Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If I can use this with a local LLM it could be useful.


Yeah. This seems like an area where a “tiny” (2-4GB) local model would be more than sufficient to generate very high quality queries and schema answers to the vast majority of questions. To the point that it feels outright wasteful to pay a frontier model for it.


In ollama is included default add the endpoint URL yourself




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: