Hacker News new | past | comments | ask | show | jobs | submit login

It may not be exactly as good, but it's also a 7B parameter model vs 175B parameters for GPT-3.5 (i.e., text-davinci-003). People are running the new model on their phones, laptops, and Raspberry PIs.

People's mouths were watering over the commercial implications of the recent 90% drop in cost for the new ChatGPT model. Now imagine if you can get similar performance on a model that requires <5% of the parameters.




not to mention I think you want your knowledge in some kind of database and just let the AI just do the conversational part of it.

which would be a much easier and more reliable(explainable) way to do it. just find and parrot the facts.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: