Hacker News new | past | comments | ask | show | jobs | submit login

100%. I don't think we should at all minimize the decades of research that it took to get to the current "generative AI boom", but once transformers were invented in 2018, we basically found that just throwing more and more data at the problem gave you better results.

And not to discount the other important advances like RLHF, but the reason everyone talks about the big model companies as having "no moat" is because it's not really a secret of how to recreate these models. That is basically the complete opposite of, say, other companies that really do build "the most challenging products that have been created in all of human history." E.g. nobody has really been able to recreate TSMC's success, which requires not only billions but a highly educated, trained, and specialized workforce.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: