Hacker News new | past | comments | ask | show | jobs | submit login

"He found a few dozen ink strokes - and some complete letters - that could be labeled and used as training data.

Before long, the model was unveiling traces of crackle invisible to his own eye. Soon, these traces began to form letters and hints of actual words."

This does not sound like a "Large Language Model (LLM)" or other large set of training data, like the sort hyped by so-called "tech" companies; this sounds relatively small. What am I missing. (Besides brain cells.)




Indeed, it’s a machine learning model, but not a large one. Who called it large?


He made $40,000 without needing a large data set. No proprietary, corporate LLM needed.


calculated on his laptop whilst at a party




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: