Hacker News new | past | comments | ask | show | jobs | submit login

The whole point of NPU-enabled devices is to run models locally, so they your data never leaves your device. This is a huge privacy win.



They're trying to have it both ways and it's not clear to me as a consumer what is local and what is cloud. (As a developer, I can tell they're doing a few things locally like OCR and webcam background blur on the NPU, but they are not running ChatGPT on an a laptop anytime soon)


Although the line can get fuzzy when they want to ship a feature that's too big to run locally. Android has run into that, some of the AI features run locally, some of them run on Googles servers, and some of them might run locally or on Googles servers depending on which device you happen to have.


The whole point is making the consumers pay the cost of running LLMs (both in hardware and power), not your privacy, they will still get your data to train better models.


The whole point of enshittification is that companies don't need your data but they take it anyway.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: