Hacker News new | past | comments | ask | show | jobs | submit login

> ig_signals_cupid_better_recall_v1.mlmodelc

Are they using on-device ML? Why?




They're probably just using an already trained model. They're almost certainly not training a model on-device.

AFAIK both the iphone and the pixel have a chip that are specifically designed to do forward propagation on a deep neural network. I don't know much about them though.


True - but what model would they want to use on-device?

All the feed ranking stuff is done serverside I assume... And apart from that, the app is just for posting and reading text and pictures. I don't really see any use for any ML there.


> True - but what model would they want to use on-device?

Video/image filters. Very popular on snapchat, instagram, tiktok, etc. If you want to compete with those services you have them.


All sorts of crap like facial recognition or object identification.

Every bit of ML you can offload on your clients is one you don't have to pay for.


If one could profitably offload serverside CPU to the client then every app would be mining bitcoins on-device.

But it turns out that user complaints of a laggy app, hot phone and bad battery life (and therefore lost users and lost ad revenue) far outweigh any CPU time savings on the server.


I have a suspicion that it could be something that does voice recognition via the gyroscope data. I believe so for many reasons-

1. Gyroscope creates a lot of data for every millisecond, so it doesn't makes any sense to send all of it over to the server.

2. You only need to recognise certain keywords/movements of the user for somewhat accurate ad-tracking and behavourial analysis.

3. You don't need to prompt the user to allow access to it.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: