Hacker News new | past | comments | ask | show | jobs | submit login

I don't mean it's easy to do, I mean they're doing it. Google is literally announcing a cloud service that uses an LLM to analyze if code snippets are safe. This is an obviously bad idea and it's not theoretical, it already exists.

I don't necessarily disagree that it was a leap, but it's a leap we've taken now.




I think I mostly am agreeing with you but I see it through a liability lens.

When Google's cloud service to analyze code snippets screws up, nothing in the ToU should alleviate Google from the responsibility they should rightly bear if they ship a system with known and understood flaws that are not clearly portrayed to the user (and I don't mean buried in the middle of a paragraph in page 36 of the ToU).

If forcing Google to accept liability kills the service dead then good - it's probably the right thing until they can reach an acceptable level of safety that they are willing to accept the resultant risk.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: