Hacker News new | past | comments | ask | show | jobs | submit login

But alignment is always going to rely on cooperation of users though? What benefit does the delay offer other than the direct one of a delay?



Why is it going to rely on co-operation if users don't have the means to change the model enough to misalign it?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: