Hacker News new | past | comments | ask | show | jobs | submit login

The discussion here has me wondering whether code produced by an advanced AI would need to use the same coding patterns / abstractions that we've come up with over the past several decades.

If a human won't be maintaining the code and a v2 could literally be rewritten from scratch, would we end up with giant balls of spaghetti code that only the AI could truly understand? At some point will we treat this code like a closed source library that exposes the API we want but whose implementation is unknown to us?




We already don’t understand the AIs inner workings exactly. If those algorithms keep getting optimized then maybe we’ll just have black boxes of “neurons” that somehow does the thing. Machine code could be just used to run the GPU instance


Totally. I find the videos of people asking ChatGPT to make them "a web app that does X"—which causes it to print out gobs of JS, HTML and CSS—to be hilariously human-focused. In a machine-focused world, wouldn't it just spit out an optimized binary executable, containing the web server too if necessary? Why would it need to separate the concerns at all?




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: