Hacker News new | past | comments | ask | show | jobs | submit login

When you think about it it's astounding how much energy this technology consumes versus a human brain which runs at ~20W [1].

[1] https://hypertextbook.com/facts/2001/JacquelineLing.shtml




It’s almost as if human intelligence doesn’t involve performing repeated matrix multiplications over a mathematically transformed copy of the internet. ;-)


It’s interesting that even if raw computing power had advanced decades earlier, this type of AI would still not be possible without that vast trove of data that is the internet.


It makes you think there must be more efficient algorithms out there.


Maybe the problem isn't the algorithm but the hardware. Numerically simulating the thermal flow in a lightbulb or CFD of a Stone flying through air is pretty hard, but the physical thing isn't that complex to do. We're trying to simulate the function of a brain which is basically an analog thing using a digital computer. Of course that can be harder than running the brain itself.


If you think of human neurons they seem to basically take inputs from bunch of other neurons, possibly modified by chemical levels and send out a signal when they get enough. It seems like something that could be functionally simulated in software by some fairly basic adding up inputs type stuff rather than needing the details of all the chemistry.


Isn’t that exactly what we’re currently doing? The problem is that doing this few billion times for every token seems to be harder than just powering some actual neurons with sugar.


I think the algorithm is pretty different though I'm not expert on the stuff. I don't think the brain processes look like matrix multiplication.


The algorithm (of a neural network) is simulating connections between nodes with specific weights and an activation function. This idea was derived from the way neurons are thought to work.


lol, just done that simply huh? said by someone who doesn't have a teenth of understanding of neurobiology or neuropsychology

only on hackernews


20w for 20 years to answer questions slowly and error-prone at the level of a 30B model. An additional 10 years with highly trained supervision and the brain might start contributing original work.


Multiply that by billion, because only very few individuals of entire populations can contribute original work.


And yet that 20w brain can make me a sandwich and bring it to me, while state of the art AI models will fail that task.

Until we get major advances in robotics and models designed to control them, true AGI will be nowhere near.


> Until we get major advances in robotics and models designed to control them, true AGI will be nowhere near.

AGI has nothing to do with robotics, if AGI is achieved it will help push robotics and every single scientific field further with progression never seen before, imagine a million AGIs running in parallel focused on a single field.


We already have that. It's called civilization.

Maybe you mean quadrillions of AGIs?


A human brain is also more intelligent (hopefully) and is inside a body. In a way GPT resembles Google more than it resembles us.


You've discovered the importance of well-formed priors. The human brain is the result of millions of years of very expensive evolution.


A human brain has been in continuous training for hundreds of thousands of years consuming slightly more than 20 watts.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: