Hacker News new | past | comments | ask | show | jobs | submit login

> Mmmm well my meatsuit can't easily make my own heart quiver the wrong way and kill me.

It very much can. Jump scares, deep grief are known to cause heart attacks. It is called stress cardiomyopathy. Or your meatsuit can indiredtly do that by ingesting the wrong chemicals.

> If you could make an intelligent process, what would it think of an operating system kernel

Idk. What do you think of your hypothalamus? It can make you unconscious at any time. It in fact makes you unconscious about once a day. Do you fear it? What if one day it won’t wake you up? Or what if it jacks up your internal body temperature and cooks you alive from the inside? It can do that!

Now you might say you don’t worry about that, because through your long life your hypothalamus proved to be reliable. It predictably does what it needs to do, to keep you alive. And you would be right. Your higher cognitive functions have a good working relationship with your lower level processes.

Similarly for an AGI to be inteligent it needs to have a good working relationship with the hardware it is running on. That means that if the kernel is temperamental and idk descheduling the higher level AGI process then the AGI will mallfunction and not appear that inteligent. Same as if you meet Albert Einstein while he is chemically put to sleep. He won’t appear inteligent at all! At best he will be just drooling there.

> Can you imagine an intelligent process in such a place, as static representation of data in ram?

Yes. You can’t? This is not really a convincing argument.

> It all sounds frankly ridiculous.

I think what you are doing is that you are looking at implementation details and feeling a disconnect between that and the possibility of inteligence. Do you feel the same ridiculousnes about a meatblob doing things and appearing inteligent?

> a computer couldn't spontaneously pick to go down a path that wasn't defined for it.

Can you?




>> Can you imagine an intelligent process in such a place, as static representation of data in ram?

> Yes. You can’t? This is not > really a convincing argument.

Fair, I believe it's called begging the question. But for some context is that people of many recent technological ages have talked about the brain like a piece of technology -- e.g. like a printing press, a radio, a TV.

I think we've found what we've wanted to find (a hardware-software dichotomy in the brain) and then occasionally get surprised when things aren't all that clearly separated. So with that in mind, I personally without any particularly good evidence to the contrary am not of the belief that your brain can be represented as a static state. Pribram's holonomic mind theory comes to mind as a possible way brain state could have trouble being represented in RAM.( https://en.m.wikipedia.org/wiki/Holonomic_brain_theory)

> ...you are looking at implementation details and feeling a disconnect between that and the possibility of inteligence. Do you feel the same ridiculousnes about a meatblob doing things and appearing inteligent?

If I was a biologist I might. My grandfather was a microbiologist and scoffed at my atheism. But with a computer at least the details are understandable and knowable being created by people. We haven't cracked the consciousness of a fruit fly despite having a map of it's brain.

>> a computer couldn't spontaneously pick to go down a path that wasn't defined for it.

> Can you?

Love it. I re-read Fight Club recently, it's a reasonable question. The worries of determinism versus free will still loom large in this sort of world view. We get a kind of "god in the gaps" type problem with free will being reduced down to the spaces where you don't have an explanation.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: