> Sure, if our measuring stick is information, then there is no difference in kind, merely a difference in complexity. But the complexity difference between the two is worlds apart, thus substantiating the distinction I'm pointing to.
I agree 100% that computational complexity can be used to make a meaningful distinctions. It's not clear that that's the case here though. That is, I agree that the quantity of information is worlds apart, but if the information content is all of the same kind and requires no special changes to the computational model needed to process it, then I don't think the magnitude of information is relevant.
> For something to be intrinsically contentful, it has to intrinsically pick out the intended target of reference, not merely be the source of entropy from which another process picks out a target.
I don't think this distinction is meaningful due to the descriptor "intrinsic". It suggests that an agent's thoughts are somehow divorced from the enviroment that bootstrapped them, that the thoughts originated themselves somehow.
The referent of one of my thoughts is an abstract internal model of that thing that I formed from my sensory memories of it. So if "instrinsically contentful" information is simply information that refers to an internal model generated from sensory data, then this would suggest that even dumb AI-driven non-player-character/NPC in video games have thoughts with instrinsic content, since they act on internal models built from sensing their game environment.
> But to deny the notion of intrinsic content because you can't currently write one down is short-sighted.
Maybe, but I'm not yet convinced that there's real substance to the distinction you're trying to make. I'm all for making a meaningful distinctions, and perhaps "mental states" driven by internal sensory-built models is such a distinction, but I'm not sure "thought" or "information with intrinsic content" are necessarily good labels for it. "Thought" seems like overstepping given the NPC analogy, and per above, "intrinsic" doesn't seem like a good fit either.
I agree 100% that computational complexity can be used to make a meaningful distinctions. It's not clear that that's the case here though. That is, I agree that the quantity of information is worlds apart, but if the information content is all of the same kind and requires no special changes to the computational model needed to process it, then I don't think the magnitude of information is relevant.
> For something to be intrinsically contentful, it has to intrinsically pick out the intended target of reference, not merely be the source of entropy from which another process picks out a target.
I don't think this distinction is meaningful due to the descriptor "intrinsic". It suggests that an agent's thoughts are somehow divorced from the enviroment that bootstrapped them, that the thoughts originated themselves somehow.
The referent of one of my thoughts is an abstract internal model of that thing that I formed from my sensory memories of it. So if "instrinsically contentful" information is simply information that refers to an internal model generated from sensory data, then this would suggest that even dumb AI-driven non-player-character/NPC in video games have thoughts with instrinsic content, since they act on internal models built from sensing their game environment.
> But to deny the notion of intrinsic content because you can't currently write one down is short-sighted.
Maybe, but I'm not yet convinced that there's real substance to the distinction you're trying to make. I'm all for making a meaningful distinctions, and perhaps "mental states" driven by internal sensory-built models is such a distinction, but I'm not sure "thought" or "information with intrinsic content" are necessarily good labels for it. "Thought" seems like overstepping given the NPC analogy, and per above, "intrinsic" doesn't seem like a good fit either.