I think it’s impossible because I don’t think you can leapfrog evolution, and there’s almost definitely hidden context essential to our intelligence and perception that you can only get through a very very long history.
However I do think it’s quite possible and likely that we’ll create mimics that convince most people they’re intelligent. But I think that’s a weird type of reflection/anything we can make is limited to mirroring our observable past and existing collective thought and can never be truly introspective, because true introspection requires evolutionary context and hidden motivation you can’t transfer.
Evolution is a process that works without an intelligent steward. In a way it's a brute force technique. Plus, nothing is optimizing for intelligence in evolution, it is merely a happy accident that humans ended up with the brains that we did. A different environment could yield drastically different evolutionary history
It doesn't seem very logical to think that because evolution took so long to get us to where we are now, we consequently won't be able to design an intelligent AGI system.
It’d take a while to justify this argument to the extent I think it’s justified, but I think we’re in an inescapable frame set by evolution, and our adaptation to our environment probably goes a lot deeper than what we can see. I don’t think the visible context is the full context, and think true intelligence probably requires an implicit understanding of context which is invisible to us.
However I do think it’s quite possible and likely that we’ll create mimics that convince most people they’re intelligent. But I think that’s a weird type of reflection/anything we can make is limited to mirroring our observable past and existing collective thought and can never be truly introspective, because true introspection requires evolutionary context and hidden motivation you can’t transfer.