Opinions about what's necessary for AGI are a dime a dozen. You shared your opinion as though it was fact, and you claim that it's incompatible with Eliezer's opinion. I don't find your opinion particularly clear or compelling. But even if your forecast about what's needed for AGI is essentially accurate, I don't think it has much to do with Eliezer's claims. It can simultaneously be the case that AGI will make use of information gathering, verifying capability, and something like a "network of peers", AND that Eliezer's core claims are also correct. Even if we take your opinion as fact, I don't see how it represents a disagreement with Eliezer, except maybe in an incredibly vague "intelligence is hard, bro" sort of way.