Do you have any argument besides "he writes long-windedly with insular terminology" and "there are multiple organizations affiliated to him"? E.g.
>It's a typical path that a coercive group makes multiple 360° turns.
What is the evidence that Eliezer is running a coercive group? Whom are they coercing into what?
I fully get that LessWrong et al. are obnoxious, use their own terminology, have far-out ideas about the future of AI that are probably wrong, weird moral systems (though any moral system is weird when you look at it hard enough imho)... But you are making the charge that Eliezer is running a cult and doing so on purpose, which is more specific than just running an annoying community around some non-mainstream ideas.
>It's a typical path that a coercive group makes multiple 360° turns.
What is the evidence that Eliezer is running a coercive group? Whom are they coercing into what?
I fully get that LessWrong et al. are obnoxious, use their own terminology, have far-out ideas about the future of AI that are probably wrong, weird moral systems (though any moral system is weird when you look at it hard enough imho)... But you are making the charge that Eliezer is running a cult and doing so on purpose, which is more specific than just running an annoying community around some non-mainstream ideas.