I didnt read that as a technical argument, but as a sociological one. If there are two apps, one which is easy/legal to obtain, but has a backdoor and another which may be harder/illegal to obtain but has no backdoor, which is the bad guy going to choose? Unless you can somehow force every app to use the backdoor, you cant make the bad guy go through it. Therefore, "You cannot create a back door that only the good guys can go through"
The proposal is to scan every message and report if only there is a match. What bad guy can possibly take advantage of that without coercing Signal? And if Signal can be coerced, they can backdoor it all to begin with. Call it a feature instead if a backdoor if it sounds better lol.
Control of the list which is used to scan your messages has the same security properties as the code of signal itself and a threat actor that can exploit the system must control both that and access the match reporting system (signal servers). An actor that controls these does not need to abuse this system, it would be easier to just push a rat alongside signal.
> "You cannot create a back door that only the good guys can go through"
Perhaps for crypto protocols. Software systems do this all the time in form of software updates, usage monitoring and even unattended remote support accounts. And also, just because it might be possible to find a vuln in a system in the future that does not mean it is vulnerable at the time of design.
Not only are clean backdoors possible, software engineers can design them cleanly and make them sound nice and fluffy when they are the ones accessing the backdoor.