They have a lot of ways they could’ve built trust without a full negative burden: which of them, if any, are they doing?
Open sourcing of their watch word and recording features specifically, so people can self-verify it does what it says and that it’s not doing sketchy things?
Hardware lights such that any record functionality past the watch words is visible and verifiable by the end user and it can’t record when not lit?
Local streaming and auditable downloads of the last N hours of input as heard by amazon after watchwords, so you can check for misrecordings and also compare “intended usage” times to observed times, such that you can see that you and Amazon get the same stuff?
If you really wanna go all out, putting in their TOS protections like explicit no-train permissions on passing utterances without intent, or adding an SLA into their subscription to refund subscription and legal costs and to provide explicit legal cause of action, if they were recording when they said they weren’t?
If you explicitly want to promote trust, there are actually a ton of ways to do it, one of them isn’t “remove even more of your existing privacy guardrails”.
On the first two, if you already think they're blatantly lying about functionality, why would you think the software in the device is the same as the source you got, or that it can't record with the light off?
Open sourcing of their watch word and recording features specifically, so people can self-verify it does what it says and that it’s not doing sketchy things?
Hardware lights such that any record functionality past the watch words is visible and verifiable by the end user and it can’t record when not lit?
Local streaming and auditable downloads of the last N hours of input as heard by amazon after watchwords, so you can check for misrecordings and also compare “intended usage” times to observed times, such that you can see that you and Amazon get the same stuff?
If you really wanna go all out, putting in their TOS protections like explicit no-train permissions on passing utterances without intent, or adding an SLA into their subscription to refund subscription and legal costs and to provide explicit legal cause of action, if they were recording when they said they weren’t?
If you explicitly want to promote trust, there are actually a ton of ways to do it, one of them isn’t “remove even more of your existing privacy guardrails”.