An encrypted chat app purchased by Amazon Web Services in 2021 is full of child sexual abuse material (CSAM), a problem Amazon is reportedly aware of and has chosen to do almost nothing about.
More than 72 state and federal child sexual abuse or child pornography cases reviewed by NBC News have cited Wickr Me as the platform chosen by the defendant. Wickr has, according to these documents, cooperated only in rare circumstances when the law required it to do so.
While Amazon is obviously not the only tech company dealing with CSAM on its platforms, the company has done far less to cut down on this material than others facing the same issue. And the actual extent of the CSAM problem on Wickr goes far beyond these court cases, experts say.
Almost nothing proactive —
Wickr is nowhere near as popular as something like iMessage or WhatsApp, but that obviously doesn’t exempt it from protecting the users it does have. But only 15 cases of CSAM were reported by Wickr last year, despite more than 3,500 cases being reported by users themselves. This, NBC News points out, makes it clear that its users, not Wickr, that are discovering CSAM on the app.
Wickr’s allowance of child sexual abuse material is particularly concerning given that messages sent on the app are end-to-end encrypted. This means that, unless a search warrant is issued, only the users sending and receiving the messages can see them. This allows for the utmost privacy — making it very easy to use the app for maligned purposes.
Where companies like Facebook have instituted automatic scanning systems to detect CSAM before it causes harm, Wickr has no such systems in place. To make matters even more concerning, Wickr doesn’t require any sort of personal information verification to get started, making it difficult to trace accounts back to their actual users.
Amazon is ‘committed’ —
Amazon, which purchased Wickr last year, said in response to NBC News’s story that Amazon is “committed” to preventing CSAM in every part of its business, mentioning that Wickr’s terms of use explicitly prohibit anything illegal. “We act quickly on reports of illegal behavior, respond immediately to requests from law enforcement, and take the appropriate actions,” a spokesperson for the company said.
Because Wickr makes so little effort to combat CSAM in meaningfully proactive ways, it’s become known as a great place to share such content. Communities on Reddit, Tumblr, and Twitter discuss Wickr using thinly veiled code words, swapping handles, and generally contributing to the narrative that Wickr is safe from prying eyes.
This is the most fundamental dilemma presented by end-to-end encryption: extreme privacy is incredibly easy to abuse. It’s a catch-22 Big Tech is dealing with every day, and one for which there’s no quick answer.