Schneier has also added the words “This story is wrong” to his original blog post. “The only source for that post was a Forbes essay by Kalev Leetaru, which links to a previous Forbes essay by him, which links to a video presentation from a Facebook developers conference.” But that Forbes contributor has also responded, saying that he’d first asked Facebook three times about when they’d deploy the backdoor in WhatsApp — and never received a response.
Asked again on July 25th the company’s plans for “moderating end to end encrypted conversations such as WhatsApp by using on device algorithms,” a company spokesperson did not dispute the statement, instead pointing to Zuckerberg’s blog post calling for precisely such filtering in its end-to-end encrypted products including WhatsApp [apparently this blog post], but declined to comment when asked for more detail about precisely when such an integration might happen… [T]here are myriad unanswered questions, with the company declining to answer any of the questions posed to it regarding why it is investing in building a technology that appears to serve little purpose outside filtering end-to-end encrypted communications and which so precisely matches Zuckerberg’s call. Moreover, beyond its F8 presentation, given Zuckerberg’s call for filtering of its end-to-end encrypted products, how does the company plan on accomplishing this apparent contradiction with the very meaning of end-to-end encryption?
The company’s lack of transparency and unwillingness to answer even the most basic questions about how it plans to balance the protections of end-to-end encryption in its products including WhatsApp with the need to eliminate illegal content reminds us the giant leap of faith we take when we use closed encryption products whose source we cannot review… Governments are increasingly demanding some kind of compromise regarding end-to-end encryption that would permit them to prevent such tools from being used to conduct illegal activity. What would happen if WhatsApp were to receive a lawful court order from a government instructing it to insert such content moderation within the WhatsApp client and provide real-time notification to the government of posts that match the filter, along with a copy of the offending content?
Asked about this scenario, Carl Woog, Director of Communications for WhatsApp, stated that he was not aware of any such cases to date and noted that “we’ve repeatedly defended end-to-end encryption before the courts, most notably in Brazil.” When it was noted that the Brazilian case involved the encryption itself, rather than a court order to install a real-time filter and bypass directly within the client before and after the encryption process at national scale, which would preserve the encryption, Woog initially said he would look into providing a response, but ultimately did not respond.
Given Zuckerberg’s call for moderation of the company’s end-to-end encryption products and given that Facebook’s on-device content moderation appears to answer directly to this call, Woog was asked whether its on-device moderation might be applied in future to its other end-to-end encrypted products rather than WhatsApp. After initially saying he would look into providing a response, Woog ultimately did not respond.
Here’s the exact words from Zuckerberg’s March blog post. It said Facebook is “working to improve our ability to identify and stop bad actors across our apps by detecting patterns of activity or through other means, even when we can’t see the content of the messages, and we will continue to invest in this work. “