Facebook Insists No Security 'Backdoor' Is Planned for WhatsApp (medium.com) 56
An anonymous reader shares a report: Billions of people use the messaging tool WhatsApp, which added end-to-end encryption for every form of communication available on its platform back in 2016. This ensures that conversations between users and their contacts -- whether they occur via text or voice calls -- are private, inaccessible even to the company itself. But several recent posts published to Forbes' blogging platform call WhatsApp's future security into question. The posts, which were written by contributor Kalev Leetaru, allege that Facebook, WhatsApp's parent company, plans to detect abuse by implementing a feature to scan messages directly on people's phones before they are encrypted. The posts gained significant attention: A blog post by technologist Bruce Schneier rehashing one of the Forbes posts has the headline "Facebook Plans on Backdooring WhatsApp." It is a claim Facebook unequivocally denies.
"We haven't added a backdoor to WhatsApp," Will Cathcart, WhatsApp's vice president of product management, wrote in a statement. "To be crystal clear, we have not done this, have zero plans to do so, and if we ever did, it would be quite obvious and detectable that we had done it. We understand the serious concerns this type of approach would raise, which is why we are opposed to it."
UPDATE: Later Friday technologist Bruce Schneier wrote that after reviewing responses from WhatsApp, he's concluded that reports of a pre-encryption backdoor are a false alarm. He also says he got an equally strong confirmation from WhatsApp's Privacy Policy Manager Nate Cardozo, who Facebook hired last December from EFF. "He basically leveraged his historical reputation to assure me that WhatsApp, and Facebook in general, would never do something like this."
"We haven't added a backdoor to WhatsApp," Will Cathcart, WhatsApp's vice president of product management, wrote in a statement. "To be crystal clear, we have not done this, have zero plans to do so, and if we ever did, it would be quite obvious and detectable that we had done it. We understand the serious concerns this type of approach would raise, which is why we are opposed to it."
UPDATE: Later Friday technologist Bruce Schneier wrote that after reviewing responses from WhatsApp, he's concluded that reports of a pre-encryption backdoor are a false alarm. He also says he got an equally strong confirmation from WhatsApp's Privacy Policy Manager Nate Cardozo, who Facebook hired last December from EFF. "He basically leveraged his historical reputation to assure me that WhatsApp, and Facebook in general, would never do something like this."
Comment removed (Score:5, Insightful)
Re:Yeah, sure (Score:4, Insightful)
I don't know about that. I'd put them at about the same level of trustworthiness.
None.
Re: (Score:2)
At some level, politicians fear the will of the people. What does Facebook fear?
Re: Yeah, sure (Score:2)
"What does Facebook fear?"
Vigorous enforcement of the antitrust laws.
Re: (Score:2)
Why the downvotes? because of AC signed it? Does that disqualify the known fact?
Re: (Score:2)
Actually, denying it and then doing it would open them up for criminal prosecution in the EU. Also, it would destroy their reputation completely. Hence this sounds pretty credible. Not that I think they have any honor or concerns for their customers, but this indicates strongly that too many people would not accept such a change and that they are aware of that.
Now, Facebook has no credibility and no trustworthiness, but Bruce Schneier has a ton of it and he thinks these things are not planned after getting
Re: (Score:2)
In this case though, it looks like the usual crappy Forbes reporting getting the wrong end of the stick. Don't take my word for it though, take Bruce Schneier's. If there is one credible voice in all this, it's his.
The short story is that Facebook did a presentation about using AI to classify images, in order to block uploading banned content to Facebook (violent images, pornography etc). Forbes somehow decided that they were going to apply it to WhatsApp too, even though the presentation doesn't mention Wh
"We haven't added a backdoor to WhatsApp," (Score:4, Funny)
{ INSERT DOUBLESPEAK LOREM IPSUM } - "We've just decrypted and re-crypted your data in realtime, it's not a backdoor. It's a front door key under your mat that we put there to get in whenever we want, is all."
Oddly-specific "denial" (Score:5, Insightful)
Here's the specific statement made by WhatsApp's Will Cathcart.
"We haven't added a backdoor to WhatsApp. The Forbes contributor referred to a technical talk about client side AI in general to conclude that we might do client side scanning of content on WhatsApp for anti-abuse purposes.
To be crystal clear, we have not done this, have zero plans to do so, and if we ever did it would be quite obvious and detectable that we had done it. We understand the serious concerns this type of approach would raise which is why we are opposed to it."
That sounds very definitive... but after you've read it closely, you might note it seems to have an extremely narrow scope.
- "Backdoor" is generally understood to be specifically about decryption of encrypted traffic, while the report was about grabbing messages before they're encrypted at all
- Scanning for purposes other than anti-abuse is not covered by this
- Copying unencrypted messages on-device and sending them to Facebook's central servers is not covered by this
- Any scanning which can't be construed as AI-driven might not be covered by this
Not to mention Facebook's repeated general history of doing shady things and/or playing semantic games, then getting caught and saying "oh gosh it was a mistake don't worry we won't do it again".
Re: (Score:2)
I'm not sure if you're autistic or a tin foil hatter, but man are you overthinking this. You have a PR person specifically replying to a media article written in news speak. He's not publishing a definite mathematical proof.
You need to look at everything in the context of why it is written and for whom it is written. Backdoor has a specific meaning to *YOU*. It is also is a word used in direct reply to the original Forbes article which proceeded to discuss how the process worked and then concluded "outlined
Re: (Score:2)
How would they get away with it though?
Anyone can run Wireshark. Anyone can look inside the app's .apk and decompile back to Java code. Any backdoor that sends unencrypted or duplicate data is going to be found pretty damn quickly.
If there were to be a backdoor, it would likely be a flaw in the crypto that allows spooks to decrypt it. WhatsApp uses the Signal system so it should be possible to verify that their implementation matches the reference one, but errors could be subtle and difficult to detect. And
Backdoor? (Score:1)
Why would they need a backdoor. All the traffic goes through their server anyway...
Re: (Score:2)
Some very basic understanding of encryption is required to participate in this discussion. You seem to lack that.
Re: (Score:2)
Or that there's no reason to plan because it was planned and done a while ago.
Don't care (Score:2)
Wouldn't touch anything that Facebook has anything to do with regardless of this story, or the veracity of the claims for and against backdoors in their subsidiary apps.
This is a shady company. Period. Couldn't pay me to deal with them or use anything they have their fingers in. They cannot be trusted. Period.
Their users are their product. Let me say that again, YOU are what Facebook sells. Your interests are irrelevant to them. You are just another data point to leech off. All the other shenanigans
Facebook lies (Score:2)
Again.
No backdoors because call them something else (Score:5, Funny)
Re: (Score:2)
I don't think you can even depend on that.
Indeed (Score:2)
I bet they find this 'Inconceivable'!
Why plan them! (Score:3)
Score 0 (Score:2)
The downvote pattern in this topic is atypical. best to browse unfiltered.
Slippery eels. (Score:4, Informative)
Of course no back door is planned. That's because all the back doors they want are already there.
Inaccessible to the company? (Score:2)
Of course it is accessible to the company - they decrypt the contents of the messages and display them on the user's screen.
FB could easily decrypt the contents and send it to their servers at either end of the conversation. Or they could send the key that they use to decrypt and display the messages to the server, and decrypt them there.
The end-tro-end encryption is only useful in stopping anyone else from eavesdropping.
And PRISM (Score:2)
When govs have all the keys, they got the keys from someone.
Show me... (Score:2)
..the source. If it's securely encrypted, publish the source code so experts can vet it. Until then, it's Signal or PGP for vetted end to end encryption.
We all know how much credibility FB has for protecting users privacy.
I believe them (Score:2)
You needn't plan what's already existing.
no plans (Score:2)
there are no plans to add any backdoor, because the backdoor(s?) are already build-in, waiting to be (ab)used.