Facebook ‘reads and shares’ WhatsApp private messages: New report
ISLAMABAD, SEPT 9: Facebook’s encrypted messaging service WhatsApp isn’t as private as it claims, according to a new report.
The popular chat app, which touts
its privacy features, says parent Facebook can’t read messages sent between
users. But an extensive report by ProPublica on Tuesday claims that Facebook is
paying more than 1,000 contract workers around the world to read through and
moderate WhatsApp messages that are supposedly private or encrypted, reported
New York Post.
What’s more, the company
reportedly shares certain private data with law enforcement agencies, such as
the US Department of Justice.
The revelation comes after
Facebook boss Mark Zuckerberg has repeatedly said that WhatsApp messages are
not seen by the company. “We don’t see any of the content in WhatsApp,” the CEO
said during testimony before the US Senate in 2018.
Privacy is touted even when new
users sign up for the service, with the app emphasizing that “your messages and
calls are secured so only you and the person you’re communicating with can read
or listen to them, and nobody in between, not even WhatsApp.”
“Those assurances are not true,”
said the ProPublica report. “WhatsApp has more than 1,000 contract workers
filling floors of office buildings in Austin, Texas, Dublin and Singapore,
where they examine millions of pieces of users’ content.”
Facebook acknowledged that those
contractors spend their days sifting through content that WhatsApp users and
the service’s own algorithms flag, and they often include everything from fraud
and child porn to potential terrorist plotting.
A WhatsApp spokeswoman told The
Post: “WhatsApp provides a way for people to report spam or abuse, which
includes sharing the most recent messages in a chat. This feature is important
for preventing the worst abuse on the internet. We strongly disagree with the
notion that accepting reports a user chooses to send us is incompatible with
end-to-end encryption.”
According to WhatsApps’s FAQ
page, when a user reports abuse, WhatsApp moderators are sent “the most recent
messages sent to you by the reported user or group.” ProPublica explained that
because WhatsApp’s messages are encrypted, artificial intelligence systems
“can’t automatically scan all chats, images and videos, as they do on Facebook
and Instagram.”
Instead, the report revealed that
WhatsApp moderators gain access to private content when users hit the “report”
button on the app, identifying a message as allegedly violating the platform’s
terms of service.
This forwards five messages, including the allegedly offending one, along with the four previous ones in the exchange — plus any images or videos — to WhatsApp in unscrambled form, according to unnamed former WhatsApp engineers and moderators, who spoke to ProPublica.
Aside from the messages, the
workers see other unencrypted information such as names and profile images of a
user’s WhatsApp groups, as well as their phone number, profile photo status
message, phone battery level, language and any related Facebook and Instagram
accounts.
Each reviewer handles upward of
600 complaints a day, which gives them less than a minute per case. Reviewers
can either do nothing, place the user on “watch” for further scrutiny or ban
the account.
ProPublica said WhatsApp shares
metadata, or unencrypted records that can reveal a lot about a user’s online
activity, with law enforcement agencies such as the Department of Justice.
The outlet claimed that WhatsApp user data
helped prosecutors build a high-profile case against a Treasury Department
employee who leaked confidential documents to BuzzFeed News that exposed how
dirty money allegedly flows through US banks.
Like other social media
platforms, WhatsApp is caught between users who expect privacy and law
enforcement agencies that demand that such platforms hand over information that
will help fight crime and online abuse.
WhatsApp CEO Will Cathcart said
in a recent interview that there’s no conflict of interest. “I think we
absolutely can have security and safety for people through end-to-end
encryption and work with law enforcement to solve crimes,” Cathcart said in a
YouTube interview with an Australian think tank in July.
But the privacy issue isn’t that
simple. Since Facebook bought WhatsApp in 2014 for $19 billion, Zuckerberg has
repeatedly assured users he would keep data private. Since then the company has
walked a tightrope when it comes to privacy and monetizing data it collects
from users of the free messaging app.
In 2016, WhatsApp disclosed it
would begin sharing user data with Facebook, a move that would allow it to
generate revenue. The plan included sharing information such as users’ phone
numbers, profile photos, status messages and IPO addresses, so that Facebook
could offer better friend suggestions and serve up more relevant ads, among
other things.
Such actions put Facebook on the
radar of regulators, and in May 2017, European Union antitrust regulators fined
the company $122 million for falsely claiming three years earlier that it would
be impossible to link the user information between WhatsApp and the Facebook
family of apps. Facebook said its false statements in 2014 were not intentional
but it didn’t contest the fine.
Facebook continued to be the target of security and privacy issues over time. In July 2019, that culminated in an eye-popping $5 billion fine by the Federal Trade Commission for violating a previous agreement to protect user privacy.
The fine was almost 20 times greater than any previous privacy-related penalty, the FTC said at the time, and Facebook’s wrongdoing included “deceiving users about their ability to control the privacy of their personal information.”
Regardless, WhatsApp is still
trying to figure a way to make money while guarding privacy. In 2019, the app
announced it would run ads inside the app, but those controversial plans were
abandoned days before the ads were set to launch.
Earlier this year, WhatsApp unveiled
a change in its privacy policy that included a one-month deadline to accept the
policy or get cut off from the app. The policy would allow users to directly
message businesses on its platform. It required users to agree to those
conversations being stored on Facebook servers, leading many users to think
that Facebook would have access to their private chats.
The concerns sparked massive
backlash, causing tens of millions of users to move to rival apps such as
Signal and Telegram. WhatsApp pressed forward with the change in February, but
assured users that messages would remain private.
“We’ve seen some of our competitors try to get away with claiming they can’t see people’s messages — if an app doesn’t offer end-to-end encryption by default that means they can read your messages,” WhatsApp said on its blog. “Other apps say they’re better because they know even less information than WhatsApp. We believe people are looking for apps to be both reliable and safe, even if that requires WhatsApp having some limited data.”
-----------------------------------------------------------
COURTESY 24newshd
----------
Comments
Post a Comment