Wednesday 28 March 2018

Turns out that Facebook Inc is the biggest baddie of all on the Internet


“The FTC is firmly and fully committed to using all of its tools to protect the privacy of consumers. Foremost among these tools is enforcement action against companies that fail to honor their privacy promises, including to comply with Privacy Shield, or that engage in unfair acts that cause substantial injury to consumers in violation of the FTC Act. Companies who have settled previous FTC actions must also comply with FTC order provisions imposing privacy and data security requirements. Accordingly, the FTC takes very seriously recent press reports raising substantial concerns about the privacy practices of Facebook. Today, the FTC is confirming that it has an open non-public investigation into these practices.”  [US Federal Trade Commission (FTC), Statement, 26 March 2018]

It may have been the Cambridge Analytica-Facebook situation as first set out by Carole Cadwalladr at The Guardian & The Observer (UK) that recently alerted the average Internet user to the issue of digital privacy on social media and, it was certainly the situation which caught the eye of the US Federal Trade Commission which is now investigating.

The story of that data harvest so far.....

The Guardian UK, 25 March 2018:

The story of how those data made the journey from Facebook’s servers to Cambridge Analytica’s is now widely known. But it is also widely misunderstood. (Many people were puzzled, for example, by Facebook’s vehement insistence that the exfiltration of a huge trove of users’ data was not a “breach”.) The shorthand version of what happened – that “a slug of Facebook data on 50 million Americans was sucked down by a UK academic named Aleksandr Kogan, and wrongly sold to Cambridge Analytica” – misses an important point, which is that in acquiring the data in the first place Kogan was acting with Facebook’s full knowledge and approval.

In 2013, he wrote an app called “Thisisyourdigitallife” which offered users an online personality test, describing itself as “a research app used by psychologists”
Approximately 270,000 people downloaded it and in doing so gave their consent for Kogan to access information such as the city they set on their profile, or content they had liked, as well as more limited information about friends who had their privacy settings set to allow it. This drew more than 50 million unsuspecting Facebook users into Kogan’s net.

The key point is that all of this was allowed by the terms and conditions under which he was operating. Thousands of other Facebook apps were also operating under similar T&Cs – and had been since 2007, when the company turned its social networking service into an application platform.

So Kogan was only a bit player in the data-hoovering game: apps such as the insanely popular Candy Crush, for example, were also able to collect players’ public profiles, friends lists and email addresses. And Facebook seemed blissfully indifferent to this open door because it was central to its commercial strategy: the more apps there were on its platform the more powerful the network effects would be and the more personal data there would be to monetise.

That’s why the bigger story behind the current controversy is the fact that what Cambridge Analytica claimed to have accomplished would not have been possible without Facebook. Which means that, in the end, Facebook poses the problem that democracies will have to solve. [my yellow highlighting]

However, it is not the only way Facebook is collecting personal information to enrich Zuckerberg and his shareholders.

Now we find out that Facebook Inc is scraping information from Android devices such as mobile phones and adding phone logs to its Big Brother database.

Global News, 25 March 2018:

In the same week Facebook found itself in the middle of a massive data scandal, recent reports indicate that the social media giant has also scraped records of phone calls and SMS data from its users with Android devices without explicit permission.

New Zealand-based software developer Dylan McKay tweeted earlier this week that upon downloading his Facebook data in zip file (which is an option for all users) he claims to have discovered records of phone calls and a historical data of every contact on his phone., including contacts he no longer had, from a period between 2016 and 2017.
After he made the discovery, McKay set up a Google poll to gather evidence from other users who’ve been affected.

So far, just under 900 people have responded to the poll, and more than 20 per cent confirmed they found call records and/or text metadata in their Facebook data archive. Another 74 people responded to the poll saying that MMS data was collected, 106 people responded saying that SMS data was collected, and 104 responded saying that cellular calls were collected.

The story was first published by the tech news website Ars Technica on Saturday, who interviewed several Facebook users, and had a member of its staff download their Facebook data archive. Following, this, the site could confirm that the data file downloaded by the staff member contained call logs from a device that individual used between 2015 and 2016, as well as SMS and MMS message data.

Several Global News staff members also requested their data archives as well in the preparation of this story and some found that the contact lists from their mobile devices were recorded in the file. No one noted any text message or call logs in the data files they downloaded.

Ars Technica reached out to Facebook for comment before the publication of its story, who said that the practice was a common one among social networking and messaging apps.
“The most important part of apps and services that help you make connections is to make it easy to find the people you want to connect with. So, the first time you sign in on your phone to a messaging or social app, it’s a widely used practice to begin by uploading your phone contacts.”

Following McKay’s tweets, other users came out on social media expressing similar concerns about what they discovered after downloading their data archives.

In recent years, the company has updated this process to clarify that when requesting access to your contact list, it intends to access all call logs and SMS text messages as well, but Android users in the past may have unknowingly given Facebook access to this data. [my yellow highlighting]

It is also wise to remember that even Internet users who do not have a Facebook account have their PC or other digital device scanned for information each time they click on a link to Facebook



Facebook image via ZDNet, 3 January 2014

ZDNet on 3 January 2014: By "content" Facebook means “anything you or other users post on Facebook”. By "information" Facebook means “facts and other information about you, including actions taken by users and non-users who interact with Facebook”. [my yellow highlighting]

Nor should we ignore this report about Facebook's surreptitious activities.......

Law360 (March 2, 2018, 7:02 PM EST) -- A California federal judge held Friday that Facebook can’t shake a proposed class action over its allegedly unlawful collection and storage of non-users’ facial scans, declining to toss the matter for lack of standing, just as he recently did in a related suit involving users of the site.

U.S. District Judge James Donato rejected Facebook Inc.’s renewed motion to dismiss litigation led by Frederick William Gullen for lack of subject-matter jurisdiction, pointing to his Feb. 26 decision in a related proposed class action accusing the social media... 
[my yellow highlighting]

Then there is the lobbying to discourage federal regulation of Facebook.......

According to SOCIAL MEDIA CASEROUNDUP (selected cases) in April 2015, by 2013 Facebook Inc had spent more than US$1 million on lobbying efforts to water down the US Children's Online Privacy Protection Act (COPPA). It was particularly concerned about any change of status of third party "add ons"/"plug-ins" which might by default make platforms like Facebook legally liable for any harm to a minor/s which occurred, as well asbeing resistant to any increase in general protections for minors or any expanded definition of protected "personal information" being included in the Act.

Quartz, 22 March 2018:

Facebook CEO Mark Zuckerberg said yesterday that the company welcomes more regulation, particularly to bring transparency to political advertising online. But in recent months, Facebook has been quietly fighting lawmakers to keep them from passing an act that does exactly that, campaign transparency advocates and Congressional staff tell Quartz.

The Honest Ads Act was introduced last October to close a loophole that has existed since politicians started advertising on the internet, and was expected by many to sail through Congress. Coming as Congress investigated how Russia used tech companies to influence the 2016 election, it was considered by many in Washington DC to be the bare minimum lawmakers could do to address the problem.

The act introduces disclosure and disclaimer rules to online political advertising. Tech companies would have to keep copies of election ads, and make them available to the public. The ads would also have to contain disclaimers similar to those included in TV or print political ads, informing voters who paid for the ad, how much, and whom they targeted.

“The benefit of having disclaimers on all political ads [is] the more suspicious ads would be more identifiable,” said Brendan Fischer, the director of federal and Federal Election Commission reform at theCampaign Legal Center (CLC) in Washington.

In a vote of confidence from bitterly-divided Washington, the act was rolled out by a bipartisan group of senators—John McCain, the Republican from Arizona, and Democrats Amy Klobuchar from Minnesota and Mark Warner of Virginia—and it currently has the support of 18 senators. But it hasn’t moved from the committee on “Rules and Administration” since was first introduced, thanks in part to Facebook’s lobbying efforts.

Fischer, who is a co-author of a CLC report on US vulnerabilities online after the 2016 election, accuses Facebook of “working behind the scenes using the levers of power to stop any legislation from moving forward.”

Facebook’s lobbying clout

Lobbyists for the company have been trying to dissuade senators from moving the Honest Ads Act forward, some Congressional aides say

Facebook’s argument to Congress behind the scenes has been that they are “voluntarily complying” with most of what the Honest Ads Act asks, so why pass a law, said one Congressional staffer working on the bill. Facebook also doesn’t want to be responsible for maintaining the publicly accessible repository of political advertising, including funding information, that the act demands, the staffer said.

Facebook spent nearly $3.1 million lobbying Congress and other US federal government agencies in the last quarter of 2017, on issues including the Honest Ads Act according to its latest federal disclosure form. It also signed on Blue Mountain Strategies, a lobbying firm founded by Warner’s former chief of staff, an Oct. 30, 2017 filing shows.

It’s part of a massive uptick in lobbying spending in recent years. [my yellow highlighting]

Despite all its lobbying Facebook Inc is not immune from official censure for its deceptive business practices.

Take this analysis of a 2011 binding agreement between the US Federal Trade Commission and Facebook Inc.....


FEDERAL TRADE COMMISSION [File No. 092 3184], 2 December 2011:

The Federal Trade Commission has accepted, subject to final approval, a consent agreement from Facebook, Inc. (‘‘Facebook’’)……

The Commission’s complaint alleges eight violations of Section 5(a) of the FTC Act, which prohibits deceptive and unfair acts or practices in or affecting commerce, by Facebook:

* Facebook’s Deceptive Privacy Settings: Facebook communicated to users that they could restrict certain information they provided on the site to a limited audience, such as ‘‘Friends Only.’’ In fact, selecting these categories did not prevent users’ information from being shared with Apps that their Friends used.

* Facebook’s Deceptive and Unfair December 2009 Privacy Changes: In December 2009, Facebook changed its site so that certain information that users may have designated as private— such as a user’s Friend List —was made public, without adequate disclosure to users. This conduct was also unfair to users.

* Facebook’s Deception Regarding App Access: Facebook represented to users that whenever they authorized an App, the App would only access the information of the user that it needed to operate. In fact, the App could access nearly all of the user’s information, even if unrelated to the App’s operations. For example, an App that provided horoscopes for users could access the user’s photos or employment information, even though there is no need for a horoscope App to access such information. 

* Facebook’s Deception Regarding Sharing with Advertisers: Facebook promised users that it would not share their personal information with advertisers; in fact, Facebook did share this information with advertisers when a user clicked on a Facebook ad.

* Facebook’s Deception Regarding Its Verified Apps Program: Facebook had a ‘‘Verified Apps’’ program through which it represented that it had certified the security of certain Apps when, in fact, it had not. 

* Facebook’s Deception Regarding Photo and Video Deletion: Facebook stated to users that, when they deactivate or delete their accounts, their photos and videos would be inaccessible. In fact, Facebook continued to allow access to this content even after a user deactivated or deleted his or her account.

* Safe Harbor: Facebook deceptively stated that it complied with the U.S.-EU Safe Harbor Framework, a mechanism by which U.S. companies may transfer data from the European Union to the United States consistent with European law.
The proposed order contains provisions designed to prevent Facebook from engaging in practices in the future that are the same or similar to those alleged in the complaint.

Part I of the proposed order prohibits Facebook from misrepresenting the privacy or security of ‘‘covered information,’’ as well as the company’s compliance with any privacy, security, or other compliance program, including but not limited to the U.S.-EU Safe Harbor Framework. ‘‘Covered information’’ is defined broadly as ‘‘information from or about an individual consumer, including but not limited to: 
(a) A first or last name; 
(b) a home or other physical address, including street name and name of city or town; (c) an email address or other online contact information, such as an instant messaging user identifier or a screen name; 
(d) a mobile or other telephone number; 
(e) photos and videos; (f) Internet Protocol (‘‘IP’’) address, User ID, or other persistent identifier; (g) physical location; or 
(h) any information combined with any of (a) through (g) above.’’

Part II of the proposed order requires Facebook to give its users a clear and prominent notice and obtain their affirmative express consent before sharing their previously-collected information with third parties in any (a) through (g) above.’’ Part II of the proposed order requires Facebook to give its users a clear and prominent notice and obtain their affirmative express consent before sharing their previously-collected information with third parties in any way that materially exceeds the restrictions imposed by their privacy settings. A ‘‘material . . . practice is one which is likely to affect a consumer’s choice of or conduct regarding a product.’’ FTC Policy Statement on Deception, Appended to Cliffdale Associates, Inc., 103 F.T.C. 110, 174 (1984).

Part III of the proposed order requires Facebook to implement procedures reasonably designed to ensure that a user’s covered information cannot be accessed from Facebook’s servers after a reasonable period of time, not to exceed thirty (30) days, following a user’s deletion of his or her account.

Part IV of the proposed order requires Facebook to establish and maintain a comprehensive privacy program that is reasonably designed to: 
(1) Address privacy risks related to the development and management of new and existing products and services, and 
(2) protect the privacy and confidentiality of covered information. The privacy program must be documented in writing and must contain controls and procedures appropriate to Facebook’s size and complexity, the nature and scope of its activities, and the sensitivity of covered information. Specifically, the order requires Facebook to:
* Designate an employee or employees to coordinate and be responsible for the privacy program;
* Identify reasonably-foreseeable, material risks, both internal and external, that could result in the unauthorized collection, use, or disclosure of covered information and assess the sufficiency of any safeguards in place to control these risks;
* Design and implement reasonable controls and procedures to address the risks identified through the privacy risk assessment and regularly test or monitor the effectiveness of these controls and procedures;
* Develop and use reasonable steps to select and retain service providers capable of appropriately protecting the privacy of covered information they receive from respondent, and require service providers by contract to implement and maintain appropriate privacy protections; and
* Evaluate and adjust its privacy program in light of the results of the testing and monitoring, any material changes to its operations or business arrangements, or any other circumstances that it knows or has reason to know may have a material impact on the effectiveness of its privacy program.

Part V of the proposed order requires that Facebook obtain within 180 days, and every other year thereafter for twenty (20) years, an assessment and report from a qualified, objective, independent third-party professional, certifying, among other things, that it has in place a privacy program that provides protections that meet or exceed the protections required by Part IV of the proposed order; and its privacy controls are operating with sufficient effectiveness to provide reasonable assurance that the privacy of covered information is protected. Parts VI through X of the proposed order are reporting and compliance provisions. Part VI requires that Facebook retain all ‘‘widely disseminated statements’’ that describe the extent to which respondent maintains and protects the privacy, security, and confidentiality of any covered information, along with all materials relied upon in making such statements, for a period of three (3) years. Part VI further requires Facebook to retain, for a period of six (6) months from the date received, all consumer complaints directed at Facebook, or forwarded to Facebook by a third party, that relate to the conduct prohibited by the proposed order, and any responses to such complaints. Part VI also requires Facebook to retain for a period of five (5) years from the date received, documents, prepared by or on behalf of Facebook, that contradict, qualify, or call into question its compliance with the proposed order. Part VI additionally requires Facebook to retain for a period of three (3) years, each materially different document relating to its attempt to obtain the affirmative express consent of users referred to in Part II, along with documents and information sufficient to show each user’s consent and documents sufficient to demonstrate, on an aggregate basis, the number of users for whom each such privacy setting was in effect at any time Facebook has attempted to obtain such consent. Finally, Part VI requires that Facebook retain all materials relied upon to prepare the third-party assessments for a period of three (3) years after the date that each assessment is prepared. 

Part VII requires dissemination of the order now and in the future to principals, officers, directors, and managers, and to all current and future employees, agents, and representatives having supervisory responsibilities relating to the subject matter of the order. Part VIII ensures notification to the FTC of changes in corporate status. Part IX mandates that Facebook submit an initial compliance report to the FTC and make available to the FTC subsequent reports. Part X is a provision ‘‘sunsetting’’ the order after twenty (20) years, with certain exceptions.

The purpose of the analysis is to aid public comment on the proposed order. It is not intended to constitute an official interpretation of the complaint or proposed order, or to modify the proposed order’s terms in any way. 

By direction of the Commission. 
Donald S. Clark, Secretary. [FR Doc. 2011–31158 Filed 12–2–11; 8:45 am [my yellow highlighting]

No comments: