Wednesday, 3 June 2020

For years Facebook Inc. has known that its algorithms encourage and amplify antisocial behaviour like hate speech and extreme political bias


It seems that Facebook Inc. executives shut down efforts to make the site less divisive - because social and political division was increasing company profits by keeping certain categories of users engaged.

One has to wonder to what degree the company's decades of fostering poisonous online comment has contributed to the chaos that is American society in 2020.

Business Insider, 29 May 2020:
  • For years, Facebook has known that its algorithms encourage and amplify antisocial behaviour like hate speech and extreme political bias to keep users engaged, according to company documents reported in The Wall Street Journal.
  • When given proposals to make the platform better, executives often balked. They didn’t want to offend bad actors, and they didn’t want to release their hold on people’s attention. At Facebook attention equals money. 
  • So Facebook’s algorithms have been allowed to continue being sociopaths – pushing divisive content and exploiting people’s visceral reactions without a thought for the consequences or any remorse for their actions. 
  • Meanwhile, by letting bad actors on the platform do their thing, Facebook is feeding an inherent political bias into the algorithms themselves, and the company at large.
Facebook has always claimed that its mission is to bring people together, but a new report from The Wall Street Journal laid bare what many have suspected for some time: Its algorithms encourage and amplify harmful, antisocial behaviour for money. 

In other words, Facebook’s algorithms are by nature sociopaths. And company executives have been OK with that for some time. 

Here’s what we learned from Jeff Horowitz and Deepa Seetharaman at The Journal
  • A 2016 internal Facebook report showed “64% of all extremist group joins are due to our recommendation tools.” 
  • A 2018 internal report found that Facebook’s “algorithms exploit the human brain’s attraction to divisiveness” and warned that if left unchecked they would simply get nastier and nastier to attract more attention. 
  • An internal review also found that algorithms were amplifying users that spent 20 hours on the platform and posted the most inflammatory content (users that may not be people at all, but rather Russian bots, for example). 
  • Facebook executives, especially Mark Zuckerberg, time and time again ignored or watered down recommendations to fix these problems. Executives were afraid of looking biased against Republicans – who, according to internal reports, were posting the highest volume of antisocial content. 
  • And of course executives had to protect the company’s moneymaking, attention-seeking, antisocial algorithms – regardless of the damage they may be doing in society as a whole. Politics played into that as well. 
People who suffer from antisocial personality disorder – known in popular culture as “sociopaths” – engage in harmful, deceptive behaviour without regard for social norms. Sometimes this is done with superficial charm; other times this is done with violence and intimidation. These people never feel remorse for their behaviour, nor do they consider its long-term consequences. 

This is how Facebook’s algorithms behave. It’s how they hold on to users’ attention and how, ultimately, the company makes money. 

This runs contrary to what the company has been telling us about itself. After the bad rap it developed in the wake of the 2016 election, executives and the company’s marketing machine were telling us that Facebook was both financially and culturally committed to encouraging pro-social behaviour on the platform by doing things like removing violence and hate speech, making sure conspiracy theories and lies didn’t go viral, and cracking down on opioid sales. 

Now we know that that commitment was limited. Facebook would not kill the algorithms that laid the golden eggs despite their bias against these goals, or even clip their wings for that matter.....

Read the full article here.

No comments: