What Facebook gets wrong about 'The Social Dilemma'

There's no doubt that the rise in social media must have an effect on individuals and wider society, but to what extent is this a bad thing? Senior Lecturer in Future Media and Digital Communications, Mark Brill, investigates Facebook's response to 'The Social Dilemma'.

In response to the American docudrama, The Social Dilemma (2020), Facebook wrote a briefing paper called, 'What 'The Social Dilemma' Gets Wrong'. The response appears to be aimed at clients using their advertising service, rather than a public press release. In dismissing the film, the briefing states that 'rather than offer a nuanced look at technology, it [The Social Dilemma] gives a distorted view of how social media platforms work to create a convenient scapegoat for what are difficult and complex societal problems'. It is certainly the case that social media, and technology more broadly, are often blamed for problems that are largely societal or economic. However Facebook's response is not especially detailed either...

ADDICTION: 'Facebook builds its products to create value, not to be addictive'

It is true that Facebook creates products that build value. Value that benefits themselves and their advertisers. Retaining and keeping users engaged through compelling content is a key element for retention that attracts advertisers. However, the question of how it creates value for their users is the crux of the debate.

The engagement, especially through the use of the 'Like' button, fosters addiction in some people. There is plenty of research that demonstrates how social media and the 'Like' button triggers the reward centres of the brain, working in a similar way to recreational drugs. Given this research, you might say that Facebook is building addiction, regardless of their actual intent. That addiction may be an unintended consequence of building a compelling platform, but it's a consequence that is detrimental to users. It seems that changes they have made to their algorithms are designed to appease concerned advertisers or governments, and the benefit to their users is incidental. Whilst Facebook are not entirely forthcoming about the objectives of their algorithms, if they were trying to prevent addiction, surely they would remove the 'Like' button?

YOU ARE NOT THE PRODUCT: 'Facebook is funded by advertising so that it remains free for people'

This doesn't really address the point raised in The Social Dilemma. Firstly, there is no choice for the user in the value exchange that Facebook has created. Is there a non-ad, subscription version of Facebook? No! By signing up to be on the platform users must also share an unspecified amount of data. That might include information such as ethnicity and political affiliations, as well as their Facebook connections. Whilst technically user data is anonymous*, that information is then packaged and sold to advertisers. They do this by creating 'shadow profiles' for all of their users, including information not displayed on profiles, such as phone or email addresses. Think of it as a hidden avatar of each user. Facebook then sells this to advertisers in a product called Custom Audience. Whilst brands who buy into this do not have access to specific personal data, it is still a means of productising the user. Worryingly, Facebook also collects shadow profile data from non-users through the use of cookies on websites. The irony is that you can only ask Facebook to remove this information if you join the site. It means that you're still the product even if you're not on Facebook! The key issue is that users are not aware of how their profile or behaviours are used in the value exchange, nor can they realistically opt out of it. Just because it is free, it does not give Facebook a licence to do whatever they want.

*Technically anonymous because it has been demonstrated that a very small number of data points, typically about three, is enough to identify a user.

ALGORITHMS: 'Facebook's algorithm is not 'mad'. It keeps the platform relevant and useful'

It is true that algorithms and machine learning can be both useful and relevant. For example, many companies, including Netflix, use recommendation algorithms to suggest programmes based on past viewing. But they can also be 'mad'. The problem is that all algorithms are inherently biased in their design. That comes from both the data given to them in the first place, and the way in which that data is interpreted by the algorithm. They will reinforce existing behaviours and biases. There are many examples of this, from Microsoft's chatbot that had to be withdrawn due to racial bias, to the algorithm that 'predicted' the UK school exam results in the summer of 2020, preferencing students from private schools and wealthier areas.

The biases with any algorithm are further amplified by the context in which they are used. Making a recommendation based on previous viewing or listening history is limited in its impact on the user. It's entirely different when Facebook base those recommendations on a range of unidentified behaviours from each user and their connections. The issues with Facebook go further than this. Unlike Netflix or Spotify, the content is user-generated and especially where opinion or falsehoods are concerned, they can propagate quickly, often unchallenged. That creates a much greater risk in the choice of information that is presented to the user. By preferencing content similar to previously liked posts, it can amplify the 'confirmation bias'* to users. Ultimately it is humans that structure the algorithms, feed the data for it to recommend, and therefore will always contain these biases both current and historical.

*Confirmation bias is the idea that we tend to preference information or opinions that support our own.

DATA: 'Facebook has made improvements across the company to protect people's privacy'

Although there have been some changes to protect the privacy of users, the main driver has been to protect Facebook’s advertising revenue and from government regulation. They have largely resisted regulation and only made changes when threatened with sanctions such as breaking up Facebook, or from advertiser boycotts. Mark Zuckerberg has continually refused to testify to UK Government Committees on the subject, therefore avoiding oversight outside of the US. Mark Zuckerberg has been quoted many times saying that ‘the age of privacy is over’ or ‘privacy is a social norm of the past’. His past statements lead me to question Facebook’s commitment to privacy.

POLARISATION: 'We take steps to reduce content that could drive polarisation'

Facebook is not responsible for populism and the polarisation that it brings, that is true. However it provides a platform, and above all, the scale, in which theories can be incubated and propagated. The fact that most content is from friends and family is part of the problem. 46% of US users get their news via the platform. Fake news and factual inaccuracies cannot be checked in this context, and falsehoods are passed between these groups without evidence or oversight of a traditional news channel. The algorithms amplify confirmation bias, which further leads to greater polarisation.

ELECTIONS: 'Facebook has made investments to protect the integrity of elections'

Unlike Twitter, Facebook has continued to allow political advertising, including false advertising, meaning it is difficult to realistically claim that they protect the integrity of elections. Facebook’s Ad Library is a good attempt at political transparency, but it doesn’t work. It’s difficult to navigate and hard to find specific ad content. Furthermore, Facebook have said that they will not fact check political adverts.

"Between March and May this year alone, we removed more than 100,000 pieces of Facebook and Instagram content for violating our voter interference policies"

That sounds like a lot? It's a drop in the ocean compared to the total number of posts! For starters, there are 350 million photos uploaded each day to Facebook (let alone Instagram).

MISINFORMATION: 'We fight fake news, misinformation, and harmful content using a global network of fact-checking partners'

Yet the Trump campaign used Facebook as a voter suppression tactic in 2016. Through Cambridge Analytica, the pro-leave groups broke UK election spending rules to deliver an unknown quantity of adverts. These made false claims, such as 76 million Turkish citizens would be allowed to come to the UK. Even though Facebook has since developed a fact-checking service its efficiency has been questioned.

"We removed over 22 million pieces of hate speech in the second quarter of 2020"

Facebook only made a concerted effort after a number of major advertisers pulled their ads!

References:

Castillo. S (2015) 'Facebook Addiction Activates Same Brain Areas As Drugs; How Social Media Sites Hook You In'. Available at: https://www.medicaldaily.com/facebook-addiction-activates-same-brain-areas-drugs-how-social-media-sites-hook-you-320252. Accessed February 2021.

Lieu. J (2018) 'Facebook Allows Advertisers to Target Based on Your Shadow Profile'. Available at: https://mashable.com/article/facebook-advertisers-shadow-profile/?europe=true. Accessed February 2021.

Cyphers. B (2019) 'A Guided Tour of the Data Facebook Uses to Target Ads'. Available at: https://www.eff.org/deeplinks/2019/01/guided-tour-data-facebook-uses-target-ads. Accessed February 2021.

Wagner. K (2018) 'Congress Doesn't Know How Much Facebook Works and Other Things We Learned From Mark Zuckerberg's Testimony'. Available at: https://www.vox.com/2018/4/11/17226742/congress-senate-house-facebook-ceo-zuckerberg-testimony-hearing. Accessed February 2021.

Hern. A & Sabbagh. D (2018) 'Zuckerberg's Refusal to Testify Before UK MP's 'absolutely astonishing''. Available at: https://www.theguardian.com/technology/2018/mar/27/facebook-mark-zuckerberg-declines-to-appear-before-uk-fake-news-inquiry-mps. Accessed February 2021.

Pew Research Centre (2018) 'Nearly Half of Americans Get News Through Facebook'. Available at: https://www.pewresearch.org/fact-tank/2019/05/16/facts-about-americans-and-facebook/ft_18-04-06_facebooknews/. Accessed February 2021.

Smith. R (2020) 'The UK Election Showed Just How Unreliable Facebook's Security System for Elections Really Is'. Available at: https://www.buzzfeednews.com/article/rorysmith/the-uk-election-showed-just-how-unreliable-facebooks. Accessed February 2021.

Smith. C (2013) 'Facebook Users Are Uploading 350 Million New Photos Each Day'. Available at: https://www.businessinsider.com/facebook-350-million-photos-each-day-2013-9?IR=T. Accessed February 2021.

Channel 4 News Investigations Team (2020) 'Revealed: Trump Campaign Strategy to Deter Millions of Black Americans from Voting in 2016'. Available at: https://www.channel4.com/news/revealed-trump-campaign-strategy-to-deter-millions-of-black-americans-from-voting-in-2016. Accessed February 2021.

Cadwalladr. C (2019) 'Facebook's Role in Brexit - and the Threat to Democracy'. Available at: https://www.youtube.com/watch?v=OQSMr-3GGvQ. Accessed February 2021.

Hsu. T & Friedman. G (2020) 'CVS, Dunkin', Lego: The Brands Pulling Ads from Facebook Over Hate Speech'. Available at: https://www.nytimes.com/2020/06/26/business/media/Facebook-advertising-boycott.html. Accessed February 2021.

For more information on the issues raise in The Social Dilemma, please see:

Zuboff. S (2019) 'The Age of Surveillance Capitalism'
Criado-Perez, C. (2020) ‘Invisible Women’
Umoja Noble, S. (2018) ‘Algorithms of Oppression’

Games, Film and Animation Courses

Find out more about our courses