The Facebook Failure

a whole generation of users got used In the move-fast-and-break-things era. Now it is time for the question: What about the people broken in the process?

At Facebook’s Compassion Research Day, February 2015 the company announced a new ‘suicide feature’ - a feature supposed to direct potentially suicidal individual towards treatment. A participant at the conference asked whether or not the company was worried, that data from the suicide programme would be misused. Not an unimportant question, bearing in mind the 2014 scandal: Manipulation of 700,000 users’ news feeds. The aim of the experiment was to examine whether or not it was possible to affect a Facebook user’s mood. Although it was not intended as an experiment in mind control, it was perceived as a scary example of what Big Data can also be used for. Sheryl Sandberg, Chief Operating Officer of Facebook, told The Wall Street Journal subsequently: “this was part of ongoing research that companies do to test different products, and that was what it was; it was poorly communicated ... and for that communication we apologise. It was not our intention to worry you.”

At the conference in Menlo Park one of the researchers answered to the question of the risk of tracking people’s mental state, that she was sceptical at first, but saw great potential in suicide prevention: “Facebook is an obvious place to start a social movement against suicide (...) With such a large audience comes the possibility of reaching a lot of people. The enormous problem that is suicide demands action so as to make a difference”. No one can disagree with an initiative that aims to prevent suicide. It makes sense and is commendable that a company like Facebook is taking responsibility - or at least it made sense in 2015. It doesn’t really anymore… Users apparently get used.

The people and places broken in the process

In 2015 the most influential tech narrative was still driven by tech optimism and the belief in the neutrality of technology - that tech can fix everything. However we’ve since learned that it was only half of the story.  That it was also a billion dollar marketing trick that caused the sleepwalking into a new century. With professor Ruja Benjamin’s words (in Race After Technology, 2020) : “In that light, I think the dominant ethos in this arena is best expressed by Facebook’s original motto: “Move Fast and Break Things.” To which we should ask: What about the people and places broken in the process? (…) I think we should stop calling ourselves “users”. Users get used .“

Fastforward to September 2021 when Wall Street Journal launched a bombshell investigative journalism series called The Facebook Files. As Georgia Wells and Jeff Horwitz state: “Facebook’s Effort to Attract Preteens Goes Beyond Instagram Kids, Documents Show. It has investigated how to engage young users in response to competition from Snapchat, TikTok; ‘Exploring playdates as a growth lever’”

Facebook isn’t the only technology company to confronting legal or regulatory problems related to children’s use of their platforms - it also applies to ByteDance Ltd.’s TikTok and YouTube. And the competition from rivals is - according to Wells and Horwitz - a motivating factor behind its work.

The company’s approach to young users is expected to be addressed during a Senate subcommittee hearing on Thursday, which is expected to probe the effects of Facebook’s Instagram platform on mental health.

The Facebook Files

The files reveal that Facebook knew “in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands”. The files contain a wide-range of topics from teen mental health, to algorithm-driven outrage, to human-trafficking, to vaccine misinformation. As the Center for Humane Technology states in the article: “The Facebook Files: Tech's Big Tobacco Moment” the most remarkable part of the investigation is its evidence: Internal Facebook documents “including research reports, online employee discussions, and drafts of presentations to senior management” prove Facebook knew of alarming, far-reaching harms and failed to take action—often against direct recommendations of its own researchers.

SOME OF THE REVELATIONS:

  • 32% of teen girls said that when they felt bad about their bodies, Instagram (which is owned by Facebook) made them feel worse

  • 13% of British and 6% of American teens who reported suicidal thoughts traced the desire to kill themselves to Instagram

  • 5.8 million VIPs are protected or excluded from having their policy-violating content removed through a system called "XCheck"

  • Only 13% of moderator time taken to label or remove false or misleading information is spent on content from outside the US, yet 90% of users live outside the US and Canada

  • One political party's team shifted their content from 50% negative to 80% negative because 2018 algorithm changes rewarded outrage

    Center for Humane Technology 

Sources:

WSJ “the facebook files - A Wall Street Journal investigation,” https://www.wsj.com/articles/the-facebook-files-11631713039?utm_source=Movement+for+Humane+Technology&utm_campaign=177c164f22-EMAIL_CAMPAIGN_2021_09_17_12_47&utm_medium=email&utm_term=0_df2ee1f826-177c164f22-68496407

Robinson, Meyer, “Everything We Know about Facebook’s Secret Mood Manipulation Experiment”, The Atlantic, 28-06-2014. Goldman, David, “Facebook treats you like a lab rat”, CNN/Money, 30-06-2014.

R. Jai Krishna “Sandberg: Facebook Study Was ‘Poorly Communicated’”, Wall Street Journal, 02-07-2014.

Compassion Research Day, Jennifer Stuber from the Forefront organisation, Seattle, spoke to the conference about addressing suicide on social media. Facebook HQ, Menlo Park, Palo Alto 25-02-2015.

Mayer-Schönberger, Viktor & Cukier, Kenneth, Big Data: A Revolution That Will Transform How We Live, Work, and Think, Boston, Houghton Mifflin Harcourt, 2013.

Pedersen, Katrine, Phono Sapiens, 2016.

Previous
Previous

art, tech, culture & crisis... Books to bring from 2021 to 2022

Next
Next

LOOK AT ME // The psychology of Selfies