The Wall Street Journal has published a series of investigations into Facebook’s deep-seated problems with its platforms. Over the last few weeks, “The Facebook Files” detailed instances in which the social media conglomerate researched and acknowledged its faulty algorithms, exclusivity, and negative effects on the public.
The Journal has accused Facebook and Instagram of being “riddled” with flaws that “cause harm, often in ways only the company fully understands,” and chief executive Mark Zuckerberg is aware of the deleterious issues. The news outlet has reported on the presence and severity of the faults and the social media company’s apparent lack of interest in fixing them.
Strengthening the Divide
According to the reports, Facebook made changes to its algorithms three years ago to improve user connectivity. Documentation appears to show staffers warned of adverse effects taking hold. Zuckerberg refused fixes proposed by his team, fearing they would discourage people from interacting with the social media platform.
The company’s intentions were far from achieved and instead created conflict. The user base and platform were made “angrier” by divisive content. Article titles such as Buzzfeed’s “21 Things That Almost All White People Are Guilty of Saying” is one example of a post that reached tens of thousands of people, triggering online arguments and aggression.
The Journal also unearthed an internal platform list known as “cross check” or “XCheck” that exempts some high-profile users from the rules applied to typical users. Those who make it onto the list are shielded from sanctions that come when posted material breaks Facebook’s terms of service. One example of a violation is a post inciting violence.
Zuckerberg has boasted that Facebook allows all users to speak on an equal basis and no one is above the law when it comes to its standards. According to internal documentation, the cross check program was designed to maintain quality control for high-profile accounts, but in reality many “abuse the privilege, posting material including harassment and incitement to violence that would typically lead to sanctions.”
Mental Health Effects
Several studies have found that time spent on social media platforms, including Instagram, can interrupt sleep patterns and expose teenagers to bullying and unrealistic displays of other people’s lives. Instagram’s internal research has shown this to be most prevalent among girls. Researchers from Instagram have found the app to be “harmful” and “toxic” for young users. According to WSJ, 32% of teen girls “said that when they felt bad about their bodies, Instagram made them feel worse.” Many teens also attributed their increased anxiety and depression to Instagram.
In 2018, Instagram assembled a “well-being” team that made a few adjustments to the software, including flagging posts that reference self-harm and mental illness. It also created a “help center” page with useful information regarding suicide prevention, abuse, mental health, and eating disorders.
These additions are a step in the right direction, but the bottom line is, those scrolling through the Instagram app are still consuming unrealistic/unachievable displays and experiencing cyberbullying. Nothing concrete has been done by Instagram to effectively combat and prevent the negative effects on sleep and mental health. More must be done on the front end to prevent troubling posts.
Lack of Problem Solving
Every social media app has issues, as well as any sharing platform. Identifying problems is proactive and commendable, but refusal to solve destructive developments is not. The purpose of The Journal’s exposé is to hold Facebook accountable for its lack of care for the well-being of its users. Facebook wields tremendous power, enough even to affect the health and safety of its users and to foster political divisions more toxic than ever. Will Facebook accept responsibility?
The second in this two-part series covers Facebook’s response to The Wall Street Journal’s exposé, finding “deliberate mischaracterizations.” Will this report force Facebook to make changes and be more transparent?
Read more from Keelin Ferris.