Free Speech and Censorship on Facebook

When scrolling through the comments on Facebook, users either can see a display of critical discussion or a plethora of spiteful comments intended to attack and demean people. In the 21st century, social media gives everyone with internet access and an account a platform to voice their opinions. This can be controversial when people abuse their voices to inflict harm on people or groups of people under specific contexts. In his article “Facebook’s Free Speech Balancing Act” Brett Johnson describes Facebook’s philosophical shift in promoting the mere ability to communicate ideas, rather than the content of the ideas (Johnson, 2016). Facebook’s free speech platform is fiercely advocated by its CEO, but the tradeoffs regarding offensive speech and censorship are problematic and steam from the self-motivated interest of Facebook as a privately own entity. 

The balance between free speech and hate speech is riddled with tradeoffs that can both benefit and harm its users. Facebook’s founder and CEO, Mark Zuckerberg, has been one of the championing voices in the fight for freedom of speech amongst social media networks. In his speech at Georgetown University, Zuckerberg defended his position by explaining how Facebook has minimized authority to create an equal playing field for people to express themselves freely, and empower everyday marginalized voices (Romm, 2019). At its core Facebook facilitates the connection with other users, and converse with them, it exists because people want to share and obtain information. A user can share their opinion via Facebook posts, converse with like-minded people using groups, and engage in critical discussions with opposing views through the comments. Consequently, the ideology of Facebook is based on the number of voices, to give everyone an equal voice. One of the most significant trade-offs of Facebook’s promotion of free speech is that people can abuse their voices in a way that perpetuates hatred or violence towards others in the community. In contrast to Zuckerberg’s perspective, Brett Johnson argues that the “… the quantity of online voices does not translate to quality discussions” (Johnson, 2016). When everyone can voice their opinions there will inevitably be a confrontation of opposing ideals. While these can sometimes result in critical discussions, they can also venture into dangerous behaviour such as slander, hatred, and violence. Zuckerberg’s overly optimistic view of Facebook’s free speech fails to recognize the ramifications of the user’s content. What people are saying is just as important as allowing them a say. By focusing on providing users with a platform of expression, they are undermining the ramifications of the content and the quality of discourse it facilitates. Furthermore, the minimization of authority that comes with online discussion on Facebook allows a personal disassociation and newfound courage for users to perpetuate offensive behaviour while simultaneously avoiding responsibility. 

In response to critique about Facebook’s responsibility to protect the users on its platform, community guidelines were put in place to restrict the ability to vocalize offensive behaviour. Zuckerberg defended the update of community standards in 2015, which contrasts his recent advocacy of free speech. In a Facebook post, he said that “… threats of violence and bullying will be taken down” because they “are examples where one person exercising their voice may unfairly limit the voices of many others” (Zuckerberg, 2015). When taking into consideration the surface level commitment towards maintaining a safe community, Facebook alters its notion of free speech to become free only within a set of ambiguous parameters. This poses a critical question in the debate of free speech; when is it okay to limit the speech of others? Facebook takes the majority approach, where the action is taken if the content is harming more people than it is helping. Johnson emphasizes the importance of having clear instruction for “… allowable and unallowable discourse…” because it “… is healthier for public discourse than a policy in which users have important speech removed arbitrarily and capriciously because it falls in a gray area of Facebook’s community standards” (Johnson, 2016). While the governance of user-generated content is meant to provide greater community protection for the disenfranchised, but the means of removal can be unclear to many users and often vary case by case depending on the reviewal committee. Again the issue of quality is brought up as it leads users to become more forgiving of how content is governed by Facebook, in favour of the greater good that Facebook provides. The potential for discourse is the primary aspect of Facebook as a global platform for communication. 

While debates over Facebook’s freedom of speech and censorship policies vary, it is not because they have the public’s best interest at heart but rather because they are looking out for their self-motivated interests. It is imperative to remember that Facebook is a privately owned entity, and thus has no state or legal authority in the freedom or censorship of user content. Consequently, it is nearly impossible to separate Facebook’s self-interest in the success of its platform from the concern for human decency. Facebook attempts to have the best of both worlds by providing enough free speech that users don’t feel stifled but enough support that users don’t feel alienated or offended by the community. Furthermore, Johnson insists that appealing to both sides has little to do with concern for human decency, but rather to prevent a mass exodus of the Facebook user that might lead to the loss of revenue (Johnson, 2016). In this sense, the corporate agenda of Facebook does not serve a democratic process, but rather the self-preservation of the company. The PEW Research Institute conducted a survey asking people if they believed people should have the ability to make statements that are offensive to minority groups, or whether the government should be able to prevent people from saying these things (Poushter, 2020). The results of the study provide some surprising insight into Facebook’s stance as 4-in-10 millennials said there should be some preventative measures in place (Poushter, 2020). The American outcome illustrates how fiercely the right to freedom of expression is advocated and protected under the First Amendment. Still, close to half of Americans are in favour of stricter censorship a number that is not reflected in Facebook’s current policies. However, the popular census across the globe varies drastically from Facebook’s non-committal stance on removing offensive behaviour. The median of those in favour grew by 21% for European countries, with 7-in-10 Germans and 62% of Italians agreeing with stricter interventions against offensive speech (Poushter, 2020). This research highlights that popular demand across continents is in contrast with Facebook’s interests. It could not be more clear that Facebook cares less about your rights, as it does its preservation and profit.

The dialogues of discussion that Facebook provides raises critical questions regarding the right to freedom of expression, the censorship of user-generated content, and the ultimate intentions of the corporation. In his article “Facebook’s Free speech Balancing Act” Brett Johnson argues that the balance between providing every a voice while censoring content to create a safe space is based on the ability to communicate rather than the content of the discussion (Johnson, 2016). Furthermore, Facebook’s self-motivated interests have created a divide, where offensive speech and censorship are trade-offs of Facebook’s pursuit of free speech. It is ultimately up to the users to make the rules of online social discourse, and maintain the appropriate respectful curtesy that is implied in-person. If people want to engage in respectful conversation, they must act respectfully.

References:

Johnson, B. J. (2016). Facebook’s Free Speech Balancing Act: Corporate Social Responsibility and Norms of Online Discourse. U. Balt. J. Media L. & Ethics, 5, 19.

Poushter, J. (2020, July 31). 40% of Millennials OK with limiting speech offensive to minorities. Retrieved October 21, 2020, from https://www.pewresearch.org/fact-tank/2015/11/20/40-of-millennials-ok-with-limiting-speech-offensive-to-minorities/

Romm, T. (2019, October 17). Zuckerberg: Standing For Voice and Free Expression. Retrieved October 21, 2020, from https://www.washingtonpost.com/technology/2019/10/17/zuckerberg-standing-voice-free-expression/

Zuckerberg, M. (2015, March 15). Today we released our latest Global Government Requests Report and updated our Community Standards. Retrieved October 20, 2020, from https://www.facebook.com/zuck/posts/ 10101974380267911?reply_comment_id=10101974442079041&total_comments=36&__ fns&hash=Ac2U13_rwzPmNUEQ

css.php