Home > Sample essays > Exploring Negative Impacts of Facebook: IS Propaganda, Radicalisation and Prejudice

Essay: Exploring Negative Impacts of Facebook: IS Propaganda, Radicalisation and Prejudice

Essay details and download:

  • Subject area(s): Sample essays
  • Reading time: 7 minutes
  • Price: Free download
  • Published: 1 April 2019*
  • File format: Text
  • Words: 1,967 (approx)
  • Number of pages: 8 (approx)
  • Tags: Facebook essays

Text preview of this essay:

This page of the essay has 1,967 words. Download the full version above.



Social media platforms, such as Facebook, have (and continue to) experienced an enormous boom in popularity. In fact, as of September 2018, there were 1.49 billion daily active users on average in comparison to 1 million total users in December 2004 (“Company Info: Facebook Newsroom” 2018).

Such growth allows users to share opinions or ideas (including videos and pictures) which have the ability to draw in a huge audience and gain insight into the opinions and views of other people which they would otherwise have no access to. For instance, a person can create a public post or group which can be shared, commented on or ‘liked’ by any other user on Facebook creating a potential reach as vast as the social media platform itself. This alone creates issues because a single person could hypothetically convince many more than a million people that their opinion is fact.

This is an issue because the capacity to share ideas and opinions so widely coupled with the capacity to share anything a person wants to, allows for the spread of mis-information, unfavourable con-tent, fearmongering, prejudice and general hate speech.

This essay will focus around explicit content that is used by Islamic terror organi-sations and Islamophobic communities as propaganda and a way of fearmonger-ing. When discussing whether Facebook is ‘good for us’ I will maintain to the he-donistic view that the only intrinsic ‘good’ is pleasure and the only intrinsic bad is pain or harm.

In 2013 Facebook caused up cry by changing its community standards policy to allow the spread of content which is seen as explicit and violent. They claim to ‘allow graphic content (with some limita-tions) to help people raise awareness about issues’ because ‘people value the ability to discuss im-portant issues like human rights abuses or acts of terrorism.’ (“Facebook: Community Standards” 2018) One commenter (we’ll call her Ann), expressed that she was ‘nervous to scroll through face-book’ after accidentally viewing an Islamic terror organisations execution video that a friend had shared and stated that ‘There is absolutely no excuse for providing something so graphic and dis-turbing to anyone, let alone on a site which age limit is 13’ (“Beheading videos to be allowed” 2013). Countless other users like Ann were appalled by this decision and, it seems, rightly so.

Research shows that exposure violence in the media (including video violence) can cause an adoles-cent (a young person about to develop into an adult) to show aggressive and antisocial behaviour which can stay with them throughout adulthood (Huesmann 2018). Another finding of Hues-mann’s was that children who engaged in mass media displaced other activities which were vital in their intellectual development, such as reading.

If a child begins to substitute learning through reading with learning through mass media, they are at high risk of taking information found on social media as fact. For example, a teenager who uses Facebook instead of reading published, unbiased books is more likely to accept the views of Face-book as fact. I will focus on an adolescent’s response to two different types of propaganda video sharing.

Firstly, a person may share a beheading video expressing outrage at it. In this case, an adolescent watches the video and reads the comments made by whoever shared it (the stimulus) and experi-ences an emotional reaction of aggression. This increases the likelihood of them sharing it them-selves with the same expression of outrage (repeating aggression through observational learning). By being exposed to such a video their schema (prejudices and stereotypes) may be impacted, these later become ‘crystallised’ if repeated exposure occurs (the video appears on the adolescents Face-book newsfeed multiple times). With this example (a person who imitates a negative expression of outrage to the videos) the adolescent is vulnerable to being ‘recruited’ by an Islamophobic online community , which will in turn cause harm to the individual and the Muslim people that could tar-get. One example of this is a video shared on the Facebook page ‘Behead Murderous Fanatical Musl1ms’, where the page administrator attached the caption “This is not a mental illness this is Islam !!!” to a video that they had shared which depicting a Muslim lady carrying the head of a child through Moscow .

Having said that, if the adolescent sees the video being shared on Facebook by somebody who sup-ports the beheading, it may lead to crystallisation of radical beliefs and support of the violence wit-nessed. This occurs using the same processes as mentioned above. This process is called the social learning theory and is used by ISIS to recruit vulnerable people and promote violence on social me-dia (Freiburger and Crane 2008). In this case, the adolescent is at risk of being ‘recruited’ by a radi-cal Islamic terror community.

Worryingly, when people agree with the opinions of the adolescent (which primarily came from the original poster), they are likely to become concrete, affecting the adolescent’s views of the world even when they become an adult. Repeated exposure to such beheading videos also leads to desen-sitisation of violence and leads the adolescent to be more likely to mimic the violence that they have seen because the repeated exposure allows immorality to seem normal. Similarly, adults with a lack of knowledge about Islam and its values may take these videos as concrete facts about what Islam encourages. The end result of this can be wide spread misinformation, radicalisation and/or the long-standing prejudice and discrimination.

These two scenarios are ways in which beliefs and opinions spread through Facebook without mod-eration. Considering these two situations alone, the ways in which young people, and even adults, are negatively affected by using Facebook are very clear. At this point, I see no reason to say that Facebook is good for us because, although Facebook is not directly to blame for incidents such as radicalisation and developing prejudice, it does help to facilitate the spread of misinformation which can lead to radicalisation  and prejudices.

Although I was unable to find a specific Facebook account related to IS propaganda, this does not mean that Facebook’s response to terrorist organisations on the site is satisfactory. In fact, the company has been accused of encouraging the connections of terrorism on Facebook by tailoring advertisements and suggests friends to appeal to a person’s interests (Evans 2018). Research shows that Facebook does attempt to remove IS propaganda on Facebook – in fact 1.9 million pieces of IS related content were removed in the first months of 2018 – but the propagandists have found nu-merous ways around being removed. These ways include back up accounts in case of removal and copying and pasting IS content instead of posting the original source in hopes of bypassing Face-books automatic security software.  Of twenty-eight accounts monitored by Waters and Posting, eight IS affiliated Facebook accounts have at least four linked back up accounts which will be used in the event that their original account is removed.

Further to this, Facebook have previously reinstated the account of IS propagandists after they previously removed them. For instance, Abdulrahman Alcharbati had his account reinstated nine times after being removed for posting statuses which encouraged terrorism and showed graphic violence.

In one case, an American man with an interest in Islam was targeted by an Islamic extremist who, over the space of six months ‘went from having no clear religion to becoming a radicalised Muslim supporting Isil’ (Postings 2018). This event has been repeated with numerous other Facebook users and continues to happen even after Posting and Water’s report was published on the counter ex-tremism website. Clearly, the ongoing trend of propagandists being recommended ‘friends’ with a curiosty in the religion of Islam is a large-scale problem that catalysed on the site and not stopped.

Above I stated that Facebook does little to nothing to stop the spread of misinformation, I must elaborate on what exactly is does do. In Facebook’s community standards they claim to allow vio-lent and graphic content ‘with some limitations’. The limitations that they are referring to are vide-os or pictures depicting violence where the poster has clearly expressed enjoyment of violence, suf-fering or humiliation. However, photos or videos involving dismemberment, throat slitting, visible internal organs (to name only a few) are allowed but will include ‘a warning screen so that people are aware the content may be disturbing.’ (“Facebook: Community Standards” 2018). Having said that, the warning screens that are place on distressing images are often done so by a team of Face-book moderators as their automated security is unsatisfactory. These moderators are “bombarded with thousands of images depicting child sexual abuse, torture, bestiality and beheadings”  and one moderator recently attempted to sue Facebook after claiming to develop PTSD after watching vide-os such as these beheadings. Clearly, the company is not doing enough to protect either its users of its moderators from harm.

Facebook also attempt to only make this content available to those over the age of eighteen. How-ever, as discussed previously, even adults who view violent content can be negatively affected by it. Plus, as expressed by Ann’s comments about the policy change, such gruesome content is not wel-comed by adults either.

Facebooks attempts to control unfavourable content are extremely poor as it can cause harm to its users who accidentally stumble upon it. Since posters are pro-hibited from stating the nature of content that may have an automatic warning on it, users may watch the video without really knowing what it is. Facebook cannot be good for us if it allows content that makes users uncomfortable to be posted so freely. Regardless of whether the post has a warning attached to it, it is all too easy for unsuspecting users to accidentally click on the video or photo and expose themselves to it.

Further, research has found that twenty-six percent of nine to ten-year-olds use social media and forty-nine percent of eleven to twelve-year-olds use social media with one in five nine to twelve-year-olds using Facebook (Eukidsonline: UK Key Findings 2018). Clearly Facebook is not implement-ing their over thirteens policy strictly enough, allowing children to access the site with a false date of birth. This could result in children under the age of eighteen, or even under the age of thirteen, viewing explicit, gruesome or violent content. Although Facebook do give the option to report an underage user, the company itself seems to do very little in the way of checking the age of users and removing them automatically.

The poor security surrounding explicit content and the negative effects that such content causes open up more issues for users than already discussed. Notably, it allows propaganda videos to be spread by terrorist groups which encourage recruitment and participation, that said, it also allows the demonization of religious groups which are associated with such groups.

To conclude, without proper education and taking unmoderated posts on Facebook as fact, users buy into stereotypes, create conflict and expose themselves to violence or graphic content and the effects of such. They also allow themselves open to be open to potential recruitment from Islam-ophobic communities and radicalisation. Although Facebook has attempted to put restrictions on the sharing of content, such as Islamic extremist beheading videos, it seems to have done very little in actually stopping them from being allowed on the site completely. Thus, Facebook cannot be deemed as inherently good for us because it enables harm to come to its users. Harm is caused re-gardless of whether it is triggered by the immediate consequences of viewing the videos such as us-ers feeling disturbed or nervous, the short-term consequence of encouraging aggressive responses, or the potential long-term consequences of allowing a person to become vulnerable to recruitment organisations (either from the terrorist community or the Islamophobic community). Facebook has, and continues to, refrain from fully censoring malicious, violent and explicit content from its site and cannot be deemed as inherently good for its users until it does so.

...(download the rest of the essay above)

Discover more:

About this essay:

If you use part of this page in your own work, you need to provide a citation, as follows:

Essay Sauce, Exploring Negative Impacts of Facebook: IS Propaganda, Radicalisation and Prejudice. Available from:<https://www.essaysauce.com/sample-essays/2018-11-26-1543258283/> [Accessed 18-05-24].

These Sample essays have been submitted to us by students in order to help you with your studies.

* This essay may have been previously published on Essay.uk.com at an earlier date.