One can only imagine what it must have been like for Facebook executives on the 15th of March, 2019. A far right extremist started a live stream on Facebook, while he systematically gunned down devout Muslims attending Friday prayers at the local mosque using a semi-automatic assault rifle (ABC, 2019a). The possibilities for connected, networked publics, based on the affordances of digital social media platforms, suddenly don’t seem like a good idea anymore. The creation of a networked public using a digitally mediated community like Facebook, has now allowed a variety of very dangerous private worlds, to enter the public sphere.
The challenges for both social media corporations and government regulatory bodies to further mediate and regulate what people share, post, and live stream has never been greater. The New Zealand Government has already prosecuted one person for sharing footage of the Christchurch massacre (ABC, 2019b). Facebook’s own CEO Mark Zuckerberg calling for social media regulation (Gillespie, 2019) and there have been an Increasing number of concerns regarding breaches of privacy (ABC, 2018a). Facebook’s Cambridge Analytica in 2018 (ABC, 2018b), is one of the most high profile examples.
The current situation created by the live stream of the Christchurch massacre calls for greater monitoring of social media corporations but will this now justify these same corporations monitoring us? Once again, the Cambridge Analytica scandal has already shown us the potential of mismanagement of our data. Social media corporations such as Facebook already curate:
“the content and [police] the activity of their users: not simply to meet legal requirement, or to avoid having additional policies imposed, but also to avoid losing offended or harassed users, to placate advertisers eager to associate their brands with a healthy online community, to protect their corporate image … (Gillespie, 2017, p255).
The success of Social media platforms relies on users to share information. Social media users openly share their personal details online and we are only now starting to see the potential fallout from this practice. With the sharing of private spaces in public places via the affordances of social media, content curation and content creation. Issues are now being raised regarding proprietary and responsibility.
Social media sites promote themselves as ‘impartial platforms’ however, the Christchurch massacre showed that social media sites must clearly start to regulate who is able to start broadcast a live stream. Australian Prime Minister Scott Morrison, weighed into the debate declaring that potential changes to legislation could be made in response to the mosque attacks in Christchurch (Wynne, 2019).
RN podcast: Social media executives could face criminal charges over failure to remove violent content
There are however, very strong arguments against the regulation of social media platforms on the basis of who owns the content? Telecommunications legislation drafted in the USA allows a ‘safe harbour’ for social media platforms because of the following technicalities:
- Social media corporations are not publishers. They don’t create the content and therefore cannot be held responsible for content created by it’s users
- Social media corporations are distributers who merely circulate the information of others (Gillespie, 2018, p258).
Indeed the term ‘platform’ is now being contested with alternative; ‘Internet Intermediary’ . This term is deemed to be more appropriate and highlights an important distinction between social media corporations that manage content rather than produce it themselves (Gillespie, 2018, p256). Social media corporations are actually online intermediaries that come between and facilitate the connection of others and transmit content that is produced by others (Gillespie, 2017, p 256).
Regulation of online content in the wake of the Christchurch massacre live stream is a given but how to enforce it is an entirely different story. If social media corporations are capable of self regulation, will they also be capable of restraining themselves from excessive intervention or curation of users news feeds? This remains to be seen as the scandalous misuse of Facebook user data by Cambridge Analytica, is one reason to suggest that government intervention may be necessary. The New Zealand government has already served a prison sentence for one person who was caught sharing the Facebook live streaming footage of the Christchurch Mosque shootings. Contemporary social media corporations may technically be regarded as ‘Internet Intermediaries’ however the situation is complicated. Further government regulation of social media platforms may be just as unwanted as governments not doing enough to ensure that Facebook live streaming is never used again to broadcast cold blooded murder.
ABC. (2018a). Aussies caught up in Cambridge Analytica data breach feel ‘unsafe’ and ‘angry’. Retrieved from https://www.abc.net.au/news/2018-04-11/australians-caught-up-cambridge-analytica-data-breach/9640998
ABC. (2018b). Facebook founder Mark Zuckerberg faces Congress over Cambridge Analytica scandal — World News with Matt Bevan. Retrieved from https://www.abc.net.au/radionational/programs/breakfast/world-news-with-matt-bevan/9640004
ABC. (2019a). Why you should think twice about watching the Christchurch shooting live stream. Retrieved from https://www.abc.net.au/news/2019-03-15/christchurch-shooting-live-stream-think-twice-about-watching-it/10907258
ABC. (2019b). Man jailed for sharing Christchurch massacre video with 30 ‘associates’. Retrieved from https://www.abc.net.au/news/2019-06-18/new-zealand-man-who-shared-christchurch-massacre-video-jailed/11221444
Gillespie, N. (2019). Mark Zuckerberg Is Calling for Regulation of Social Media To Lock in Facebook’s Position. Retrieved from https://reason.com/2018/03/22/the-real-reason-mark-zuckerberg-calling
Gillespie, T. (2018). Regulation of and by Platforms in The SAGE Handbook of Social Media, edited by Jean Burgess, et al., SAGE Publications, 2017. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/qut/detail.action?docID=5151795.
Wynne, E. (2019). Social media bosses could face charges if violent content isn’t removed. Retrieved from https://www.abc.net.au/radio/programs/worldtoday/social-media-face-charges-if-violent-content-isnt-removed/10939994