Skip to main content
Login | Suomeksi | På svenska | In English

Who Can Turn a Blind Eye to Cyber Hate? : Council of Europe’s evolving practices and new challenges

Show simple item record

dc.date.accessioned 2015-03-31T07:32:32Z
dc.date.available 2015-03-31T07:32:32Z
dc.identifier.uri http://hdl.handle.net/123456789/842
dc.title Who Can Turn a Blind Eye to Cyber Hate? : Council of Europe’s evolving practices and new challenges en
ethesis.faculty Faculty of Law en
ethesis.faculty Juridiska fakulteten sv
ethesis.faculty Oikeustieteellinen tiedekunta fi
ethesis.faculty.URI http://data.hulib.helsinki.fi/id/5a29ad0e-46f3-4834-92a6-4a95699cc1e8
ethesis.university.URI http://data.hulib.helsinki.fi/id/50ae46d8-7ba9-4821-877c-c994c78b0d97
ethesis.university Helsingfors universitet sv
ethesis.university University of Helsinki en
ethesis.university Helsingin yliopisto fi
dct.creator Majlander, Sini
dct.issued 2015
dct.language.ISO639-2 eng
dct.abstract Liability of online operators for user-generated content is a topical issue in Europe. The case of Delfi AS v. Estonia, which is currently pending before the Grand Chamber of the European Court of Human Rights, is one example of a situation where questions related to this issue are analysed. The First Section judgment finding the online news portal operator, Delfi, liable for defamatory user-generated comments was widely criticized. Moreover, other topical issue in Europe regards dissemination of cyber hate. This thesis combines these two elements and seeks to answer the following question: Which online entities, if any of them, are liable for the dissemination of user-generated cyber hate? The analysis is limited to the approach of the Council of Europe, although the rules and principles adopted in the United States are referred to due to their trendsetter status. Freedom of expression is protected by Article 10 of the European Human Rights Convention. According to the rules and principles adopted in the framework of the Council, this Article does not protect ‘hate speech’ or its online version ‘cyber hate’. However, there exists no clear definition of ‘hate speech’. In the strategies adopted by different Council bodies and in the case-law of the Court, several categories of speech have been considered as ‘hate speech’. However, this practice has been neither clear nor consistent. This is especially regrettable noting that, according to the Court’s case-law, ‘hate speech’ can be categorically excluded from the protection of Article 10 by using the probation of the abuse of rights clause provided for in Article 17 of the Convention. In the course of this research, I come to oppose said the application of said Article due to the unnecessary risks it poses on the enjoyment of freedom of expression. Moreover, I strongly endorse the adoption of a legally binding definition of the central notion. Concerning liability issues, in the case-law of the Court, the media has been afforded special protection under Article 10. On the other hand, this protection is coupled with responsibilities. Therefore, professional journalists have been held liable even for dissemination of third-party content. The central elements analysed by the Court when imposing such liability in printed media cases have been the amount of editorial control and the intent of the journalist. Due to their functions, some online operators have been assimilated to these traditional media actors. They are considered content providers. However, so-called Internet service providers are a category of online operators regarded as intermediary or auxiliary entities. These entities enjoy a limited liability regime. Again, the key in making this distinction between content providers and ISPs is the amount of editorial control over the content the respective entity hosts, transmits or allows to access. Extensive control over information is coupled with wider liability. In the case of Delfi, the First Section of the Court concluded that due to the amount of control practiced by Delfi, it was to be regarded as a content provider. I agree with the main parts of the Court’s analysis. Furthermore, the liability related principles adopted by the Court in this case can be applied in relation to ‘cyber hate’ cases, although the criminal nature of these cases allows the primary liability to be imposed also on the actual authors of the content. I consider that in order for the Council’s fight against ‘hate speech’ to be effective, additional liability should be imposed on content providers and, in specific circumstances, even on ISPs. I endorse the mobilization of and co-operation with the relevant private sector actors to form guidelines on self-regulatory measures they could apply in order to comply with their duties. Accordingly, the suggested answer to the question posed in the beginning of this research is that all online operators can be liable for user-generated ‘cyber hate’ in case they neglect their respective responsibilities. In the future, the aim of the Council should be to try to hinder any attempts by these entities to rely on the so-called wilful blindness. However, any liabilities imposed must be assessed on case-by-cases basis, taking the circumstances of the specific cases into account, and respecting the inherent principles of Article 10 of the Convention. en
dct.language en
ethesis.language.URI http://data.hulib.helsinki.fi/id/languages/eng
ethesis.language English en
ethesis.language englanti fi
ethesis.language engelska sv
ethesis.thesistype pro gradu-avhandlingar sv
ethesis.thesistype pro gradu -tutkielmat fi
ethesis.thesistype master's thesis en
ethesis.thesistype.URI http://data.hulib.helsinki.fi/id/thesistypes/mastersthesis
dct.identifier.ethesis E-thesisID:7249d4d4-cd97-4bf8-b852-82078ca43dd3
ethesis-internal.timestamp.reviewStep 2015-03-09 12:57:45:524
dct.identifier.urn URN:NBN:fi:hulib-201508073425
dc.type.dcmitype Text

Files in this item

Files Size Format View
Master Thesis_Sini Majlander.pdf 1.416Mb PDF

This item appears in the following Collection(s)

Show simple item record