In April 2016, a Facebook user posted on his personal page an article from the Austrian e-magazine oe24.ar entitled “Greens: Minimum income for refugees should stay”. The post generated a ‘thumbnail’ containing the title and a summary of the article, as well as a photograph of Ms Eva Glawischnig-Piesczek, a member of the Nationalrat (National Council, Austria), chair of the parliamentary party ‘die Grünen’ (the Greens) and federal spokesperson for that party. The post was accompanied by a disparaging comment published by the user, which the Austrian court found to be harmful, insulting and defamatory to Ms Glawischnig-Piesczek. Any Facebook user could access the post.
In July 2016, Ms Glawischnig-Piesczek wrote to Facebook, asking it to delete the harmful comment. As Facebook did not comply with her request, Ms Glawischnig-Piesczek brought proceedings against it before the Austrian courts and obtained an order prohibiting Facebook from publishing and/or disseminating the harmful comment and/or equivalent content. The proceedings reached the Austrian Supreme Court, which issued a preliminary ruling request on the interpretation of Directive 2000/31/EC on Electronic Commerce (the ‘E-Commerce Directive’) to the Court of Justice of the European Union (‘CJEU’).
The E-Commerce Directive provides that a hosting platform should not be liable for user-generated content, provided that it has no knowledge of any illegality and acts expeditiously upon obtaining such knowledge (Article 14(1)). This exemption is without prejudice to the power of national courts to require the hosting platform to terminate or prevent an infringement (Article 14(3)). However, national courts cannot subject hosting platforms to a general monitoring obligation or impose a general obligation to seek facts or circumstances indicating illegal activity (Article 15(1)).
In light of the above, the CJEU was asked to decide whether Article 15(1) of the E-Commerce Directive precluded national courts from requiring a hosting platform to remove content which is identical to content which was previously declared to be illegal, and any content with an equivalent meaning and if so, whether the national court could extend the injunction so that it had effect worldwide.
The CJEU answered the questions regarding identical content, equivalent content and the relevant injunction having effect worldwide as follows:
Recital 47 of the E-Commerce Directive clarifies that Article 15(1) prohibits monitoring obligations of a general nature only, whereas monitoring obligations “in a specific case”1 are permitted. On this basis, the CJEU considered that requiring hosting platforms to continue to identify and remove content identical to a specific piece of information declared illegal by a national court is not contrary to Article 15(1). According to the CJEU, this is the case irrespective of who uploads the information.2
The CJEU described “equivalent content” as conveying a message that is “essentially unchanged” and therefore diverges very little from the content which gave rise to the finding of illegality but may be “worded slightly differently”.3 The court considered that requiring platforms to remove equivalent content may be necessary to ensure that victims of illegal content do not need to bring multiple legal actions. However, the E-Commerce Directive aims to balance victims’ interests against those of hosting platforms, which should be shielded from any general monitoring obligations.4 The CJEU found that identifying “specific elements” that “equivalent information” should contain in the relevant injunction (e.g., the victim’s name) is necessary to strike the appropriate balance and ensure that no general monitoring obligation arises contrary to Article 15(1). Importantly, the CJEU clarified that differences between the illegal and equivalent content must not be such as to warrant “an independent assessment” of the content by the hosting platform. According to the CJEU, this would enable host providers to identify and remove equivalent content using automated search tools and technologies alone.5
Pursuant to Article 18(1) of the E-Commerce Directive, member states must ensure that their national courts can adopt rapid measures to terminate or prevent any infringement on online platforms. The CJEU viewed this requirement as particularly broad and not subject to a territorial limitation. Therefore, it ruled that injunctions requiring host providers to remove identical or equivalent content can have worldwide effect, provided that any applicable international rules are also observed by the national courts.
The CJEU’s decision has attracted criticism around key questions regarding freedom of expression. The issues that have caused the most controversy are how ‘equivalent content’ is to be defined, the court’s perceived ignorance of the technology involved in content monitoring, and the court’s decision to open the door to injunctions relating to illegal online content issued by member states’ national courts having worldwide effect.
Definition of ‘equivalent content’
Despite the CJEU’s attempts to define ‘equivalent’ precisely, critics argue that the concept is inherently vague and described by reference to questions open to interpretation – e.g. whether differences in wording are “slight”, etc. Given the increasing pressure on tech companies from regulators and the media alike, there are concerns that this may cause platforms to interpret ‘equivalent content’ broadly, thereby compromising free speech.
The CJEU’s claim that hosting platforms can identify equivalent content relying on automated tools alone has been described as “highly erroneous”6 and “deeply flawed”.7 The court is seen as “presuming a level of technological sophistication and degree of specificity that simply do not, and likely never will, exist”.8
The CJEU stressed that illegality does not “stem from the use of certain terms combined in a certain way”, but from “the message conveyed”.9 Critics of the judgment argue that human assessment will be needed to interpret the conveyed message.
The CJEU has been accused of paving the way to global censorship by enabling EU member states with censor-prone legislation to set global speech standards through their courts. It is also argued that the decision contradicts the CJEU’s recent ruling in CNIL v Google10 that Google does not have to apply the right to be forgotten globally, which confirmed that EU laws end at the EU’s borders.
The decision in Glawischnig-Piesczek is significant in that it appears to allow the national courts of EU member states to require hosting platforms to filter content worldwide – not generally, but potentially in a very wide range of circumstances given the obligation to remove identical content and the uncertainty surrounding the definition of equivalent.
The decision in Glawischnig-Piesczek has wider implications in that it informs the debate between member states about the extent to which tech companies should be liable for user-generated content. Member states like Germany, for example, have introduced strict hate speech laws subjecting tech companies to fines for failing to police illegal content, whereas other member states have taken a more relaxed approach. The ruling may also influence debates regarding the EU’s proposal for a Digital Services Act, which is intended to reform EU rules on liability and safety of digital platforms.11
Client Alert 2019-266
- Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market, Recital (47).
- Eva Glawischnig-Piesczek v Facebook Ireland Limited, Case C‑18/18 ECLI:EU:C:2019:821, paras. 37 and 53.
- Eva Glawischnig-Piesczek v Facebook Ireland Limited, Case C‑18/18 ECLI:EU:C:2019:821, paras. 39 and 41.
- Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market, Recital (41).
- Eva Glawischnig-Piesczek v Facebook Ireland Limited, Case C‑18/18 ECLI:EU:C:2019:821, paras. 46 and 53.
- Jennifer Daskal, “A European Court decision may usher in global censorship” Slate Magazine (3 October 2019).
- “ECJ ruling has risks for freedom of speech online”, The Financial Times (3 October 2019).
- See fn 6 above.
- Eva Glawischnig-Piesczek v Facebook Ireland Limited, Case C‑18/18 ECLI:EU:C:2019:821, para. 40.
- Commission nationale de l’informatique et des libertés (CNIL) v Google LLC Case C 507/17 ECLI:EU:C:2019:772.
- Ursula von der Leyen, Political Guidelines for the Next European Commission 2019 - 2024.