Upload Filtering Mandate Would Shred European Copyright Safe Harbor

shadow

After months of study, European regulators have finally released the full and final proposal on Copyright in the Digital Single Market, and unfortunately it’s full of ideas that will hurt users and the platforms on which they rely, in Europe and around the world.  Electronic Frontier Foundation have already written a fair bit about leaked version of this proposal, but it’s worth taking a deeper dive into a particular provision, euphemistically described as sharing of value. This provision, Article 13 of the Directive, requires platform for user-generated content to divert some of their revenue to copyright holders who, the Commission claims, otherwise face a hard time in monetizing their content online. I strongly support balanced and sensible mechanisms that help ensure that artists get paid for their work. But this proposal is neither balanced nor sensible.

Article 13 is short enough that its key first paragraph can be reproduced in its entirety here:

Information society service providers that store and provide to the public access to large amounts of works or other subject-matter uploaded by their users shall, in cooperation with rightholders, take measures to ensure the functioning of agreements concluded with rightholders for the use of their works or other subject-matter or to prevent the availability on their services of works or other subject-matter identified by rightholders through the cooperation with the service providers. Those measures, such as the use of effective content recognition technologies, shall be appropriate and proportionate. The service providers shall provide rightholders with adequate information on the functioning and the deployment of the measures, as well as, when relevant, adequate reporting on the recognition and use of the works and other subject-matter.

The essence of this paragraph is that it requires large user-generated content platforms to reach agreements with copyright holders to adopt automated technologies that would scan content that users upload, and either block that content or pay royalties for it.

This is Not Content ID

The automated scanning mandate described above may sound similar to what YouTube’s Content ID technology does—but there are some key differences. Perhaps the biggest difference is that whereas Content ID only scans music and video uploads, there is no such limitation in Article 13. As such the provision anticipates that any other copyright works including text, photographs, and so on, will also have to be scanned and filtered. At one stroke of a pen, this would turn the Internet from a zone of mostly permissionless free expression into a moderated walled garden. Such a general imposition on freedom of expression online is quite unprecedented in a democratic society.

Another difference from Content ID is that many additional parties would be pulled into these cooperative arrangements, both on the platforms’ side and the copyright holders’ side. On the platforms’ side, Article 13 applies to any service provider that hosts “large amounts” of user uploaded content. What are “large amounts”? No one has any way of knowing for sure, but it’s easy to think of many hundreds of websites that might qualify, including commercial sites such as Tumblr and DeviantArt, but also non-profit and community websites such as Wikipedia and the Internet Archive.

On the copyright holders’ side, which rightsholders are platforms required to negotiate with? Article 13 doesn’t specify. Unless further regulations or industry agreements fill in this gap, we face the prospect that platforms might have to negotiate with hundreds or even thousands of claimants, all seeking their own share of the platform’s revenue. That’s the worst case scenario. The best case scenario is that collecting societies will step in as intermediaries, but further cementing their role in the value chain isn’t such an attractive proposition either, since most European collecting societies are national monopolies and have been known to abuse their market power.

Incompatibility with European Law and Human Rights

A law that requires Internet platforms to reach “voluntary” agreements with copyright holders, is of course, the essence of Orwellian doublethink, and a hallmark of the kind of policymaking by proxy that one might call “Shadow Regulation.” The Commission is likely taking that approach because that it knows that it can’t directly require Internet platforms to scan content that users upload — an existing law, Article 14 of the Directive 2000/31 on electronic commerce (E-commerce Directive), expressly prohibits any such requirement.

That provision, which is roughly equivalent to the safe harbor provisions in Section 512 of the DMCA, gives conditional immunity to Internet platforms for user-uploaded content, and rules out the imposition of a general obligation to monitor such content. The Court of Justice of the European Union (CJEU) ruled in two separate cases in 2011 and 2012 that this prohibition on general monitoring derives directly from Articles 8 and 11 of the European Charter of Fundamental Rights, which safeguard personal data and freedom of expression and information.

If the European Commission proposed to directly rescind the Article 14 safe harbor, this would be a clear infringement of Europeans’ bill of rights. Yet the Commission proposes to get around this through the sham arrangement of forcing companies into private agreements. Convinced? Neither am I. It’s clear law that a government can’t get around its own human rights obligations by delegating the infringement of those rights to the corporate sector.

A mandate for Internet platforms to scan and filtering users’ content is exactly the kind of general monitoring obligation that Article 14 prohibits. Rather than face a challenge to the Digital Single Market Directive in the CJEU, it would behoove the European Commission to abandon its attempt to rewrite the E-commerce Directive through Shadow Regulation before the proposal goes any further.

At this stage of the labyrinthine European legislative process, the proposal is out of the European Commission’s hands, and awaits comment by the other EU institutions, the Council of the EU and the European Parliament. That review will offer an opportunity for users to weigh in, so get ready. EFF are working with their European partners to fight back against this repressive proposal — and they will be asking for your help. Stay tuned.

For more, check out Electronic Frontier Foundation’s website. Thanks to Deeplinks.

Creative Commons License
Except where otherwise noted, the content on this site is licensed under a Creative Commons Attribution 4.0 International License.