The European Parliament has just approved the new text of the copyright directive, which will now go to the Council for a final vote on April 15th, 2019. This legislation not only modifies the copyright framework as set out in the Information Society Directive (Directive 2001/29/EC) but it will also modify the liability regime as enshrined in the E-Commerce Directive (Directive 2000/31/EC) for online content-sharing service providers. Against this backdrop, a team of researchers decided to investigate the impact of automated copyright enforcement mechanisms on cultural diversity. The infamous article 13 of the Proposed new Copyright Directive (COM(2016)0593 – C8-0383/2016 – 2016/0280(COD), now article 17 of the final text, renders the content-sharing platforms liable for the content uploaded by their users if the platform also gives public access to copyright materials. Whilst it is understandable why these platforms should obtain a licence for the copyright-protected content shared, the new obligation to adopt ‘best efforts to ensure the unavailability of […] works and other subject matter […]; and in any event, [act] expeditiously, upon [notification] from the right-holders, to disable access to, or to remove from, their websites the notified works or other subject matter, and made best efforts to prevent their future uploads‘ has proven much more problematic to explain. It indirectly refers to so-called ‘upload filters’, which are implemented via complex algorithms identifying copyright infringements and deciding to block access to content, even prior to upload and without human oversight, in order to protect and promote cultural diversity. To counter this, the text attempts to offer some relief by adding that copyright exceptions (referring to the quotation and parody exceptions) must be respected and that platforms should introduce an ‘effective and expeditious complaint and redress mechanism’ for users to challenge the decision of an algorithm.