Europe’s Big Tech Law Is Approved. Now Comes the Hard Part


The potential gold standard for online content governance in the EU—the Digital Services Act —is now a reality after the European Parliament voted overwhelmingly for the legislation earlier this week. The final hurdle, which is a mere formality, is for the European Council of Ministers to sign off on the text in September.

The good news is that the landmark legislation includes some of the most extensive transparency and platform accountability obligations to date. It will give users real control over, and insight into, the content they engage with and offer protections from some of the most pervasive and harmful aspects of our online spaces.

The focus now turns to implementation of the vast law, as the European Commission begins in earnest to develop the enforcement mechanisms. The proposed regime is a complex structure in which responsibilities are shared between the European Commission and national regulators, in this case known as Digital Services Coordinators (DSCs). It will rely heavily upon the creation of new roles, expansion of existing responsibilities, and seamless cooperation across borders. What’s clear is that, as of now, there simply isn’t the institutional capacity to enact this legislation effectively.

In a “sneak peek,” the Commission has provided a glimpse into how they propose to overcome some of the more obvious challenges to implementation—like how they plan to supervise large online platforms, and how they will attempt to avoid the problems that plague GDPR, such as out-of-sync national regulators and selective enforcement—but their proposal only raises new questions. A huge number of new staff will need to be hired and a new European Centre for Algorithmic Transparency will need to attract world-class data scientists and experts in order to aid in the enforcement of the expansive new algorithmic transparency and data accessibility obligations. The Commission’s preliminary vision is to organize its regulatory responsibilities by thematic areas, including a societal issues team, which will be tasked with oversight over some of the novel due diligence obligations. Insufficient resourcing here is a cause for concern and would ultimately risk turning these hard won obligations into empty tick-box exercises.

One critical example is the platforms’ obligation to conduct assessments to address systemic risks on their services. This is a complex process that will need to take into account all the fundamental rights protected under the EU Charter. In order to do this, the tech companies will have to develop human rights impact assessments (HRIAs)—an evaluation process meant to identify and mitigate potential human rights risks stemming from a service or business, or in this case a platform—something civil society urged them to do throughout the negotiations. It will, however, be up to the Board, made up of the DSCs and chaired by the Commission, to annually assess the most prominent systemic risks identified, and outline best practices for mitigation measures. As someone who has contributed to developing and assessing HRIAs, I know that this will be no easy feat, even with independent auditors and researchers feeding into the process.

If they are to make an impact, the assessments need to establish comprehensive baselines, concrete impact analyses, evaluation procedures, and stakeholder engagement strategies. The very best HRIAs embed a gender-sensitive approach and pay specific attention to systemic risks that will disproportionately impact those from historically marginalized communities.

This is the most concrete method for ensuring all potential rights violations are included.

Luckily, the international human rights framework, such as the UN Guiding Principles on Human Rights, offers guidance on how best to develop these assessments. Nonetheless, the success of the provision will depend upon how platforms interpret and invest in these assessments, and even more so on how well the Commission and national regulators will enforce these obligations. But at current capacity, the ability of the institutions to develop the guidelines, best practices and to evaluate mitigation strategies is nowhere near the scale the DSA will require.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: