• Home
  • News
  • Business
  • Gear
  • Reviews
  • Games
  • Science
  • Security
Reading: Europe’s Big Tech Law Is Approved. Now Comes the Hard Part
Share
Ad image
Technology MagazineTechnology Magazine
Aa
  • News
  • Business
  • Gear
  • Reviews
  • Games
  • Science
  • Security
Search
  • Home
  • News
  • Business
  • Gear
  • Reviews
  • Games
  • Science
  • Security
Have an existing account? Sign In
Follow US
Technology Magazine > News > Europe’s Big Tech Law Is Approved. Now Comes the Hard Part
News

Europe’s Big Tech Law Is Approved. Now Comes the Hard Part

Press room
Press room Published July 8, 2022
Last updated: 2022/07/27 at 11:43 AM
Share
SHARE

The potential gold standard for online content governance in the EU—the Digital Services Act—is now a reality after the European Parliament voted overwhelmingly for the legislation earlier this week. The final hurdle, a mere formality, is for the European Council of Ministers to sign off on the text in September.

The good news is that the landmark legislation includes some of the most extensive transparency and platform accountability obligations to date. It will give users real control over and insight into the content they engage with, and offer protections from some of the most pervasive and harmful aspects of our online spaces.

The focus now turns to implementation, as the European Commission begins in earnest to develop the enforcement mechanisms. The proposed regime is a complex structure in which responsibilities are shared between the European Commission and national regulators, in this case known as Digital Services Coordinators (DSCs). It will rely heavily on the creation of new roles, expansion of existing responsibilities, and seamless cooperation across borders. What’s clear is that as of now, there simply isn’t the institutional capacity to enact this legislation effectively.

In a “sneak peek,” the commission has provided a glimpse into how they propose to overcome some of the more obvious challenges to implementation—like how they plan to supervise large online platforms and how they will attempt to avoid the problems that plague the General Data Protection Regulation (GDPR), such as out-of-sync national regulators and selective enforcement. But their proposal only raises new questions. A huge number of new staff will need to be hired and a new European Centre for Algorithmic Transparency will need to attract world-class data scientists and experts to aid in the enforcement of the new algorithmic transparency and data accessibility obligations. The Commission’s preliminary vision is to organize its regulatory responsibilities by thematic areas, including a societal issues team, which will be tasked with oversight over some of the novel due diligence obligations. Insufficient resourcing here is a cause for concern and would ultimately risk turning these hard-won obligations into empty tick-box exercises.

One critical example is the platforms’ obligation to conduct assessments to address systemic risks on their services. This is a complex process that will need to take into account all the fundamental rights protected under the EU Charter. In order to do this, tech companies will have to develop human rights impact assessments (HRIAs)—an evaluation process meant to identify and mitigate potential human rights risks stemming from a service or business, in this case a platform—something civil society urged them to do throughout the negotiations. It will, however, be up to the board, made up of the DSCs and chaired by the commission, to annually assess the most prominent systemic risks identified and outline best practices for mitigation measures. As someone who has contributed to developing and assessing HRIAs, I know that this will be no easy feat, even with independent auditors and researchers feeding into the process.

If they are to make an impact, the assessments need to establish comprehensive baselines, concrete impact analyses, evaluation procedures, and stakeholder engagement strategies. The very best HRIAs embed a gender-sensitive approach and pay specific attention to systemic risks that will disproportionately impact those from historically marginalized communities.

This is the most concrete method for ensuring all potential rights violations are included.

Luckily the international human rights framework, such as the UN Guiding Principles on Human Rights, offers guidance on how best to develop these assessments. Nonetheless, the success of the provision will depend on how platforms interpret and invest in these assessments, and even more so on how well the commission and national regulators will enforce these obligations. But at current capacity, the ability of the institutions to develop guidelines and best practices and to evaluate mitigation strategies is nowhere near the scale the DSA will require.

Press room July 8, 2022
Share this Article
Facebook TwitterEmail Print
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

You Might Also Like

News

You can now try Microsoft Loop, a Notion competitor with futuristic Office documents

7 Min Read
News

Nanoleaf’s new Thread and Matter compatible smart bulbs arrive but without one key feature

7 Min Read
News

How to watch Epic Games’ State of Unreal 2023

1 Min Read
News

Pete Buttigieg still believes in smart cities

8 Min Read
  • Review
  • Top Lists
  • Contact
  • Privacy Policy
  • Terms of use

We influence 20 million users and is the number one business and technology news network on the planet.

I have read and agree to the terms & conditions

Contact US

  • Contact Us
  • DMCA
  • Editorial Policy
  • Advertise

Quick Link

  • Gear
  • Games
  • Security
  • Reviews

© 2022 Technology Magazine. All Rights Reserved.

Follow US on Socials

Removed from reading list

Undo
Welcome Back!

Sign in to your account

Lost your password?