17 Nov Regulate Big Tech
Heard a lot about Big Tech lately…………… what are they up to?
________________________________
Source – https://www.techregister.co.uk/the-eus-push-to-regulate-big-tech-platforms-is-already-getting-messy/
“TECH REVIEWS
The EU’s Push to Regulate ‘Big Tech’ Platforms Is Already Getting Messy
August 3, 2021Tech Reviews
Regulating digital content and platforms was never going to be easy. As the European Union continues what is expected to be a multi-year process to turn its draft Digital Services Act into law, France appears to have jumped the gun and enacted its own version of the proposed regulations. In a 12-page rebuke couched as “observations,” the European Commission warned that France’s law “poses a risk to the single market in digital services and to Europe’s prosperity.” Just as it prepares to assume the EU’s rotating six-month presidency next January, France seems set on a collision course with the institutions charged with regulating digital content within the bloc.
At the heart of this legislative spat are tensions between EU member states and the commission on the relationship between national and EU-wide regulation of content and enforcement mechanisms, as well as policy differences on approaches to regulating online speech and Big Tech platforms.
The EU approach is set out in its flagship Digital Services Package, which was published as a draft in June 2020 and is now going through the lengthy consultation processes prior to becoming law. One of the package’s two pillars, the Digital Markets Act, or DMA, aims to foster innovation and competition on a level playing field within the European Single Market. The other, the Digital Services Act, or DSA, proposes a new regime to provide clarity on the liability of digital platform providers for user-generated content they host, updating rules that are now more than 20 years old.
The draft DSA contains a lot of commendable ideas. Rather than taking a “one size fits all” stance, the proposal sets out an asymmetric approach, reserving the strictest obligations to the small number of “very large online platforms”—defined as those with more than 45 million users in the EU—while exempting micro- and small businesses entirely from the regime. And instead of being drawn into defining what is meant by illegal content, the EU leaves such definitions to member states or already established EU rules. The aim is to introduce greater transparency throughout the digital environment for intermediaries, hosting services, online platforms and Big Tech companies. In circumstances where a hosting provider decides to remove content, the DSA introduces obligations requiring them to inform the user and provide clear reasons for its decision.
The EU also aims to clarify the exemptions from liability for providers and create incentives for them to take proactive measures when it comes to illegal content . These proposals would update EU rules that have been in place for “mere conduits,” or intermediaries, since the early 2000s and which are broadly equivalent to the United States’ Section 230 of the Communications Decency Act.
In what is becoming the regulatory norm for EU laws, the DSA will have extraterritorial scope, applying not only to businesses established within the EU, but also to those that offer services within the union, determined by factors such as having a significant number of users or targeting activities toward one or more member states. In this way, the EU plans to export its approach to regulating online speech to U.S. Big Tech firms, similarly to how it imposed its General Data Protection Regulation, or GDPR, which those firms have almost universally adopted in order to continue operating in the EU market. This might be more complicated when it comes to free speech, though, as the marked differences in approach between how the U.S. and EU interpret the right to freedom of expression are likely to come to the fore on implementation. In fact, the draft regulation seems keen to avoid these potential thorny disputes by refraining from definitions of illegality and, unlike a proposed U.K. law, by excluding the controversial category of legal but harmful speech, such as content that encourages self-harm, anorexia or suicide.
At the heart of the current tensions between France and the commission is the enforcement regime. The DSA will have a hierarchy of enforcement bodies that start at the level of the member state, with each designating a Digital Services Coordinator. The sticking point is that, as with the GDPR, rather than leaving enforcement within the control of the member state, Article 40 of the draft DSA makes the “member state in which the main establishment of the provider … is located” the ultimate competent authority. In practice, this means Ireland and Luxembourg, where almost all the Big Tech companies have their European headquarters. The French law would enable France to investigate and punish Big Tech platforms itself, rather than having to rely on the responsiveness of other countries’ regulators.
“France is moving faster and more aggressively than the DSA, with stricter limits for takedowns and higher liabilities for platforms,” Konstantinos Komaitis, the senior director of policy strategy and development at the Internet Society, told me. Although the French law has a virtual self-destruct button, which will remove it from national law at the point when the DSA comes into force, Komaitis questions how that would work in practice. “How easy is it to declare a law void once it has been integrated into your legal and social environment?”
Instead, the French law seems designed to signal to domestic audiences that France is serious about going after content and online platforms that condone or promote terrorism. Through this unusual and legally questionable route, France may also be seeking to impose its will on European lawmakers, hoping to influence some amendments into the DSA.
France is not alone in pursuing and adopting national legislation on digital content. Poland praised the French approach, noting that it intends to enact domestic legislation with similar provisions. Germany’s controversial NetzDG law, which came into force in 2018 in response to widespread concerns about disinformation and social divisions, compels social media platforms to remove “unlawful” digital content and has drawn criticism from human rights organizations. And the U.K., now out of the EU, is also pressing forward with its controversial Online Harms Bill which sets out a regime for the oversight of a wide range of speech, from content promoting terrorism and portraying child sexual abuse at one end of the spectrum, to legal but harmful content at the other, all under the auspices of a single regulator.
Thierry Breton, the French businessman turned European commissioner for the internal market, last week pushed back against the argument that the DSA will usurp the prerogatives of member states, arguing that “cross-border digital services need cross-border rules.” He’s right, of course, and his intervention highlights the risks of an ever more fragmented regulatory approach to online platforms.
The differences between France and the EU illustrate how localized the interpretations of harmful or illegal digital content ultimately are. In line with EU ambitions to shape the world’s digital regulation through strategic application of extraterritorial provisions, the proposed Digital Services Act aims to create a single, rational governance system for regulating online content. The reality is likely to be messier. As Big Tech and other digital providers try to make sense of a plethora of conflicting, confusing laws from multiple jurisdictions, the risks of inconsistent legal outcomes, impediments to international data flows and regulatory arbitrage will only increase.”
Sorry, the comment form is closed at this time.