The European Union has continued its campaign to regulate large multinational tech companies’ activities in Europe. The Digital Services Act (DSA), which was approved by the European Parliament last Saturday, follows the Digital Markets Act (DMA), which was approved last month.
According to the EU, the DSA and DMA have two big goals: “create a safer digital space in which the fundamental rights of all users of digital services are protected” and “establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally.”
In practice, this means overseeing how large social networks, search engines, and other tech companies do business, and limiting how they use consumer data.
The DSA in particular has rules targeted at online services like Facebook, Instagram, Google, and TikTok. It bans targeted advertising aimed at children, or based on sensitive data like religion, gender, race, sexual orientation, or political affiliation. It also bans “dark patterns” or deceptive design elements that can trick you into buying or signing up for something unintentionally. For example, websites will have to present the buttons to opt in and out of targeted ads equally; the option to opt out can’t be tucked away behind a text link on the second page of settings and written in a small font colored to match the background. Unless US tech companies create separate page and app designs just for EU customers, this will hopefully improve the web user experience around the world.
The strictest laws will only apply to Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). These are services that have at least 45 million monthly EU users. The DSA will require VLOPs to analyze the systemic risks they face and file a report every year. These companies will also be responsible for preventing illegal content, like hate speech, child sexual abuse material, and scams from appearing on their platforms. For marketplaces like Amazon, this will also include provisions forcing them to vet third-party traders to ensure that they aren’t selling counterfeit or dangerous goods.
As part of this process, the EU is also requiring that platforms, in the name of accountability and transparency, reveal details about how their algorithms recommend content. What specifics they will be forced to turn over remain to be seen, but it is likely to be more than Facebook or Twitter want to reveal.
Controversially, one provision of the DSA, pushed through in response to the Russian invasion of Ukraine and the resulting online misinformation campaign, is the so-called “crisis mechanism”. This can be activated by the European Commission in response to an emerging crisis and would allow it to analyze and control how VLOPs and VLOSEs react to risky information and implement additional safeguards if needed . For example, it could theoretically force Facebook to ban posts from Russian State agencies, or Covid-19 misinformation.
Failure to comply with the DSA carries potentially huge fines: up to 6 percent of annual turnover. Alphabet, Google’s parent company, had $258 billion in annual revenue last year. Should it have committed some serious breach of the regulations, it could have been on the hook for more than $15 billion dollars.
Because of how the European Union works, the DSA itself isn’t exactly a law yet. The final text of the document hasn’t even been confirmed—what was approved was a political agreement around the general principles. Instead, over the next year or two all the countries in the European Union will pass laws to align with the DSA. Then, on either January 1, 2024 or 15 months after the full text of the DSA is published, whichever comes later, it will take effect.