To analyse The European Parliament has passed a series of amendments to the Digital Services Act (DSA) that make current legislation even more privacy-protective and require companies to pay greater attention to advertising technology, compliance from user choice and web design.
the DSA, put forward by the European Commission at the end of 2020, aims to control online services and platforms by creating “a safer digital space where the fundamental rights of users are protected and to establish a level playing field for companies”. It is a set of rules aimed at limiting illegal content and misinformation online and making digital advertising more responsible and transparent.
It complements the Digital Markets Act (AMD), which focuses on regulating big tech “gatekeepers” like Amazon, Apple, Google, Meta (Facebook), and Microsoft.
Both sets of rules – the DSA and the DMA – are expected to enter into force in 2023 or later, subject to final approval by the European Parliament and Council.
On Tuesday, MEPs voted 530 to 78, with 80 abstentions, to approve the DSA text, which will now be negotiated with member states.
“Online platforms have become increasingly important in our daily lives, bringing new opportunities, but also new risks,” said Christel Schaldemose, a Danish MEP, in A declaration. “It is our duty to ensure that what is illegal offline is illegal online. We must ensure that we have digital rules in place for the benefit of consumers and citizens.”
The Revised DSA Rules [PDF] are even stricter in some cases than they were initially, said privacy researcher and consultant Dr. Lukasz Olejnik. The register in an email. As examples, he pointed to the limitations of targeted ads and the requirement that deepfakes must be labelled.
Recital 52 prohibits advertising targeted at minors and prohibits the use of sensitive data (eg religion) to target adults. The rules also now require ad repositories run by very large platforms to include with archived ads both data about the advertiser “and, if different, the natural or legal person who paid for the ad. “.
In addition, dark schemes were explicitly prohibited: “Therefore, intermediary service providers should be prohibited from misleading or nudging service recipients and from misrepresenting or impairing the autonomy, taking of decision or choice of service recipients through the structure, design or functionality of an online interface or part thereof (“dark models”)”.
Olejnik observes that the DSA rules formalize a strict interpretation of the user consent, described more generally in the European General Data Protection Regulation (GDPR).
The amended rules state that service providers must refrain “from urging a recipient of the service to change any setting or configuration of the service after the recipient has already made an election.” In other words, browser settings not to accept tracking must be respected.
“The new and precise stipulations of granting, rejecting and withdrawing consent in a separate article are very relevant,” Olejnik said.
“This severely limits Big Tech’s leeway when it comes to data protection grant prompts. However, despite the fact that the article claims it is ‘without prejudice’ to GDPR, it is in fact with prejudice to the GDPR. I suspect this particular change may be very controversial.”
Olejnik expects European and American companies will have to make substantial changes to adapt.
“Technological risk assessments will need to be developed,” he said. “This means new needs for technology and technology policy analysis teams. There is a good chance that the privacy impact process will be adapted to this broader risk assessment process. GDPR, companies that deploy resources sooner rather than later will be better prepared.It will also be a competitive advantage.Companies will start to expect certain safeguards.
Olejnik said, however, that such a risk assessment would be futile if it was not forward-looking enough to consider future risks. And he notes that the tech giants haven’t shown much ability to anticipate the problems they’ve created.
“It is clear that Big Tech could anticipate that their infrastructures will be used in microtargeting and misinformation,” he said, pointing to his. warning post on the subject from 2016 as proof that these issues were openly discussed at the time.
“But they didn’t want to dedicate business cycles to that scope. Now they will be forced to. That’s very fortunate because technology will continue to impact companies in the future.”
Finalizing the DSA will not be without challenges: there is also the risk that these rules will end up doing more harm than good. Consider recent efforts in the United States to tame social media platforms with legislation like Texas’ HB 20, which legal scholars say violates First Amendment speech protections.
“The big issue will be weighing and balancing each of the fundamental rights and freedoms, such as privacy, security, freedom of expression,” Olejnik said. “It would be shameful if DSA enforcement ended in legalized censorship.” ®