A landmark judgment from the Court of Justice of the European Union (CJEU) has fundamentally reshaped the data protection landscape for online marketplaces, establishing that these operators can be deemed “joint controllers” of personal data posted by their users and are therefore subject to a demanding new set of proactive responsibilities. The ruling, delivered late last year, effectively dismantles the long-standing reactive “notice-and-takedown” model that platforms have relied on for decades. It ushers in a new era of accountability, mandating a proactive “verify-and-protect” framework that places the onus squarely on platform operators to safeguard user data, particularly when sensitive personal information is involved. This decision signals a significant turning point in digital regulation, with far-reaching consequences for the operational and legal strategies of countless companies within the digital platform economy.
The Genesis of a Landmark Decision
The case that prompted this seismic shift originated in Romania, where a woman became the victim of a malicious online advertisement. In 2018, an unknown individual posted a fraudulent ad on a marketplace operated by Russmedia Digital SRL, falsely implying the woman offered sexual services and unlawfully publishing her personal photographs and phone number. Although Russmedia acted swiftly to remove the advertisement within an hour of being notified, the damage had already been done. The content was scraped by other websites, leading to continued harassment and distress for the victim. Her subsequent legal action in Romania raised complex questions about the intersection of data protection law and the liability exemptions granted to online intermediaries. Faced with this novel legal challenge, the Romanian court referred the case to the CJEU, seeking a definitive ruling on whether the liability shield provided by the E-Commerce Directive could absolve a platform of its direct obligations under the General Data Protection Regulation (GDPR).
The referral presented the CJEU with two pivotal questions that cut to the heart of digital platform liability. First, the court was asked to determine the relationship between two cornerstone pieces of EU legislation: the GDPR, with its stringent data protection requirements, and the E-Commerce Directive, which offers a “safe harbor” liability exemption for hosting providers that act as neutral conduits for user-generated content. The central issue was whether a platform could claim protection under the E-Commerce Directive to avoid its responsibilities under the GDPR when personal data was involved. Second, the CJEU was asked to outline the specific, practical duties an online marketplace operator would have if the GDPR were indeed found to apply directly to its operations. This second question sought to translate the broad principles of the GDPR into a concrete set of technical and organizational measures that platforms would be legally required to implement, moving beyond abstract legal theory into tangible operational mandates for the entire industry.
Redefining Responsibility as Joint Controllers
Before directly answering the referred questions, the CJEU laid a critical legal foundation by interpreting the facts of the case through the lens of the GDPR. The Court first affirmed that any information pertaining to an individual’s sex life or sexual orientation constitutes “special categories of personal data,” more commonly known as sensitive data, under Article 9 of the regulation. Crucially, it clarified that the truthfulness of this information is irrelevant; the mere suggestion or allegation is enough to trigger the highest level of data protection. Consequently, the false advertisement was deemed to involve the processing of sensitive personal data, immediately raising the legal stakes for the platform operator. This interpretation closed a potential loophole where platforms might argue that false information does not fall under data protection rules, confirming that the potential for harm is the key consideration.
The most consequential part of the judgment was the Court’s determination that the platform operator, Russmedia, was not a mere passive intermediary but a “joint controller” of the personal data published in the fraudulent advertisement, sharing this legal status with the anonymous user who posted it. The CJEU applied its consistently broad interpretation of the “controller” concept, which includes any entity that influences the purpose and means of data processing. The Court reasoned that while the user determined the ad’s content, the platform played an essential and decisive role in its publication. This was based on several factors: the platform provided the necessary technological infrastructure, set the operational rules for publishing content, and ran its service for its own commercial objectives, profiting from user-generated ads. The most compelling evidence of this “decisive influence,” according to the Court, was found in the platform’s own terms and conditions, which granted the company a general right to reuse the content, demonstrating it was an active participant influencing the data processing for its own benefit.
A New Mandate for Proactive Protection
Having established the platform’s status as a joint controller, the CJEU proceeded to detail the significant, proactive obligations that this status imposes under the GDPR. The Court emphasized that operators must now integrate data protection principles by design and by default into their systems, ensuring their platforms can actively demonstrate compliance with core tenets like lawfulness, accuracy, and accountability. A primary requirement outlined by the Court is the implementation of mechanisms to systematically identify advertisements that potentially contain sensitive data. This forces platforms to develop and deploy automated or manual systems to flag high-risk content for review before it is published, enabling them to verify compliance with the stringent conditions for processing sensitive data under Article 9 of the GDPR. This marks a fundamental shift from waiting for user complaints to actively policing content as it is submitted.
The ruling further translated the GDPR’s prohibition on processing sensitive data without a valid legal basis into a direct operational duty for platforms. When an ad contains sensitive information about another person, the operator must now take reasonable steps to verify that the user has obtained that person’s explicit consent or can rely on another legal justification. This effectively requires platforms to build and implement identity and consent verification procedures for any high-risk postings, a complex and resource-intensive task. As a direct consequence of this verification duty, if a user cannot provide a valid legal basis for publishing another individual’s sensitive data, the platform has an affirmative obligation to refuse or block the publication of the advertisement. This moves platforms from the role of a neutral bulletin board to that of a gatekeeper responsible for upholding the data rights of individuals who may not even be users of the service.
No Hiding Behind Old Liability Shields
Addressing the high-risk nature of publishing sensitive data, the CJEU also invoked a controller’s obligations under Article 32 of the GDPR, which mandates the implementation of security measures appropriate to the risk. The Court specified that an operator’s duty extends to taking active steps to prevent the unauthorized dissemination of this data beyond its own platform. This includes implementing technical measures, “as far as technically possible,” to block or hinder the copying and reproduction of sensitive information, such as images or text, by third-party scrapers and bots. This part of the ruling directly tackles the problem of viral proliferation, where content removed from one site continues to cause harm after being republished elsewhere. It places a new burden on platforms to not only manage content on their own site but also to actively work to contain it, recognizing the borderless nature of the modern internet.
In its final and most definitive clarification, the CJEU ruled on the interplay between the two key legislative frameworks at the heart of the case. The Court held unequivocally that the liability exemption for hosting providers under the E-Commerce Directive cannot be invoked to shield an online marketplace operator from its direct responsibilities as a data controller under the GDPR. It explained that the two legal frameworks operate in parallel and serve different, though complementary, purposes. While the E-Commerce Directive may limit an intermediary’s liability for certain types of illegal content, such as defamation or copyright infringement, the GDPR imposes a distinct and more demanding set of positive obligations on any entity that qualifies as a controller of personal data. Therefore, even if a platform might benefit from the liability shield in other contexts, it remained fully subject to the requirements, responsibilities, and potential liabilities of the GDPR whenever the content it hosted included personal data. This judgment closed a significant legal gray area and affirmed the primacy of data protection law in the digital sphere.
