Policymakers are prioritizing the rapid adoption of the Digital Markets Act (DMA). The DMA introduces new obligations intended to promote competition in markets where technology firms, often foreign, are perceived as dominant and accused of stifling innovation.
Data collection powers the alleged “gatekeeper’s” market dominance. In response, a key DMA provision will require gatekeepers to share their data. If data circulates, European policymakers believe innovation will be stimulated.
Yet crucial questions remain unanswered about the DMA’s consistency and compatibility with other European legislation. In particular, a potential conflict looms with the continent’s landmark privacy standard, the General Data Protection Regulation (GDPR).
The DMA forces gatekeepers to require consent for combining data. The GDPR offers the legal basis for such processing and provides legitimate reasons for combining data, for example, to prevent fraud or buttress security. These positive use cases do not fit well with consent. Gatekeepers could be prevented from analyzing data from multiple sources to detect attacks, weakening the systemic level of security of platforms’ digital ecosystem
The DMA also requires gatekeepers to offer customers “continuous and real-time access” to their “aggregated and non-aggregated data,” free of charge. There’s no requirement – or often even the ability – for the gatekeepers to know how these customers plan to use the data. GDPR violations could proliferate.
When users’ personal data is shared, the GDPR requires companies to tell them exactly who are the “recipients” of their data. But how can this happen if the recipients are unknown at the time that the platform initially collects the data?
Similarly, GDPR rules state that personal data must be collected for a specific use, and not be processed subsequently for another “incompatible” purpose. If this obligation is breached, it is a criminal offense under French law. How can a company receiving data from a gatekeeper platform ensure that its use remains “compatible”?
The DMA is designed to help small and medium-sized companies. But few have the legal resources to carry out this complex compatibility analysis. If a recipient of data uses it for an incompatible purpose, will the platform be held liable as an accomplice?
GDPR requires all companies that collect personal data to ensure the security of that data throughout its entire life cycle. Platforms must build data protection into their services and interfaces by design. This obligation does not fit well with their obligation to give access to their data “in real-time” to customers.
What will happen when a security breach caused by a cyberattack affects a data recipient? Will that recipient company be the only one held liable, or will the platform be held jointly liable? And, since these are potentially criminal offenses, will we end up sending tech employees to prison? Paradoxically, these uncertainties increase the risk that small and medium-sized companies receiving data, the ones that the DMA aims to support, will fail to comply with the rules and commit criminal offenses.
An even larger scale risk could arise if this “aggregated and non aggregated data” is obtained through malicious actors, such as a foreign troll farm or government-sponsored cyberattack. The DMA does foresees safeguards that would prevent malicious apps to be installed on an operating system but does not include such safeguards when it comes to data relating to EU users or the prevention of fraud or cyberattacks.
These legal uncertainties damage trust, which is critical for both data sharing and innovation.
Some solutions that could be contemplated. Gatekeepers could be authorized to differ or suspend any access to data (either personal or non-personal) in case of a serious and documented doubt about the level of security provided by the business partner. In order to provide proper regulatory oversight, such suspension would be followed by a notification to the Commission.
Another solution could be codes of conduct or data sharing frameworks specifying the rules applicable to data sharing and codifying the privacy and security obligations of gatekeepers and recipient organizations. Such a code of conduct would help small and medium-sized companies access data without bearing the cost of the determination of the proper level of security and compatibility analysis.
These safeguards could bring much-needed clarity as governments move to regulate Big Tech.
Yann Padova is an attorney at Baker McKenzie and former secretary-general of CNIL, France’s data protection