When Europe put in place the world’s strictest rules on Internet privacy in 2018, the General Data Privacy Regulation, known as the GDPR, spread around the globe and became a gold standard for protecting personal data. In comparison, the US has no federal data protection law.
Today, Europe’s new emphasis is on clearing away overregulation and spurring competitiveness. Its recently proposed Digital Omnibus includes a haircut to GDPR. Rather than treating nearly all data-related activity as a privacy issue, the revisions attempt to separate harmless data processing from true privacy harm.
The reforms have triggered familiar reactions. Industry calls it overdue relief. Civil society derides them as a dangerous rollback. The European Commission insists they represent simplification, not deregulation, and points to an estimated €5 billion in reduced compliance costs by 2029.
GDPR governs data processing, and data processing is everywhere. Payroll systems, customer support logs, spam filters, authentication tokens, billing records, fraud detection, security monitoring — the daily mechanics of a digital economy.
Privacy harms are different. They arise from specific dangerous practices such as surveillance, profiling, behavioral manipulation, biometric identification, and location tracking.
The proposed reforms separate the two categories. They do so in limited but deliberate ways, by easing some requirements where data use is low-risk and technical, without touching the core rules that apply to tracking, profiling, or decisions about people. Companies would still have to act responsibly and transparently, but some routine operations would no longer automatically trigger the heaviest privacy protections.
That means less pressure to treat every technical activity as a consent event, and fewer ritual consent requests for tasks that do not raise genuine privacy concerns but currently absorb a disproportionate share of red tape. Much of GDPR compliance has become less about solving genuine privacy problems than about pushing paper.
Consider cookies. Whenever a European navigates to a website, a cookie banner appears. To clear the screen, users click Accept All. Consent fatigue does not represent a meaningful choice.
The reforms would take a first step toward reducing the endless banner treadmill. Privacy advocates who complain about weak enforcement of GDPR treat any reduction in consent friction as suspect, as if the banner ritual itself represented the core of privacy protection.
One of the most sensitive changes concerns how the law decides when data counts as “personal.” Under the GDPR, data is protected if it can be linked to a person — but the rules have never been clear. In practice, almost any data can be treated as personal if someone, somewhere, could theoretically identify an individual.
This stretch has pulled a vast amount of routine technical information into the GDPR. The proposed reforms narrow that uncertainty by asking a simpler, more practical question: can the company actually use the data it holds to identify a real person? If not, the data would no longer automatically trigger the full weight of privacy rules.
For the largest tech companies, these changes are unlikely to feel dramatic. They already have teams, processes, and workarounds in place. The reform may remove some paperwork and clarify a few grey areas, but it does not rewrite the rules of the game. In practice, this means adjustment rather than transformation — incremental change.
The reform will not make the pop-ups, notices, and paperwork disappear overnight. Many of the forces that generate them remain, and exceptions and carve-outs limit how far simplification can go. Companies will still find ways to ask for consent, and users will still be asked to click through notices – even if fewer of them. The package trims the edges without a real redesign.
Europe will not fix its competitiveness problem by trimming compliance costs around cookie clicks. But the reform matters. It nudges Europe in the right direction. For the first time, Europe admits that data processing is not synonymous with privacy harm.
Dr. Anda Bologa is a Senior Researcher with the Tech Policy Program at the Center for European Policy Analysis (CEPA).
Bandwidth is CEPA’s online journal dedicated to advancing transatlantic cooperation on tech policy. All opinions expressed on Bandwidth are those of the author alone and may not represent those of the institutions they represent or the Center for European Policy Analysis. CEPA maintains a strict intellectual independence policy across all its projects and publications.
2025 CEPA Forum Tech & Security Conference
Explore the latest from the conference.