The “free” app on your phone may be the most expensive thing you use. In Europe, three scholars argue the law already has the tools to call that deal unfair.
Their claim is simple and provocative: stop treating consumer law and data protection as separate worlds. In data-driven markets, many deals hinge on personal data, so the two legal toolkits should work together to police the whole bargain. That marriage—what they call “data consumer law”—could change how “free” services are judged.
Two toolkits, one market
Consumer law is about fair contracting: the balance of rights and duties in a transaction. Data protection law (anchored in the GDPR) is about fair processing: when, how, and why personal data—any information that relates to an identifiable person—can be collected and used.
The GDPR lays down pillars for fair processing: lawfulness, fairness, transparency, purpose limitation, data minimization, accuracy, storage limits, security, and accountability. Companies need a legal basis (often consent, “necessary for contract,” or “legitimate interests”) and must explain clearly what they do.
Consumer law brings a different muscle. It tackles unfair terms, unfair commercial practices, and information duties; it also offers contract remedies when sellers don’t deliver. That is the authors’ core move: let data protection set the floor for data use, and let consumer law judge the fairness of the overall deal.
When “free” isn’t free
A key target is the “free” label. EU rules on unfair practices say it’s misleading to describe something as free if the consumer pays with anything beyond unavoidable costs. The Commission has warned that calling a product free while failing to say how preferences, personal data, or user content are used can be a misleading practice. In short: if data fuel the business, say so.
The paper spotlights a then‑draft Digital Content Directive that treats personal data as a “counter‑performance”—the thing you give in exchange for a service. That move pulls many data-funded services into consumer law’s orbit. But it also creates friction. GDPR consent must be freely given; bundling non‑essential data as a condition of access risks invalid consent. And the draft spoke of data “actively provided,” which misses the reality of passive tracking by cookies and SDKs.
One red line remains: acknowledging data as counter‑performance doesn’t turn personal data into “money.” These are also the stuff of fundamental rights. Any synthesis must keep that boundary intact.
Making disclosures bite
The GDPR demands clear, plain-language transparency—ideally with visuals or icons—about what data is collected, why, and for how long. But disclosures alone rarely move people; most don’t read, don’t understand, or don’t act. Consumer law makes transparency consequential.
Pre‑contract information becomes part of the bargain. If a smart watch seller promises not to share health data with third parties and does so anyway, that’s not just a privacy problem—it’s a breach of contract. Under the draft Digital Content Directive, security and data‑handling can be part of “conformity”: fall short, and consumers can seek repair or a refund. Information duties become levers, not fine print.
Unfair terms and unfair tactics
Standard privacy terms can be tested for fairness. Courts in Berlin treated Apple’s privacy policy as standard terms and struck clauses that forced broad data sharing or used vague purposes—an imbalance “to the detriment of the consumer.” The test maps neatly onto GDPR benchmarks such as data minimization, purpose limitation, and privacy by default.
The unfair commercial practices rules also reach consent flows and targeting. Agreeing to personal data use is a “transactional decision.” German courts found Facebook’s “Find Friends” feature breached data protection rules on consent and, for that reason, amounted to an unfair practice. Professional diligence includes complying with data protection rules; a breach can be both.
Profiling raises a deeper worry: “knowledge is power; and so is knowledge about consumers.” Personalized persuasion can exploit moments of frailty or tailor messages to hidden biases. EU law lets enforcers assess practices from the standpoint of vulnerable groups. In a world of mood‑targeted ads and timing tricks, vulnerability isn’t just age or disability; it can be created by profiling itself.
Mind the gaps
This integration isn’t painless. How should courts value data for remedies when no money changed hands? What happens if a user withdraws consent mid‑contract—can the service degrade, or was consent never “free” if withdrawal carries a penalty? The draft’s use of “data” (not just “personal data”) also blurs scope. And disclosure that a service is “free but we monetize your data” won’t suffice; consumers need meaningful specifics about data types, purposes, and sharing to make a real decision.
Still, the upside is clear. Pair data protection’s substantive limits with consumer law’s fairness lens and remedies, and “free” stops being a blank check.
Why this matters now
Connected devices, social feeds, and “free” apps fold products, services, and data collection into a single click. The authors show that Europe doesn’t need to invent a new right to tackle that bundle. It can braid existing ones: use the GDPR to set strict terms for data use, then use consumer law to test the fairness of the deal, curb manipulative design, and give people remedies that bite.
That combined approach makes the opening claim real: the app may be “free,” but the law can still insist on a fair price.


