Every single time I see a cookie banner, I click reject/deny/close/X... whatever I can do to make that heinous popup get out of my way without accepting. Sometimes, when I can't deny it (which is illegal in Europe), I leave the website because I immediately distrust respect it less.
And, as someone who runs multiple websites, I know how frustrating this is for the other side of this equation as well.
Content creators, marketers, data analysts, ad people, and everybody else who owns and maintains a site needs this data to produce ROI, to validate attribution, and show that what they do has value at all.
And as such, I know that my data is worth something.
It's worth something to people like me, who want to use it to know who reads my stuff, where they live, what they like or don't like, so I can make better content.
So then why do I do it? Why do I deny hard-working marketers my data, even though I know they're probably just using it to serve me ads or stick my email address in a different contact segment so they can try to convince me to buy stuff?
Simple: my data is a currency that I have, and that you want. Why would I give it to you for free when I don't feel compelled to do so? Why do you expect that I do so without questioning it? Because I always have in the past?
Is marketing personalization the enemy?
Personalization is on the tip of every marketer's tongue these days - including ours - but consumers won't accept personalization without a bit of quid pro quo.
As website visitors and social media users, we have a lot of quid, but not a lot of quo.
The tension here is obvious. People say they care about privacy, and they do. At the same time, they expect relevance. They want content, offers, and experiences that reflect their needs.
That’s not contradictory. It’s conditional. It's transactional. They’re willing to share information when the return is clear, immediate, and meaningful, in the same way that they would exchange cash.
What they reject is asymmetry, and that's something companies often do not deliver.
We talk about tailoring experiences, but in practice, a lot of personalization shows up as slightly improved targeting or marginally more relevant recommendations. Mildly useful, but not transformative. Certainly not enough to justify handing over behavioral data to a brand you just met ten seconds ago.
And users know that.
We know that when you say "personalized experience" that you mean “more efficient advertising.”
And nobody wants this to begin with. Personally, I use an ad blocker, so I never see this personalization.
So for marketers, and the people providing marketing technology to these marketers, the question is whether the experience we're offering is actually worth accepting by the user.
That’s a much harder problem, because it forces a shift in how we think about data.
Instead of starting with the data you want to collect, you have to start with the experience you want to create. Then you work backward and ask what information is genuinely required to deliver that experience. Not what would be nice to have, not what your tools can process, but what is essential.
That distinction naturally limits data collection to what can be justified in user-facing terms. If you can’t explain why a piece of data improves the experience in a way the user can perceive, it becomes very difficult to argue for collecting it at all.
This is where the idea of reciprocity becomes more than a nice principle. It becomes a design constraint.
Is this a content design problem?
If data is a currency, then every request for it should come with a clear exchange. It cannot be buried in a privacy policy, nor implied through generic language, but made explicit in the interaction itself. “Give us this, get that.” And the “that” has to be something the user actually cares about.
Content design is how we transport the value of our exchange to the user in a way that makes them comfortable with the decision to trade us their data.
And it has to be served to them in an ethical, legal (as in, follow the GDPR rules!), and most importantly, immediately recognizable way.
In practice, that could mean gating certain features behind consent in a way that feels fair rather than coercive. It could mean offering genuinely improved functionality, not just improved tracking
It is my opinion that ad personalization is not a good form of reciprocity because nobody wants ads to begin with.
When given the option, users will always choose their own experience over your optimization. Companies that choose to optimize around
This is also where a lot of “first-party data strategy” conversations miss their marks.
Shifting away from third-party cookies is often framed as a technical or regulatory necessity. And it is. But the deeper change is behavioral. You no longer get data by default. You get it by contract.
That agreement is fragile.
It depends on how every interaction feels to the user, starting with the first banner and continuing through every touchpoint that follows. If any part of that experience suggests that the exchange is one-sided, the entire strategy starts to erode. People disengage, provide false information, or simply opt out whenever possible.
Which brings us back to the original question.
What is the user actually missing out on when they click “deny”?
If the honest answer is “not much,” then the problem isn’t user behavior. It’s the content experience we’ve designed. We're not shipping sufficiently an experience worth trading for, whether that comes in the form of content quality, perks, or simply the way that our banners convey value.
And that has consequences beyond consent rates.
Email signups, account creation, preference centers, loyalty programs. Every one of these is a moment where you ask for data in exchange for something. If the value is unclear or underwhelming, the interaction becomes transactional in the worst way. Users give the minimum, or they walk away.
Over time, that erodes the quality of the data you do collect. It becomes incomplete, outdated, or inaccurate. Which then feeds back into weaker personalization, creating a cycle that’s hard to break.
The alternative is more demanding, but also more durable
Treat the user data like a currency with a real price, not a free resource with a legal disclaimer. Treat it like money your users have, and that you want.
If you believe the data you’re asking for has value, you need to “pay” for it in a way the user can see and feel immediately. Consumers are waking up to this fact at an increasing rate, and AI personalization is adding a creepiness factor to it that is both expediting collection as well as apprehension. To continue down this road, we need to change tactics.
Otherwise, it’s coercion. You've fabricated a hostage situation and tried to convince the hostage that it's all good. You've indicated to them that they have no leverage, an indication that companies have relied upon for years: the willingness or obliviousness of customers to fork over data without any notion of what that means.
But as their trust dries up, so too does your data.
The only way users can judge whether the exchange is fair is through the experience in front of them.
“Better experience” usually translates to slightly improved targeting or recommendations. That’s not compelling. It’s not enough to justify giving up data, especially in the first interaction.
Immediate means the benefit shows up right away. Visible means I can tell what changed. Meaningful means it affects something I actually care about.
That can mean the experience becomes more streamlined, or "premium". It can mean access to something I couldn’t get otherwise, in a way that feels fair right away. Or it can mean real control over what I see and how I engage with the content on the page
What it can’t be is ads alone.
Ad personalization is a weak form of reciprocity because it benefits you first. If I don’t care about ads or actively block them, your entire value proposition collapses.
Which leads to the uncomfortable part: the customer experience you build has to work without data first.
This is where most teams get stuck, so let’s make it concrete.
What do do instead
If you ask for an email, you don’t promise “updates.” You immediately tailor what I see next, categories, frequency, format, based on what I just told you. This isn't an unfamiliar concept: plenty of companies let contacts choose the content they're interested in receiving, but the customer doesn't know that until they go to unsubscribe from an emailing list, at which point you've already lost them.
If you ask about preferences, you reflect them on the very next screen, not in some future campaign. Show me fewer irrelevant options. Reorder the page. Remove friction I would otherwise have to deal with.
Make concrete UX promises. Tell them in your cookie popup that they won't experience popups, that their settings and reading progress will be remembered.
None of this is complicated or requires new mechanics or tools. It just requires you to tie every data request to an immediate, observable change in the experience.
Before asking for data, you should be able to answer clearly what the user gets in return, right now.
Because if your site still works perfectly when I click “deny,” then your data strategy is optional.
As users and as content creators and marketers, we must acknowledge that data is a bartering tool, and that we should be right to covet it carefully, and distribute it wisely.