Søren H. Dinesen, CEO of Digiseg, delves into the privacy dilemma as cookie deprecation raises new concerns about consumer expectations. From the early days of contextual ads to the rise of identity resolution graphs, Dinesen unpacks how the ad tech industry continues to track users despite privacy regulations. Are we truly anonymous, or is it all just a myth?
In the introduction of this series, I raised the concern that the targeting, measurement and attributions arising in the wake of cookie deprecation won’t meet the consumer’s expectations of privacy. It’s a hugely critical issue, and one worth exploring in depth. This article does just that.
The Rise of the New Tracking Cookie
In the early days of digital advertising, nearly all ads were contextual; Google AdSense assessed web page content and if it matched the topic of an ad creative, Google would fill the impression. The challenge was that contextual targeting back then was rudimentary, leading to horribly embarrassing and often brand-safe placements. A few memorable ones include:
- A “put your feet up” ad for a travel company appeared next to an article titled “Sixth Severed Foot Appears Off Canadian Coast” on CNN.
- VacationsToGo.com banner ad over a photo of a cruise ship that sank in Italy
- Aflac, a service for employee recruitment and whose mascot is a duck and has a tagline of “We’ve got you under our wing” appeared next to a story about anatidaephobia, a disease where people believe they are being watched by a duck.
Marketers naturally wanted better tools for targeting, and deservedly so. By the mid-2000s, Web 2.0 was in full swing, with consumers increasing the amount of time they spent online and on social media, generating vast amounts of data. For marketers, it was the start of the data-driven revolution.
That revolution was powered by private signals, which are any and all signals that are tied to an individual allowing the industry to follow consumers as they go about their digital lives, whether that’s surfing the web, using apps on their mobile device, or streaming content via their smart TVs or radios.
Initially, the main tracking device was the third-party cookie; little snippets of code dropped into the browser, unbeknownst to the user, so their every move could be logged and their future behavior monetized.
Ad tech companies and agencies retrieved that data from the consumers’ browsers and used it to make assumptions about people: users who visited a parenting site were women aged 25-to-35; users who read about new automobile models are actively in-market for a new car.
Here’s a true story about an American on the Digiseg team: She signed onto her health insurance account to check on something. Later, she saw an ad on Facebook that said something like, “Dr. Smith is in your healthcare network, schedule an appointment today.” This was far from a unique event.
For everyday citizens the message was clear: We’re watching you. For many, installing ad blocking software was an act of desperation. Such software didn’t end tracking, but at least they weren’t reminded of how much they were under the microscope of entities they didn’t know.
Consumers complained, of course. More importantly, they demanded regulators in their home states to end tracking. For the industry, that meant finding a replacement for third-party cookies, but not for tracking users.
But — and it’s a big one — blocking cookies and ceasing the tracking of users in this industry seem to be two different things, though why that is the case is beyond us. Users still emit private signals as they go online, and the industry is still collecting them. Consumers still have no control over the matter, which means brands and ad tech companies still follow them around, whether they like it or not.
The new crop of tracking signals stems from the user’s device or the single consumer, such as hashed emails, and CTV device IDs. Worse, they’re making it more difficult for users to protect their identity from prying eyes.
Identity Resolution Graphs
Identity resolution graphs are seen as an important step forward in consumer privacy protection, but whether or not they respect a user’s desire for anonymity is up for debate. These databases are built on vast identity signals: email, device ID, cookie data, CTV ID, work computer, home computer, and even physical address. An identity resolution graph connects all known signals to a single ID that typically represents individual consumers.
The benefit of ID graphs is to allow marketers and data companies to “recognize” users across multiple IDs. Let’s say a site invites users to register for a free account and the site collects the user’s email address (i.e. first-party data). Next, the site purchases an ID resolution graph to recognize users when they visit the site via a mobile device or computer from work.
Are there benefits for the user? Yes, because it allows the site to know the user and display content of interest. But wouldn’t it be better to ask the user to sign in or register on the device? Or during the initial registration process, ask permission to recognize them on other devices? This is the type of behavior that got the industry in trouble before. How hard is it to request permission?
In worst-case scenarios, the site allows advertisers or partners to target those users across their devices — without the user’s permission or input.
The Myth of Anonymity
Signals can be anonymized; emails can be hashed, device IDs can be hidden in data clean rooms, but how relevant is that anonymity if the signal can still be used to track users without their permission and for purposes they never agreed to? We forget that cookie data was also “anonymized” but the consumer still complained vociferously about being tracked.
The new private signals don’t even guarantee anonymity. Take hashed emails, which aren’t so private when everyone has the same key. That key allows anyone to recognize a hashed email as a consumer who, say, purchased this dog food or subscribes to this streaming service.
As I mentioned in the first article in this series, this level of tracking is all in pursuit of one-to-one marketing, which itself is a bit of a myth.
Digital as Mass Media
We’re pursuing a find-and-replace option for cookies, and in doing so, we are ignoring effective and truly privacy-compliant options in front of us: one-to-many ad campaigns. Two of those options include:
Contextual targeting, which has come a long way since the days of Google AdSense. We have numerous AI solutions to help avoid brand unsafe placements, including natural language processing, sentiment analysis and computer vision that can assess the true content of an article, and place ads accordingly. This segmentation method is inherently anonymous, eschews every form of tracking, and can achieve massive scale with the right approach.
Another option is using offline demographic data, that is collected, verified and anonymized by national statistics offices, ensuring it is both accurate and privacy compliant. Going further, with modern modeling and methodology, entire countries can be segmented into neighborhoods of as few as 100 households.
Ultimately, the evolution of digital ad tracking reflects the ongoing tension between technological advancements and privacy concerns. As the ad tech industry continues to innovate, the challenge lies in balancing effective marketing strategies with the imperative to respect user privacy. By embracing more privacy-compliant options such as advanced contextual targeting and offline demographic data, the industry can pave the way for a future where digital advertising is both effective and ethical. As we navigate this new era of surveillance capitalism, the need for transparency, user consent, and robust privacy protections has never been more critical.