It’s a harrowing time.
First, we were hit with the COVID pandemic, and in recent weeks, protests and discussions about racial inequities in America have taken the main stage.
Businesses are questioning their operating practices (or sometimes being called out for them and canceled) and uncovering how unconscious biases might play a role in everything they do—from recruitment to product development to marketing. The world of digital media and advertising has long been fraught with its own set of diversity challenges, but maybe now is a time where actionable plans and strategies can be put in place to bring about actual change.
Earlier this week, in our Wrapper newsletter, I brought up the topic of ad targeting bias, nudging the industry to start thinking about a solution. On a visit to Twitter later that day, I discovered that I wasn’t alone.
Right in my feed, Gizmodo’s enterprise reporter, Shoshana Wodinsky asked the question: “are there any concrete digital ad rules wrt racially targeting/tracking a given user? outside of fb’s 2018 settlement, i’m not seeing…………….. much of anything.” To which she added, “genuinely asking here, bc after doing 20 minutes of digging into the ways data orgs infer (or explicitly pull) racial data from……………………. all of us, i’m kinda getting genuinely concerned.”
Her question prompted deep conversations about weblining and digital redlining, which relate to how personalization can lead to discrimination and bias in terms of exactly which advertising is presented to an individual based on demographics such as race, gender or location. It’s not something that’s illegal, but it’s closely related to the practice of redlining—explicitly denying services based on race or zone pricing—which most certainly is.
These are some instances where personalization can go all the way wrong.
AdMonsters friend, Aram Zucker-Scharff, Ad Engineering Director for RED at the Washington Post, wrote one of the most informed and thoughtful responses to Wodinsky’s question. It was so good, we asked him if we could post his Twitter thread in its entirety, here it is:
Like the obvious ones are redlining of specific discount offers for loans against geographic regions, something Facebook was called on to deal with, but is available via the rest of the advertising ecosystem. Job offers are the other obvious one…
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
I mean if you don’t think that some jobs are targeted exclusively against age specifically or representative demographic categories that proxy for age generally then you have never talked with a recruiter for SV.
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
But specific user category targeting that drills down to the ad tech level can have multiplier effects on bad behavior that marketing does and has done for a long time in terms of assumed demographics. That’s real a real complex sentence so it is worth explaining broadly:
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
Marketers have always sought and designed against demographics. The idea in the past was they could find broad proxies and they were often write, but the difference was when you design an ad to say proxy for targeting men by showing up in a sports mag, that is a permeable barrier
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
You could be a lady who liked sports, or you could know that there were coupons for buying programming kits in Sports Illustrated and go pick one up, or you could stumble upon it…
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
(Newspapers made those types of assumptions when they counted their readership based on multiple reads per household–different readers per one house–though that didn’t always translate to advertisers beyond raw numbers.)
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
But in the world of modern demographically targeted ad technology this type of user-focused targeting is not a barrier easy to break through. If a marketer decides gaming is a dude thing and targets ads to dudes, it is a lot harder to build a profile that fits as a user.
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
In her article, Anjin Anhut talks about how even the basic demographic targeting of the print era creates cycles of reinforcement both in the marketing campaign and in the way products are designed. https://t.co/JVIsoTvMsy
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
The cycle of marketing leading to more exclusive marketing leading to more stratified products is one that is both accelerated and tightened for the modern era of digital ad tech, and that can have proxy discriminatory results we don’t think about at first.
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
I don’t think anyone would disagree if I were to assume that a lot of gamers found their way into lucrative programming and computer science fields because of their start in the world of gaming…
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
Now we have some nasty proxy logic building when we follow that thinking to its conclusion… how many mundane demographics act as gateways to opportunity? You can likely think of quite a few. Now those mundane demographics that are being marketed to are harder to see than ever.
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
Now think of how marketers portray women and people of color in their campaigns. Now think of how those campaigns are targeted against seemingly mundane topics, now think about how those mundane topics might lead to less mundane work, opportunities, career connections…
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
Too much of the world of ad tech allows explicitly race-based demographic targeting. That logic inevitably leads to bad behavior and redlining. But that doesn’t even touch upon how higher barriers to even seeing some marketing due to demographic proxies might have an impact.
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
Housing is, unfortunately, only the most obvious horror here. https://t.co/ItzIU0VtGV
— Aram Zucker-Scharff (@Chronotope) June 9, 2020
And Facebook is only the most obvious offender in this process. https://t.co/WylqwLVqQq
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
The ad categories formed from demographics get ever more precise with user targeting on the web, and even worse they inform the algorithms that present what they think you are interested in. The greater marketing’s precision, the more dangerous it becomes to equal opportunity.
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
The truly frighting thing that Facebook has brought into the ad tech marketplace is not just its precision, which every ad tech targeting firm seeks to duplicate, but its pricing model around precision.
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
Facebook has created a world where targeting is made more precise as a *savings measure*. This is a big change because if you wanted to buy a specific audience in the print era (say hobby knitters) you would have to pay big bucks for their specialty publication…
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
But if you’ve ever run a campaign on Facebook you’ve been reminded that the more *precise* you can make your targeting, the *cheaper* it is for you to reach your audience. It makes discriminatory targeting available to even the smallest budget. That’s not good.
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
This is the core of demographically-based precision web targeting. It inherently asks you to look at groups, measure their value, & decide 1 group is more valuable than another.
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
That inevitably boils down to discrimination in some form when ad tech makes it hard to penetrate into a different group, by design.
The ethics of this are obviously bad when advertising jobs. But they’re questionable in any campaign w/these technical tools (all of them)
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
Just as bad is this standard means that either the entire world changes or the people who try to stay anonymous on the web get punished. W/o a data fingerprint you won’t get anything but scams. A small opportunity to enter the life changing algo bucket turns into a 0 opportunity.
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
Browsing in incognito mode? Behind a VPN? With anonymity software? Unless the entire set of technological assumptions ad tech is built upon changes then you’re going to be screwed when it comes to what ads and algos decide you should be presented with. Also, extra malvertising.
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
When I talk about ad tech, this is why I am most concerned about the middlemen, the ecosystem. The problems of ad tech aren’t a single site or vendor, no matter how large they are, it is an entire landscape that needs regulation.
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
There are over 7000 ad tech products in the ad tech ecosystem. When the ad for the job you should have is explicitly targeted to not reach you every single one of them are culpable. Regulation is needed b/c otherwise you could spend a lifetime in lawsuits and not make a dent.
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
And until that happens we are in an ever narrowing spiral of demographically specific targeting. pic.twitter.com/NrhAXjIrX9
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
It’s pretty gross if you put even the smallest amount of thinking into the inevitable aftereffects of this as a system. Too bad most ad tech companies and most engineers don’t bother. pic.twitter.com/rHLBjeOP5C
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
This is the dangerous mindset of most in ad tech, and we live in the desolation of their lack of care or ethical consideration. (From: https://t.co/kxmPYU2jL0 ) pic.twitter.com/8cZORKQwFL
— Aram Zucker-Scharff (@Chronotope) June 10, 2020
Hopefully, you made it all the way through Zucker-Scharff’s full thread. It’s quite the worthy read.
Wodinsky and I also had another conversation that sprang from her original question about racial ad targeting. It’s clear that targeting to the right person at the right time can lead to some terribly bad practices and although they’re not illegal, they can lead to some of the worst cases of injustice.
Here’s a couple of follow-up questions for the ad-tech community at large: Can we have targeted advertising that is also fair and unbiased? And will it require a set of regulations to get us there?