Editor's Note: This is the second installment of the Living with Data series exploring how our online data is tracked, collected and used. Do you have questions about how your personal data is being used? Curious to learn more about your daily encounters with algorithms? Email The Decoder at thedecoder@aljazeera.net or submit your question via the form here. Screen shots and links are helpful clues!
I have owned a truck and am interested in buying another, according to my Acxiom AboutTheData.com profile. But neither of these data points is true.
Acxiom, one of the largest marketing data brokers, pools data about individuals from sources like direct-mail responses, consumer survey data, public records and browsing history. My AboutTheData.com profile incorrectly listed my parents’ address, and I wasn’t able to correct that, even after contacting support directly. I’m guessing the truck detail was pulled from old public motor vehicle records, but my parents haven’t owned a truck since the 1990s.
Acxiom lets me edit and remove this data point from my profile so that I won’t be targeted on this detail. But I still don’t have a clear sense of how editing my profile affects my experience. How does this influence the ads I see or the segmentation models I might be grouped into (Truckin’ & Stylin’)? How might Acxiom’s third-party customers use this information about me?
The biggest players in commercial Internet platforms and the social Web are starting to take users’ data concerns seriously, offering us a peek into our data profiles. More often than not, these steps toward opening up the data only raise more questions.
Follow the blue arrow
In The Decoder last week, we explored the inner workings of retargeted ads that follow us around. We can learn more about our personal data profiles when we come across these ads if we dig a little deeper.
Some of the retargeted ads I saw last week had a little blue arrow symbol in the corner.
That’s the AdChoices advertising option icon. The Digital Advertising Alliance (DAA) has developed this as the standard symbol to mark where we have options about the ads we see. Following this link from an ad leads to a website that looks like it was designed with Web 1.0 aesthetics: http://www.aboutads.info.
The DAA runs one of the largest consortiums in the industry to coordinate opting out of targeted data marketing. When I went to the DAA opt-out page, 102 out of 115 participating companies had left their mark on my browser.
Depending on where you encounter the arrow, clicking will take you to settings pages on the given platform, like Facebook or Google. Or it might tell you more about where the ad is coming from or what advertising technology company is behind it.
These companies are starting to reflect our data profiles back to us, giving us the opportunity to correct when our browsing history doesn’t reflect our actual interests. The standard has been that data marketers make all kinds of assumptions about why you are doing what you are doing. That won’t change without our input about what we really want.
So let’s take a closer look at some of the biggest and most common places we can not just opt out but also interact with our profile data.
Interests: flowers and gangs?
Google offers a number of ways to influence the ads I see on Google and ads across the Web through Google’s ad network. The most interesting ad setting page exposes my account history and the interest categories I’ve accrued over time.
Looking at my interest profile, I wasn’t surprised by some things, on the basis of my recent browsing history, including an interest in ISPs because I had been looking at my Internet bill and comparing prices. (Spoiler alert: There’s no competition.) But I was a little concerned to find that apparently I’m interested in nuclear energy and gangs and organized crime. I have no logical browsing history explanation for that, not even from news articles.
We get to see what our browsing history says about us to advertisers, and we get a chance to correct when it doesn’t match up with what we really want.
But I still have no idea where certain categories of interest are coming from or how they might affect my personalized ads. And I really don’t have a sense of how these interests might affect other personalized Google experiences.
Why am I seeing this ad?
Facebook just took a big step toward articulating the causal links between our actions and the ads we see. This summer Facebook introduced an ad preferences feature to answer the question “Why am I seeing this?”
The next time you click on the arrow or the X in the corner of a Facebook ad, a few options will appear. When I was doing research on retargeted ads, Facebook explained that one of the reasons I was seeing one ad was that Ringly wanted to reach people who had visited its website.
There are options to adjust my settings and preferences and to be removed from the interest or profile details that advertisers were targeting. I could tell Facebook that I didn’t want to keep seeing ads for the particular dress that I already bought. Facebook let me state my reason, “I keep seeing this,” which is effectively like telling Facebook that retargeted ads aren’t working for me.
Facebook also has a pretty thorough explainer page about ads that says advertisers sometimes have broad categories or might have used overlapping interests to target audiences, so my interest input might not block those ads completely.
But I’m still seeing only one of the reasons behind an ad. So while the direction toward transparency is commendable, partial transparency has the harmful potential of oversimplifying a very complex interaction. We change one thing, but we don’t know the effects elsewhere in our digital experience.
About the data
Last fall Acxiom opened up its trove of consumer behavior and marketing data to the public with profiles accessible on AboutTheData.com. It was an industry-leading step toward self-regulation and greater transparency to consumers, but as a beta experiment, it has its limits too.
Acxiom’s data is just one of the inputs that could go into the ads I see on Facebook, for example. Some of the data is core data, which is listed as fact, and some of it is inferred data, based on modeled insights. Both types are exposed in consumer profiles on the site, in sections including my household, vehicle and purchasing behaviors.
The website also gives consumers the opportunity to edit individual items. For example, it said that I had completed only high school, so I changed that because it seemed significant. When the site launched, Acxiom CEO Scott Howe looked at his profile and changed incorrect details about his mortgage. He reports that the most popular edits that visitors to the site make address political party, income, education, marital status and occupation — all categories that really reflect our sense of self, but it also means that Acxiom was getting these things wrong somehow.
As consumers, we can begin to guess how education and mortgage details could affect our profile to marketers, but it’s hard to follow through on how changing these details here might have effects elsewhere. Marketing becomes more relevant to me with my input. But what are the effects of these individual data points in aggregate, depending on how they are combined and used with other data in the future?
Acxiom and other marketing data companies offer the ability to opt out. But Acxiom doesn’t make it easy. The link to opt out leads to a page with at least three places to opt out of different data types. I can also choose individual elements in my profile that I don’t want to be targeted on by clicking on “Remove this data about me for online marketing purposes,” as I did with the truck information.
The Catch-22 is that telling advertisers about our preferences ensures that they know more about us, not less. Better experiences require greater exposure, not less. So while the PR effect is that these companies are becoming more transparent, they are also enlisting us to make their advertising more effective. That’s why Facebook keeps asking us to “tell us more about what you like,” to fill out our interest profile in greater detail. Before we offer up more to them, we need to keep prodding them for more details too.
What's the full story?
All these profiles show us our data, but they rarely describe where it comes from or how it is being used. And that means we’re still not getting the full story.
Acxiom pulls from thousands of data sources, but it isn’t exposing to consumers exactly where an individual data point — like my listed vehicle or my education — is coming from to be able to correct it there too. On the Google interest pages, I can’t see how categories match up with certain searches or websites I have visited. There are few connections made, and we’re left speculating about where the information came from and what it all means.
In the case of Acxiom, it uses the data to model consumer segments to advertisers and marketers. But as consumers, we have no insights into how we fit into these segments. Many of those data broker segmentations can be alarmingly sensitive when they deal with health — for example, AIDS/HIV — or socioeconomic status.
Most of the data we’ve looked at here is being used for marketing purposes. History shows us we ought to keep a closer eye on these uses. There aren’t yet many limits in place to prevent it from being used for more insidious purposes such as insurance underwriting, proxy-based discrimination or predatory loan targeting. Even if you are not targeted for your race, things like the cellphone plan you use, your ZIP code and your Twitter usage can stand in for race. That will change as policymakers and the industry begin to explore means for regulation of inappropriate uses of this data. But we’re still figuring out what “inappropriate” means.
Howe acknowledges the problem, saying, “I think we as consumers would feel better knowing that data about us is being used responsibly.” That’s why he’s is arguing for the data broker industry to self-regulate by excluding use of their data for credit or insurance purposes. “Companies must commit that marketing data be used only for marketing purposes,” he says. But it’s not always clear to consumers how these data brokers will hold their customers accountable for appropriate use.
If any of this profile data is to be meaningful to us, these companies need to take transparency and accountability a step further. Maybe that means more interactive feedback and follow-through to show us where our data comes from, where it goes and how it is used. But we have to ask for it first.
Beyond defaults
It may be clear now that investigating our data profiles and providing feedback is a bit of work. If these ads are just creepy or funny to you, it may not seem worth your time, attention and energy. Who is really going to go out of his or her way to tell Facebook “This ad is useful”? It’s good that these options exist, but they aren’t practical for most of us going about our everyday lives.
But I encourage you to start playing around with these profiles. Give your feedback. See what your data is saying about you. Toggle the default settings. The way advertising works with our data won’t change without our input; it’s simple supply and demand. The good news is that companies seem to be listening more carefully these days.
That’s the only way to get at the toughest issue: not how the data can be used but how it should be used. The last couple of years have been an exploration of what’s possible. If we’re all going to have a stake in the evolving negotiations about how data can be used in the future, we have to start having more of these conversations out in the open.
The Living with Data series explores how our data is tracked, collected and used online. Do you have any questions about how your personal data is being used? Curious to learn more about your daily encounters with algorithms? Email The Decoder at thedecoder@aljazeera.net, tweet me @smwat, or submit your issue or question via the form here. Screen shots and links are helpful clues!
Error
Sorry, your comment was not saved due to a technical problem. Please try again later or using a different browser.