Editor's Note: This is the fifth installment of the Living With Data series exploring how our online data is tracked, collected and used. Do you have questions about how your personal data is being used? Curious to learn more about your daily encounters with algorithms? Email the Decoder at thedecoder@aljazeera.net or submit your question via the form here. Screen shots and links are helpful clues!
As this Living With Data series develops, you might start to wonder, why focus on the advertising and commercial uses of personal data? Why not just use Adblock Plus and be done with it?
Ads and interactions on the commercial Web are perhaps some of the clearest signals we have to reveal the impact of algorithms in our everyday lives. The connection between our clicks, likes and searches and the algorithms that process and spit back outputs in ads or filtered feeds is most visible there.
Perhaps without thinking about it, you encounter algorithms every day. Google spam filters incorporate the wisdom of the crowd with your spam-training inputs. Netflix suggests movies based on increasingly specific parameters. Facebook's algorithms determine your news feed. We at least get to see the effects of and sometimes interact with these algorithms. And we also get the opportunity to correct when their models make faulty assumptions.
When these commercial Web algorithms get us wrong, it seems the worst that can happen is the annoyance or comedy of a mistargeted ad or frustration when sites don’t work as seamlessly as we expect. But that’s not always the case.
The use of our personal data becomes more harmful when we aren’t privy to the inputs and outputs, when we know neither which of our actions the algorithm’s incorporating nor the effect those processed decisions have on us. Commercial algorithms can lead to things like price discrimination, judging our willingness to pay for a Swingline at Staples on the basis of our socioeconomic status or proximity to a brick and mortar store.
Web tracking is the basis of all other forms of tracking. Advertising and consumer algorithms are the tip of the iceberg in that they are the only visible part of the system. When we watch for strange things happening with our data in the places we can see, we can imagine similar effects in the places that are more obscured. This hidden harm applies to our experience not only as consumers but also as citizens.
We aren’t told we are in a social experiment that tests our perception of how much we like someone we are well-matched with on OkCupid until after the fact. We don’t get to see exactly what data point was the deciding factor in the loan we didn’t get. Or where we fall in the spectrum of algorithmically defined citizenship in the National Security Agency’s 51 percent foreignness model that legally justifies targeted surveillance of people within the United States. Or how governments track mobile phones to send text alerts stating that “you are registered as a participant in a mass disturbance.” Or what we did to get double-checked by the Transportation Security Administration every time we travel.
What we learn from paying attention to data’s uses in our consumer context will be important for training in our lives as digital citizens. Increasingly, the data that’s used in these different contexts is one and the same. Your relationship with Verizon or AT&T is a commercial one, but the bulk collection of metadata from your calls is of interest to the NSA. Prism revelations showed just how many commercial Internet companies were targets for data collection. Our data has just as much bearing on our autonomy as consumers as it does on our autonomy as citizens.
Some of the best tools we have for surfacing and interacting with our data profiles right now focus on advertising because it’s the most visible data we have access to. That includes ones I’ve already profiled, like Floodwatch, a browser-based ad-monitoring tool that captures and spits back our advertising profile across the Web.
Ads may seem innocuous, but tools like this can be used to illustrate the liberties that systems or individuals can take in retelling the story of our identities. When Floodwatch launched, Jer Thorp of the Office for Creative Research paid Mechanical Turkers to write biographies of him based on just his Floodwatch history. They got a lot of things right, but a lot of the stories drew on “algorithmic leaps of logic, evidence of the blunt-instrument approaches to profiling that are used in Web-based advertising,” he wrote in an article on Medium.
Helen Nissenbaum, a New York University professor and the author of “Privacy in Context,” recently discussed another new advertising monitoring and obfuscation tool, AdNauseam. Like Floodwatch, AdNauseam attends to the ads that are served to you, but here the plug-in clicks on all the ads in an effort to overload the system with false feedback. The tool plays with the idea of hiding in plain sight by burying your intended activity with lots of other meaningless signals. By clicking on everything, AdNauseam builds the profile of an “omnivorous click-stream” rendering targeted ads “futile,” according to the website.
But even as we play with the inputs and outputs, we have no way of following through on how that misdirecting, obfuscating data might affect us in the future. Where does all that automated ad clicking lead? Could they converge into clicking on a spiraling binge of porn? In an effort to combat the tyranny of ads, we’ve introduced new signals into the system on which it could arbitrarily judge us. And most concerning, we probably won’t know when or how those judgments happen.
Imagine what a tool like Floodwatch could do for surfacing the deeply hidden uses of data we encounter every day. For now, we can start interrogating the ads in plain sight.
The Living With Data series explores how our data is tracked, collected and used online. Do you have any questions about how your personal data is being used? Curious to learn more about your daily encounters with algorithms? Email The Decoder at thedecoder@aljazeera.net, tweet me @smwat, or submit your issue or question via the form here. Screen shots and links are helpful clues!
Error
Sorry, your comment was not saved due to a technical problem. Please try again later or using a different browser.