Opinion

Time to regulate data brokers

The lucrative information industry can't be trusted without external oversight

January 23, 2014 8:45AM ET
For data brokers, any piece of our personal information is potentially valuable. Relevance is only an algorithm — or a sales pitch — away.
Getty Images

Last week, a man in a Chicago suburb received a letter from OfficeMax. It was addressed to Mike Seay; the line below his name read “Daughter Killed in Car Crash.” OfficeMax blamed an unnamed third-party data broker for mistakenly printing the information on the envelope. A distraught Seay responded with some important questions: “Why would they have that type of information? Why would they need that?”

The answer is that for data brokers, any piece of personal information is potentially valuable. Relevance is only an algorithm — or a sales pitch — away. Data brokers are supposed to be the unseen cogs in the surveillance economy, collecting vast amounts of information on hundreds of millions of people and analyzing it for patterns and likely outcomes. They are devoted to extracting value from the raw information of our lives for their own gain; they sell this information to stores, insurers, banks, tech companies, HR departments and basically anyone who comes calling. Government agencies are part of the trade as well, with the DMV selling to data brokers and the TSA passing on information to debt collectors.

Some brokers know everything there is to know about you, including your shopping habits, medical history and income, how much debt you have, what you read, which charities you support and where you worship. One such broker, Equifax, maintains up to 75,000 data elements per individual. The brokers categorize consumers into various groups — “Ethnic Second-City Strugglers” or “Elderly Opportunity Seekers” — that allow the buyers of this information to be more selective.

Selling personal data is massively lucrative. In 2012, the data-broker industry produced $150 billion in revenue. One major player, IMS Health Holdings, claims it generated $2 billion in sales in nine months last year and has “over 85 petabytes of unique data,” including 400 million patient records from more than 100 countries. Drawing on its records of “45 billion healthcare transactions,” IMS can help pharmaceutical sales reps know which doctors are prescribing their products and which need to be hit with some salesmanship.

While IMS claims that its patient records are anonymous, a number of studies have shown that it’s possible to de-anonymize data, whether by seizing on a few key data points or combining disparate data sets.

Veil of secrecy

Last month, the Senate Committee on Commerce, Science, and Transportation, led by Sen. Jay Rockefeller, D-W.Va., published a report on the data-broker industry. “Data brokers operate behind a veil of secrecy,” read the document. “Many of their practices lie outside the ambit of federal consumer protection law.” One data broker, InfoUSA, “routinely ignored rules about selling data to known fraudsters.”

The data-broker industry explains its role as greasing the wheels of digital commerce — in other words, helping subsidize free services such as Gmail and Facebook, providing startups with access to new customers and connecting people with more personalized ads. (This latter point is always presented as a kind of public service, as if Internet users are clamoring for more targeted advertising.) Representatives also contend that self-regulation has worked. “Responsible data sharing” is great for economic growth and for consumers, Tony Hadley, Experian’s VP of government affairs and public policy, told the Senate committee.

But “responsible data sharing” is a conveniently imprecise and flexible term. Companies don’t just use data to get shoppers to more easily part with their money. Personal data are important for any field concerned with risk management, from airport security to electrical utilities. Insurers might buy up customer information to improve actuarial models, determine coverage levels and monitor clients for adverse behavior. In the process, your personal data are turned against you, used by corporations to help influence your habits. (“We noticed that you recently bought some bigger pants and are drinking more. Is everything all right?”)

Personal data can therefore be put to pernicious uses while bypassing laws, such as the Fair Credit Reporting Act or the Health Insurance Portability and Accountability Act (HIPAA), which are supposed to protect consumers by regulating banking, insurance, health care and other industries that make use of our private information. Much of the information circulated between brokers and clients comes from third parties, so that a list of people suffering from Alzheimer’s or alcoholism isn’t covered by HIPAA. Similarly, companies can use personal data — arrest records, medical prescriptions, political contributions — to discriminate against customers and potential employees, especially the most vulnerable.

These laws now appear antiquated and in need of amending, a conclusion supported by a recent Government Accountability Office report, which said, “Congress should consider strengthening the consumer privacy framework to reflect the effects of changes in technology and the increased market for consumer information.”

Without significant regulatory reforms, the situation is only going to get more stacked against the millions of people who, whether they like it or not, are this industry’s product.

The rise of personal data as a market force carries some alluring possibilities. It can make aspects of our lives more convenient, as big retailers seem to anticipate our needs with timely recommendations and coupons.

But in the last few weeks, some of the industry’s problems have come into view. The tens of millions of consumer records stolen from Target; Facebook’s sharing of user data with Yandex, Russia’s largest search engine; Google’s purchase of Nest, a smart-thermostat company that, as The New York Times described it, is “interested in how people behave inside their houses” — all of these reflect the ways in which personal data have come to shape our lives, often beyond our control.

Companies collecting and trading such information should be required to keep it highly secure, submit to government audits and allow individuals to easily opt out of all tracking. Limits must also be placed on which kinds of data can be collected and sold and how they can be used.

Consumer education and industry pressure can help. After the Senate hearing last month, The Wall Street Journal contacted Medbase2000, an Illinois-based company that offered a “rape sufferers list” for sale on its website. The cost was 7.9 cents per victim. The president of Medbase2000’s parent company claimed that the list was “hypothetical,” a test for possible products. (Why this sort of list was used as an in-house hypothetical was never explained.) The company then removed the page, along with a list of HIV/AIDS patients, from its site.

But without significant regulatory reforms, the situation is only going to get worse for the millions of people who, whether they like it or not, are this industry’s product. That President Barack Obama, in his speech last week laying out mild reforms to the National Security Agency, invoked corporate surveillance to justify governmental surveillance offers a reminder that the data trade is a public-private partnership. The U.S. government relies on the information produced by this corporate surveillance infrastructure, and one proposed intelligence reform — to have telecoms hold on to more data on behalf of intelligence customers — would only cement this codependency. (AT&T, for instance, has reportedly been paid $10 million for its cooperation with intelligence agencies.)

How much data, one might ask, is enough? Companies ranging from Microsoft to Acxiom are now working to track people across their various electronic devices and into the physical world, providing a seamless view of their subjects’ lives. Brick-and-mortar stores are using sensors to track customers’ movements via their smartphones and selling that information to advertisers. It is corporate surveillance on the granular level — molecular, even. A new batch of genomic startups, from the embattled testing service 23andMe to a company called DNAnexus, are now collecting DNA from individuals under the promise of advancing medical research or revealing genetic abnormalities.

Crucially, these companies position themselves not as health care providers bound by an oath to do no harm, but as tech firms, with their cavalier attitudes toward the sale of customer data. Here that data is your genetic material. And what kind of information could be more personal, more sensitive, more essentially yours than that?

Jacob Silverman's book about social media and digital culture will be published by HarperCollins later this year.

The views expressed in this article are the author's own and do not necessarily reflect Al Jazeera America's editorial policy.

Related News

Find Al Jazeera America on your TV

Get email updates from Al Jazeera America

Sign up for our weekly newsletter

Get email updates from Al Jazeera America

Sign up for our weekly newsletter