You’ve probably never heard of Acxiom, but it likely knows you: The Arkansas firm claims to have data on 2.5 billion people around the world. And in the US, if someone’s interested in that information, there are virtually no restrictions on their ability to buy and then use it.

Enter the data brokerage industry, the multibillion dollar economy of selling consumers’ and citizens’ intimate details. Much of the privacy discourse has rightly pointed fingers at Facebook, Twitter, YouTube, and TikTok, which collect users’ information directly. But a far broader ecosystem of buying up, licensing, selling, and sharing data exists around those platforms. Data brokerage firms are middlemen of surveillance capitalism—purchasing, aggregating, and repackaging data from a variety of other companies, all with the aim of selling or further distributing it.

Data brokerage is a threat to democracy. Without robust national privacy safeguards, entire databases of citizen information are ready for purchase, whether to predatory loan companies, law enforcement agencies, or even malicious foreign actors. Federal privacy bills that don’t give sufficient attention to data brokerage will therefore fail to tackle an enormous portion of the data surveillance economy, and will leave civil rights, national security, and public-private boundaries vulnerable in the process.

Large data brokers—like Acxiom, CoreLogic, and Epsilon—tout the detail of their data on millions or even billions of people. CoreLogic, for instance, advertises its real estate and property information on 99.9 percent of the US population. Acxiom promotes 11,000-plus “data attributes,” from auto loan information to travel preferences, on 2.5 billion people (all to help brands connect with people “ethically,” it adds). This level of data collection and aggregation enables remarkably specific profiling.

Need to run ads targeting poor families in rural areas? Check out one data broker’s “Rural and Barely Making It” data set. Or how about racially profiling financial vulnerability? Buy another company’s “Ethnic Second-City Strugglers” data set. These are just some of the disturbing titles captured in a 2013 Senate report on the industry’s data products, which have only expanded since. Many other brokers advertise their ability to identify subgroups upon subgroups of individuals through criteria like race, gender, marital status, and income level, all sensitive characteristics that citizens likely didn’t know would end up in a database—let alone up for sale.

These companies often acquire the information through purchase, licensing, or other sharing agreements with third parties. Oracle, for example, “owns and works with” over 80 data brokers, according to a 2019 Financial Times report, aggregating information on everything from consumer shopping to internet behavior. However, many companies also scrape data that is publicly viewable on the internet and then aggregate it for sale or sharing. “People search” websites often fall into this latter category—compiling public records (property filings, court documents, voting registrations, etc.) on individuals and then letting anyone on the internet search for their information.

All of these unchecked practices undermine civil rights. Companies that boast holding thousands of data points on millions or billions of people—all for selling them to whomever is buying—themselves represent the aggregation of unrestrained surveillance power. This is particularly dangerous to the less powerful. As centuries of surveillance in the United States have made undeniably clear, the impact of stockpiling individuals’ personal information will fall hardest on the already oppressed or marginalized: the poor, Black and brown communities, Indigenous populations, LGBTQ+ individuals, undocumented immigrants. “People search” websites in particular can publicize addresses and thus enable intimate partner violence or doxing. The strong financial incentives to sell data, with virtually nonexistent limitations, gives these companies every reason to share their data with others, including those who use it for harm.

Copyright on 2024 by  jilina ishan gamage.