Everyone loves collecting data, nobody loves analyzing it later
Friday October 22, 2021
This is a expansion and adaption of a MetaFilter post I wrote a couple months ago. As such, it’s not so much a post about my ideas as it is a post arranging things other people have said in a way that I find insightful. Primarally, I am pulling from Bad News by Joseph Bernstein, Forget privacy: you’re terrible at targeting anyway by apenwarr, and The Real Source of America’s Rising Rage by Kevin Drum. I recommend reading all three if you would like a full understanding of the ideas I’m talking about.
A big theme in the Discourse in the west for the past half-decade or so has been the amount of power that large tech companies have. When the term “techlash” was first coined in 2013, it mostly focused on the sheer amount of wealth being accumulated by the people at the top of the tech industry, and the ostentatiousness arising from that. By the end of 2016, criticism sharply shifted away from simply looking at wealth, and instead focused on societal power: the power to shift perspectives, sway elections, and control communications. This criticism is essentially bipartisan: The political left in the US will bring up Facebook’s role in the 2016 election, while the right will focus on Twitter banning conservatives.
It’s important to critically examine this narrative, though. For years, companies like Facebook attempted to downplay their influence on the world — Mark Zuckerberg famously called the idea that Facebook influenced the 2016 election “a pretty crazy idea”. As Joseph Bernstein points out, this is a untenable position in the long term:
Facebook’s basic business pitch made denial impossible. Zuckerberg’s company profits by convincing advertisers that it can standardize its audience for commercial persuasion. How could it simultaneously claim that people aren’t persuaded by its content?
So which is it? Is Facebook a bumbling giant, incapable of exerting influence on the world? Or are they a finely tuned, omnipotent AI and data driven machine capable of building detailed models of individual people to determine exactly how to sell ads?
When you talk to people who actually have a high level understanding of the internet advertising industry, companies like Facebook and Google seem more bumbling and less omnipotent. Avery Pennarun (who previously did “strategic analysis” at Google) writes about how none of the tracking these companies are doing actually works:
The advertiser has a tracker that it places on multiple sites and tracks me around. So it doesn’t know what I bought, but it does know what I looked at, probably over a long period of time, across many sites.
Using this information, its painstakingly trained AI makes conclusions about which other things I might want to look at, based on…
…well, based on what? People similar to me? Things my Facebook friends like to look at? Some complicated matrix-driven formula humans can’t possibly comprehend, but which is 10% better?
Probably not. Probably what it does is infer my gender, age, income level, and marital status. After that, it sells me cars and gadgets if I’m a guy, and fashion if I’m a woman. Not because all guys like cars and gadgets, but because some very uncreative human got into the loop and said “please sell my car mostly to men” and “please sell my fashion items mostly to women.” Maybe the AI infers the wrong demographic information (I know Google has mine wrong) but it doesn’t really matter, because it’s usually mostly right, which is better than 0% right, and advertisers get some mostly demographically targeted ads, which is better than 0% targeted ads.
That’s a lot about profiling for ad targeting, which obviously doesn’t work, if anyone would just stop and look at it. But there are way too many people incentivized to believe otherwise. Meanwhile, if you care about your privacy, all that matters is they’re still collecting your personal information whether it works or not.
Facebook, for instance, is aware of this:
Online ads tend to produce clicks among people who are already loyal customers. This is, as Hwang puts it, “an expensive way of attracting users who would have purchased anyway.” Mistaking correlation for causation has given ad buyers a wildly exaggerated sense of their ability to persuade.
So too has the all-important consumer data on which targeted advertising is based, and which research has exposed as frequently shoddy or overstated. In recently unsealed court documents, Facebook managers disparaged the quality of their own ad targeting for just this reason. An internal Facebook email suggests that COO Sheryl Sandberg knew for years that the company was overstating the reach of its ads.
It’s not just the ad platforms that realize this either — Uber recently realized that they were being defrauded out of $100 million of their ad spend.
If omnipotence isn’t the problem with Facebook, what is? Kevin DrumI can’t generally recommend Kevin Drum — I think that a lot of his thinking is sloppy, and he seems to have the typical pundit brainworms. But in this particular case, I think there’s a lot of merit to his idea. somewhat-convincingly argues that we have Fox News to blame for many of our misinformation problems:
As we all know, trust in government plummeted during the ’60s and ’70s thanks to Vietnam and Watergate, and then flattened out for the next few decades.
From 1980 to 2001, trust stayed at roughly 40 percent except for a brief dip, during Bill Clinton’s first term, that was quickly regained. Then, right after 2001, trust began to plummet permanently. By 2019 it was down to 20 percent.
What accounts for this? It’s here that our popular explanations run aground. It can’t be all about a rise in conspiracy theories, since they’ve been around for decades. It can’t be social media, since Facebook and Twitter have become popular in the political arena only over the past few years. It can’t be a decline in material comfort, since incomes and employment have steadily improved over the past couple of decades. It can’t really be social trends, since most of them have improved too. And most of the specific issues that might cause alarm—immigration, racism, and more—are unlikely candidates on their own. They may be highly polarizing, but in a concrete sense they haven’t gotten worse since 2000. In fact, they’ve mostly gotten better.
To find an answer, then, we need to look for things that (a) are politically salient and (b) have changed dramatically over the past two to three decades. The most obvious one is Fox News.
When it debuted in 1996, Fox News was an afterthought in Republican politics. But after switching to a more hardline conservatism in the late ’90s it quickly attracted viewership from more than a third of all Republicans by the early 2000s. And as anyone who’s watched Fox knows, its fundamental message is rage at what liberals are doing to our country. Over the years the specific message has changed with the times—from terrorism to open borders to Benghazi to Christian cake bakers to critical race theory—but it’s always about what liberal politicians are doing to cripple America, usually with a large dose of thinly veiled racism to give it emotional heft. If you listen to this on a daily basis, is it any wonder that your trust in government would plummet? And on the flip side, if you’re a progressive watching what conservatives are doing in response to Fox News, is it any wonder that your trust in government might plummet as well?
Interestingly, it’s pretty easy to blame both Fox News and Facebook, given that Facebook’s top news stories are consistently from Fox News, people with shows on Fox News, and similar right-wing media outlets. Facebook isn’t my biggest worry in this situation — they are certainly evil, and have a worrying amount of power, but they’re evil and dumb. Fox News, on the other hand, is evil and clever, having finely honed their rhetoric to produce as much profit as possible, with rage and destruction as a side effect.
I don’t mean to downplay the risk of large technology companies centralizing power — it’s certainly worth worrying about the implications of ubiquitous tracking, centralization of communications infrastructure, and other trends among tech giants. But when we consider these things, we should be sure to remember that the stories tech companies tell do not reflect the ground truth: ad companies are lying about how effective their ads are, since they do not have as much power as they want people to believe they do. That is, I think, comforting.