Somehow everyone is surprised that Cambridge Analytica scraped a hundred million Faceboot profiles and engineered an election.

Why is this news? I feel like I’m seeing “MAN BITES DOG” on every website, TV show and broadsheet. It’s absurd. The same news companies that profess such astonishment have been bending over backward to get that same data-targeting service to save their ailing businesses. Faceboop and Goople control access to information, and it’s a two-way street. Just as we puny humans jam on that “Like” button to get the most personalized “feed”, with all of our favorite news articles and pictures of our exes being happier than us, the “content producers” have to chase the ever-moving goalposts set by Silicon Valley if they want to get access to our eyes. So when Zuck says “jump,” they say “how many words can we put before the jump sir?”

It’s obvious that social media companies steal your data. It’s their business model. If you get “personalized” services in the 21C – from recommendation engines like Amazon and Spotify to social media feeds to customer-loyalty programs at supermarkets – then you’re being tracked. Sometimes there’s a nasty surprise, like the revelation that Faceboek collects your call and SMS history on Android, but those are just details. The general pattern is clear: if you ever log in to anything, you will be watched. Forever.

The great blue monster even keeps ghost records to represent non-users. They literally have a database of every potential person in the world, cross-referenced from all the little hints dropped by the people who signed their ToS. In the same way that 270,000 users who took the Cambridge Analytica survey agreed to let it scrape data about their friends , anyone who keeps a Fabecook profile is snitching on their friends who don’t.

So how do we deal with the track-and-target method of surveillance capitalism? We can learn from the original Cybermen: Norbert Wiener and John von Neumann, the inventors of cybernetics . Cybernetics – from the Greek kybernetes , the person who steers the boat – was a reaction to the World War-era innovations in war machines. If a rocket is already on its way to bomb your city, how do you shoot it down before it’s too late? How do you measure its speed and its location at the same time? It’s the classic moving-target problem.

The cybernetic approach was to use a feedback loop: put radar in the nose of your rocket, and have it adjust its course based on the updates from the radar signal. This is the guided missile, the grandfather both of the Predator drone and of the online advertising machine.

How do you dodge a guided missile? You drop chaff. You scatter a cloud of tiny radar-reflecting garbage, confusing the tracking system so that it can’t properly target you.

We need to chaff social media. Not just anonymizing our traffic data, like with Tor, but actively creating millions of false profiles in the databases of the surveillocrats.

Cambridge Analytica’s software runs on a
co-occurrence model
these hundred users “like” Coldplay, and ninety of them “like” Wonder Bread, so we can assume that any given user that likes Coldplay has a ninety percent chance of liking Wonder Bread. All the social platforms and recommendation engines do some version of this analysis. It’s the underlying trick of the current boom in Artificial Intelligence: if you have enough data, and enough processing power, you can find all kinds of correlations that humans couldn’t see. But this is a sword of n edges: the more bad data you have, the more false correlations you’ll find. We can poison the well for these spyware barons, and take back our privacy.

The internet is already full of fake news. We might as well litter it with fake readers, too.

Thanks for reading,

Max


Bonus round: Enjoy this excerpt from Neal Stephenson’s 2008 epic, Anathem , where a world much like ours had the same problem a long time ago…

“Early in the Reticulum—thousands of years ago—it became almost useless because it was cluttered with faulty, obsolete, or downright misleading information,” Sammann said.

“Crap, you once called it,” I reminded him.

“Yes—a technical term. So crap filtering became important. Businesses were built around it. Some of those businesses came up with a clever plan to make more money: they poisoned the well. They began to put crap on the Reticulum deliberately, forcing people to use their products to filter that crap back out. They created syndevs whose sole purpose was to spew crap into the Reticulum. But it had to be good crap.”

“What is good crap?” Arsibalt asked in a politely incredulous tone.

“Well, bad crap would be an unformatted document consisting of random letters. Good crap would be a beautifully typeset, well-written document that contained a hundred correct, verifiable sentences and one that was subtly false. It’s a lot harder to generate good crap. At first they had to hire humans to churn it out. They mostly did it by taking legitimate documents and inserting errors—swapping one name for another, say. But it didn’t really take off until the military got interested.”

“As a tactic for planting misinformation in the enemy’s reticules, you mean,” Osa said. “This I know about. You are referring to the Artificial Inanity programs of the mid–First Millennium A.R.”

“Exactly!” Sammann said. “Artificial Inanity systems of enormous sophistication and power were built for exactly the purpose Fraa Osa has mentioned. In no time at all, the praxis leaked to the commercial sector and spread to the Rampant Orphan Botnet Ecologies. Never mind. The point is that there was a sort of Dark Age on the Reticulum that lasted until my Ita forerunners were able to bring matters in hand.”

“So, are Artificial Inanity systems still active in the Rampant Orphan Botnet Ecologies?” asked Arsibalt, utterly fascinated.

“The ROBE evolved into something totally different early in the Second Millennium,” Sammann said dismissively.

“What did it evolve into?” Jesry asked.

“No one is sure,” Sammann said. “We only get hints when it finds ways to physically instantiate itself, which, fortunately, does not happen that often. But we digress. The functionality of Artificial Inanity still exists. You might say that those Ita who brought the Ret out of the Dark Age could only defeat it by co-opting it. So, to make a long story short, for every legitimate document floating around on the Reticulum, there are hundreds or thousands of bogus versions— bogons , as we call them.”

“The only way to preserve the integrity of the defenses is to subject them to unceasing assault,” Osa said, and any idiot could guess he was quoting some old Vale aphorism.

“Yes,” Sammann said, “and it works so well that, most of the time, the users of the Reticulum don’t know it’s there. Just as you are not aware of the millions of germs trying and failing to attack your body every moment of every day.”


SCIOPS is a weekly newsletter about cognitive security. Feel free to forward it to anyone you think would like it, or share it on your social-tracking profile. If you have thoughts, questions, or criticism, just respond to this email.

If you’re seeing this for the first time, make sure to sign up at tinyletter.com/sciops for more cyberpunk weirdness in your inbox every week.