The Cambridge Analytica scandal is less a revelation than a reminder that Facebook has built one of the biggest surveillance systems in the world with unprecedented powers of persuasion. Now that the data of some 87 million Facebook users has been compromised, the public is finally confronting that surveillance reality, along with another less-visible trend: private companies developing and deploying government-funded psy-ops, behavioral technologies, and campaigns of influence against civilians (see WILTW April 5, 2018). From mass political profiling to “gamified obedience”, these projects can weaponize big data in ways that are mostly hidden from the public. They’re also on the frontlines of bigger ideological battles, both domestically and abroad. That this trend is developing at a moment when tribalism, authoritarianism, and cyberwarfare are intersecting across the globe is no coincidence. It is a cold war world, reprogrammed for the 21st century.

Writing for The New York Review of Books, Tamsin Shaw, professor of European Studies and Philosophy at NYU, contextualized the big tech-government economy:

Apparently, the age of the old-fashioned spook is in decline. What is emerging instead is an obscure world of mysterious boutique companies specializing in data analysis and online influence that contract with government agencies.

Much of the classic, foundational research on personality, conformity, obedience, group polarization, and other such determinants of social dynamics—while ostensibly civilian—was funded during the cold war by the military and the CIA. The cold war was an ideological battle, so, naturally, research on techniques for controlling belief was considered a national security priority. This psychological research laid the groundwork for propaganda wars and for experiments in individual “mind control.”

The recent revival of this cold war approach has taken place in the setting of the war on terror.

With the rise of tech giants and their unprecedented tools of persuasion, experiments in mind control kicked into a higher gear. Smartphones, Google Earth, the internet itself, are not just commercial technologies we use every day. They are top national security assets, and indeed, would not be possible without military funding. The study of social media platforms—of how we can be “nudged” and influenced, and how likely we are to behave—is now “the cutting edge work of the American intelligence community,” Shaw writes:

Government agencies have mitigated risk and even helped to create markets for companies whose products, while ostensibly strictly civilian and commercial, satisfy their own needs. The driverless car industry will incorporate, test, and improve technologies devised for missile guidance systems and unmanned drones. Facial recognition software developed by intelligence agencies and the military for surveillance and identity verification (in drone strikes, for example) is now assuming a friendly guise on our iPhones and being tested by millions of users.

The US government has supported the monopolies of the Big Five companies partly for the sake of the “soft power” they can generate globally.

The Defense Department spent $7.4 billion on data analytics, AI, and cloud computing cloud in 2017. That’s a 32% increase over five years ago, creating what Lt. Gen. Michael Flynn once called a “gold rush” for contractors. When Flynn said that in 2014, he was speaking as the head of the Defense Intelligence Agency. A few weeks later, he left the DIA to establish his own lobbying group. He later went on to consult for SCL, the British military contractor and parent company of Cambridge Analytica, whose data-harvesting activities may have played a pivotal role in the outcome of the 2016 U.S. presidential election and Brexit.

The gold rush Flynn was speaking of is evident today, as tech giants snap up lucrative government contracts that will power the country’s national security innovations. Amazon recently rolled out a new cloud-computing service for the U.S. intelligence community that can run workloads up to the “secret” classification level and will complement its existing $600 million contract with the CIA. Amazon, Google, Oracle, and Microsoft, are racing to win a $10 billion contract with The Pentagon. Google is working with a Pentagon program that uses AI to interpret video imagery and could be used to improve the targeting of drone strikes.

Palantir Technologies, whose sole backer initially was the CIA, has emerged as another shadowy player in this world. The company, with its historically close ties to Facebook, just won an $876 million contract with the Pentagon. Its connections to Cambridge Analytica and SCL have been unraveling in real-time, with reports circulating that a Palantir employee may have helped them harvest Facebook data. And the company has secretly been using New Orleans to test its predictive-policing technology, a move that not even City Council members knew about. We quote The Verge, which recently broke the story:

The company provided software to a secretive NOPD program that traced people’s ties to other gang members, outlined criminal histories, analyzed social media, and predicted the likelihood that individuals would commit violence or become a victim.

The program escaped public notice, partly because Palantir established it as a philanthropic relationship with the city through Mayor Mitch Landrieu’s signature NOLA For Life program. Thanks to its philanthropic status, as well as New Orleans’ “strong mayor” model of government, the agreement never passed through a public procurement process.

All of which raises important questions about the growing “military industrial complex of big data psy-ops,” as Shaw refers to it.

What kind of society are we building when a company can secretly set up shop in a city and start doing prediction work that could potentially sweep up innocent people? We are beginning to understand what happens when big data psy-ops infiltrate the democratic process. But what happens when they infiltrate schools and homes, our personal lives and minds?

This future is already knocking. As a recent Twitter thread revealed, Facebook’s growing list of 11,000 patents includes “systems and methods of eye tracking control”, “user influence scores”, and “image object recognition based on location”, which means it probably knows you’re going to photograph your family before you even snap the shutter. How’s that for modern spyware?

In a surprisingly frank op-ed in The Washington Post, Mitch Daniels, president of Purdue University, shared his concerns about how data analytics is used on his campus. The hope is to help his students make “better” decisions, but he also foresees the potential for abuse and backlash:

When does a nudge become a shove?… Somewhere between connecting a struggling student with a tutor and penalizing for life a person insufficiently enthusiastic of a reigning regime, judgment calls will be required and lines of self-restraint drawn. People serene in their assurance that they know what is best for others will have to stop and ask themselves, or be asked by the rest of us, on what authority they became the Nudgers and the Great Approvers.

Will these students, Daniels wonders, revolt against the nudgers and approvers? Employees at Google already have. Just last week, thousands of Googlers, including dozens of senior engineers, signed a letter protesting their company’s involvement with the Pentagon. “Dear Sundar,” it began, “We believe that Google should not be in the business of war.”

Now that Facebook has been weaponized by armies of bots, Russian propagandists, and homegrown politicians, one thing is clear: the U.S. government’s most essential national security assets, the smartphones and platforms its citizens use every day, are also its greatest liabilities. Having the data from those assets concentrated in the hands of a few powerful companies represents another vulnerability, as chipmakers exposed in January.

Tech giants will no doubt argue otherwise, using national security as a defense to keep their monopolies intact while developing ever more subtle forms of influence. How regulators thread this needle—curbing Big Tech without handing the innovation lead to China—remains an open question. Shaw offered them a word of advice:

A science that is oriented toward the development of behavioral technologies is bound to view us narrowly as manipulable subjects rather than rational agents…It is clearly time to ask whether this hybrid Silicon Valley economy has been a good national security investment.