The defamation lawsuits against Infowars founder Alex Jones and his subsequent removal (“de-platforming”) from Apple, Facebook, Spotify and YouTube vividly illustrate what is at stake in the battles over who gets to determine the tone and content of America’s public sphere. The tech giants’ long overdue decision to address Jones’s hateful assertions and conspiracy theories – which they have helped to distribute, profitably – suggests their vulnerability to legal and regulatory action. At the same time, however public resistance to the perceived muzzling – downloads of the Infowars app immediately surged in Google’s and Apple’s online stores – show how deeply politicized the issue has become, and how likely it is to further inflame debate over the necessary limits for free speech on campus, in digital forums and in public life.
Most skirmishes in what might be called the social media wars have centred on similar flashpoints. Each new incident has shown that the convergence of media and entertainment has distorted the public sphere in ways that may prove irreparable. As Philip Howard, a professor of internet studies at Oxford, notes in a recent Foreign Policy article “Social media platforms are designed to deliberately exploit the common predilection for selective exposure — the tendency to favour information that confirms pre-existing views — to reinforce messaging from advertising clients, lobbyists, political campaign managers, and even foreign governments.” Immured in our digital echo-chambers, we have learned to adopt and embrace political opinions of every stripe, including the most marginal and fantastical, with religious intensity while allowing the former give-and-take of democratic discourse to disappear almost completely.
Consider, for example, the aftermath of the Parkland school shooting earlier this year. Immediately, Russian controlled Twitter bots began to promote hashtags on both sides of the gun control debate – successfully polarizing the discussion before it had a chance to consider anything else. Shortly afterwards – echoing assertions about Sandy Hook – there were even conspiracy theories that one of the Parkland survivors was a “crisis actor” playing the role of a victim.
Similar provocations and have been used to amplify the digital noise surrounding Black Lives Matter and other social movements. This sort of interference not only skews public debate away from productive discussions, it undermines democracy. The age of Big Data has enabled social media companies and data-miners – the line between them is blurred and fading – to weaponise news through the manipulation of our personal data. As Howard observes this “has helped heighten ethnic tensions, revive nationalism, intensify political conflict, and even produce new political crises in countries around the world — all while weakening public trust in journalism, voting systems, and electoral outcomes.”
The influence of conspiracy theories on public opinion should not be underestimated. Three years after the 9/11 attacks a Zogby poll found that 49 percent of respondents believed the US government “knew in advance that attacks were planned …[and] consciously failed to act.” Two years later 36 percent of those polled by a Scripps-Howard survey believed that “federal officials either participated in the attacks on the World Trade Center or took no action to stop them”. As the legal scholar Cass Sunstein has noted, when a population no longer shares a repository of agreed facts on such key issues, it can easily succumb to anti democratic levels of disinformation and distrust.
Cambridge Analytica’s questionable past in the Caribbean has already shown the dangers of granting social media companies and data miners access to our data. The ubiquity of similar scandals elsewhere suggests that other political actors in the Caribbean are likely to manipulate public opinion in similar ways. If, like US regulators and politicians, we fail to recognize this threat and take appropriate countermeasures, it would be naive to expect our public sphere to fare any better.