A new report by Elon's Imagining the Internet Center and the Pew Research Center surveys opinions of more than 1,500 technologists, futurists and scholars.
Uncivil acts and manipulative behaviors online will continue and possibly expand over the next decade according to a canvassing of more than 1,500 technologists, futurists and scholars by Pew Research Center and Elon University’s Imagining the Internet Center. More than 80 percent said they expect the tone of online discourse to stay the same or get worse between now and 2026.
A total of 1,537 respondents answered the question: In the next decade, will public discourse online become more or less shaped by bad actors, harassment, trolls, and an overall tone of griping, distrust, and disgust? In response, 42 percent of respondents indicated that they expect “no major change” in online social climate in the coming decade, 39 percent said they expect the online future will be “more shaped” by negative activities and 19 percent said they expect the internet to be “less shaped” by harassment, trolling and distrust. This report, part of a series on the future of the internet, is based on a non-random canvassing survey of experts conducted from July 1 to Aug. 12, 2016.
Media stories on this report:
The Atlantic
Nieman Lab
Mashable
AdWeek
Broadcasting & Cable
Network World
CIO
MediaPost
WRAL TechWire
The Stack
New York Post
siliconbeat
Participants were also asked to consider the following prompts: 1) How do you expect social media and digital commentary will evolve in the coming decade? 2) Do you think we will see a widespread demand for technological systems or solutions that encourage more inclusive online interactions? 3) What do you think will happen to free speech? And 4) What might be the consequences for anonymity and privacy? The answers are compiled in a detailed 82-page report that identified the following themes:
Things will stay bad because to troll is human; anonymity abets bad behavior; social, political and economic inequities are motivating inflammatory dialogue; and the growing scale and complexity of internet discourse makes this difficult to overcome.
- Trolling and other destructive behaviors result from a disconnect between anonymous actions online and consequences that might flow from those actions
- Inequities drive at least some of the inflammatory dialogue
- The ever-expanding scale of internet discourse and its accelerating complexity make it difficult to deal with problematic content and contributors
Things will stay bad because tangible and intangible economic and political incentives support trolling. Participation = power and profits.
- ‘Hate, anxiety and anger drive up participation,’ which equals profits and power, so online social platforms and mainstream media support and even promote uncivil acts
- Technology companies have little incentive to rein in uncivil discourse, and traditional news organizations—which used to shape discussions—have diminished in importance
- Terrorists and other political actors are benefiting from the weaponization of online narratives implementing human- and bot-based misinformation and persuasion tactics
Things will get better because technical and human solutions will arise as the online world splinters into segmented, controlled social zones with the help of artificial intelligence (AI).
- AI sentiment analysis and other tools will detect inappropriate behavior and many trolls will be caught in the filter; human oversights by moderators might catch others
- There will be partitioning, exclusion, and division of online outlets, social platforms, and open spaces
- Trolls and other actors will fight back, innovating around any barriers they face
Oversight and community moderation come with a cost. Some solutions could further change the nature of the internet because surveillance will rise, the state may regulate debate and these changes will polarize people and limit access to information and free speech.
- Pervasive surveillance will become even more prevalent
- Dealing with hostile behavior and addressing violence and hate speech will become the responsibility of the state instead of the platform or service providers
- Increased monitoring, regulation, and enforcement will shape content to such an extent that the public will not gain access to important information and possibly lose free-speech rights
“The vast majority of these experts believe the online environment will continue to be shaped by trolling and other anti-social behaviors and struggles over phony or semi-phony information sometimes presented in ‘weaponized’ forms,” said Lee Rainie, director of internet, science and technology research at Pew Research Center and a co-author of this report. “They predict that human and technological fixes will be implemented, but that an arms race with bad actors will persist that could fundamentally hurt the open internet that many of these experts helped create.”
Elon University professor Janna Anderson, a co-author of the report, said the expert respondents worry: “They said messages of hate and discord and the political manipulation being accomplished via fake news and the fanning of flames of fear are magnified by the ease of replication and distribution of information online. The impact of all of this, they say, is compounded by the fact that the firms that operate online platforms see audience attention and profits rise when they redistribute these negative or false messages.”
The report’s third co-author, Jonathan Albright, an assistant professor at Elon University, added, “The experts point out that most of the likely solutions to solve these issues of uncivil discourse raise their own problems because they are likely to involve corporate or government controls over free speech that also raise the potential for surveillance and remove the opportunity for anonymity online.”
Following is a brief sample of thoughts shared by participants in the survey:
Predictions about the future of the internet’s social climate
Vint Cerf, Google VP, and co-inventor of the Internet Protocol: “People feel free to make unsupported claims, assertions, and accusations in online media… As things now stand, people are attracted to forums that align with their thinking, leading to an echo effect. This self-reinforcement has some of the elements of mob (flash-crowd) behavior. Bad behavior is somehow condoned because ‘everyone’ is doing it… Social media bring every bad event to our attention, making us feel as if they all happened in our backyards—leading to an overall sense of unease. The combination of bias-reinforcing enclaves and global access to bad actions seems like a toxic mix. It is not clear whether there is a way to counter-balance their socially harmful effects.”
Andrew Nachison, founder at We Media, said, “It’s a brawl, a forum for rage and outrage.”
Michael Kleeman, formerly with Boston Consulting and Sprint, now senior fellow at the Institute on Global Conflict and Cooperation at the University of California-San Diego: “Increased anonymity coupled with an increase in less-than-informed input, with no responsibility by the actors, has tended and will continue to create less open and honest conversations and more one-sided and negative activities.”
Baratunde Thurston, a director’s Fellow at MIT Media Lab, Fast Company columnist, and former digital director of The Onion: “To quote everyone ever, things will get worse before they get better. We’ve built a system in which access and connectivity are easy, the cost of publishing is near zero, and accountability and consequences for bad action are difficult to impose or toothless when they do. Plus consider that more people are getting online everyday with no norm-setting for their behavior and the systems that prevail now reward attention-grabbing and extended time online.”
Cory Doctorow, writer and co-owner of Boing Boing: “The internet is the natural battleground for whatever breaking point we reach to play out, and it’s also a useful surveillance, control, and propaganda tool for monied people hoping to forestall a redistributive future. The Chinese Internet playbook—the 50c army, masses of astroturfers, libel campaigns against ‘enemies of the state,’ paranoid war-on-terror rhetoric—has become the playbook of all states, to some extent… That will create even more inflammatory dialogue, flamewars, polarized debates, etc.”
Karen Blackmore, a lecturer in IT at the University of Newcastle: “Misinformation and anti-social networking are degrading our ability to debate and engage in online discourse. When opinions based on misinformation are given the same weight as those of experts and propelled to create online activity, we tread a dangerous path… In particular, social online communities such as Facebook also function as marketing tools where sensationalism is widely employed, and community members who view this dialogue as their news source, gain a very distorted view of current events and community views.”
Frank Pasquale, professor of law at the University of Maryland and author of Black Box Society: “The major Internet platforms are driven by a profit motive. Very often, hate, anxiety, and anger drive participation with the platform. Whatever behavior increases ad revenue will not only be permitted, but encouraged, excepting of course some egregious cases.”
John Anderson, director of journalism and media studies at Brooklyn College, wrote, “The continuing diminution of what Cass Sunstein once called ‘general-interest intermediaries’ such as newspapers, network television, etc., means we have reached a point in our society where wildly different versions of ‘reality’ can be chosen and customized by people to fit their existing ideological and other biases. In such an environment there is little hope for collaborative dialogue and consensus.”
David Durant, a business analyst at UK Government Digital Service, argued, “It is in the interest of the paid-for media and most political groups to continue to encourage ‘echo-chamber’ thinking and to consider pragmatism and compromise as things to be discouraged. While this trend continues, the ability for serious civilized conversations about many topics will remain very hard to achieve.”
Laurent Schüpbach, a neuropsychologist at University Hospital in Zurich, Switzerland, pointed out burgeoning acts of economic and political manipulation, writing: “The reason it will probably get worse is that companies and governments are starting to realise that they can influence people’s opinions that way. And these entities sure know how to circumvent any protection in place. Russian troll armies are a good example of something that will become more and more common in the future.”
Bryan Alexander, futurist and president of Bryan Alexander Consulting: “The number of venues will rise with the expansion of the Internet of Things and when consumer-production tools become available for virtual and mixed reality.”
David Wuertele, a software engineer at Tesla Motors: “Unfortunately, most people are easily manipulated by fear… Negative activities on the internet will exploit those fears, and disproportionate responses will also attempt to exploit those fears. Soon, everyone will have to take off their shoes and endure a cavity search before boarding the internet.”
About possible solutions for bad online behavior
Galen Hunt, a research manager at Microsoft Research NExT: “As language-processing technology develops, technology will help us identify and remove bad actors, harassment and trolls from accredited public discourse.”
Marc Rotenberg, executive director of the Electronic Privacy Information Center (EPIC): “The regulation of online communications is a natural response to the identification of real problems, the maturing of the industry and the increasing expertise of government regulators.”
David Clark, a senior research scientist at MIT and Internet Hall of Famer: “It is possible, with attention to the details of design that lead to good social behavior, to produce applications that better regulate negative behavior. However, it is not clear what actor has the motivation to design and introduce such tools. The application space on the internet today is shaped by large commercial actors, and their goals are profit-seeking, not the creation of a better commons.”
Read the full report: https://www.elon.edu/u/imagining/surveys/vii-2016/social-future-of-the-internet/