Elon University Home

 

The 2017 Survey: 
The Future of Truth and Misinformation Online, Part 3 of 6

What are the societal consequences of a polluted information environment?

Download the full report graphicTechnologists, scholars, practitioners, strategic thinkers and others were asked by Elon University and the Pew Research Internet, Science and Technology Project in summer 2017 to share their answers to the following query - they were evenly split, 51-49 on the question:

What is the future of trusted, verified information online? In the next 10 years, will trusted methods emerge to block false narratives and allow the most accurate information to prevail in the overall information ecosystem? Or will the quality and veracity of information online deteriorate due to the spread of unreliable, sometimes even dangerous, socially-destabilizing ideas? 

This page holds a full analysis of the answers to the second of five follow-up questions:

What are the consequences for society as a whole if it is not possible to prevent the co-opting of public information by bad actors?

Among the key themes emerging from among 1,116 respondents' answers were: - We could be turning a corner into an extremely dangerous time in human history. - Democracy is damaged when people cannot trust in information; some are likely to be overwhelmed and simply give up on participating in civic life. - Social, economic and political inequities are seen by some as a root cause. - A lack of 'common knowledge' hinders finding common ground and common solutions. - An inability to trust is damaging to all human relationships and systems. - Societies must support and turn to credentialed sources in the future - to 'trusted actors.' - This is the human condition - misinformation lives - yet some choose to expect or at least be optimistic that 'truth wins out.' - The jury is out on whether any actions taken will have net-positive results. 

If you wish to read survey participants' credited responses with no analysis, please click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_information_environment_credit.xhtml

If you wish to read anonymous survey participants' responses with no analysis, please click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_information_environment_anon.xhtml

Summary of Key Findings Follow-Up Question 2

Democracy is damaged when people cannot trust in information. A lack of 'common knowledge' hinders finding common ground and common solutions. Misinformation lives; will truth win out? 

Most of the expert respondents in this canvassing said people have to adjust well and quickly to this new information environment or there could be extremely serious effects.

Among their predictions are larger, seemingly insurmountable divides between social groups and the further fragmentation of society; people being overwhelmed by an avalanche of information in which they cannot discern fact from fiction; deepening distrust damaging the development of problem-solving policies; and the decline of democracy that results from an uninformed voting public and – possibly – a withdrawal by the public from civic engagement.

Some worry we could be turning a corner into
an extremely dangerous time in human history

An internet pioneer and principal architect in computing science replied, “Governments claiming to live in a ‘post-truth’ age slaughtered tens of millions during World War II, so we’ve seen the movie before. It didn’t end well.”

Jerry Michalski, futurist and founder of REX, replied, “My greatest fear at the moment is that the post-factual world isn’t a three-year aberration but is instead a 300-year era, much like the 400 years after Gutenberg transformed and traumatized Europe.”

A professor of law at a major U.S. state university commented, “America will look more and more like Russia. There will be more extremes of rich and poor, more corruption, more distrust.”

A project manager for the U.S. government responded, “We will fight amongst ourselves and really do some serious damage to our way of life.”

Joshua Hatch, president of the Online News Association, noted, “Honestly, I think it has the potential to lead to civil war. I don’t think that will happen, but I do think those are the stakes.”

A research scientist for the Computer Science and Artificial Intelligence Laboratory at MIT said, “In the most-extreme case, it could mean the dissolution of liberal democracies and replacement by chaos or tyranny. Any increase in risk of either should concern everyone, even if the probability is deemed small.”

A senior global policy analyst for an online citizen advocacy group simply sent the link to a photo representing Dante’s Inferno: https://media.npr.org/assets/artslife/arts/2010/02/dantesinferno/inferno_archive-becf91cdf5140a602f77b5514b518ba7db4db4f6.jpg?s=1400

Fredric Litto, professor emeritus, University of São Paulo, Brazil, wrote, “As a longtime student of information technology, I have long believed that a new ‘sickness’ would arise in modern society, one of increasing complexity in everyday life causing many people to resist leaving the security of their beds each morning, preferring to remain under the covers so as not to have to deal with decision-making and the exhausting need to constantly be learning new ways of working. If, in addition, we add to this situation the question of the insecurity arising from the inability – to never be able to entirely trust the information one encounters – there will inevitably be massive evidences of stress-provoked illness.”

Stephen Bounds, information and knowledge management consultant, KnowQuestion, said, “This is the new reality of warfare today. Wars are going to be literally fought over ‘information supply lines,’ much as food supply lines were critical in wars of years gone by.”

John Wilbanks, chief commons officer, Sage Bionetworks, replied, “We’re there already. Polarized sub-groups refusing to acknowledge truth of anything they disagree with weaponized through intentional manipulation to lower trust in institutions and intermediaries. What’s next is like this, but moreso.”

Amber Case, research fellow at Harvard Berkman Klein Center for Internet & Society, replied, “The consequences for society are similar to what we’ve already seen: Greater rifts between political parties and social classes. Crime due to misunderstandings. A waste of advertising revenue on knee-jerk reactions instead of reflective ones. So much of our online meeting is about speed and reaction that we can’t pause to take a breath before reacting.”

Tom Valovic, Technoskeptic magazine, noted, “Consequences are discussed in my book ‘Digital Mythologies’: Postmodern chaos and moral relativism along the lines of exactly what we’re seeing now.”

David Wood, a UK-based futurist at Delta Wisdom, said, “We might blunder into World War III, a new Dark Age, or another existential disaster – driven there by foolish judgment and bad feelings stirred up by bad information.”

Deirdre Williams, an internet activist, replied, “We will become a population of credulous slaves until the pendulum begins to swing in the opposite direction.”

Mercy Mutemi, legislative advisor for the Kenya Private Sector Alliance, observed, “Manipulation. And not just during elections or referenda. I imagine a world in which simple policy decisions may be manipulated to the benefit of a few… Weaponized fake news could turn us all into robots and the worst part is we wouldn’t even know we are being manipulated.”

Michael Rogers, principal at the Practical Futurist, wrote, “False information is toxic for just about every level of decision-making, from personal (say, vaccination) to public (elections, say, or mobilizing a nation for war). Often one can’t know information is toxic until it has been acted on, and then it’s too late.”

Morihiro Ogasahara, associate professor at Kansai University, said, “Democracies will be corrupt because of lack of credible information for decision making.”

Charlie Firestone, executive director, Aspen Institute Communications and Society Program, commented, “The result is the road to authoritarian society where the people are left with the word and power of the state to give them the truth. Not a desirable state.”

An executive consultant based in North America wrote, “The speed and broad spread of bad information could have devastating results – it has already impacted the outcome of a presidential election. It could also impact the markets and the economy by tainting corporate and personal reputations.”

Esther Dyson, a former journalist and founding chair at ICANN, now a technology entrepreneur, nonprofit founder and philanthropist, expert, said, “The consequences would be increased cynicism and despair; general breakdown over the long term.”

Howard Rheingold, pioneer researcher of virtual communities, longtime professor and author of “Net Smart: How to Thrive Online,” said, “People will lose their liberties (we are already seeing that in the U.S.) because of political info-manipulation; people will lose their lives because of bad medical information.”

Bob Frankston, internet pioneer and software innovator, said he hopes for the best, writing, “Faith may trump understanding. I think longer-term understanding will prevail because the ideas are more powerful. But it won’t be a simple process of refinement.”

Jamais Cascio, distinguished fellow at the Institute for the Future, noted, “Multiple scenarios are possible. We could see this as triggering a memetic immune system, where we develop both technological and cultural tools and methods to favor veracity over stimulation; we could see the further hardening of ideological barriers, with entire communities/regions becoming controlled in ways that bar wrong-think; we could see the disproportionate success of a particular worldview causing the followers of bad actors to fail. Two broad scenarios: 1) civil conflict, where ideological and cultural divisions metastasize into violence; or 2) our cognitive immune systems of skepticism and verification become stronger and society becomes healthier as a result."

An anonymous internet pioneer and longtime leader in ICANN said, “Consequences include wholesale fraud and other malicious behavior. We haven’t been able to prevent it in real life, so why should we assume that we can do it in cyberspace?”

Democracy is damaged when people cannot trust
in information: some are likely to be overwhelmed
and simply give up on participating in civic life

Many respondents discussed the impacts of a misinformed, divided, confused populace on democracy, some of them noting how a lack of faith in a political system impacts every aspect of the life of a society.

Marc Rotenberg, president, Electronic Privacy Information Center, wrote, “The consequence is the diminishment of democratic institutions.”

Thomas Frey, executive director and senior futurist at the DaVinci Institute, replied, “Failure to find a solution will mean rapidly deteriorating confidence in all systems including the stock market, financial systems and even voting/elections at the heart of a democratic society.”

danah boyd, principal researcher, Microsoft Research and founder, Data & Society, wrote, “Democracies depend on functioning information ecosystems. If we don’t address the fundamental issues at play, we could risk the collapse of democracies.”

Evan Selinger, professor of philosophy, Rochester Institute of Technology, wrote, “A well-functioning democracy is inherently incompatible with a post-fact society. Full stop.”

Ari Ezra Waldman, associate professor of law and New York Law School, wrote, “Democracy dies in a post-fact world. When the public square is so crowded and corroded by misinformation, misleading information, and outright lies, a number of things happen. First, trust falls. Trust in institutions, including the media, falls, spawning denialism and conspiracy theories. Second, knowledge falls. That is, society becomes less educated. When nothing is verifiable and everything is an opinion, people can start believing the Earth is flat and have no reason to doubt themselves because, as Mr. Trump has said about Russian hacking, ‘no one really knows.’ In a world where ‘no one really knows’ anything, anyone can say anything. Rationality, then, ceases to exist.”

Dan Gillmor, professor at the Cronkite School of Journalism and Communication, Arizona State University, commented, “It would mean the end of democratic self-rule. If all information is equal, no matter whether it’s true or false, benign or malign, we can’t make decisions based on reality.”

David Manz, a cybersecurity scientist, replied, “The result will be ‘1984’: dystopian authoritarians and powers of all types can more easily manipulate the feelings of the people.”

Charles Ess, a professor of media studies at the University of Oslo, wrote, “The consequences will be apocalyptic – at least for those of us who still believe in the basic notions of human freedom, sociability, rational discourse and affiliated rights to privacy and freedom of expression that ground democratic polity. The alternatives are already visible in Putin’s Russia and elsewhere: societies masquerading as ‘democratic’ that are in fact thinly disguised electronic oligarchies that render most of us into feudal chattel.”

Scott Shamp, an interim dean at Florida State University, commented, “We will be ruled by those who view accuracy as secondary to gaining advantage.”

Rick Forno, senior lecturer in computer science and electrical engineering at the University of Maryland-Baltimore County, said, “The consequences are an uninformed, misinformed electorate that elects similar-minded people.”

Serge Marelli, an IT professional who works on and with the Net, wrote, “The consequences are lots of Donald Trumps (or Putins, Erdogans, or...) in many countries.”

Mike Meyer, chief information officer at University of Hawaii, wrote, “The current governmental and social structure will collapse.”

A professor of education policy commented, “For some, it will mean the continued (or new) loss of the fundamental right to vote. For others, it will mean increasing precarity with fewer governmental safety nets to help cushion blows. For others it will mean loss of life, of property, of basic rights. For others, it will mean greater wealth, increased capacity to pillage natural resources, et cetera. And importantly, this is not U.S. society – this is global.”

Jan Schaffer, executive director of J-Lab, said, “People will not be able to make informed choices, and will likely opt out of political participation. Bad actors will continue to be elected. Progress will be hampered by drama and a focus on maintaining power. Civil society will decline.”

Uta Russmann, a professor whose research is concentrated on political communication via digital methods, noted, “Society as a whole will increasingly rely on software such as IBM Watson. Moreover, even though probably not in the next 10 years but in the long run, computers will become more intelligent than humans. In the next 10 years, around the world, almost everyone will have a smartphone and hence almost the same access to information and education. But this development will cause less trust between people.”

A professor of rhetoric and communication wrote, “Our democracy depends on the presumption of shared facts, rational behaviors, and a kind of regularity and trust in the laws. Online, all information, even bald-faced lies, can be made to look equal to the actual facts. The consequences are that we swing from one extreme to another, that people stop talking to anyone they disagree with, that we move away from building trust based on a common set of values and truths.”

Adam Gismondi, a researcher at the Institute for Democracy & Higher Education, Tufts University, observed, “The consequences would be enormous, with many problems emerging that we are currently unable to foresee. A problem we can predict, however, is major damage to our electorate, who will be increasingly distrustful of our major institutions (including media, scholars and government officials), and as a result less informed on the very issues that drive our democracy.”

Kenneth Sherrill, professor emeritus of political science, Hunter College, City University of New York, said, “As people increasingly become unable to determine which information sources are trustworthy, they will decide that it is not worth their time and energy to do the work of figuring out what’s true. Partisans will remain wedded to their trusted partisan sources. Others will drop out. Society will become more fragmented and civic engagement will decline.”

Laurel Felt, lecturer at the University of Southern California, commented, “The consequences are dire because it disincentivizes following the news and/or acting on objectionable behaviors (e.g., miscarriages of justice, corruption, et cetera) because citizens may doubt that such flagrant offenses are being committed OR they may get the distorted sense that they happen all the time.”

Peng Hwa Ang, an academic researching this topic at Nanyang Technological University, observed, “Interestingly, some research done suggests that it is not the burst of fake news (also known as public information by bad actors) that is the problem. It is the corrosive drip of such news that erodes confidence in democracy. The very bad outcome is the distrust in the democratic process. If we look at the U.S. and UK, the winner of both elections are those who want to erode such confidence.”

Many said that when the public loses trust or when they are overwhelmed by too much contradictory input they will lose faith in the process and ‘opt out’ of participation.

C.W. Anderson, professor at the University of Leeds, wrote, “More and more people will opt out of civic engagement, and civic life, generally, as they become overwhelmed with separating signal from noise. It will simply become too much work. Which will leave people in charge who do not have the best interests of the general population at heart. Most likely, the polarization of American society will only spread further.”

Justin Reich, assistant professor of comparative media studies, MIT, noted, “Autocratic societies have recognized that the most effective way to manage societal information is less about countering particular perspectives and more about flooding people’s information channels with distraction, misdirection and falsehoods. In the absence of reliable arbiters of truth and falsehood, people are more likely to defer to tribal and ideological loyalties. Gary King has excellent research on state-sponsored social media propagation in China, showing that one of the central aims is flooding social media with noise and distraction.”

A professor and researcher of American public affairs at a major university replied, “Perhaps the most troubling consequences are a general lowering of trust in institutions and withdrawal from public life. This creates a self-reinforcing spiral in which the least scrupulous people gain outsize influence on the political system.”

A professor of law based in North America replied, “Many people will not trust the information, which makes it possible for rumor and innuendo to play an even bigger role than today. Policy making, already difficult, will become even harder. People will not trust.”

David Brake, a researcher and journalist, replied, “Political debate will become increasingly difficult if both sides of the debate proceed from very different premises about the state of the world.”

William Anderson, adjunct professor, School of Information, University of Texas-Austin, replied, “The primary consequence is that there will not be any large-scale society that shares common goods and common goals.”

A professor of information science at a large U.S. state university wrote, “Consequences include a loss of trust in other people, in political powers and in countries."

An author/editor/journalist wrote, “There are forces both in the political spheres of the world and the natural spheres that are ongoing and untreatable without the collective strong-minded action of people. They require a determined and relentless response from humans to be overcome, and when humans are too busy fighting each other, there is scant energy left.”

Social, economic and political inequities
seen by some as a root cause

Societal inequities are seen by a share of the respondents as an underlying cause of much of the misinformation, the disagreements, the shrinkage of what might have been considered “common knowledge.”

A professor at MIT commented, “We can look forward to intensified tribalism and a breakdown of a commonly held social norms… Some people may be dupes, but most are simply navigating the world in ways that they think make sense. Media don’t ‘do’ things – people do. So let’s push for economic and social equity, for inclusion, for critical thinking and solve the problem that way. Once we have systems that vet ‘the truth,’ you can be sure that power will find a way to use it to its own advantage. My ‘truth’ is invariably another man’s ‘lie,’ so while we can and should have a pitched debate over truth claims, nothing beats an empowered, thoughtful and critical populace.”

Jon Lebkowsky, web consultant/developer, author and activist, commented, “We’re already seeing the consequences: deep polarization, suspicion, manipulation of public sentiment, erosion of rational civil discourse, widespread confusion and increasingly chaotic public/politcal spheres.”

A research scientist who works at Google, said, “The consequence could be a two-class society where the competent minority is ruled by an ignorant majority.”

A partner in a services and development company based in Switzerland predicted that using censorship as a tool will make things worse, writing, “Ultimately [the result of the expansion of misinformation will be] bellum omnium contra omnes – the war of all against all – as described by Hobbes. Of course society automatically creates new centers of power, making do with disparate inadequate tools if no orderly adequate ones are available. One such inadequate tool is censorship; it is likely to spread and further compound the problem of trust."

Sahana Udupa, professor of media anthropology at Ludwig Maximilian University of Munich, explained how the popularly used term “bad actors” would be better stated as “extreme speech” and then noted how inequities have eliminated some people. She wrote, “‘Bad actors’ is a highly political term and one cannot have a blanket approach to digital information based on a strict binary of bad and good. We have developed the concept of ‘extreme speech’ to understand how digital speech as a situated practice pushes the boundaries of legitimate speech along the twin axes of truth-falsity and civility-incivility. This means we remain aware of how ‘bad actors’ and ‘hate speech’ are framed in the first place. With such a situated understanding, digital speech practices should be mapped, and if found harmful for contextually rooted, historically sensitive reasons, one must raise strong barriers. The reason is such harmful content can render it so vulnerable groups and individuals (along the axes of caste, gender, religion, ethnicity, numerical majority/minority etc., and their intersections) are not prevented from participating in the public debate. The rise of right-wing populism and demonic majoritarianism have a lot to do with negative forms of extreme speech forced into the channels of digital media. Digital media consumers are not gullible. But trust networks are radically reconfigured in the digital age. What comes on Whatsapp is rarely disputed because messages are individualized and laced with multimedia enhancements which make it difficult to recognize and bust false and hateful narratives.”

A lack of 'common knowledge' hinders finding
common ground and common solutions

Larry Diamond, senior fellow at the Hoover Institution and FSI, Stanford University, observed, “If we fail in this task, social and political polarization will deepen because there will be no basis for common knowledge and understanding about any issue. Every political orientation will be living in its own information bubble, even more so than today. Distrust in institutions of all kinds will proliferate, and the ability of authoritarian regimes to subvert democratic practices, as Russia did with the 2016 elections in the U.S., will surge.”

Maja Vujovic, senior copywriter for the Comtrade Group, noted, “The very fabric of society would unravel if the public could never trust public information. Worse yet, spurious agents could control large groups by serving them simplified or distorted information. People have proven severely susceptible to propaganda, over and over. Democracy gets deformed without independent information to guard it.”

Steve McDowell, professor of communication and information at Florida State University, replied, “Without a common set of verifiable facts it is hard to have an informed public debate on problems and challenges confronting society, and to move from there to identifying priorities and crafting social or policy responses.”

Alan D. Mutter, media consultant and faculty at graduate school of journalism, University of California-Berkeley, replied, “Loss of trust starts with diminished confidence in the truth of empirically indisputable facts like climate change or the brutal consequences of underfunding Medicaid. If we cannot agree on the facts, then we cannot have the sort of dialogue that produces good public policy and a healthy society. When truth is denied and trust is devalued, our democracy is at peril.”

A political science and policy scholar and professor said, “The result could be the end of a shared reality, which makes it much harder to work together as a society.”

Jeff Jarvis, professor at the City University of New York Graduate School of Journalism, commented, “The consequences are dire indeed: a failure of the public conversation and deliberation that is the engine of democracy. But beware the temptation to cry as journalists do, ‘You’ll miss us when we’re gone.’ Many would not. We must reinvent journalism for this new reality to truly serve diverse communities in society and to convene them into informed and civil conversation, not just to produce a product we call news that we insist the public must trust, or else!”

Geoff Scott, CEO of Hackerati, commented, “That depends on whether people choose to think critically and independently, pursue information from outside of their information bubble, and are open to having their minds changed. This is primarily and education and social challenge. If we can’t address it at this level, bad actors will simply need to assert a narrative that serves their purpose and people will believe it.”

Glenn Edens, CTO for Technology Reserve at Xeroz/PARC, “The continued fragmentation of information consumers is a serious issue. Society, of course, works best when there are a set of shared values, transparency and verifiable sources – we will have to adapt to this new world and the means are not clear yet.”

Veronika Valdova, managing partner at Arete-Zoe, noted, “Disintegration of information environment to the point that there is no universally accepted picture of reality would have profound consequences on the functioning of the entire society. People act on information they have, they buy property, pursue careers, move around the country and the world, conduct business, undergo treatments, dedicate their time and resources to college degrees based on expectations and picture of the world they have. If there is no consensus on the basic understanding of reality, people cannot make rational informed choices relevant to their everyday life. At individual level, a prolific disinformation results in shattered reputation and loss of status, money, opportunity and life. Alternative information pipelines that develop in response to overwhelming amounts of falsehood in major media may be equally damaging.”

Mark Glaser, publisher and founder, MediaShift.org, observed, “The consequences are terrible. Without having trusted information, it’s difficult for many organizations to function properly including governments. We rely on factual information as a cornerstone of a functioning democracy.”

Scott Spangler, principal data scientist, IBM Watson Health, wrote, “The loss of a common basis for fact and perceived reality will lead to greater and greater societal fragmentation and conflict.”

Henning Schulzrinne, professor and chief technology officer for Columbia University, said, “It further degrades the notion of a common basis of fact for discussions of public policy and governance, as all information will be seen as equally suspect and untrustworthy, or as just an opinion, a matter of taste.”

Paul Hyland, principal consultant for product management and user experience at Higher Digital, observed, “Civil discourse and political debate will be increasingly difficult or impossible."

Laurie Rice, associate professor at Southern Illinois University-Edwardsville, said, “When misinformation is common and widespread, trust in all sorts of sources of information – both good and bad – weakens. In addition, those lacking the tools, time, or skill to weigh and assess information from various sources can become more easily manipulated by bad actors.”

An inability to trust is damaging
to all human relationships and systems

Many respondents were most concerned with the loss of trust – not only in information but in other humans and in human systems.

Alejandro Pisanty, a professor at UNAM, the National University of Mexico, and longtime internet policy leader, observed, “The consequences of public information being coopted by bad actors are dire; a downward spiral of reinforced ignorance-bad judgment-generalized disbelief preys on the prejudices, bias and sheer lack of knowledge of large sectors and allows bad actors either to coopt them, or to be allowed free rein to govern and prey.”

David Conrad, a chief technology officer, replied, “Trust in systems, processes, and institutions will continue to degrade, potentially leading to increases in demagoguery, extremism and civil strife.”

Richard D. Titus, CEO for Andronik and advisor to many technology projects, wrote, “All societies are built upon trust. The Magna Carta, the Declaration of Independence – trust is critical to civil society. One need only look where the deterioration of trust occurs, Somalia, Syria, Venezuela and now ever the U.S. to see rising militancy, civil unready as the direct descendants of a loss of trust.”

Erhardt Graeff, a sociologist doing research on technology and civic engagement at the MIT Media Lab, said, “Although mistrust can be used constructively to buttress democracy, when it is allowed to fester into chaos it poses an existential threat to the institution. The creation of chaos by seeding doubt and conspiracy theories – a highly refined strategy in Russian propaganda – allows authoritarianism to rise. When all facts are dubious and you cannot trust anyone, a broad power vacuum can emerge which entrenched powers and charismatic leaders are poised to fill by promising nostalgic visions of strong-armed stability.”

Sharon Haleva-Amir, lecturer in the School of Communication, Bar Ilan University, Israel, said, “The consequences depict a rather dark future of a more-polarized world; a world in which words, facts or truth have no meaning where people will not be able to cooperate for worthy causes as they will not trust one another and in which governments control citizens by spreading whatever news they want to. With no trustworthy media to count on, no democratic society could live on and prosper.”

Sonia Livingstone, professor of social psychology, London School of Economics and Political Science, replied, “I have two kinds of anxieties: 1) Bad actors in the sense of hackers, purveyors of misinformation, criminals will get dealt with by state-led systems of criminal justice and law enforcement; I’m more worried about. 2) The commercial reshaping of ‘information’ in the interests of powerful private sector actors with a lot of lawyers and lobbyists. People already can’t tell what’s trustworthy or true, so probably those with knowledge will withdraw and find a niche alternative, while those without knowledge will withdraw and just become yet more disaffected; which will exacerbate inequality and disruption.”

Hazel Henderson, futurist and CEO of Ethical Markets Media Certified B. Corporation, said, “As we have seen, whole societies can be disrupted, citizens mistrusting and informing on each other, these are the classic tools of takeovers by totalitarian leaders and authoritarian governments. George Orwell was right!”

Brooke Binkowski, managing editor of online fact-checking site Snopes.com replied, “Destabilization; chaos; bad faith; lack of national unity, which in turn further weakens the state; lack of trust in institutions within and without the United States; violence toward the most vulnerable (i.e., migrants and impoverished citizens).”

Larry Keeley, founder of innovation consultancy Doblin, observed, “In a word: Devastating. It will erode the basic check on power (governmental, corporate, institutional) that depends on trusted information systems. George Orwell’s dystopian world becomes a reality – but far worse, since there will be many more ways for anyone with an agenda to find fanatical adherents. After a while, citizens, like in Russia or North Korea will trust no one.”

George Siemens, professor at LINK Research Lab at the University of Texas-Arlington, commented,  “Effective discourse across political party and other belief systems are at risk if we are not able to create a trusted digital information ecosystem. The consequences of society will be a loss of many of the social and connected aspects of the internet as confidence will be stripped from these spaces. Dialogue will return to closed, trusted spaces with intermediaries (like with traditional media).”

Kenneth R. Fleischmann, associate professor at the University of Texas- Austin School of Information, wrote, “Certainly, there are egregious cases of introducing fake news to attempt to change the outcome of, for example, the 2016 election, particularly when it is apparent that much of this was done by or with the assistance of Russian hackers. However, any efforts to clamp down on such fake news might constitute a form of censorship that might undermine trust – there is a fine line between ensuring trustworthiness of public information and leading to a centralized government information system where there is only one trusted set of information (which is not necessarily trusted, on a deep level, by a large chunk of the population)."

Paul Saffo, longtime Silicon Valley-based technology forecaster, commented, “People focus on dramatic, extreme outcomes, but the greater danger comes from the insidious effects of cynicism and distrust engendered by an unreliable medium. The consequence will be increasing friction in social systems that in turn will retard the effective functioning of global civil society in the face of ever-growing complexity.”

Societies must support and turn to credentialed
sources in the future - to 'trusted actors'

Many respondents pointed out that the overwhelming amount of information – “mis” and otherwise – about any topic today is part of the problem. They note that when people feel overwhelmed by information they often turn to social media echo chambers or other spaces where they can find familiar information with which they agree. They may often be misinformed or manipulated or just not get the chance to see other viewpoints fairly shared. 

Irene Wu, adjunct professor of communications, culture and technology, Georgetown University, said, “Information and communication are the weft and weave of the social fabric. If they are unreliable, society cannot hold together. However, decentralization of power over information while allowing trusted institutions to emerge is a better path forward than centralizing the responsibility for good information with security services or with the government.”
An anonymous respondent from the Berkman Klein Center at Harvard University noted, “There have always been and will always be bad actors. The danger is in the unwillingness of the general population to accept responsibility for assessing the intentions, actions and information of those people around them. When the people won’t accept that responsibility, a society can slip easily into group thinking or absolute rule by the minority who control the channels of information.”

Alan Inouye, director of public policy for the American Library Association, commented, “One implication is that the general public will need to be increasingly more sophisticated about information and communication – being able to critically examine not only the content of messages but also the context in which they are communicated.”

Sam Punnett, research officer, TableRock Media, replied, “The consequences are dire, leading to an erosion of social institutions. The outcome is a choice of choosing to be educated or choosing to be manipulated. Media/information literacy is essential in a society that generates so much of it. Put glibly, you are what you view. Information flow to the individual will only get increasingly customizable in the future. If one lacks the awareness of one’s media diet then one gives little thought to the potential hazards of a ‘junk’ diet that feeds exclusively entertainment over that which feeds an informed citizen.”

A number of respondents expect the reaction by some people to the recognition that there is a lot of misinformation out there will be a return to neutral information sources.

A researcher at Karlsruhe Institute of Technology replied, “Societies need to develop strategies – and can – to identify and tackle unwanted actions. One effect might be, that trust in established authorities will increase because they will be regarded as the only ‘safe zones’ in the otherwise wild information scape. This obviously requires that such authorities meet the expectations, otherwise alternative (‘bad’) actors might prevail.”

An associate professor at a major Canadian university wrote, “We may see a shift back to notions of authority that drove pre-internet print publishing and journalism.”

Leah Lievrouw, professor in the department of information studies at the University of California-Los Angeles, observed, “Bad actors have always attempted to capture and shape knowledge for their own purposes and gain, whether we call it revisionism, propaganda, spin, public relations, ‘curation,’ what have you. The question is whether, and which, information sources can be perceived as impartial, thorough, dispassionate, and fair. That’s a cultural question: do we even want fair, impartial – modern – information sources? If so, we’ll find or create them. If not, no amount of technological intervention will magically ensure information quality.”

Andrew Odlyzko, professor of math and former head of the University of Minnesota’s Supercomputing Institute, said, “We in an environment of information and misinformation. We will simply have to learn how to choose the webs of trust we rely on.”

Geoff Scott, CEO of Hackerati, commented, “That depends on whether people choose to think critically and independently, pursue information from outside of their information bubble, and are open to having their minds changed. This is primarily and education and social challenge. If we can’t address it at this level, bad actors will simply need to assert a narrative that serves their purpose and people will believe it.”

Jim Hendler, professor of computing sciences at Rensselaer Polytechnic Institute, commented, “We will need to develop social and community mechanisms to try to deal with this – the problem is not new to digital media, propaganda, for example, has been around a long time – the bigger issue is whether newer and faster ways of distorting information can be countered by newer and faster ways of providing balance.”

Liam Quin, an information speciliast with the World Wide Web Consortium, noted, “What we need is not to prevent lies, but to make it easier to find out whether something is true.”

The assistant director of a digital media and learning group at a major U.S. university, said, “It’s critical to have journalism organizations such as the Washington Post continue to fight to provide deep and critical political coverage. The consequences of not supporting an independent press are a lack of a public record, the inability to provide a coherent approach and narrative to a variety of issues, as well as the broader enterprise to keep in check immoral and incompetent government entities, corporations and bad actors."

Jeff Jarvis, professor at the City University of New York Graduate School of Journalism, commented, “The consequences are dire indeed: a failure of the public conversation and deliberation that is the engine of democracy. But beware the temptation to cry as journalists do, ‘You’ll miss us when we’re gone.’ Many would not. We must reinvent journalism for this new reality to truly serve diverse communities in society and to convene them into informed and civil conversation, not just to produce a product we call news that we insist the public must trust, or else!”

This is the human condition - misinformation lives.
Some choose to expect or at least
remain optimistic that 'truth wins out'

A share of respondents said while the information environment today is distressing it exhibits typical behavior that has been made more visible due to technological change.

Stephen Downes, researcher with the National Research Council of Canada, commented, “The co-opting of public information by bad actors has already taken place, and has been the case for a long time. The internet did not create this phenomenon. We know what happens – wars are started based on fake stories, mass persecution of races, religious or political groups takes place, the environmental and health impacts of dangerous chemicals and processes are ignore – etc. The history of information before the internet is full of such cases. What is new is 1) the means to create disinformation have become democratized, but 2) so have the means to detect it.”

Matt Armstrong, an independent research fellow working with King’s College, formerly executive director of the U.S. Advisory Commission on Public Diplomacy, replied, “The society will be co-opted. The naive, the angry and the shortsighted will be enlisted in support of bad actors, and any wake-up or regret will come too late. This is not new, despite our attempts to frame this issue as unusual and unique to our age.”
Francois Nel, director of the Journalism Leaders Programme, University of Central Lancashire, noted, “(With apologies to the Bible) like the poor, bad actors and taxes will always be with us. And we have, and will, continue to cope.”

David C. Lawrence, a software architect for a major content delivery and cloud services provider whose work is focused on standards development, said, “It maintains the status quo of human social life throughout our entire existence as a species, and that’s probably for the better. Knowing that you have to be prepared for the possibility that someone is misleading you is better than trusting everything and then ultimately being taken advantage of when the bad actors find a way to get their message out.”

John Lazzaro, a retired electrical engineering and computing sciences professor from the University of California-Berkeley, wrote, “Society will develop antibodies: skepticism, common sense, ‘extraordinary claims require extraordinary proof.’ For a real-world analogy, the rise of the three-card Monte con in the New York subway in the 1970s did not lead to an economic catastrophe.”

Scott Fahlman, professor emeritus of AI and language technologies, Carnegie Mellon University, pointed out that media-based information always been manipulable but the explosion of information has complicated things, writing, “We have moved from a world with just a few news outlets, trusted by some and able to manipulate public opinion to a more diverse and democratic system in which everyone can spread their opinions (and deliberate lies if they choose to). We have seen that it is now much harder for people to decide what to believe, and many people no longer make the effort, just living in their own echo chambers.”

Brian Harvey, teaching professor emeritus at the University of California - Berkeley, said, “There’s nothing new about ‘coopting of public information’ except, maybe, that the internet lets non-rich people play. The best vaccination I know of against fake news is to study Marx and understand the class structure of society, but this is pretty much a non-starter in the U.S.”

Tom Rosenstiel, author, director of the American Press Institute, senior non-resident fellow at the Brookings Institution, commented, “Over the stretch of history, I think the signals are fairly clear. It requires political leadership, and cooperation by political leaders, to blunt the effect of misinformation. It is not going to be something that technology can solve, though it is something technology can make worse. Will the political parties and political leaders exploit the innate potential of citizens to believe false rumors, or will they try to rise above it.”

A selection of additional remarks on historical context:

• An anonymous professor of media and communications commented, “It will continue as it always has. Truth is not a necessary condition for social function.”
• The dean of one of the top 10 journalism and communications schools in the U.S. replied, “Thinking that today’s ‘fake news’ is different from other major points in history is silly. We’ve always had enormous sources of fake news passed along one-to-one, by bad sources, through misleading PR efforts, by phone, by fax, by penny press, by party press, by McCarthyism and similar efforts, even in religious venues. We always believed that ‘truth and falsehood grapple,’ only that we are optimistic that truth eventually wins out.”
• And an anonymous CEO and consultant based in North America said, “This has been a problem since Man started communicating in a physical medium. We have yet to figure out how to deal with bad actors.”
• An independent journalist and longtime Washington correspondent for leading news outlets said, “The consequence? It’s called life.”

The jury's out on whether any actions
taken could have net-positive results

Whether they predicted that it won’t improve or it will improve in the next decade, the vast majority of respondents wish for a better information environment in the future but are unsure if it can be achieved.

Sandro Hawke, a member of the technical staff for the World Wide Web Consortium, shared a common point of view in writing, “If we let bots, people with psychosis and enemy agents (whoever that might be in the given context) speak with the same voice as members of our communities, the damage will be incalculable. Literally, the end of civilization, I expect. We have to maintain strong attribution and reputation elements in public discourse. We have to be clear who is saying what, and how it fits in with how they’ve behaved in the past.”

The majority of respondents indicated they would like to see more people actively thinking through the possible ways to enhance the information environment, but some expressed doubts that much can be done without causing other negative effects, and some said human nature is such and the advances in the weaponization of technology are such that the current information atmosphere may not be something that can be improved upon.

For instance, in their answers to this and all questions in this canvassing many expressed doubts in the public’s willingness to pursue and consume a more fact-based information diet, no matter how much information literacy is emphasized in society.

Fred Davis, a futurist based in North America, predicted that if the current information environment is extended, “Mass media becomes even more propaganda-oriented. Social media creates the potential for ‘society hacking.’ One of the big problems of fake news is that it also casts doubt on real news.”

Giacomo Mazzone, head of institutional relations for the World Broadcasting Union, was pessimistic about people pursuing the most-reliable facts, writing, “The problem is that today’s society is hungry for fake news. It’s not looking to stop it.”

Some expressed concerns that regulatory fixes, legal changes, filtering, flagging and other suggested remedies will have their own drawbacks while possibly not having enough impact on extreme actors’ manipulation of the information environment.

Isto Huvila, professor of information studies, Uppsala University, replied, “The consequences [of misinformation in the next decade] will be disruptive but at the same time, authoritarian measures to prevent them can be equally bad.”

Micah Altman, director of research for the Program on Information Science at MIT, commented, “Judge Brandeis said, ‘If there be time to expose through discussion the falsehood and fallacies, to avert the evil by the processes of education, the remedy to be applied is more speech, not enforced silence.’ The results of either dominance of false information or silencing of ‘bad’ speech would be to severely impair liberal democracy.”"

So, once again, the question is:
What are the consequences for society if
misinformation continues to expand?

Whether they predicted that it won’t improve or it will improve in the next decade, the vast majority of respondents wish for a better information environment in the future but are unsure if it can be achieved.

Susan Etlinger, industry analyst, Altimeter Research, said, “To be blunt, I think we’re living it right now in many places in the world. And we’ve seen this before: erosion of human rights, weakening of the rule of law, suppression of free speech; all of these are symptoms of weakened public institutions and values.”

Andrew Nachison, author, futurist and founder of WeMedia, noted, “The result can be civic collapse, dark ages, widespread distrust of the other.”

Mohamed Elbashir, senior manager for internet regulatory policy, Packet Clearing House, noted, “Fake news/information could lead to catastrophic outcomes, spread hate speech, incite violence and undermine the core values of the peaceful exchange of conflicting ideas and opinions.”

Robert W. Glover, assistant professor of political science, University of Maine, wrote, “The complete breakdown of our political system, not to mention vulnerability in our basic social structures and economic system.”

Tony Smith, boundary crosser for Meme Media, commented, “Accelerating backslide towards authoritarians by noisy idealists of all flavours.”

Tim Bray, senior principal technologist for Amazon.com, wrote, “Damaging political ideas having influence over enacted policies.”

Joseph Turow, professor of communication, University of Pennsylvania, commented, “The escalated distribution of concocted facts casting opponents in a negative light will often mean that various actors will be able to use social media, off-the-beaten-track sites and apps, and email to reinforce silos of opinions regarding issues of concern to them. They will also at times be able to short-circuit political compromises across sectors of a society by planting widespread, reasonably plausible stories that sow distrust.”

Wendell Wallach, a transdisciplinary scholar focused on the ethics and governance of emerging technologies, The Hastings Center, wrote, “Power will remain in the hands of the powerful, or those best able to exploit the technology in pursuit of their own goals and ideology.”

Amy Webb, author and founder of the Future Today Institute, wrote, “AI agents will help to create stories, distribute content and eventually personalize it for each individual news consumer. Without action today, in the future more our own opinions will be reflected right back at us, making it ever more difficult to confront contrary beliefs and ideologies. Extreme viewpoints will feel like the norm, not the outliers they actually are. Leaders – and everyday people – should base their decisions in fact, but we’re still human. Emotion gets in the way. I foresee extreme viewpoints and personal duress shaping our future world, and that should be a grave concern to us all.”

Jim Warren, an internet pioneer and open-government/open-records/open-meetings advocate, said, “IF so, the results have been, are, and will be Orwellian until citizens overthrow the perpetrator(s) – which can take generations. The consequences of citizens – individually and collectively – being unable to prevent co-opting of information by bad actors would (or will?) be what it has always been: tyranny and citizen ‘helplessness’ until they (we!) revolt and redress the abuses.”

A global telecommunications leader based in Central America, commented, “Societies need to put in practice ways and means mainly through education, formation and training to lessen the impact of bad consequences. Societies need to find appropriate incentives. Defense and attack will gain in sophistication over time.”

Susan Price, lead experience strategist at Firecat Studio, noted, “Anything created by humans can be hacked; protection must be based on the cooperation of motivated humans.”

Johanna Drucker, professor of information studies, University of California-Los Angeles, commented, “We are already seeing those consequences – a form of affective fascism that works through a phantasmatic force. How to counteract this? We can only hope that some recognition that human survival depends on responsible cooperation – among humans, but also between humanity and the ecology on which it depends – will have sufficient persuasive force to prevail.”

A selection of additional comments by anonymous respondents:

• “Economic terrorism generally leads to dictatorship.”
• “The dream for the internet was that it would give a wider range of viewpoints but instead it has created an echo chamber for false stories.”
• “Making decisions based on bad information is bad for everyone. The consequences are incalculable.”
• “As long as information systems are hiding their selection and computing mechanisms, it will be impossible to prevent the coopting of public information by bad actors.”
• “It’s the rise of ‘personal truth’ over verifiable truth.”
• “Parts of the e-economy may slow or regress.”
• “Disinformation fatigue may lead to new forms of post-information politics.”
• “[There will be] more opportunities for demagogues and corruption in general.”
• “The result is a society that is based on fear, rather than trust. Where no accurate measure of public opinion is possible because of propaganda, echo chamber and spiral of silence effects.”
• “Powerful people with strong financial interests will control our democracy even more and hate groups will take advantage more to increase polarization.”
• “Chaos.” (this was the response of several respondents)
• “A race to the bottom in government and big industry, where anything goes.”
• “Low-information voters will continue to vote for liars. Society will suffer. There will be more division between information-have and information-want-nots.”
• “Manipulations will be more sophisticated than most of us can imagine, or detect.”
• “Cheap, intolerable interference by foreign governments.”
• “Echo chambers, fear mongering, prejudices, racism, social disintegration, mistrust in fellow citizens and political institutions.”
• “Election of demagogues. Isolationism and conflict. Decline of science.”
• “A breakdown of democratic norms and values, and increased power for those who don’t care what ‘truth’ is.”
• “The danger is that this becomes a self-perpetuating cycle of ignorance.”
• “It will take years for a social norm or technology to adjust to the current climate.”
• “We live with it, just as we do with crime.”
• “It’s business as usual.”
• “This is no worse than the impact of bad actors throughout history. In other words, sometimes terrible, but generally just a drag on the system.”
• “The coopting of public information will be less effective over time. Shock value is not something that you retain over repeated abuses.”
• “Were things ever so much better? Some features of the technology lead us to misjudge and exaggerate, change and overreact.”
• “The cure is worse than the disease. Unless you want a ‘Pravda’ future, let democracy and freedom of speech work.”
• “Society has to cultivate responsible and thoughtful people.”
• “We are going to have to redefine how democracy can work.”

To read the next section of the report - What Civil Liberties May Be Curtailed? -  please click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/
future_of_the_information_environment_F3.xhtml

To return to the survey homepage, please click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_the_information_environment.xhtml

To read anonymous responses to this survey question with no analysis, please click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_information_environment_anon.xhtml

To read credited responses to the report with no analysis, please click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_information_environment_credit.xhtml

About this Canvassing of Experts

The expert predictions reported here about the impact of the internet over the next 10 years came in response to a question asked by Pew Research Center and Elon University’s Imagining the Internet Center in an online canvassing conducted between July 2 and August 7, 2017. This is the eighth “Future of the Internet” study the two organizations have conducted together. For this project, we invited more than 8,000 experts and members of the interested public to share their opinions on the likely future of the Internet and received 1,116 responses; 777 participants also wrote an elaborate explanation to at least one of the six follow-up questions to the primary question, which was:

The rise of “fake news” and the proliferation of doctored narratives that are spread by humans and bots online are challenging publishers and platforms. Those trying to stop the spread of false information are working to design technical and human systems that can weed it out and minimize the ways in which bots and other schemes spread lies and misinformation. The question: In the next 10 years, will trusted methods emerge to block false narratives and allow the most accurate information to prevail in the overall information ecosystem? Or will the quality and veracity of information online deteriorate due to the spread of unreliable, sometimes even dangerous, socially-destabilizing ideas?

Respondents were then asked to choose one of the following answers and follow up by answering a series of six questions allowing them to elaborate on their thinking:

The information environment will improve – In the next 10 years, on balance, the information environment will be IMPROVED by changes that reduce the spread of lies and other misinformation online

The information environment will NOT improve - In the next 10 years, on balance, the information environment will NOT BE improved by changes designed to reduce the spread of lies and other misinformation online

The six follow-up questions to the WILL/WILL NOT query were:

  • Briefly explain why the information environment will improve/not improve.
  • Is there a way to create reliable, trusted, unhackable verification systems? If not, why not, and if so what might they consist of?
  • What are the consequences for society as a whole if it is not possible to prevent the coopting of public information by bad actors?
  • If changes can be made to reduce fake and misleading information, can this be done in a way that preserves civil liberties? What rights might be curtailed?
  • What do you think the penalities should be for those who are found to have created or knowingly spread false information with the intent of causing harmful effects? What role, if any, should government play in taking steps to prevent the distribution of false information?
  • What do you think will happen to trust in information online by 2027?

The Web-based instrument was first sent directly to a list of targeted experts identified and accumulated by Pew Research Center and Elon University during the previous seven “Future of the Internet” studies, as well as those identified across 12 years of studying the internet realm during its formative years. Among those invited were people who are active in the global internet policy community and internet research activities, such as the Internet Engineering Task Force (IETF), Internet Corporation for Assigned Names and Numbers (ICANN), Internet Society (ISOC), International Telecommunications Union (ITU), Association of Internet Researchers (AoIR) and Organization for Economic Cooperation and Development (OECD).

We also invited a large number of professionals, innovators and policy people from technology businesses; government, including the National Science Foundation, Federal Communications Commission and European Union; the media and media-watchdog organizations; and think tanks and interest networks (for instance, those that include professionals and academics in anthropology, sociology, psychology, law, political science and communications), as well as globally located people working with communications technologies in government positions; top universities’ engineering/computer science departments, business/entrepreneurship faculty, and graduate students and postgraduate researchers; plus many who are active in civil society organizations such as the Association for Progressive Communications (APC), the Electronic Privacy Information Center (EPIC), the Electronic Frontier Foundation (EFF) and Access Now; and those affiliated with newly emerging nonprofits and other research units examining ethics and the digital age. Invitees were encouraged to share the canvassing questionnaire link with others they believed would have an interest in participating, thus there was a “snowball” effect as the invitees were joined by those they invited to weigh in.

Since the data are based on a nonrandom sample, the results are not projectable to any population other than the individuals expressing their points of view in this sample.

The respondents’ remarks reflect their personal positions and are not the positions of their employers; the descriptions of their leadership roles help identify their background and the locus of their expertise.

About 74% of respondents identified themselves as being based in North America; the others hail from all corners of the world. When asked about their “primary area of internet interest,” 39% identified themselves as research scientists; 7% as entrepreneurs or business leaders; 10% as authors, editors or journalists; 10% as advocates or activist users; 11% as futurists or consultants; 3% as legislators, politicians or lawyers; and 4% as pioneers or originators. An additional 22% specified their primary area of interest as “other.”

More than half the expert respondents elected to remain anonymous. Because people’s level of expertise is an important element of their participation in the conversation, anonymous respondents were given the opportunity to share a description of their Internet expertise or background, and this was noted where relevant in this report.

Here are some of the key respondents in this report (note that position titles and organization names were provided by respondents at the time of the canvassing and may not be current):

Bill Adair, Knight Professor of Journalism and Public Policy at Duke University; Daniel Alpert, managing partner at Westwood Capital; Micah Altman, director of research for the Program on Information Science at MIT; Robert Atkinson, president of the Information Technology and Innovation Foundation; Patricia Aufderheide, professor of communications, American University; Mark Bench, former executive director of World Press Freedom Committee; Walter Bender, senior research scientist with MIT/Sugar Labs; danah boyd, founder of Data & Society; Stowe Boyd, futurist, publisher and editor-in-chief of Work Futures; Tim Bray, senior principal technologist for Amazon.com; Marcel Bullinga, trend watcher and keynote speaker; Eric Burger, research professor of computer science and director of the Georgetown Center for Secure Communication; Jamais Cascio, distinguished fellow at the Institute for the Future; Barry Chudakov, founder and principal at Sertain Research and StreamFuzion Corp.; David Conrad, well-known CTO; Larry Diamond, senior fellow at the Hoover Institution and FSI, Stanford University; Judith Donath, Harvard University’s Berkman Klein Center for Internet & Society; Stephen Downes, researcher at the National Research Council of Canada; Johanna Drucker, professor of information studies, University of California-Los Angeles; Andrew Dwyer, expert in cybersecurity and malware at the University of Oxford; Esther Dyson, entrepreneur, former journalist and founding chair at ICANN; Glenn Edens, CTO for Technology Reserve at Xeroz/PARC; Paul N. Edwards, fellow in international security, Stanford University; Mohamed Elbashir, senior manager for internet regulatory policy, Packet Clearing House; Susan Etlinger, industry analyst, Altimeter Research; Bob Frankston, internet pioneer and software innovator; Oscar Gandy, professor emeritus of communication at the University of Pennsylvania; Mark Glaser, publisher and founder, MediaShift.org; Marina Gorbis, executive director at the Institute for the Future; Jonathan Grudin, principal design researcher, Microsoft; Seth Finkelstein, consulting programmer and EFF Pioneer Award winner; Susan Hares, a pioneer with the NSFNet and longtime internet engineering strategist; Jim Hendler, professor of computing sciences at Rensselaer Polytechnic Institute; Starr Roxanne Hiltz, author of “Network Nation” and distinguished professor of information systems; Helen Holder, distinguished technologist for HP; Jason Hong, associate professor, School of Computer Science, Carnegie Mellon University; Christian H. Huitema, past president of the Internet Architecture Board; Alan Inouye, director of public policy for the American Library Association; Larry Irving, CEO of The Irving Group; Brooks Jackson of FactCheck.org; Jeff Jarvis, a professor at the City University of New York Graduate School of Journalism; Christopher Jencks, a professor emeritus at Harvard University; Bart Knijnenburg, researcher on decision-making and recommender systems, Clemson University; James LaRue, director of the Office for Intellectual Freedom of the American Library Association; Jon Lebkowsky, Web consultant, developer and activist; Mark Lemley, professor of law, Stanford University; Peter Levine, professor and associate dean for research at Tisch College of Civic Life; Mike Liebhold, senior researcher and distinguished fellow at the Institute for the Future; Sonia Livingstone, professor of social psychology, London School of Economics; Alexios Mantzarlis, director of the International Fact-Checking Network; John Markoff, retired senior technology writer at The New York Times; Andrea Matwyshyn, a professor of law at Northeastern University; Giacomo Mazzone, head of institutional relations for the World Broadcasting Union; Jerry Michalski, founder at REX; Riel Miller, team leader in futures literacy for UNESCO; Andrew Nachison, founder at We Media; Gina Neff, professor, Oxford Internet Institute; Alex ‘Sandy’ Pentland, member US National Academies and World Economic Forum Councils; Ian Peter, internet pioneer, historian and activist; Justin Reich, executive director at the MIT Teaching Systems Lab; Howard Rheingold, pioneer researcher of virtual communities and author of “Net Smart”; Mike Roberts, Internet Hall of Fame member and first president and CEO of ICANN; Michael Rogers, author and futurist at Practical Futurist; Tom Rosenstiel, director of the American Press Institute; Marc Rotenberg, executive director of EPIC; Paul Saffo, longtime Silicon Valley-based technology forecaster; David Sarokin, author of “Missed Information”; Henning Schulzrinne, Internet Hall of Fame member and professor at Columbia University; Jack Schofield, longtime technology editor now a columnist at The Guardian; Clay Shirky, vice provost for educational technology at New York University; Ben Shneiderman, professor of computer science at the University of Maryland; Ludwig Siegele, technology editor, The Economist; Evan Selinger, professor of philosophy, Rochester Institute of Technology; Scott Spangler, principal data scientist, IBM Watson Health; Brad Templeton, chair emeritus for the Electronic Frontier Foundation; Richard D. Titus, CEO for Andronik; Joseph Turow, professor of communication, University of Pennsylvania; Stuart A. Umpleby, professor emeritus, George Washington University; Siva Vaidhyanathan, professor of media studies and director of the Center for Media and Citizenship, University of Virginia; Tom Valovic, Technoskeptic magazine; Hal Varian, chief economist for Google; Jim Warren, longtime technology entrepreneur and activist; Amy Webb, futurist and CEO at the Future Today Institute; David Weinberger, senior researcher at Harvard University’s Berkman Klein Center for Internet & Society; Kevin Werbach, professor of legal studies and business ethics, the Wharton School, University of Pennsylvania; John Wilbanks, chief commons officer, Sage Bionetworks; and Irene Wu, adjunct professor of communications, culture and technology at George Washington University.

Here is a selection of institutions at which respondents work or have affiliations:

Adroit Technolgic, Altimeter Group, Amazon, American Press Institute APNIC, AT&T, BrainPOP, Brown University, BuzzFeed, Carnegie Mellon University, Center for Advanced Communications Policy, Center for Civic Design, Center for Democracy/Development/Rule of Law, Center for Media Literacy, Cesidian Root, Cisco, City University of New York Graduate School of Journalism, Cloudflare, CNRS, Columbia University, comScore, Comtrade Group, Craigslist, Data & Society, Deloitte, DiploFoundation, Electronic Frontier Foundation, Electronic Privacy Information Center, Farpoint Group, Federal Communications Commission, Fundacion REDES, Future Today Institute, George Washington University, Google, Hackerati, Harvard University’s Berkman Klein Center for Internet & Society, Harvard Business School, Hewlett Packard, Hyperloop, IBM Research, IBM Watson Health, ICANN, Ignite Social Media, Institute for the Future, International Fact-Checking Network, Internet Engineering Task Force, Internet Society, International Telecommunication Union, Karlsruhe Institute of Technology, Kenya Private Sector Alliance, KMP Global, LearnLaunch, LMU Munich, Massachusetts Institute of Technology, Mathematica Policy Research, MCNC, MediaShift.org, Meme Media, Microsoft, Mimecast, Nanyang Technological University, National Academies of Sciences/Engineering/Medicine, National Research Council of Canada, National Science Foundation, Netapp, NetLab Network, Network Science Group of Indiana University, Neural Archives Foundation, New York Law School, New York University, OpenMedia, Oxford University, Packet Clearing House, Plugged Research, Princeton University, Privacy International, Qlik, Quinnovation, RAND Corporation, Rensselaer Polytechnic Institute, Rochester Institute of Technology, Rose-Hulman Institute of Technology, Sage Bionetworks, Snopes.com, Social Strategy Network, Softarmor Systems, Stanford University, Straits Knowledge, Syracuse University, Tablerock Network, Telecommunities Canada, Terebium Labs, Tetherless Access, UNESCO, U.S. Department of Defense, University of California (Berkeley, Davis, Irvine and Los Angeles campuses), University of Michigan, University of Milan, University of Pennsylvania, University of Toronto, Way to Wellville, We Media, Wikimedia Foundation, Worcester Polytechnic Institute, World Broadcasting Union, W3C, Xerox PARC, Yale Law.

To return to the survey homepage, please click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_the_information_environment.xhtml

To read anonymous responses to this survey question with no analysis, please click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_information_environment_anon.xhtml

To read credited responses to the report with no analysis, please click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_information_environment_credit.xhtml