Elon University

The 2017 Survey: The Future of Truth and Misinformation Online (Q3 Credited Responses)

Credited responses to the second follow-up question:
What are the societal impacts of prevalent misinformation?

Internet technologists, scholars, practitioners, strategic thinkers and others were asked by Elon University and the Pew Research Internet, Science and Technology Project in summer 2017 to share their answer to the following query:

Future of Misinformation LogoWhat is the future of trusted, verified information online? The rise of “fake news” and the proliferation of doctored narratives that are spread by humans and bots online are challenging publishers and platforms. Those trying to stop the spread of false information are working to design technical and human systems that can weed it out and minimize the ways in which bots and other schemes spread lies and misinformation. The question: In the next 10 years, will trusted methods emerge to block false narratives and allow the most accurate information to prevail in the overall information ecosystem? Or will the quality and veracity of information online deteriorate due to the spread of unreliable, sometimes even dangerous, socially-destabilizing ideas?

About 49% of these respondents, said the information environment WILL improve in the next decade.
About 51% of these respondents said the information environment WILL NOT improve in the next decade.

Follow-up Question #2 was:
What are the consequences for society as a whole if it is not possible to prevent the coopting of public information by bad actors?

Some key themes emerging from among the responses: – We could be turning a corner into an extremely dangerous time in human history. – Democracy is damaged when people cannot trust in information; some are likely to be overwhelmed and simply give up on participating in civic life. – Social, economic and political inequities are seen by some as a root cause. – A lack of ‘common knowledge’ hinders finding common ground and common solutions. – An inability to trust is damaging to all human relationships and systems. – Societies must support and turn to credentialed sources in the future – to ‘trusted actors.’ – This is the human condition – misinformation lives – yet some choose to expect or at least be optimistic that ‘truth wins out.’ – The jury is out on whether any actions taken will have net-positive results.

Written elaborations by for-credit respondents

Misinformation Online Full Survey LinkFollowing are full responses to Follow-Up Question #2 of the six survey questions, made by study participants who chose to take credit when making remarks. Some people chose not to provide a written elaboration. About half of respondents chose to remain anonymous when providing their elaborations to one or more of the survey questions. Respondents were given the opportunity to answer any questions of their choice and to take credit or remain anonymous on a question-by-question basis. Some of these are the longer versions of expert responses that are contained in shorter form in the official survey report. These responses were collected in an opt-in invitation to about 8,000 people.

Their predictions:

Micah Altman, director of research for the Program on Information Science at MIT, commented, “Our goal should be to prevent bad actors from *dominating* the public sphere. Judge Brandeis said, ‘If there be time to expose through discussion the falsehood and fallacies, to avert the evil by the processes of education, the remedy to be applied is more speech, not enforced silence.’ The results of either dominance of false information or silencing of ‘bad’ speech would be to severely impair liberal democracy.”

Brian Cute, longtime internet executive and ICANN participant, said, “Fake news is not a new phenomenon. The internet has introduced open communications and sharing of information at a scale and speed that is unprecedented in human history. The speed and scale aspect means there are serious consequences for society. Elements of this problem are playing out in the question about whether Russia attempted to interfere in the US elections. It should be noted that this issue is not limited to one or a few governments. Many actors have incentives to promote fake news.”

Steve McDowell, professor of communication and information at Florida State University, replied, “We are already seeing some of these consequences. Some actors are public and transparent about mixing some verifiable facts with expansive and one-sided interpretations. Others are deliberate misrepresentations of authors, media outlets, and stories. Without a common set of verifiable facts it is hard to have an informed public debate on problems and challenges confronting society, and to move from there to identifying priorities and crafting social or policy responses.”

Bart Knijnenburg, researcher on decision-making and recommender systems and assistant professor of computer science at Clemson University, said, “Eventually society will have to grow up. And I think we will, because a new generation will deal with this stuff their entire lives. Teaching kids (digital) information literacy is important, though.”

Adam Gismondi, a researcher at the Institute for Democracy & Higher Education, Tufts University, observed, “The consequences would be enormous, with many problems emerging that we are currently unable to foresee. A problem that we can predict, however, is major damage to our electorate, who will be increasingly distrustful of our major institutions (including media, scholars, and government officials), and as a result less informed on the very issues that drive our democracy.”

Laurel Felt, lecturer at the University of Southern California, commented, “The consequences are dire because it disincentivizes following the news and/or acting on objectionable behaviors (e.g., miscarriages of justice, corruption, et cetera) because citizens may doubt that such flagrant offenses are being committed OR they may get the distorted sense that they happen all the time.”

Kenneth Sherrill, professor emeritus of political science, Hunter College, City University of New York, said, “As people increasingly become unable to determine which information sources are trustworthy, they will decide that it is not worth their time and energy to do the work of figuring out what’s true. Partisans will remain wedded to their trusted partisan sources. Others will drop out. Society will become more fragmented and civic engagement will decline.”

Scott Fahlman, professor emeritus of AI and language technologies, Carnegie Mellon University, noted, “We have moved from a world with just a few news outlets, trusted by some and able to manipulate public opinion to a more diverse and democratic system in which everyone can spread their opinions (and deliberate lies if they choose to). We have seen that it is now much harder for people to decide what to believe, and many people no longer make the effort, just living in their own echo chambers.”

Fredric Litto, professor emeritus, University of São Paulo, Brazil, wrote, “As a long-time student of information technology, I have long believed that a new ‘sickness’ would arise in modern society, one of increasing complexity in everyday life causing many people to resist leaving the security of their beds each morning, preferring to remain under the covers so as not to have to deal with decision-making and the exhausting need to constantly be learning new ways of working. If, in addition, we add to this situation the question of the insecurity arising from the inability to never be able to entirely trust the information one encounters, there will inevitably be massive evidences of stress-provoked illness.”

Sam Punnett, research officer, TableRock Media, replied, “The consequences are dire, leading to an erosion of social institutions. The outcome is a choice of choosing to be educated or choosing to be manipulated. Media/information literacy is essential in a society that generates so much of it. Put glibly, you are what you view. Information flow to the individual will only get increasingly customizable in the future. If one lacks the awareness of one’s media diet then one gives little thought to the potential hazards of a ‘junk’ diet that feeds exclusively entertainment over that which feeds an informed citizen.”

Stephen Bounds, information and knowledge management consultant, KnowQuestion, said, “This is the new reality of warfare today. Wars are going to be literally fought over ‘information supply lines,’ much as food supply lines were critical in wars of years gone by.”

John Wilbanks, chief commons officer, Sage Bionetworks, replied, “We’re there already. Polarized sub-groups refusing to acknowledge truth of anything they disagree with weaponized through intentional manipulation to lower trust in institutions and intermediaries. What’s next is like this, but moreso.”

Charlie Firestone, executive director, Aspen Institute Communications and Society Program, commented, “The result is the road to authoritarian society where the people are left with the word and power of the state to give them the truth. Not a desirable state.”

David Brake, a researcher and journalist, replied, “Political debate will become increasingly difficult if both sides of the debate proceed from very different premises about the state of the world.”

William Anderson, adjunct professor, School of Information, University of Texas-Austin, replied, “The primary consequence is that there will not be any large-scale society that shares common goods and common goals.”

Jerry Michalski, futurist and founder of REX, replied, “My greatest fear at the moment is that the post-factual world isn’t a three-year aberration but is instead a 300-year era, much like the 400 years after Gutenberg transformed and traumatized Europe.”

Esther Dyson, a former journalist and founding chair at ICANN, now a technology entrepreneur, nonprofit founder and philanthropist, expert, said, “The consequences would be increased cynicism and despair; general breakdown over the long term.”

Howard Rheingold, pioneer researcher of virtual communities, longtime professor and author of “Net Smart: How to Thrive Online,” said, “People will lose their liberties (we are already seeing that in the US) because of political info-manipulation; people will lose their lives because of bad medical information.”

Bob Frankston, internet pioneer and software innovator, said, “Faith may trump understanding. I think longer-term understanding will prevail because the ideas are more powerful. But it won’t be a simple process of refinement.”

Alan D. Mutter, media consultant and faculty at graduate school of journalism, University of California-Berkeley, replied, “Loss of trust starts with diminished confidence in the truth of empirically indisputable facts like climate change or the brutal consequences of underfunding Medicaid. If we cannot agree on the facts, then we cannot have the sort of dialogue that produces good public policy and a healthy society. When truth is denied and trust is devalued, our democracy is at peril.”

Jon Lebkowsky, web consultant/developer, author and activist, commented, “We’re already seeing the consequences: deep polarization, suspicion, manipulation of public sentiment, erosion of rational civil discourse, widespread confusion and increasingly chaotic public/politcal spheres.”

Leah Lievrouw, professor in the department of information studies at the University of California-Los Angeles, observed, “We have to be historically modest here. Bad actors have always attempted to capture and shape knowledge for their own purposes and gain, whether we call it revisionism, propaganda, spin, public relations, ‘curation,’ what have you. The question is whether, and which, information sources can be perceived as impartial, thorough, dispassionate, and fair. That’s a cultural question: do we even want fair, impartial – modern – information sources? If so, we’ll find or create them. If not, no amount of technological intervention will magically insure information quality.”

James LaRue, director of the Office for Intellectual Freedom of the American Library Association, commented, “Most immediately, erosion of trust in news sources, and uncertainty about what is really happening.”

Jamais Cascio, distinguished fellow at the Institute for the Future, noted, “Multiple scenarios are possible. We could see this as triggering a memetic immune system, where we develop both technological and cultural tools and methods to favor veracity over stimulation; we could see the further hardening of ideological barriers, with entire communities/regions becoming controlled in ways that bar wrong-think; we could see the disproportionate success of a particular worldview causing the followers of bad actors to fail. Two broad scenarios: 1) civil conflict, where ideological and cultural divisions metastasize into violence; or 2) our cognitive immune systems of skepticism and verification become stronger and society becomes healthier as a result.”

Jim Hendler, professor of computing sciences at Rensselaer Polytechnic Institute, commented, “We will need to develop social and community mechanisms to try to deal with this – the problem is not new to digital media, propaganda, for example, has been around a long time – the bigger issue is whether newer and faster ways of distorting information can be countered by newer and faster ways of providing balance.”

Liam Quin, an information speciliast with the World Wide Web Consortium, noted, “What we need is not to prevent lies, but to make it easier to find out whether something is true.”

Jonathan Brewer, consulting engineer for Telco2, commented, “A definition of bad actors might vary from one country to the next, so it will never be possible to prevent the co-opting of public information. The consequences can be seen already in America vs. Russia today, or the Gulf states vs. Al Jazeera.”

Michael R. Nelson, public policy executive with Cloudflare, replied, “We are already on a trend towards more transparency. Bad actors who steal data from OPM or whistleblowers who release the Panama Papers get a lot of attention, but data brokers are buying databases of driver’s license records and other public data for commercial use. Google makes Street View imagery available for almost every residence. Citizens are demanding more and more data and convenience. That trend will continue.”

Andrew Odlyzko, professor of math and former head of the University of Minnesota’s Supercomputing Institute, observed, “We will have to accept something that has been true already in the past, namely that we live in an environment of information and misinformation. We will simply have to learn how to choose the webs of trust we rely on.”

Glenn Edens, CTO for Technology Reserve at Xeroz/PARC, “With a declining accessible quality education system the impacts of incorrect, misleading or false information are a detriment to free society. The continued fragmentation of information consumers is a serious issue. Society, of course, works best when there are a set of shared values, transparency and verifiable sources – we will have to adapt to this new world and the means are not clear yet.”

Tom Rosenstiel, author, director of the American Press Institute, senior non-resident fellow at the Brookings Institution, commented, “Over the stretch of history, I think the signals are fairly clear. It requires political leadership, and cooperation by political leaders, to blunt the effect of misinformation. It is not going to be something that technology can solve, though it is something technology can make worse. Will the political parties and political leaders exploit the innate potential of citizens to believe false rumors, or will they try to rise above it. It’s the same question we have faced in earlier eras.”

David Wood, a UK-based futurist at Delta Wisdom, said, “We might blunder into World War III, a new Dark Age, or another existential disaster – driven there by foolish judgment and bad feelings stirred up by bad information.”

Deirdre Williams, an internet activist, replied, “We will become a population of credulous slaves until the pendulum begins to swing in the opposite direction.”

Serge Marelli, an IT professional who works on and with the Net, wrote, “The consequences are lots of Donald Trumps (or Putins, Erdogans, or…) in many countries.”

Paul Saffo, longtime Silicon Valley-based technology forecaster, commented, “People focus on dramatic, extreme outcomes, but the greater danger comes from the insidious effects of cynicism and distrust engendered by an unreliable medium. The consequence will be increasing friction in social systems that in turn will retard the effective functioning of global civil society in the face of ever-growing complexity.”

Geoff Scott, CEO of Hackerati, commented, “That depends on whether people choose to think critically and independently, pursue information from outside of their information bubble, and are open to having their minds changed. This is primarily and education and social challenge. If we can’t address it at this level, bad actors will simply need to assert a narrative that serves their purpose and people will believe it.”

Garth Graham, an advocate for community-owned broadband with Telecommunities Canada, explained, “Community (i.e., self-organization) has the innate capacity to control deviant behaviour. We haven’t yet learned how to rely on this capacity online and in the absence of place.”

Alejandro Pisanty, a professor at UNAM, the National University of Mexico, and longtime internet policy leader, observed, “The consequences of public information being coopted by bad actors are dire; a downward spiral of reinforced ignorance-bad judgment-generalized disbelief preys on the prejudices, bias and sheer lack of knowledge of large sectors and allows bad actors either to coopt them, or to be allowed free rein to govern and prey.”

Erhardt Graeff, a sociologist doing research on technology and civic engagement at the MIT Media Lab, said, “Although mistrust can be used constructively to buttress democracy, when it is allowed to fester into chaos it poses an existential threat to the institution. The creation of chaos by seeding doubt and conspiracy theories – a highly refined strategy in Russian propaganda – allows authoritarianism to rise. When all facts are dubious and you cannot trust anyone, a broad power vacuum can emerge which entrenched powers and charismatic leaders are poised to fill by promising nostalgic visions of strong-armed stability.”

Brian Harvey, teaching professor emeritus at the University of California – Berkeley, said, “There’s nothing new about ‘coopting of public information’ except, maybe, that the internet lets non-rich people play. The best vaccination I know of against fake news is to study Marx and understand the class structure of society, but this is pretty much a non-starter in the US.”

David Conrad, a chief technology officer, replied, “Trust in systems, processes, and institutions will continue to degrade, potentially leading to increases in demagoguery, extremism and civil strife.”

Mercy Mutemi, legislative advisor for the Kenya Private Sector Alliance, observed, “Manipulation. And not just during elections or referenda. I imagine a world in which simple policy decisions may be manipulated to the benefit of a few. Fake news could contribute to general misinformation of large groups of people and armed with such misinformation, they could easily make poor choices in leadership and selecting what policy they prefer. Weaponized fake news could turn us all into robots and the worst part is we wouldn’t even know we are being manipulated.”

Larry Keeley, founder of innovation consultancy Doblin, observed, “In a word: Devastating. It will erode the basic check on power (governmental, corporate, institutional) that depends on trusted information systems. George Orwell’s dystopian world becomes a reality – but far worse, since there will be many more ways for anyone with an agenda to find fanatical adherents. After a while, citizens, like in Russia or North Korea will trust no one.”

Jan Schaffer, executive director of J-Lab, said, “People will not be able to make informed choices, and will likely opt out of political participation. Bad actors will continue to be elected. Progress will be hampered by drama and a focus on maintaining power. Civil society will decline.”

Nick Ashton-Hart, a public policy professional based in Europe, commented, “Some information will be compromised but what is important or influential will get protected. There is a danger of ossification, though, as trust systems that use context rely upon fewer authoritative voices and reduce the ability of new voices to rise to being highly visible.”

Michael Rogers, principal at the Practical Futurist, wrote, “False information is toxic for just about every level of decision-making, from personal (say, vaccination) to public (elections, say, or mobilizing a nation for war). Often one can’t know information is toxic until it has been acted on, and then it’s too late.”

Scott MacLeod, founder and president of World University and School, replied, “A Russia or a China or a Saudi Arabia or a North Korea as examples? (I’m writing as an American, and an appreciator of the US Constitution).”

Jack Schofield, longtime technology editor at The Guardian, now a columnist for The Guardian and ZDNet, commented, “We’ve already seen some disastrous results from the manipulation of information by bad actors. The obvious examples are the US election of Donald Trump and the UK’s Brexit vote, but these are by no means the only examples.”

Morihiro Ogasahara, associate professor at Kansai University, said, “Democracies will corrupt because of lack of credible information for decision making.”

Sahana Udupa, professor of media anthropology at Ludwig Maximilian University of Munich, wrote, “‘Bad actors’ is a highly political term and one cannot have a blanket approach to digital information based on a strict binary of bad and good. We have developed the concept of ‘extreme speech’ to understand how digital speech as a situated practice pushes the boundaries of legitimate speech along the twin axes of truth-falsity and civility-incivility. This means we remain aware of how ‘bad actors’ and ‘hate speech’ are framed in the first place. With such a situated understanding, digital speech practices should be mapped, and if found harmful for contextually rooted, historically sensitive reasons, one must raise strong barriers. The reason is such harmful content can render vulnerable groups and individuals (along the axes of caste, gender, religion, ethnicity, numerical majority/minority etc., and their intersections) are not prevented from participating in the public debate. The rise of right-wing populism and demonic majoritarianism have a lot to do with negative forms of extreme speech forced into the channels of digital media. Digital media consumers are not gullible. But trust networks are radically reconfigured in the digital age. What comes on Whatsapp is rarely disputed because messages are individualized and laced with multimedia enhancements, which make it difficult to recognize and bust false and hateful narratives.”

Uta Russmann, a professor whose research is concentrated on political communication via digital methods, noted, “Society as a whole will increasingly rely on software such as IBM Watson. Moreover, even though probably not in the next ten years but in the long run, computers will become more intelligent than humans. In the next ten years, around the world, almost everyone will have a smartphone and hence almost the same access to information and education. But this development will cause less trust between people.”

Stuart Elliott, visiting scholar at the US National Academies of Sciences, Engineering and Medicine, observed, “The consequence of losing access to reliable information would be serious difficulties in making many types of decisions. Since there are social and behavioral solutions to this problem, they will be used to prevent this outcome.”

Tanya Berger-Wolf, professor at the University of Illinois-Chicago, wrote, “Rumors and disinformation are nothing new, just the medium and the speed change. Society will survive just as it always did, occasionally making bad decisions based on the wrong information.”

Michael J. Oghia, an author, editor and journalist based in Europe, said, “1) The spread of misinformation and hate; 2) Inflammation, socio-cultural conflict and violence; 3) The breakdown of socially accepted/agreed-upon knowledge and what constitutes ‘fact.’ 4) A new digital divide of those subscribed (and ultimately controlled) by misinformation and those who are ‘enlightened’ by information based on reason, logic, scientific inquiry and critical thinking. 5) Further divides between communities, so that as we are more connected we are farther apart. And many others.”

Veronika Valdova, managing partner at Arete-Zoe, noted, “People form their opinion based on information they receive or choose to receive and tend to act on their opinions and beliefs. Disintegration of information environment to the point that there is no universally accepted picture of reality would have profound consequences on the functioning of the entire society. People act on information they have, they buy property, pursue careers, move around the country and the world, conduct business, undergo treatments, dedicate their time and resources to college degrees based on expectations and picture of the world they have. If there is no consensus on the basic understanding of reality, people cannot make rational informed choices relevant to their everyday life. At individual level, a prolific disinformation results in shattered reputation and loss of status, money, opportunity and life. Alternative information pipelines that develop in response to overwhelming amount of falsehood in major media may be equally damaging.”

Kenneth R. Fleischmann, associate professor at the University of Texas- Austin School of Information, wrote, “Certainly, there are egregious cases of introducing fake news to attempt to change the outcome of, for example, the 2016 election, particularly when it is apparent that much of this was done by or with the assistance of Russian hackers. However, any efforts to clamp down on such fake news might constitute a form of censorship that might undermine trust – there is a fine line between ensuring trustworthiness of public information and leading to a centralized government information system where there is only one trusted set of information (which is not necessarily trusted, on a deep level, by a large chunk of the population).”

Laurie Rice, associate professor at Southern Illinois University-Edwardsville, said, “When misinformation is common and widespread, trust in all sorts of sources of information – both good and bad – weakens. In addition, those lacking the tools, time, or skill to weigh and assess information from various sources can become more easily manipulated by bad actors.”

John Lazzaro, a retired electrical engineering and computing sciences professor from the University of California-Berkeley, wrote, “Society will develop antibodies: skepticism, common sense, ‘extraordinary claims require extraordinary proof.’ For a real-world analogy, the rise of the three-card Monte con in the New York subway in the 1970s did not lead to an economic catastrophe.”

Paul Hyland, principal consultant for product management and user experience at Higher Digital, observed, “Civil discourse and political debate will be increasingly difficult or impossible.”

Scott Shamp, interim dean of the College of Fine Arts at Florida State University, commented, “We will be ruled by those who view accuracy as secondary to gaining advantage.”

Mike Meyer, chief information officer at University of Hawaii, wrote, “The current governmental and social structure will collapse.”

Amber Case, research fellow at Harvard Berkman Klein Center for Internet & Society, replied, “The consequences for society are similar to what we’ve already seen: Greater rifts between political parties and social classes. Crime due to misunderstandings. A waste of advertising revenue on knee-jerk reactions instead of reflective ones. So much of our online meeting is about speed and reaction that we can’t pause to take a breath before reacting.”

George Siemens, professor at LINK Research Lab at the University of Texas-Arlington, commented,  “Effective discourse across political party and other belief systems are at risk if we are not able to create a trusted digital information ecosystem. The consequences of society will be a loss of many of the social and connected aspects of the internet as confidence will be stripped from these spaces. Dialogue will return to closed, trusted spaces with intermediaries (like with traditional media).”

Charles Ess, a professor of media studies at the University of Oslo, wrote, “The consequences will be apocalyptic – at least for those of us who still believe in the basic notions of human freedom, sociability, rational discourse, and affiliated rights to privacy and freedom of expression that ground democratic polity. The alternatives are already visible in Putin’s Russia and elsewhere: societies masquerading as ‘democratic’ that are in fact thinly disguised electronic oligarchies that render most of us into feudal chattel.”

Brooke Binkowski, managing editor of online fact-checking site Snopes.com replied, “Destabilization; chaos; bad faith; lack of national unity, which in turn further weakens the state; lack of trust in institutions within and without the United States; violence toward the most vulnerable (i.e., migrants and impoverished citizens).”

Rick Forno, senior lecturer in computer science and electrical engineering at the University of Maryland-Baltimore County, said, “The consequences are an uninformed, misinformed electorate that elects similar-minded people.”

John Anderson, director of Journalism and Media Studies at Brooklyn College, City University of New York, wrote, “We’re already living it: Very distinct frames of reference now exist for what constitutes ‘reality’ throughout everyday life.”

Stephen Downes, researcher with the National Research Council of Canada, commented, “The co-opting of public information by bad actors has already taken place, and has been the case for a long time. The internet did not create this phenomenon. We know what happens – wars are started based on fake stories, mass persecution of races, religious or political groups takes place, the environmental and health impacts of dangerous chemicals and processes are ignore – etc. The history of information before the internet is full of such cases. What is new is 1) the means to create disinformation have become democratized, but 2) so have the means to detect it.”

Tom Valovic, Technoskeptic magazine, noted, “Consequences are discussed in my book “Digital Mythologies”: Postmodern chaos and moral relativism along the lines of exactly what we’re seeing now.”

Philip J. Nickel, lecturer at Eindhoven University of Technology, said, “The immediate consequence will be the further erosion of the Fifth Column, the news media, as a check on executive power.”

Sharon Haleva-Amir, lecturer in the School of Communication, Bar Ilan University, Israel, said, “The consequences depict a rather dark future of a more-polarized world; a world in which words, facts or truth have no meaning where people will not be able to cooperate for worthy causes as they will not trust one another and in which governments control citizens by spreading whatever news they want to. With no trustworthy media to count on, no democratic society could live on and prosper.”

Fred Davis, a futurist based in North America, wrote, “Mass media becomes even more propaganda-oriented. Social media creates the potential for ‘society hacking.’ One of the big problems of fake news is that it also casts doubt on real news.”

Giacomo Mazzone, head of institutional relations for the World Broadcasting Union, replied, “The problem is that today’s society is hungry for fake news. It’s not looking to stop it.”

Sandro Hawke, technical staff, World Wide Web Consortium, noted, “If we let bots, people with psychosis, and enemy agents (whoever that might be in the given context) speak with the same voice as members of our communities, the damage will be incalculable. Literally, the end of civilization, I expect. We have to maintain strong attribution and reputation elements in public discourse. We have to be clear who is saying what, and how it fits in with how they’ve behaved in the past.”

Jesse Drew, professor of cinema and digital media, University of California-Davis, commented, “The result will be dictatorship.”

Sean Goggins, an associate professor and sociotechnical data scientist, wrote, “The coopting of public information by ‘bad actors’ poses a clear and present danger to free, democratic society.”

Isto Huvila, professor of information studies, Uppsala University, replied, “The consequences will be disruptive but at the same time, authoritarian measures to prevent them can be equally bad.”

Paul N. Edwards, Perry Fellow in International Security, Stanford University, commented, “Political divisions will be exacerbated and splintered into many factions, each with its own preferred sources of information, verified or not. As at present, wealthy and powerful actors will learn to shape public opinion through disinformation in order to promote their views and values.”

danah boyd, principal researcher, Microsoft Research and founder, Data & Society, wrote, “Democracies depend on functioning information ecosystems. If we don’t address the fundamental issues at play, we could risk the collapse of democracies.”

Mark Glaser, publisher and founder, MediaShift.org, observed, “The consequences are terrible. Without having trusted information, it’s difficult for many organizations to function properly including governments. We rely on factual information as a cornerstone of a functioning democracy.”

Susan Etlinger, industry analyst, Altimeter Research, said, “To be blunt, I think we’re living it right now in many places in the world. And we’ve seen this before: erosion of human rights, weakening of the rule of law, suppression of free speech; all of these are symptoms of weakened public institutions and values.”

Scott Spangler, principal data scientist, IBM Watson Health, wrote, “The loss of a common basis for fact and perceived reality will lead to greater and greater societal fragmentation and conflict.”

Marc Rotenberg, president, Electronic Privacy Information Center, wrote, “The consequence is the diminishment of democratic institutions.”

Andrew Nachison, author, futurist and founder of WeMedia, noted, “The result can be civic collapse, dark ages, widespread distrust of the other.”

Henning Schulzrinne, professor and chief technology officer for Columbia University, said, “It further degrades the notion of a common basis of fact for discussions of public policy and governance, as all information will be seen as equally suspect and untrustworthy, or as just an opinion, a matter of taste.”

David C. Lawrence, a software architect for a major content delivery and cloud services provider whose work is focused on standards development, said, “It maintains the status quo of human social life throughout our entire existence as a species, and that’s probably for the better. Knowing that you have to be prepared for the possibility that someone is misleading you is better than trusting everything and then ultimately being taken advantage of when the bad actors find a way to get their message out.”

Francois Nel, director of the Journalism Leaders Programme, University of Central Lancashire, noted, “(With apologies to the Bible) like the poor, bad actors and taxes will always be with us. And we have, and will, continue to cope.”

Richard D. Titus, CEO for Andronik and advisor to many technology projects, wrote, “All societies are built upon trust. The Magna Carta, the Declaration of Independence – trust is critical to civil society. One need only look where the deterioration of trust occurs, Somalia, Syria, Venezuela and now ever the US to see rising militancy, civil unready as the direct descendants of a loss of trust.”

Mohamed Elbashir, senior manager for internet regulatory policy, Packet Clearing House, noted, “Fake news/information could lead to catastrophic outcomes, spread hate speech, incite violence and undermine the core values of the peaceful exchange of conflicting ideas and opinions.”

G. Hite, a researcher, replied, “The consequences as a whole will be dire as long as people trust what they see on their monitor/device at face value. There are some things that should not be trusted to the internet if a visit or phone call will suffice.”

Sonia Livingstone, professor of social psychology, London School of Economics and Political Science, replied, “I have two kinds of anxieties: 1) Bad actors in the sense of hackers, purveyors of misinformation, criminals will get dealt with by state-led systems of criminal justice and law enforcement; I’m more worried about. 2) The commercial reshaping of “information” in the interests of powerful private sector actors with a lot of lawyers and lobbyists. People already can’t tell what’s trustworthy or true, so probably those with knowledge will withdraw and find a niche alternative, while those without knowledge will withdraw and just become yet more disaffected; which will exacerbate inequality and disruption.”

Robert W. Glover, assistant professor of political science, University of Maine, wrote, “The complete breakdown of our political system, not to mention vulnerability in our basic social structures and economic system.”

Tony Smith, boundary crosser for Meme Media, commented, “Accelerating backslide towards authoritarians by noisy idealists of all flavours.”

Tim Bray, senior principal technologist for Amazon.com, observed, “Damaging political ideas having influence over enacted policies.”

Joseph Turow, professor of communication, University of Pennsylvania, commented, “The escalated distribution of concocted facts casting opponents in a negative light will often mean that various actors will be able to use social media, off-the-beaten-track sites and apps, and email to reinforce silos of opinions regarding issues of concern to them. They will also at times be able to short-circuit political compromises across sectors of a society by planting widespread, reasonably plausible stories that sow distrust.”

Johanna Drucker, professor of information studies, University of California – Los Angeles, commented, “We are already seeing those consequence – a form of affective fascism that works through a phantasmatic force. How to counteract this? We can only hope that some recognition that human survival depends on responsible cooperation – among humans, but also between humanity and the ecology on which it depends – will have sufficient persuasive force to prevail.”

Hazel Henderson, futurist and CEO of Ethical Markets Media Certified B. Corporation, said, “As we have seen, whole societies can be disrupted, citizens mistrusting and informing on each other, these are the classic tools of takeovers by totalitarian leaders and authoritarian governments. George Orwell was right!”

Peng Hwa Ang, an academic researching this topic at Nanyang Technological University, observed, “Interestingly, some research done suggests that it is not the burst of fake news (also known as public information by bad actors) that is the problem. It is the corrosive drip of such news that erodes confidence in democracy. The very bad outcome is the distrust in the democratic process. If we look at the US and UK, the winner of both elections are those who want to erode such confidence.”

Susan Price, lead experience strategist at Firecat Studio, noted, “The arms race of technology enablement and hacking is itself a driver of innovation. Anything created by humans can be hacked; protection must be based on the cooperation of motivated humans.”

Tom Wolzien, chairman of The Video Center and Wolzien LLC, said, “The result, at best, will be decisions based on inaccurate data; at worst, demagoguery and dictatorship.”

Thomas Frey, executive director and senior futurist at the DaVinci Institute, replied, “Failure to find a solution will mean rapidly deteriorating confidence in all systems including the stock market, financial systems, and even voting/elections at the heart of a democratic society.”

Wendell Wallach, a transdisciplinary scholar focused on the ethics and governance of emerging technologies, The Hastings Center, wrote, “Power will remain in the hands of the powerful, or those best able to exploit the technology in pursuit of their own goals and ideology.”

Amy Webb, author and founder of the Future Today Institute, wrote, “Those who envisioned and built the internet never mapped out the possible catastrophic scenarios that might arise as it matured. That’s what makes the proliferation of fake news so acute right now, and why there is no easy way to stop the global threat misinformation poses. We are now standing on the precipice of the next transformative evolution of our information ecosystem, from the internet as we know it today to artificially intelligent agents in the near future. Those AI agents will help to create stories, distribute content, and eventually personalize it for each individual news consumer. Without action today, in the future more our own opinions will be reflected right back at us, making it ever more difficult to confront contrary beliefs and ideologies. Extreme viewpoints will feel like the norm, not the outliers they actually are. Leaders – and everyday people – should base their decisions in fact, but we’re still human. Emotion gets in the way. I foresee extreme viewpoints and personal duress shaping our future world, and that should be a grave concern to us all.”

Axel Bender, a group leader for Defence Science and Technology (DST) Australia, said, “The value of veracity of information will degrade further. This is the continuation of a trend that is not necessarily correlated with the proliferation of social media and internet technology. It is the continuation of a trend that has seen the decoupling of statements/claims and evidence underpinning the statements/claims. Modern media’s requirement to be fast and first in the production of news has come at the expense of substantiation of the claims made.”

Peter Lunenfeld, a professor at UCLA, commented, “We have entered into a moment when Gresham’s Law from economics – where bad money drives out good – collides with Godwin’s Law of social network practices – where any online thread that goes on long enough will eventually devolve into comparisons to Adolph Hitler and the Nazi party. The consequences of the anti-rom comedy ‘When Gresham Met Godwin’ are evident all around us in the base confusion over what constitutes a fact versus an opinion. Whole sectors of on-line culture treat truth as incidental to ‘lulz’ or the quest for power. This will not be an easy fix.”

Jason Hong, associate professor, School of Computer Science, Carnegie Mellon University, said, “More discontent, more radicalization by people who are prone to it, more ways for malcontents to find each other and to organize, distrust and blaming of minorities, a growing lack of trust in organizations and institutions, and perhaps most importantly, really bad public policy decisions. We’ve seen the beginning of these effects with Gamergate, men’s rights movement (which in theory is not a bad thing, but in practice has been really awful), white supremacists, and more. We’ve also seen the effects of misinformation by pro-smoking groups, by people spreading misinformation about global climate change, and by anti-vaccination groups, and they will only get worse as people are exposed to limited and highly biased sources of information.”

Matt Stempeck, a director of civic technology, noted, “Those weaponizing disinformation will realize their goals at the expense of the goals shared by the rest of us who trust that information wins the day on its veracity alone.”

Dave Burstein, editor of FastNet.news, said, “I’m much more worried by well-funded or government misinformation than bad actors’ fraud.”

David Manz, a cybersecurity scientist, replied, “The result will be ‘1984’: dystopian authoritarians and powers of all types can more easily manipulate the feelings of the people.”

Hjalmar Gislason, vice president of data for Qlik, noted, “Democracy is at risk – no less. Democracy relies on informed voters, or at least as informed as they want to be, and certainly not voters being manipulated with false information.”

C.W. Anderson, professor at the University of Leeds, wrote, “Institutions that are currently trusted will succumb to tactics similar to those who are subverting information channels now. Media, especially, will feel the pressure. More and more people will opt out of civic engagement, and civic life, generally, as they become overwhelmed with separating signal from noise. It will simply become too much work. Which will leave people in charge who do not have the best interests of the general population at heart. Most likely, the polarization of American society will only spread further.”

Wendy Seltzer, strategy lead and counsel for the World Wide Web Consortium, replied, “We’ll need to work more with uncertainties and smaller-scale engagement – where we can personally verify information.”

Joshua Hatch, president of the Online News Association, noted, “Honestly, I think it has the potential to lead to civil war. I don’t think that will happen, but I do think those are the stakes.”

Troy Swanson, a teaching and learning librarian, replied, “The challenge comes if credibility and authority in information sources are undermined. Meaning (truth) may always be debatable, but the validity of the measurements of reality (data) should not be. This is the primary danger.”

Nathaniel Borenstein, chief scientist at Mimecast, commented, “Democracy will effectively end, in favor of rule by those who tell the most appealing stories.”

Irene Wu, adjunct professor of communications, culture and technology, Georgetown University, said, “Information and communication are the weft and weave of the social fabric. If they are unreliable, society cannot hold together. However, decentralization of power over information while allowing trusted institutions to emerge is a better path forward than centralizing the responsibility for good information with security services or with the government.”

Ari Ezra Waldman, associate professor of law and New York Law School, wrote, “Democracy dies in a post-fact world. When the public square is so crowded and corroded by misinformation, misleading information, and outright lies, a number of things happen. First, trust falls. Trust in institutions, including the media, falls, spawning denialism and conspiracy theories. Second, knowledge falls. That is, society becomes less educated. When nothing is verifiable and everything is an opinion, people can start believing the Earth is flat and have no reason to doubt themselves because, as Mr. Trump has said about Russian hacking, ‘no one really knows.’ In a world where ‘no one really knows’ anything, anyone can say anything. Rationality, then, ceases to exist.”

Alan Inouye, director of public policy for the American Library Association, commented, “I don’t expect coopting by bad actors. Just that it will remain a complex environment. One implication is that the general public will need to be increasingly more sophisticated about information and communication –being able to critically examine not only the content of messages but also the context in which it is communicated.”

Scott Amyx, managing partner of Amyx Ventures and Amyx+, wrote, “This endangers the very premise of a democracy. History has ample examples of consequences.”

Andreas Vlachos, lecturer in artificial intelligence at the University of Sheffield, commented, “The consequences are that the public will be misled to wrong conclusions.”

Jim Warren, an internet pioneer and open-government/open-records/open-meetings advocate, said, “IF so, the results have been, are, and will be Orwellian until citizens overthrow the perpetrator(s) – which can take generations. The consequences of citizens – individually and collectively – being unable to prevent coopting of information by bad actors would (or will?) be what it has always been: tyranny and citizen ‘helplessness’ until they (we!) revolt and redress the abuses.”

Allan Shearer, associate professor at the University of Texas-Austin, commented, “There would be a paralysis of decision-making processes and increasing tension between or among groups.”

Evan Selinger, professor of philosophy, Rochester Institute of Technology, wrote, “A well-functioning democracy is inherently incompatible with a post-fact society. Full stop.”

Zbigniew Lukasiak, a business leader based in Europe, wrote, “We’ll go back to the more tribal ways – people will distrust strangers.”

Maja Vujovic, senior copywriter for the Comtrade Group, noted, “Governments, businesses, schools, scientific bodies, sports and entertainment entities et cetera – everything and everyone is represented by information. Even our personal brands have become the result of careful curating. The very fabric of society would unravel if the public could never trust public information. Worse yet, spurious agents could control large groups by serving them simplified or distorted information. People have proven severely susceptible to propaganda, over and over. Democracy gets deformed without independent information to guard it.”

Matt Armstrong, an independent research fellow working with King’s College, formerly executive director of the US Advisory Commission on Public Diplomacy, replied, “The society will be co-opted. The naive, the angry and the shortsighted will be enlisted in support of bad actors, and any wake-up or regret will come too late. This is not new, despite our attempts to frame this issue as unusual and unique to our age.”

Dan Gillmor, professor at the Cronkite School of Journalism and Communication, Arizona State University, commented, “It would mean the end of democratic self-rule. If all information is equal, no matter whether it’s true or false, benign or malign, we can’t make decisions based on reality.”

Justin Reich, assistant professor of comparative media studies, MIT, noted, “Autocratic societies have recognized that the most effective way to manage societal information is less about countering particular perspectives and more about flooding people’s information channels with distraction, misdirection, and falsehoods. In the absence of reliable arbiters of truth and falsehood, people are more likely to defer to tribal and ideological loyalties. Gary King has excellent research on state-sponsored social media propagation in China, showing that one of the central aims is flooding social media with noise and distraction.”

Larry Diamond, senior fellow at the Hoover Institution and FSI, Stanford University, observed, “If we fail in this task, social and political polarization will deepen because there will be no basis for common knowledge and understanding about any issue. Every political orientation will be living in its own information bubble, even more so than today. Distrust in institutions of all kinds will proliferate, and the ability of authoritarian regimes to subvert democratic practices, as Russia did with the 2016 elections in the US, will surge.”

Vince Alcazar, business owner and retired US military officer, wrote, “We are already seeing the largest consequence: The death of critical thinking; and second, the legitimization of ‘alternate facts.’ We passed the tipping point for these notions in the middle of the last decade. Now, many Western societies are split into multiple societies with a widening chasm of ideological distance in between.”

Anne Mayhew, retired chief academic officer and professor emerita, University of Tennessee, replied, “Cyberwarfare and perhaps worse among competing groups, groups that will probably not be nation states. This will be made worse by climate change and competition for livable space and water.”

Greg Lloyd, president and co-founder of Traction Software, wrote, “A tragedy of the commons reduces but does not eliminate the value of public services. People will be more selective and rely more on walled services.”

Adam Holland, a lawyer and project manager at Harvard’s Berkman Klein Center for Internet & Society, noted, “A likely amplification of ‘tribal’ or thematic societal divisions and an inability to find consensus on solutions to collective action problems; increased costs in general due to inefficiencies of discussion, and retarded progress.”

Shawn Otto, author of “The War on Science,” predicted, “Democracy will be destroyed and policy will be controlled by the people with the biggest propaganda/PR budgets. Journalism’s original intent in a free society was/is to serve as the rudder for the public to steer the ship of state, providing reliable feedback to citizens so they could make decisions informed by the evidence and hold their elected leaders to account. Without a reliable rudder, the ship of state may founder on the shoals of authoritarianism. Everything, but especially freedom, rests on the embrace of the enlightenment values of science, objectivity, knowledge over ‘but faith, or opinion,’ as John Locke put it, equality, and separation of church and state, that gave birth to the idea of modern democracy and serve as the foundation upon which it stands.”

Danny Rogers, founder and CEO of Terbium Labs, replied, “Oh, a full-on ‘1984’ scenario would result. And one need only look as far as authoritarian governments in places like North Korea to see the consequences of information dominance by those unswayed by truth. The question is not theoretical, and the consequences are apocalyptic.”

Susan Hares, a pioneer with the NSFNet and longtime internet engineering strategist, now a consultant, said, “A society that bases its decisions on false information operates at the whim of those who create the false information. Some people have a political agenda for creating bad information or allowing reduced standards for protecting information. Reading President Obama and President Trump’s pre-election books can provide you with two different attitudes on public information. Some people simply want chaos to reign. Bad information and chaos is at the heart of civil unrest, racial hatred, strife and wars.”

Louisa Heinrich, founder of Superhuman Ltd, commented, “If we don’t figure out how to identify, share, respect and distribute the truth, then things look pretty grim.”

Jeff Jarvis, professor at the City University of New York Graduate School of Journalism, commented, “The consequences are dire indeed: a failure of the public conversation and deliberation that is the engine of democracy. But beware the temptation to cry as journalists do, ‘You’ll miss us when we’re gone.’ Many would not. We must reinvent journalism for this new reality to truly serve diverse communities in society and to convene them into informed and civil conversation, not just to produce a product we call news that we insist the public must trust, or else!”

Greg Wood, director of communications planning and operations for the Internet Society, replied, “The bar need not be active coopting; there would be severe consequences in merely sowing uncertainty about important information. Public markets depend – both theoretically and in practice – on widely available reliable information. A simple example: imagine what false, but seemingly authoritative, information about US Federal Reserve actions would do to stock and bond prices.”

R. Lee Mulberry, managing partner, Northern Star Consulting, said, “There will always be a few bad actors in all parts of life. I believe the media has begun and will continue to improve their self-policing policies and methods. Their credibility is closely tied to their revenue, this alone will help improve the process.”

Ed Terpening, an industry analyst with the Altimeter Group, replied, “This seems unthinkable today, but I suppose best answered by an historian. Before the free-flow of information (perhaps going back to the Guttenberg press), what happened to society when confronting biased information? Or, perhaps a country like North Korea provides a model for what can happen.”

Paul Jones, director of ibiblio.org, University of North Carolina-Chapel Hill, observed, “Vernor Vinge described the ‘Net of a Million Lies’ in his novel ‘A Fire Upon the Deep.’ In 1993, he saw how the Net could participate in a complicated context that included genocide, betrayal and space battles. He had it nailed.”

Helen Holder, distinguished technologist for HP, said, “The deterioration of a common understanding of objective reality is corrosive to society because it opens institutions to ever more arbitrary decision-making, leading to a range of bad outcomes for society. For example, if a government department’s charter requires that it monitor and protect the integrity of a body of water (assuming that charter is an expressed ‘will of the people,’ enacted through legislation) but different people or groups cannot agree on the objective measurement of the contaminants, it becomes impossible for the government department to fulfill its function. Note that this is different from disagreements on moral questions, such as should human life be privileged over aquatic life. Without a commonly accepted objective reality, all decisions become arbitrary. In such a situation, a few will find ways to benefit and take advantage, but probably at the expense of the majority of society.”

David Weinberger, writer and senior researcher at Harvard’s Berkman Klein Center for Internet & Society, noted, “It will be the same as always. We’ve never been able to drive out all bad ideas from all bad actors. That’s why we have complex systems of authentication and complex systems of governance for resolving policy even when opinions cannot be fully harmonized. That people believe false things, sometimes because they are fooled by bad actors is not an aberration caused by the internet. It is the human condition.”

Peter Eckart, director of health and information technology, Illinois Public Health Institute, replied, “It’s not hyperbole to suggest that democracy is at stake. If voters are unable to distinguish truth from untruth, then truth cannot be uplifted to prominence over untruth. We’ve always stated that ‘the truth will set you free,’ so the converse might also be true: the lack of truth will enslave us.”

Peter Levine, associate dean and professor, Tisch College of Civic Life, Tufts University, observed, “Of course, public information has always been coopted pretty frequently by bad actors. This is not an entirely new phenomenon. However, the consequences are potentially disastrous, going as far as accidental nuclear war.”

Adam Powell, project manager, Internet of Things Emergency Response Initiative, University of Southern California Annenberg Center, said, “Consider the reverse: If you ‘coopt’ the ‘bad actors’ you are just creating a mechanism by which totalitarian regimes flourish (see Firewall, Great, of China).”

Bernie Hogan, senior research fellow, University of Oxford, noted, “Bad actors undermine our faith in institutions and our ability to even conceive of these institutions as functional parts of society. The most likely consequence is greater inequality of information, especially quality information, and the concomitant consequence of polarising outcomes. Bad actors prey on the weak and the naive. These might be politicians, policy makers for platforms or intelligence agencies. It’s not fair to call Russians the bad actors and the US as the good ones. There are many people operating within the law for selfish and contemptible reasons.”

Joanna Bryson, associate professor and reader at University of Bath and affiliate with the Center for Information Technology Policy at Princeton University, said, “I don’t mean to be alarmist, but basically the consequence is disintegration. We will need to scale back our transactions to those we can trust, which implies far smaller societies. In the process to getting there, probably war.”

Adrian Schofield, an applied research manager based in Africa, commented, “Society will distrust the information presented to them unless they can verify it first-hand. Everything will be regarded as propaganda, published by vested interests. Communities will form alternative information flows (similar to the Dark Web).”

Alf Rehn, chair of management and organization studies, Åbo Akademi University, predicted, “Increased polarization, radicalization and so on would become the order of the day. We already see that there are fringe groups who distrust public information on sight and on principle.”

Marcel Bullinga, futurist with Futurecheck, based in the Netherlands, said, “If we cannot prevent bad actors from twisting public information, it will carry us to our graves, since society evolves around data and trust. But we WILL come to a solution because we must.”

Vian Bakir, professor in political communication and journalism, Bangor University, Wales, commented, “The consequences are bad: e.g., badly informed citizens reinforcing their unfounded beliefs in digitally created echo chambers; increased polarisation of society; increased discontent from those who lose in democratic elections where mis- and dis-information prevail.”

Jens Ambsdorf, CEO at The Lighthouse Foundation, based in Germany, replied, “Society will be even more manipulated in opinion and decision-making. Although this is not new, now it is possible on a global scale and much faster.”

David J. Krieger, director of the Insitute for Communication and Leadership, Lucerne, Switzerland, commented, “Society will turn away from public information sources to private ones.”

Ella Taylor-Smith, senior research fellow, School of Computing, Edinburgh Napier University, noted, “It is not possible to prevent this, but it’s not a binary thing. Public information has always been coopted by people who favour their own interests, rather than society’s. We need to adjust the balance away from them –through creating content and advancing literacies.”

Ian Peter, an internet pioneer, historian, activist user and futurist, replied, “The consequences are substantial. Clickbait might be new, but the problem predates the internet – particularly in specific areas such as propaganda. It is not new for societies to be told what they should think and believe; however, the internet does provide many new opportunities for propaganda and fact manipulation to occur. The great freedoms we once envisaged via the internet may not materialise in an environment where most internet users draw their information from “bubbles” in particular social media applications.”

Rich Ling, professor of media technology, School of Communication and Information, Nanyang Technological University, said, “This will undercut the legitimacy of both the press (people’s source of information about policy and governance) as well as the legitimacy of government in itself. In short, it is the end of democracy.”

Julian Sefton-Green, professor of new media education at Deakin University, Australia, replied, “The result will be as it has always been: The continuation of inequality and exploitation without possibilities for redress. Ultimately, it could lead to the decline of public space.”

Marina Gorbis, executive director of the Institute for the Future, said, “There will be increased tribalism, i.e., people repling on close-knit social networks to help filter and process information abnd widely held distrust of public sources of information.”

Patrick Lambe, principal consultant, Straits Knowledge, predicted, “The result will be war, chronic low-grade conflict, poverty and the loss of social cohesion.”

Vivienne Waller, senior lecturer, Swinburne University of Technology, replied, “Although the coopting of public information by bad actors cannot be prevented, there will never be 100% co-option. In other words, at any time there will always be some ‘good’ information, and consumers will work out ways of knowing what they can trust. There is a parallel with the use of email. Although it is not possible to prevent spam and phishing, email users are educated to recognise both of these types of email. Similarly, the key regarding public information is education of the consumer.”

Eric Burger, research professor of computer science and director of the Georgetown Center for Secure Communications in Washington, DC, replied, “We have always had this problem. It is the reputation problem, which is why even in the digital age we have trusted news sources and we have fringe sources.”

Barry Wellman, internet sociology and virtual communities expert and co-director of the NetLab Network, noted, “We’ve had such cooptation before, and people – even the American people – have a means for crap detection.”

Seth Finkelstein, consulting programmer with Seth Finkelstein Consulting, commented, “We move further away from democracy and toward oligarchy. Now, the way the question is phrased suggests to me a ‘barbarians at the gates’ framework. Something along the lines of ‘What happens if the savages invade our city?’ The stress on ‘bad actors’ emphasizes the promotion of falsehood. But that directs attention away from the decimation of support for truth. I think the issue is more akin to ‘What happens if the idea of an educated population is abandoned?’ While I don’t want to overstate the ideal, there are large differences in how much societies value trying to have an informed public. Even from a cynical perspective, there’s a vast gulf between at least nominally aspiring to rely on reason (despite being flawed with errors and rationalizations), versus raw appeals to fear and hatred. This is a key issue of political philosophy, the theory of government. As an analogy, we’ve never had perfect economic equality. But there’s a huge range between having a strong middle class, versus Third World stereotypical inequality levels with a tiny sliver of ultrawealthy and everyone else in poverty. We seem to be going through something similar in terms of an informed populace. We’ve never had everyone be an intellectual (and not everybody wants to be one!). But there’s a big difference between having civic efforts to bring knowledge to the masses, and a tiny sliver of ultrainformed specialists (who don’t matter much anyway) and the rest of the populace just reacts to whatever propaganda is put out on an issue.”

Andrea Matwyshyn, a professor of law at Northeastern University who researches innovation and law, particularly information security, observed, “Our approach to information security needs to shift from information sharing and deterrence to building security vigilance infrastructures and defense primacy.”

Bryan Alexander, futurist and president of Bryan Alexander Consulting, replied, “Consequences include a dwindling lack of faith in institutions, public disengagement from civic life, increased political polarization and loss of support for the commons.”

Howard Greenstein, adjunct professor of management studies at Columbia University, said, ” With different versions of public information, or ‘facts’ we spend more time arguing about what the actual facts or truths are, and less time actually addressing the situations. Bad actors exacerbate this situation. Additionally, with opinions at a reasonable place and an extreme, people will move toward a ‘center,’ and therefore continually be pushed more towards the extreme opinion.”

Alexander Halavais, associate professor of social technologies, Arizona State University, said, “New wine, old bottles. Coopting of public information by bad actors of all sorts has always been the norm. There may have been a recent acceleration of that, thanks to new technologies that are not as familiar to some, or it may just be that what was once less visible is now more so. In practice, I suspect the consequences are not substantial: media literacy and critical thinking still matters. Who knew?”

David Schultz, professor of political science, Hamline University, said, “It would lead to the continued erosion of political and cultural legitimacy and consensus.”

Mark Lemley, professor of law, Stanford University, observed, “People will become better at filtering out fake news. Indeed, they already have. But people who want to believe fake news rather than to learn the truth will find it easy to do so.”

Nigel Cameron, technology and futures editor at UnHerd.com and president of the Center for Policy on Emerging Technologies, said, “Is this a question? I mean, it’s all over. Liberal democracy, freedom, et al.”

Bill Woodcock, executive director of the Packet Clearing House, wrote, “The main danger is a continuing degradation of public trust in online information sources, a long slow slide we’ve been on for forty years, now. The principle cost of this is to education, which suffers when people cannot trust the veracity of the things they read, and find falsehoods more aggressively supported and corroborated than truths (because the truth rarely has a deep-pocketed advocate).”

Greg Swanson, media consultant with Itzontarget, noted, ” The current climate does raise the spectre of a media-savvy oligarchy pummeling the electorate into some new form of government. The optimistic hope would be that in the broad cacophony of the voices of political spin, outright lies, the special interest megaphones and the rants on talk shows there will emerge a set of shared conversations and a collection of trusted voices that will reach and influence enough people that our democracy will continue to exist.”

Mike Roberts, pioneer leader of ICANN and Internet Hall of Fame member, replied, “Media (for that matter, all forms of social discourse) have always been plagued by inaccuracies and special pleading. The notable aspects of the current situation are the far greater reach of the Net as a distribution medium, and the pathological and possibly criminal motivations of those sourcing the erroneous info. Society is diminished if discourse cannot be trusted.”

Robin James, an associate professor of philosophy at a North American university, wrote, “We see these every day – derogatory stereotypes about minorities are an old example of this. Think about decades-old arguments about so-called ‘black-on-black crime.’”

Tom Birkland, professor of public policy, North Carolina State University, noted, “The consequences will include very poor public policy, as demands for policies become driven by bad ideas and unrealistic expectations.”

Jeff MacKie-Mason, university librarian and professor of information science, professor of economics, University of California-Berkeley, replied, “If we do not greatly raise the information literacy of people, and provide better reputation and evaluation tools, the quality of civic discourse will continue to decline, and I fear our democratic system is at risk. Fascism triumphs when it gains control of civic information.”

David Sarokin of Sarokin Consulting, author of “Missed Information,” said, “’Coopting public information’ is what Orwell called ‘doublespeak’ and ‘1984’ describes the consequences quite nicely, I think.”

Paul Gardner-Stephen, senior lecturer, College of Science & Engineering, Flinders University, noted, “In my view, we are seeing many of the problems that authors prophesied in works like ‘1984.’ By manipulating the perception of truth, and combining this with powerful governmental machines, the potential for the ‘capture’ of democracy by authoritarianism and indeed neofascism are very real risks.”

Jonathan Grudin, principal design researcher, Microsoft, said, “The results are plainly evident in historical consequences of cooptation of public information by actors subsequently found to be bad. These caused damage but were overcome, and I am optimistic this will happen again.”

Robert Watts, director of new media, Federation of Jewish Men’s Clubs, said, “Instead of an ‘information society’ we will have a ‘disinformation society’ in which there is so much confusion about fact and fiction that trust in public information will erode further.”

Richard Rothenberg, professor and associate dean, School of Public Health, Georgia State University, noted, “We will lose an instrument for advancement, and will need to discover a new one.”

Virginia Paque, lecturer and researcher of internet governance, DiploFoundation, predicted, “For better or for worse, we will lose the internet as a source of information. OR we might have walled gardens, inside of which we can control information more carefully.”

Tatiana Tosi, netnographer at Plugged Research, commented, “The consequences has already been taken place in terror attacks, community extremists, but mostly the lack of information in a world of too much information, brings ignorance to the most of population. To reverse we need to put all the filters disposable for the society.”

Pamela Rutledge, director of the Media Psychology Research Center, noted, “Controlling information controls freedom. Information is the narrative that each of us uses to construct reality. However, just as it will be a challenge to prevent assaults on systems by bad guys, it will also be impossible to prevent a resistance of ‘counter-hacking’ by those who want to restore access, connectivity and information flows.”

Richard Lachmann, professor of sociology, State University of New York-Albany, predicted, “There would be a distrust of knowledge and of expertise, the breakdown of national community and success for demagogues.”

Diana Ascher, information scholar at the University of California-Los Angeles, observed, “The consequences for society as a whole are, as always, bad for marginalized people and good for those with the power to control public opinion. What’s disturbing to me in the present situation is the blatant disregard for public trust, which is a strategic undermining of democratic accountability. The public is being desensitized, and politicians seem to be waiting it out.”

Dariusz Jemielniak, professor of organization studies in the department of Management In Networked and Digital Societies (MiNDS), Kozminski University, predicted, “There will be a further decline of reason, the second coming of the Dark Ages.”

Siva Vaidhyanathan, professor of media studies and director of the Center for Media and Citizenship, University of Virginia, wrote, “Authoritarianism will be easy to establish and maintain if there is now way to maintain credibility or a reputation for reliability.”

Alexis Rachel, user researcher and consultant, said, “Truth becomes subjective, and as a result we could end up in a dangerous relativism.”

Daniel Kreiss, associate professor of communication, University of North Carolina-Chapel Hill, commented, “The bigger issue is the willingness and even eagerness of members of the public to believe in misinformation. The issue is not bad actors – they have always been there and have attempted to parlay bad information strategically. The issue is the credibility that these actors have among the public, who judges information and sources through the lens of identity. The consequences of this, to me, are contingent upon where parties as institutions and networks take this public.”

Steven Miller, vice provost for research, Singapore Management University, wrote, “We are seeing it now, so one answer is – just look around now – and figure out what is happening. But situations like this do not stay static. The situation now is different than it was 20 years ago, or even 10 years ago. Similarly, it is hard to know how the current environment will change 5, 10 and 20 years from now. But it will be dynamic, and we cannot assume it will stay as it current is.”

Mike DeVito, graduate researcher, Northwestern University, wrote, “If we don’t increase information literacy, and probably civics knowledge as well, the consequences are clear: bad actors taking over. A good narrative is always going to be more powerful than a detailed explanation; that’s well verified in both communication and cognitive science literature. Good narratives are what the bad actors have; not having to worry about truth really opens up the creative possibilities, after all. All the good actors have are facts and explanations, and those don’t do much good if the populace doesn’t know the basics of how government works (civics education) and how to check/verify information and the values/biases of information systems (information literacy). Like in a libel case, the best defense is truth; if no one can consistently identify the truth, though, you really don’t have many options.”

Eduardo Villanueva-Mansilla, associate professor, department of communications, Pontificia Universidad Católica del Perú, said, “Political norms will continue to degrade as long it is effective and worthy for those involved in politics and society. It may be difficult to contain and only societies with strong normative traditions in their political activities may be able to withstand the force of disruption.”

Jane Elizabeth, senior manager for the American Press Institute, said, “Living in a democracy means, thankfully, having the freedom to make a variety of choices in everyday life. Without reliable information to inform those choices, decision-making is poor or non-existent and we cease to live democratically.”

Andee Baker, a retired professor, said, “People will come to believe more and more false information, leading to trusting the wrong people and electing a greater proportion of incompetent leaders.”

Jeff Johnson, professor of computer science, University of San Francisco, replied, “Bad actors coopted public information in the former Yugoslavia and in Rwanda. In both cases the public media was radio. In both cases the result was civil war and genocide.”

Federico Pistono, entrepreneur, angel Investor and researcher with Hyperlook TT, commented, “The consequences could be disastrous, potentially fatal.”

Richard Rothenberg, professor and associate dean, School of Public Health, Georgia State University, noted, “We will lose an instrument for advancement, and will need to discover a new one.”

Virginia Paque, lecturer and researcher of internet governance, DiploFoundation, predicted, “For better or for worse, we will lose the internet as a source of information. OR we might have walled gardens, inside of which we can control information more carefully.”

Tatiana Tosi, netnographer at Plugged Research, commented, “The consequences has already been taken place in terror attacks, community extremists, but mostly the lack of information in a world of too much information, brings ignorance to the most of population. To reverse we need to put all the filters disposable for the society.”

Pamela Rutledge, director of the Media Psychology Research Center, noted, “Controlling information controls freedom. Information is the narrative that each of us uses to construct reality. However, just as it will be a challenge to prevent assaults on systems by bad guys, it will also be impossible to prevent a resistance of ‘counter-hacking’ by those who want to restore access, connectivity and information flows.”

Noah Grand, a sociology Ph.D., wrote, “There is a long history of politicians trying to co-opt public information. The White House Office of Communications has grown under each president since Nixon made it a full-time organization after his campaign. We need to define bad actors more broadly and see ‘fake news’ as the next step in a 50-year process instead of thinking it is completely foreign. The main consequence in a society where it is easier and easier for bad actors to co-opt public information is people can believe what they want to believe. Someone deeply committed to seeing the world a certain way can always find reinforcement. Many politics readers want emotional and ideological reinforcement more than they want facts. This is the highly engaged audience today. What worries me most is the possibility that audiences who don’t want to consume that much news – but prefer their news is factual – will run away from all these bad actors shouting at each other.”

Meg Mott, professor of politics at Marlboro College, commented, “I’m concerned about the framing of this issue: Better to educate the population on how to ask critical questions, such as ‘Who benefits from this way of framing the story?’ rather than assume we can identify malevolence with a rough rubric like ‘bad actor.’”

Jack Park, CEO, TopicQuests Foundation, noted, “Are we not seeing that already? You can focus on anything you like, from alleged Russian interference in our recent elections, etc. But, face it; if people were not so easily agitated by unsupported ‘facts,’ things would be different. Looking for ways to reduce fake news is a bit like looking for lost keys under the street lamp because that’s where the lights are on; it may have little to do with problem resolution. We have met the enemy, and he is us. The ‘us’ in that needs to improve.”

Steve Newcomb of Coolheads Consulting replied, “There won’t be ‘public information’ in the sense understood by the generation that grew up with Walter Cronkite (legendary TV news broadcaster of the mid-20th century).”

Kelly Garrett, associate professor in the School of Communication at Ohio State University, said, “There have always been bad actors, individuals who lie and deceive for personal gain. When these individuals succeed, society is worse off. Fortunately, individuals and societies learn from their mistakes. Having been fooled, we search for better ways to protect ourselves. This includes creating penalties for those caught in deception and creating strategies to avoid being deceived again in the future.”

Daniel Berleant, author of the book “The Human Race to the Future,” commented, “Decisions that affect society at large will be made without regard for their benefits or harm to society at large, which will damage society and retard or reverse progress.”

Stephan Adelson, an entrepreneur and business leader, said, “Continuation of an escalating divisiveness.”

Mike O’Connor, a self-employed entrepreneur, wrote, “The same as the consequences of bad actors coopting information at any scale (relationship, marriage, family, neighborhood, town, state, country) or organism (cell up to ecosystem) – cancer followed by death.”

Willie Currie, a longtime expert in global communications diffusion, wrote, “We would see a turn towards fascism. Timothy Snyder’s book on Tyranny is the textbook here.”

Katim S. Toray, an international development consultant currently writing a book on fake news, noted, “Fake news can have detrimental impact on democracy, businesses, and world peace. The 2016 US presidential elections left a sour taste in the mouths of many because of allegations of the role of fake news in the process, many business spend considerable sums of money to combat fake news, a we all know the devastating impact of the fake news spread by the US government that Saddam Hussein’s had weapons of mass destruction.”

Ian O’Byrne, assistant professor at the College of Charleston, replied, “We’re seeing the implications of this in the current elections and social tone around the globe. Individuals have more of a connection with their online tribes and affinity spaces than they do with their next-door neighbor. The system is primed to allow bad actors to co opt these spaces and streams and spread their message. As the spaces/streams become more salacious, and the groups want to hide their true motive, the bad actors targeting them will get even more clandestine as they work to spread misinformation there.”

David A. Bernstein, a marketing research professional, said, “Society is likely to become even more fractured with each group strongly holding to their ‘truths.’ We only have to look at Syria to see groups that will kill and die for their particular version of the truth. It is unlikely that either side will move off their position and what the truth is then only becomes question of whose interpretation of the truth you subscribe to.”

Michael Wollowski, associate professor at the Rose-Hulman Institute of Technology, commented, “The fundamental problem is that in the past two/three decades several organizations, primarily those associated with the tobacco and oil industry have worked tirelessly to devalue scientific/objective information. We are seeing these effects permeate all of society. As such, the damage has been done. As a society, we need to ensure that we need to learn to identify and be willing to trust experts. I think that people by and large are all too willing to hear what they want to hear. The damage has been done. If we are willing to change this, the school system would have to fill this role.”

Monica Murero, a professor and researcher based in Europe, wrote, “I foresee that society as a whole might undergo unforeseen changes. We are observing some consequences already today (recent elections). For example, ‘fake’ news alarmism may affect attitudes and behaviours (reinforce political attitudes, consumption, etc.). I foresee an augmented lack of trust from citizens towards political institutions as they are considered responsible for guaranteeing public security and reliability in public information.”

Ned Rossiter, professor of communication, Western Sydney University, replied, “Society has always lived with the consequences of so-called ‘bad actors’ coopting public information. That’s how power, commerce and social governance work.”

Andrew Feldstein, an assistant provost, noted, “At one extreme, loss of trust, the inability to tell truth from fiction, incivility, and chaos. At the other extreme a totalitarian state that becomes the only source of trusted information since people might opt for structure over chaos.”

Sandra Garcia-Rivadulla, a librarian based in Latin America, replied, “It will be possible to prevent or mitigate this, to some extent. Information users will need to become more information literate. The coopting by bad actors will be considered a reflection of other behaviour people in society already shows offline, but new laws and resources will be in place to respond to these situations.”

Hanane Boujemi, a senior expert in technology policy based in Europe, commented, “The consequences could involve making the wrong decisions about critical issues.”

Avery Holton, professor at the University of Utah, wrote, “As a more utopian view, the public is forced to become accountable for the information it engages with. This applies, then, to the information that the public chooses to create and share. A more accountable public can, at the very least, identify such coopting when it occurs.”

Carol Chetkovich, professor emerita of public policy, Mills College, commented, “We’ve already seen some of the consequences in the election of the most incompetent president of my lifetime, Donald Trump. Continued coopting of information by bad actors will continue to lead to more demagogues being elected.”

Daniel Menasce, professor of computer science, George Mason University, replied, “The consequences to society are extremely grave! A misinformed society can be easily led to make the wrong choices.”

Emmanuel Edet, head of legal services, National Information Technology Development Agency of Nigeria, observed, “The result will be insecurity. The truth is, these issues have always been there. For example, the mysteries surrounding the assassination of JFK, and the 9/11 conspiracy theorists. Information Technology simply amplifies the challenge.”

Constance Kampf, a researcher in computer science and mathematics, said, “Your focus is on the sender, not the receiver. It would be more productive to look at sender/receiver systems of interaction in a knowledge management inspired perspective (try Seely Brown’s ‘Social Life of Information,’ or any of the great knowledge-management work in the late 1990s/early 2000s addressing the interplay of social and technical aspects of knowledge. The co-opting of information by bad actors is not the core issue; the naïveté of people who do not have the skills to critically engage with information is a much more interesting question. My high school history teacher’s definition of history has caught up with the present: History (public information available on the internet) is man’s feeble attempt to isolate the unique gestalten [patterns or configurations], dress them in tenuous rationale, and including them in times yet to come (internet information sources), hoping that the unsuspecting innocents will view this prestidigitation as truth, or more hopefully, revelation. So, I see the future needs and our responsibilities in terms of asking questions like, ‘How can we develop more-sophisticated forms of social literacy that include critical perspective and knowledge of the potential technical and social pitfalls of relying on public information alone,’ and building an understanding of how and why public information can be co-opted by bad actors. This requires teaching critical thinking, ethics and a deep understanding of rhetoric and how it works in society. Note that I am using rhetoric in the academic sense with reference to Aristotle, Isocrates and the ancient Greeks and Sophists, and rhetoricians throughout history, not in a popular sense.”

Stuart A. Umpleby, professor emeritus, George Washington University, wrote, “Truthful information is essential for wise decision-making. Regularly asking, ‘What is the evidence for that view?’ and ‘What is the source of that interpretation?’ will help.”

Antoinette Pole, associate professor, Montclair State University, noted, “There will be lower trust in government institutions and actors, accompanied by decreased legitimacy.”

Gina Neff, professor, Oxford Internet Institute, said, “We are already seeing the rapid decline in the trust in journalism and the trust in public institutions in the US which will only weaken our democracy.”

William L. Schrader, a former CEO with PSINet Inc., observed, “The consequences are the same as they have been since the beginning of the written word. Look at religious documents of all kinds, they somewhat agree and somewhat disagree, but at all times are used to justify horrible wars and killing. People, mankind, is what it is. The internet is simply one more communications tool that man will use to manipulate others to commit to their beliefs. That is my view.”

Don Kettl, professor of public policy at the University of Maryland, said, “This would be very bad. It would diminish trust in political discourse and in political institutions.”

James Schlaffer, an assistant professor of economics, commented, “Nothing. Eventually the people who only believe one source or who don’t see that multiple sources that all quote each other as sources is really only one source and should be accepted with caution. People who get their news from Facebook will eventually become a lost minority.”

Stowe Boyd, futurist, publisher and editor in chief of Work Futures, said, “I believe it will be possible to identify and tag fake information, so we will head that off.”

Collette Sosnowy, a respondent who shared no additional personal details, wrote, “The consequences are continued divisiveness and a lack of interest in verified information by many people.”

Kevin Werbach, professor of legal studies and business ethics, the Wharton School, University of Pennsylvania, said, “Our democracy will continue to be weakened.”

Filippo Menczer, professor of informatics and computing, Indiana University, noted, “As news become less reliable and trusted, public opinions become easier to manipulate and democracy is weakened, giving way to de-facto dictatorships.”

Garland McCoy, president, Technology Education Institute, commented, “Given product coming out of most public education systems there is not much anyone can do. For those fortunate enough to have been well-educated there won’t be a problem as a well-educated individual is the best defense against bad actors in the public information space.”

Luis Martínez, president of the Internet Society’s Mexico chapter, observed, “Any poorly-informed society becomes weak in an age of sociopolitical change.”

Mike Gaudreau, a retired IT and telecommunications executive, commented, “There would be further polarization of opinions and nasty rhetoric.”

Ryan Sweeney, director of analytics, Ignite Social Media, wrote, “We will become the authors of our own destruction. The United States turned its back to the world by pulling out of the Paris Agreement. Our planet is dying, but we refuse to take steps to remedy it. Vaccine related fear is leading to outbreaks. We are afraid of each other because of how aspects like skin color and religion are condemned by our politicians and covered relentlessly by the media; tearing each other apart because of it. That’s just the tip of the melting-iceberg; what’s next?”

Daniel Alpert, managing partner at Westwood Capital, a fellow in economics with The Century Foundation, observed, “Difficulties in sorting fact from fiction have been present in society since Gutenberg (at least). The issue we are facing is the rapid dissemination of fiction labeled as fact, and the silo-ization of information networks. While you may not be able to induce a reader to read ‘both sides,’ of an issue, many people would utilize a passive ‘bullshit detector’ just as we use malware and spam detectors today. Not to censor, but rather to point out (in an automated fashion) facts that are not supported by source-level information.”

Andreas Birkbak, assistant professor, Aalborg University, Copenhagen, said, “There has never been unbiased public information, so the consequences are not out of the ordinary. The question is whether a culture that cares about facts prevails.”

John King, professor, University of Michigan School of Information Science, noted, “People will work around it. We have had these problems for thousands of years.”

John Laprise, consultant with the Association of Internet Users, wrote, “Society will develop new cultural norms to deal with it.”

Ray Schroeder, associate vice chancellor for online learning, University of Illinois-Springfield, replied, “The consequences include a further widening of the gap between political poles. Eventually, we will see one pole suffer enormously as they fall victim (lack of health care, higher taxes, fewer services) to deceit from the very leaders they trusted.”

Davide Beraldo, postdoctoral researcher, University of Amsterdam, noted, “The manipulation of public opinion towards specific political agendas and the exploitation of the public’s cognitive/affective labour.”

Tom Worthington, honorary lecturer in the Research School of Computer Science at Australian National University, commented, “Presidents Reagan and Trump are examples of the coopting of public information by bad actors. ;-)”

John McNutt, professor, School of Public Policy and Administration, University of Delaware, wrote, “This would be very bad and would destroy democracy. I am not willing to concede that it is actually possible.”

Greg Shatan, partner, Bortstein Legal Group, based in New York, replied, “The consequences are that trust in online information and communication will erode, even when it appears to come from trusted sources. The promise of the internet will be stunted if it cannot be seen as a trusted source.”

Amali De Silva-Mitchell, a futurist, replied, “A messy situation for society and potentially harmful, and the only counter is educating people to apply the benefit of the doubt. Given an educated population this should be a key teaching for students – the curiosity to dig deeper for truth and benevolent generosity.”

Ayaovi Olevie Kouami, chief technology officer for the Free and Open Source Software Foundation for Africa, said, “The society as a whole is under a permanent threat.”

Mark Patenaude, vice president for innovation, cloud and self-service technology, ePRINTit Cloud Technology, replied, “We see today what unfettered language and content is doing to society. Regardless of a country’s leader, there must be stopgaps that prevent this dissemination. The political and banking systems as well as our war machine will be unstoppable. We need to curtail the political agendas of all sides. Please no anarchy though!”

Paul Kyzivat, retired software engineer and Internet standards contributor, noted, “There would be factionalism and anarchy.”

Flynn Ross, associate professor of teacher education, University of South Maine, said, “The populus becomes more uncertain as they are hurt by the results. The great hope in the US is the younger voters are better informed than the older generation.”

Kevin J. Payne, founder and research scientist, Chronic Cow, commented, “The widespread pollution of our cultural environment with disinformation plays to a host of our cognitive biases. We’re not rational, unless highly motivated – we’re rationalizers.”

Steven Polunsky, writer with the Social Strategy Network, replied, “Remember that some people want to see the misinformation due to confirmation bias. Otherwise, we will be no worse off than we were previously in our history when some newspaper publishers slanted their papers’ work and prejudiced commentators commanded the radio waves. Eternal vigilance.”

Timothy Herbst, senior vice president of ICF International, noted, “There will always be some level of misinformation. But too much misinformation has the very real potential to further discredit our norms, institutions and belief systems that our foundations of our society.”

Carol Wolinsky, a self-employed marketing researcher, replied, “Those most inclined to believe the ‘fake’ news will find continuing support for their views. Mistrust of mainstream media will worsen and society will become more polarized as each side accepts its own version of the news as the ‘truth’ and decries any information to the contrary as ‘fake.’”

Eileen Rudden, co-founder of LearnLaunch, wrote, “There will be bad consequences. Isn’t this what Hitler did?”

Joel Reidenberg, chair and professor of law, Fordham University, wrote, “The network infrastructure will require new governance mechanisms suited for the digital age.”

Michele Walfred, a communications specialist at the University of Delaware, said, “Consequences are significant. Anyone can publish today, as well they should, but bad actors go to great effort to look authentic, present themselves as news organizations, share partial truths to appear legitimate, but also invent content. From my observations, audiences are gullible and often make choices based on false reports, such as health treatments. The bogus material I see shared on the internet astonishes me. It is increasingly being shared. Everyone has become his or her own expert. This dilutes real facts. Science and research have become mistrusted.”

Ken O’Grady, a futurist/consultant, said, “Credible news is critical so as to enable a free society to analyze and form opinions.”

David Goldstein, researcher and author of the Goldstein Report, noted, “The consequences would be losing trust in government and society.”

Gianluca Demartini, a senior lecturer in data science, observed, “The main possible consequence will most likely be going back to a system like 20 years ago where content producers where much less for many content consumers contrary to now where anyone can be a content creator as well.”

Richard Jones, a self-employed business owner based in Europe, said, “This would be disastrous unless education is able to amplify the old message ‘don’t believe it just because it’s in black-and-white print.’ I expect proliferation of minority pressure groups. I expect fragmentation of societies.”

Iain MacLaren, director of the Centre for Excellence in Learning & Teaching, National University of Ireland-Galway, commented, “Fragmentation, distrust and the likelihood of political disengagement by a significant fraction (if not the majority) of the population.”

Frank Kaufmann, founder and director of several international projects for peace activism and media and information, commented, “This problem will not be any better or any worse than it always has been. It just will be writ in the complexities of the time. It is nothing new. The manipulation of news is not the problem.”

Bradford W. Hesse, chief of the health communication and informatics research branch of the US National Cancer Institute, said, “In the absence of complete prevention, society will need to evolve better risk-management systems to detect misuse when it does occur and then to enable public conversation over appropriate societal response through credible channels (trusted advocacy, fourth estate).”

Glenn Grossman, a consultant in the financial services industry, replied, “Society is harmed to live by lies. It destroys trust everywhere. Without trust our democracy is weakened.”

Peter Dambier, DNS guru for Cesidian Root, commented, “Corruption of information is not new, all historic nations did it. It helps to keep your brain alive, no other consequences feared.”

Rob Lerman, a retired librarian, commented, “A functioning democracy is near impossible without an informed citizenry. ‘Bad actors’ will be successful unless and until citizens have the will and capability to consume information critically. We will have more Trumps at all levels of government.”

Cliff Cook, planning information manager for the City of Cambridge, Massachusetts, noted, “I think we are already seeing that – a breakdown of the social contract among Americans, exacerbation and exaggeration of differences to forment public anger, election of fundamentally dishonest and/or incapable leaders.”

Megan Knight, associate dean, University of Hertfordshire, said, “Society is just going to get worse. Increasing polarisation and the conflict that results is inevitable.”

Dave Kissoondoyal, CEO, KMP Global, replied, “I am for the availability of public information freely. However this information should be available to authenticated users only. In this way, it is ensured that the publicly available information does not go to bad actors. On the other hand, if a society is not able to prevent the coopting of public information by bad actors, the consequences will be really disastrous, as information would be used for all sort of wrong activities.”

Matt Moore, a business leader, predicted, “There would be growing political anarchy and tribalism. If the mass media (including the internet) is untrustworthy then we fall back on those we know around us. And these tend to be people with similar views to ours.”

David Harries, associate executive director for Foresight Canada, replied, “The more coopting there is the more ‘competition’ there will be to win the veracity stakes, and therefore the more that ‘society as a whole’ will become fragmented.”

Michael Marien, senior principal, The Security & Sustainability Guide and former editor of The Future Survey, wrote, “There will be more bad and costly decisions and more Trump-like leaders.”

Paul M.A. Baker, senior director of research for the Center for Advanced Communications Policy, observed, “That is a complex question which is difficult to anticipate. It might be possible to model such outcomes by looking at existing crowdsourced rating systems and scaling up. Or conversely, I could imagine parallel news systems and social and political decision makers negotiating which to use in a given case (similar to agree to use of an independent arbitration process, perhaps?).”

Deborah Stewart, an internet activist/user, wrote, “There will always be gullible people who will believe anything.”

Sasa M. Milasinovic, information and communication technology consultant with Yutro.com, replied, “Mislead people about public opinion and the abuse of society.”

Jonathan Ssembajwe, executive director for the Rights of Young Foundation, Uganda, commented, “Society will fear using the internet for their businesses and day-to-day activities because its information will not be trusted.”

Shirley Willett, CEO, Shirley Willett Inc., said, “A small revolution between people believing in opposing information, hopefully bad enough to birth a change.”

Bill Jones, chairman of Global Village Ltd., observed, “It has always been thus, except the scale and effect are larger. Society will survive but will have to do so by dealing with large temporary information challenges”

Andrew McStay, professor of digital life at Bangor University, Wales, wrote, “Much depends on context (who, what, when, why) but there is scope to influence, manipulate and alter behaviour.”

Dan Ryan, professor of arts, technology, and the business of design at the University of Southern California, said, “In the extreme, the undermining of information credibility can yield a breakdown in what I call ‘the information order.’ It’s an interlinked system of norms, shared/common knowledge, information generating and verifying processes and institutions. It’s complex interdependencies make it robust in the face of strategic and inept perturbation, but it can also collapse in a cascade. The result is not pretty as it turns out the information order is a sort of ultimate infrastructure for social order.”

Rajnesh Singh, Asia-Pacific director for an internet policy and standards organization, observed, “People will be determining the accuracy of information and the actions/decisions they take based on what they think is accurate.”

Adam Nelson, a developer for Amazon, replied, “Society can be swayed by bad actors; nothing is different.”

Eleni Panagou, cultural informatics and communication scientist at Arwen Lannel Labs in Greece, wrote, “It would a tremendous [blow] to new chances and an open challenge to humankind.”

Mark P. Hahn, a chief technology officer, wrote, “Same as for the past two millennia. People will adapt and small grass roots organizations will dissent, but they may be decentralized, connected only by communications technologies.”

Taina Bucher, associate professor in the Centre for Communication and Computing at the University of Copenhagen, commented, “I think we are talking about differences in degree, not kind. We have always had bad actors trying to coopt public information. As such ‘fake news’ is nothing new. Now, more than ever, we need classical source criticism techniques and combine it with computational literacy.”

Frank Odasz, president, Lone Eagle Consulting, observed, “If it takes only a few bad actors to undue what good folks have built, like Trump undoing Obama’s legacy, or trolls inhibiting online discussions and civic participation, then we need societal policing to address the negatives, and social positive recogntion for those serving as models of good behavior. The microcosm of a small town, or Alaska Native village, impacted by abuses of social media resulting from nearly everyone having a smartphone and following facebook, really suggests addressing training for responsible use and consequences for diminishing community capacity building efforts instead of contributing. New metrics are needed; public, visual, daily, community feedback is an area ripe for innovation.”

O’Brien Uzoechi, a business development professional based in Africa, replied, “The consequences can be clearly disastrous. Just look at what happened in the last US general election, apparently the election result was somewhat compromised given the whole intrigue of Russian meddling through the email hacking episodes; those are clear instances of spreading damaging information to discredit individual’s personal integrity.”

Scott Guthrey, publisher for Docent Press, said, “First, ‘bad’ is in the eye of the beholder. Second, coopting public information presumes the truth of public information (whatever that may be). Public archives are routinely polluted by bureaucrats with agendas. (See ‘Resist’.) The consequent is that each individual will operate according to their own set of truths. In fact, there is no particular need to have a set of universal truths.”

Andrew Dwyer, an expert in cybersecurity and malware at the University of Oxford, commented, “There has always been a misappropriation of information by actors. This is nothing new. Yet the scale and challenge of doing this without conventional, mass consensus readership is the issue. Thus we can look to history when people do not trust their governments, or if particular narratives gain traction within a society that can be extremely destructive.”

Dean Willis, consultant for Softarmor Systems, commented, “This is just another political arms race. See ‘movable type.’”

Axel Bruns, professor at the Digital Media Research Centre, Queensland University of Technology, commented, “History is full of examples for where unbridled propaganda and misinformation can lead. We will continue to see agitation by extremist political actors, emboldened by the ‘alternative facts’ circulating in their networks, and these groups have the potential to do real damage to the fabric of society. But we will also see the emergence of counter-movements that seek to capture the political centre and stand for civic cohesion – at least in countries where politics has not already been poisoned by years and decades of mindless tribal partisanship.”

Sharon Tettegah, professor at the University of Nevada, commented, “Consequences for society could include chaos and erroneous decisions that are based on false information. We should have specific sites for opinions.”

Steve Axler, a user-experience researcher, replied, “Increased use of the web for certain people’s own agendas, such as terrorism, criminal activity, political beliefs, etc.”

Philip Rhoades, retired IT consultant and biomedical researcher with Neural Archives Foundation, said, “The consequences will be very bad but environmental problems will make them seem trivial.”

Sean Justice, assistant professor at Texas State University – San Marcos, “This is a historical question. Look to history for an answer. I realize that this survey is meant to be painless, to some extent, but the enmeshed assumptions of the language (‘whole’ vs ‘public’ vs ‘bad actors’) invite simplistic causal analyses that cannot begin to illuminate the larger situation.”

Edward Kozel, an entrepreneur and investor, replied, “Destruction of the ‘public square’ as a place for reasoned, balanced discussion (for personal or political aims) will encourage anarchy or individual extremism.”

Neville Brownlee, associate professor of computer science at the University of Auckland, said, “The consequences will be the ongoing exploitation by the few who can afford to hire those who control the targeting of information delivery.”

Janet Kornblum, a writer/journalist, investigator and media trainer, replied, “Um, President Trump. It’s already happened.”

To return to the survey’s for-credit responses home page, with links to all sets, click here.

To advance to the next set of for-credit responses – those to survey Question 4 – click here.

If you wish to read the full survey report with analysis, click here.

To read anonymous survey participants’ responses with no analysis, click here.