Elon University Home

 

The 2017 Survey: 
The Future of Truth and Misinformation Online

Credited responses to the fifith of five follow-up questions:
What do you think will happen to trust in information online by 2027?

Internet technologists, scholars, practitioners, strategic thinkers and others were asked by Elon University and the Pew Research Internet, Science and Technology Project in summer 2017 to share their answer to the following query:

What is the future of trusted, verified information online? The rise of "fake news" and the proliferation of doctored narratives that are spread by humans and bots online are challenging publishers and platforms. Those trying to stop the spread of false information are working to design technical and human systems that can weed it out and minimize the ways in which bots and other schemes spread lies and misinformation. The question: In the next 10 years, will trusted methods emerge to block false narratives and allow the most accurate information to prevail in the overall information ecosystem? Or will the quality and veracity of information online deteriorate due to the spread of unreliable, sometimes even dangerous, socially-destabilizing ideas?

About 49% of these respondents, said the information environment WILL improve in the next decade.
About 51% of these respondents said the information environment WILL NOT improve in the next decade.

Follow-up Question #5 was:
What do you think will happen to trust in information online by 2027?

Some key themes emerging from among the responses: - People have differing notions of 'trust,' 'facts' and 'truth.' - The rise of misinformation will continue and things will likely worsen. - The next few years are crucial to the future of the information environment. - Some people will be smarter inthe future about finding and relying on trusted sources. - There will be a divide between the savvy and not-so-savvy, and noisy, manipulative attention-grabbers may drown out the voices of veracity. - New and old approaches to improving the information environment will be successful. - Methods adopted to potentially improve things will cut free speech and elevate surveillance, changing the nature of the internet; the actors responsible for enabling change profit from control. - Despite some work to improve things, there won't be much change.

Written elaborations by for-credit respondents

Following are full responses to Follow-Up Question #5 of the six survey questions, made by study participants who chose to take credit when making remarks. Some people chose not to provide a written elaboration. About half of respondents chose to remain anonymous when providing their elaborations to one or more of the survey questions. Respondents were given the opportunity to answer any questions of their choice and to take credit or remain anonymous on a question-by-question basis. Some of these are the longer versions of expert responses that are contained in shorter form in the official survey reportThese responses were collected in an opt-in invitation to about 8,000 people.

Their predictions:

Bart Knijnenburg, researcher on decision-making and recommender systems and assistant professor of computer science at Clemson University, said, "Given the mechanisms likely to emerge, I think that mainstream media will have fewer problems with trust. The question remains how many people can separate themselves from the ‘mainstream’ and function in an echo chamber of ‘alternative facts.’ I'm optimistic though: Information already serves a crucial role in one's general ability to function in society, and this will only become more prominent. A steady diet of misinformation will disqualify a person from participating in society.”

Glenn Edens, CTO for Technology Reserve at Xeroz/PARC, “Truth now seems ‘optional.’ The root of these issues is in publishing and consumption as well as education. We may get to a point where ‘media’ is largely ignored, especially in an environment where the boundaries between business and editorial barely exist anymore. With any luck society will self-regulate and it will be cool again to verify sources and fact check.”

David Sarokin of Sarokin Consulting, author of "Missed Information," said, "'Online' is not the issue, this is much broader societal concern (including print, TV, etc.). Continued deterioration will set back science, journalism and liberty, but hopefully, we're smarter than that.”

Paul Gardner-Stephen, senior lecturer, College of Science & Engineering, Flinders University, said, "I fear that I am rather pessimistic at the moment: Fake news is simply too easy to create, the general population too easy to influence, and the potential benefits of its application too great for power-hungry entities to ignore. It is only if we find ways to defuse these factors, that we will see a long-term improvement in the situation. But I hold onto hope that the populace will become more immune and questioning, as for example has occurred in the recent French elections, where the application of fake news probably backfired. However, this is an arms race, just as with spam, malware and other digital blights. Battles will be won and lost, and although the war currently shows no end of ending, the increasing awareness of manipulation will likely mitigate the overall impact of fake news over time.”

Bryan Alexander, futurist and president of Bryan Alexander Consulting, replied, "In a decade we should have roughly the same amount of trust.”

Howard Greenstein, adjunct professor of management studies at Columbia University, said, "Systems will develop where facts and origins will be sourced, so readers know where the information originated. This will exceed hyperlinks and become more like a line-by-line ‘pedigree’ for articles. Hopefully these will create incentives to work with the most accurate sources.”
Alexander Halavais, associate professor of social technologies, Arizona State University, said, "I suspect we will see the development of metrics for determining the validity of news and information sources. This is a problem that we have already approached in search, with the need to filter ‘real’ responsive search results from attempts at spam or other misleading information. There is value in finding trusted information, and I suspect that people will seek ways of extracting that value, by certifying or rating the validity of claims. Unfortunately, as we have seen with Politifact and Snopes, not everyone will agree about who those certifying authorities should be.”

Tom Rosenstiel, author, director of the American Press Institute, senior non-resident fellow at the Brookings Institution, commented, "I hate to say it, but the last 30 years suggest that the forces of declining trust will likely continue. Three trends are merging here. As technology expands, the audience fragments further into its own channels by subject and point of view. And as that happens, political leaders, particularly those who feel the traditional media are against them will continue to exploit of that to inflame audiences for their own purposes.”

David Wood, a UK-based futurist at Delta Wisdom, said, "In line with my earlier answer, it's about 65% likely that trust in online information will increase by 2027, and 35% likely that it will decline.”

Serge Marelli, an IT professional who works on and with the Net, wrote, "It will remain pretty much the same. Educated people will know where and what to trust. Stupid people will believe in what they want to believe: alternate facts, lies, ‘alternate media,’ populist propaganda.”

Geoff Scott, CEO of Hackerati, commented, "I hope parents and educators will begin teaching their children the critical thinking and investigative skills needed to render fake information harmless, but it will take several more decades before enough people think independently enough to have an impact.”

Garth Graham, an advocate for community-owned broadband with Telecommunities Canada, explained, "We will begin to realize that truth/lie is a false dichotomy, and that ‘information’ is a verb, not a noun. Also that narrative is an illusion. We are discovering that mind/consciousness depends on context. The internet increases our awareness that reality is a construct. It accelerates our capacity to apply that awareness. If we are lucky, by 2027, we will be able to practice that capacity as a learned artifice. We will be more conscious of the nature of consciousness. As we do this, our trust in ‘society’ as an organizing principle dependent on external authority will disappear. To be replaced by a reliance on self-organizing community as the primary principle of structural relationship and organization.”

Philip Rhoades, retired IT consultant and biomedical researcher with Neural Archives Foundation, said, "I will be surprised if it is still an issue without an functioning biological environment.”

Sean Justice, assistant professor at Texas State University-San Marcos, “This is an ecosystem question if ‘trust’ is held as an open, relational term. In that sense, ‘trust’ will continue to be commodified in capitalist systems. But another question needs to be asked simultaneously, how is capitalism changing? Changes to the materiality of the ecology have yet to be theorized in a coherent way. Questions that rely on anachronistic (black-boxed) terminology might actually work against a sustainable dialog that might prove useful, however. In the end it might not matter too much that we understand what we're doing; practice often (perhaps always) leads theory.”

Edward Kozel, an entrepreneur and investor, replied, "Fragmentation of ‘trusted ecosystems’ as National Interests (countries) all struggle with the issues in different ways.”

Alejandro Pisanty, a professor at UNAM, the National University of Mexico, and longtime internet policy leader, wrote, "The landscape of trust in information online by 2027 will continue to be mixed. There are reasons to project into the future that alkaline diets, science denial, conspiranoic theories, hate and ignorance will not be abated in ten years. On the other hand, a better understanding of biases and a decade more of the internet's life may begin to create information resources whose trustworthiness is better established and more easily identified, as has taken the press more than 500 years to, somewhat, achieve.”

Erhardt Graeff, a sociologist doing research on technology and civic engagement at the MIT Media Lab, said, "The issue of trust in information is related to but not the same as the issue of trusted methods of addressing misinformation. The crux of this will play out through the fact that the overall information ecosystem actually comprises a multitude of smaller, unequal information ecosystems. Certainly, identifying, developing, and deploying trustworthy social, legal, and technical methods to combat misinformation must precede any reconstitution of our trust in information online. However, changes in media literacy and wider processes of socializing new norms of perception around information online will evolve more slowly and asynchronously. Different people within different ecosystems will have different experiences. Most likely, between 2017 and 2027, we will see increased inequality when it comes to trust in information online and the ability of certain people to leverage the information ecosystem to serve their needs and to make change in the world. There will be elite classes who are structurally positioned online and offline to comprehend and to access the most reliable nodes in the overall information ecosystem, benefiting from existing social and cultural capital and resources like money, education, and advanced tools. And there will be underclasses whose information ecosystems who lack connections to diverse, trustworthy people and news sources, and/or who have simply been left behind in their understanding of improvements to their information ecosystems – their lack of trust will mean they cannot exploit this new landscape as fully empowered citizens.”

David Conrad, a chief technology officer, replied, "It will continue to decline, particularly as technology evolves for modifying and/or generating fake video, audio, and text that is essentially indistinguishable from real information.”

Judith Donath, fellow at Harvard's Berkman Klein Center, and founder of the Sociable Media Group at the MIT Media Lab, commented, “There will be an arms race of fakeness, especially in audio and video, as the tools to make convincing artificial videos of people and events become commonplace and believable.”

Esther Dyson, a former journalist and founding chair at ICANN, now a technology entrepreneur, nonprofit founder and philanthropist, expert, said, "I'm optimistic because I'm an optimist. However, there is not a lot a lot of evidence right now.”

Jerry Michalski, futurist and founder of REX, replied, "I'm afraid we won't make much progress in a decade. It's too early. The possibilities for havoc haven't yet been played out, believe it or not.”

David Brake, a researcher and journalist, replied, "A distinction needs to be made between ‘formal’ trust in information (which is already low and dropping) and ‘in effect’ trust (which does not seem to have changed much). In other words, people seem not to trust what they read much when they think about it but that doesn't stop them behaving as if it were true (e.g., by sharing it).”

Bob Frankston, internet pioneer and software innovator, said, "Ideally there will be a more-aware public less apt to accept ‘the internet says’? Or will there be more acceptance of one’s tribe as authority?”

Alan D. Mutter, media consultant and faculty at graduate school of journalism, University of California-Berkeley, replied, "I am terrified to contemplate the subject. While I fear for the worst, we should be mindful that the pendulum swings both ways. We now are at a point of extreme distrust and discontent. It's hard to imagine a country that elected Barack Obama could turn around and elect Donald Trump. But it did. So, the obverse most certainly could occur. Even as I cling to the hope that the arc of history will right itself, I doubt anyone will ever again be regarded, Walter Cronkite-like, as ‘the most trusted’ person in America.”

Mark Lemley, professor of law, Stanford University, wrote, "People will trust information from identified sources and from clusters of others they know, but will put less trust in information merely because it is online.”

Nigel Cameron, technology and futures editor at UnHerd.com and president of the Center for Policy on Emerging Technologies, said, "There will have been much clarification of branded/trusted sources vs. unreliable, so there should be an increasingly healthy situation.”

Bill Woodcock, executive director of the Packet Clearing House, wrote, "I’m afraid the trolls will continue to ascend over the next decade, with national sponsors and a growing sense that it can be hip to be reactionary if you can play the left for rubes. Which leaves the schoolchildren of tomorrow unable to trust either textbooks or the Internet.”

Tiziano Bonini, lecturer in media studies at the department of social, political and cognitive sciences, University of Siena, said, "Information online will be extremely polarized. Most of the information will be under the real-time review of millions of skilled users, while bad actors will continue to proliferate in subcultural contexts or specific clusters of people (those less skilled in media literacy). Authoritarian governments will centralize and control information online, maybe producing fake news by themselves. Trust will more and more rely on single persons (journalists and gatekeepers with a high reputation, maybe measured through new ranking systems) instead of single institutions.”

Jane Elizabeth, senior manager American Press Institute, said, "The current downhill trajectory will reach rock-bottom soon and prompt more serious efforts to reverse the trend. In 10 years, we can and should be able to restore some of the trust that's been eroded.”

J. Nathan Matias, a postdoctoral researcher at Princeton University, previously a visiting scholar at MIT Center for Civic Media, wrote, “In our time, people already take billions of actions every month to manage and filter trusted information. By 2027, citizen behavioral scientists will routinely test the effects of these actions at scale, developing adaptive knowledge on effective ways to support public understanding in the face of rapidly-evolving misinformation.”

Barry Chudakov, founder and principal, Sertain Research and StreamFuzion Corp., Trust in information online will erode if media outlets do not position themselves and their media vehicles to build trust-measures into their content. Just by generating information we will not, magically, generate tools to better regulate that content – any more than driving your car down a road would magically generate road signs and traffic signals along that road. Keeping in mind the need for open access, transparency, and protection of privacy, online information sources will have to cooperatively generate new ‘rules of the road’ for online information.
User-generated content will continue to explode in the next decade. Virginia Tech’s ‘Evaluating internet information uses five criteria to determine the trustworthiness of online information:
1. Authority (Who is this person? How is he or she qualified?)
2. Coverage (How relevant is this information? Does it fully address the significant issues associated with the topic?)
3. Objectivity (Does the information show minimum bias? Are there links or ads that show the author’s agenda?)
4. Accuracy (Is the information reliable and error-free? Is there some kind of fact-checking confirmation of the information?)
5. Currency (How recent is the information? When was the page last updated?)
Readers need guidance, filters, standards. The information flood is here, and with it come truly positive outcomes and opportunities. But it also brings consequences, foremost of which is the need to manage that information – give the reader perspective and tools to coordinate the information with other information and ultimately evaluate its worthiness.
For example, users can upload social media posts, links, images, or other content to Check, an open web–based verification tool developed by Meedan, as part of their verification process: ‘Once an item is uploaded, it can be color-coded and tagged by subject matter. Users can regularly update the status of their reporting, add notes, and include other details that might be useful.’ (Neiman Lab: http://bit.ly/2tDB2yi]
By 2027, hackers and mischief-makers will use technology advances to create more confusion and work to obfuscate or distort the truth. Now is the time to build vigilance and standards into our information.
Magical thinking or wishing this to get better is foolish. We must get to work now or by 2027 the nonsense one hears today – you can’t trust any information anymore – may, like Orwell’s doublespeak, distort reality enough that people will assume it is true.
As William D. Lutz has written:‘All who use language should be concerned whether statements and facts agree, whether language is, in Orwell's words, ‘largely the defense of the indefensible’ and whether language ‘is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.’
By 2027 online information can puncture illusions, but only with vigorous attention to building confirmation tools that underline facts and foster truth-telling.”

Mercy Mutemi, legislative advisor for the Kenya Private Sector Alliance, wrote, "It will disintegrate.”

Larry Keeley, founder of innovation consultancy Doblin, wrote, "Parts of it will get worse. But most of it will get much better.”

Jan Schaffer, executive director of J-Lab, said, "I fear that people will draw tighter information circles of those they trust, but this may exclude alternative views and ideas.”

Steve Axler, a user-experience researcher, replied, "It will be no different than today. Intelligent people will filter, others won't.”

Nick Ashton-Hart, a public policy professional based in Europe, commented, "Trust will increase, but the processes that increase it will also reduce the ability of new forms of information dissemination to become publicly accessible as the costs of compliance reduce the ability of the private sector, especially SMEs, to innovate.”

Michael Rogers, principal at the Practical Futurist, wrote, "With the right tools and more public education, in 10 years we may return to the same level of trust that we had at the turn of this century. That would be a victory.”

Jack Schofield, longtime technology editor at The Guardian, now a columnist for The Guardian and ZDNet, commented, "News sources that distribute false information have a vested interest in discrediting more honest news sources – for example, Fox News benefits by discrediting CNN and the New York Times. I envisage more and more sources appearing over the next decade, each putting its own distinctive spin on the news, while trying to discredit rivals in similar niches. The result could be more sources catering to fewer people, with less agreement between sources about even basic facts. Once you've discredited the old ‘gatekeepers’ like the New York Times, the Washington Post and the Wall Street Journal, anything goes.”

Ian Peter, internet pioneer, historian and activist, wrote, "It is likely to deteriorate, and we all should become far more critical of what we read and are told, but I am not sure whether that will eventuate. We are more likely to carry on trusting information as if nothing had happened and just believe whatever we are told we should believe.”

Leah Lievrouw, professor in the department of information studies at the University of California-Los Angeles, wrote, "My guess is that the more popular, click-bait-y, online sources and streams will continue to have audiences (as tabloid or sensationalist, celebrity culture outlets always have). But the great online ‘pool’ of information will increasingly be distrusted by opinion leaders, decision-makers, institutions, and experts, who may need to create a separate ‘ecosystem’ of high-status – elite, if you will – and reliable sources for creating, sharing and debating information away from the populist ‘roar.’ Perhaps it will look a bit more like book publishing and libraries (with the ‘curation’ that implies), perhaps enclosed by paywalls (like academic publications?). But without an arena for trusted information to be created, circulated and debated in a fair way, there is little chance that a pluralist society can succeed into the future.”

Michael J. Oghia, an author, editor and journalist based in Europe, said, "If Wikipedia can be used as a benchmark, I've witnessed how it went from being laughable to practically a first-stop for legitimate and respectable information gathering in less than a decade. The fact is, while there is more content available to muddy the water between fact and fiction, new technologies, policies, education and human resources are being allocated to address this issue, so I'm optimistic it will improve.”

Jamais Cascio, distinguished fellow at the Institute for the Future, said, "There are multiple scenarios. We could be so mistrustful of online information that we look for alternative media of communication for trustworthy material, each potentially worse than the last; we could successfully develop tools and norms to push back against falsehoods (e.g., reliance on general public camera swarms as verification of video). We could be so polarized that people will trust information that comes from ideologically aligned sources and everything else is garbage. I suspect trust in information will be greater by 2027, largely because it will be easier to block out information and information sources that we don't like.”

Kenneth R. Fleischmann, associate professor at the University of Texas- Austin School of Information, wrote, "ICTs will continue to evolve and multiply. Fora for sharing and receiving information will continue to multiply. Fragmentation of discourse and development of filter bubbles will likely continue to increase. It's never safe or a wise idea to predict the future, but I see no reason (apart from some kind of nationwide or global catastrophe) that our political and information environments would become less fractured and polarized over the coming decade.”

Jim Hendler, professor of computing sciences at Rensselaer Polytechnic Institute, commented, "We will see considerable change as new technologies become available, online communities become both more diverse (as in growing Facebook groups) but also more point to point, as in messaging and direct communication groups. Trust is, and will always be, a social effect – and there will need to be consensus on the harms of purposely manipulating information for gain. There will always be grey areas, but current systems that reward people who make up stories from whole cloth for political effect (via site hits and advertising, for example) must become illegal acts – stating an opinion is protected speech and should continue to be – spreading lies as truth has always been regarded as an unethical act, and current systems that reward, rather than punish, such acts are clearly eroding trust.”

Taina Bucher, associate professor in the Centre for Communication and Computing at the University of Copenhagen, commented, "The next decade will see an increase in public awareness and debate over issues of trust and information online. This is not to say, however, that we are not living in a delicate moment; we are. Everything is not relative and there are not endless alternatives to the truth. We all have a job to do, the public, the politicians, the technologists and the journalists alike. There has not been a better time for the humanist, social scientist and the software developer to meet.”

Andrew Dwyer, an expert in cybersecurity and malware at the University of Oxford, commented, "We will have developed frameworks of trust recognition, with some sort of verification body that attests that this has been 'fact-checked', in similar ways to emerging organisations have now. These will be plural due to the multiple perspectives required in democracies, yet others may verify another and so ecologies of trust will emerge that individuals and societies can ascribe to.”

Michael P. Cohen, a principal statistician, replied, "I hope people will trust information that has been checked and learn to check first.”

Dean Willis, consultant for Softarmor Systems, commented, "By 2027, online information will be as trusted as (the Russian news service) Pravda.”

Axel Bruns, professor at the Digital Media Research Centre, Queensland University of Technology, commented, “I would expect people to have formed a considerably more sophisticated, differentiated understanding of the relative trustworthiness of different (online as well as offline) information sources.”

Amber Case, research fellow at Harvard Berkman Klein Center for Internet & Society, replied, "Fake news is not new. Social networks are simply a new way to broadcast it. We've seen fake news in the 1700s, 1800s, etc., and as we become more accustomed to it we will probably learn how to spot it better. Right now we're in a serious emotional time with information. It can trigger intense feelings and reactions that make it difficult to make sober choices or take a step back. We'll probably learn a bit more about this and become accustomed to it over time. In the same way that a fake news story from 1860 might look ridiculous to us now, we'll probably feel the same way about news stories posted in 2017 when we look back on them in 10 years. The trick is to be able to have that perspective during the moment we read the news story.”

George Siemens, professor at LINK Research Lab at the University of Texas-Arlington, commented, "Within a decade, the amount of misinformation will increase due to bots and propaganda, but so will mechanisms to intentionally identify and isolate false information.”

Matt Mathis, a research scientist who works at Google, said, "We will get smarter at separating facts from alternate facts.”

Micah Altman, director of research for the Program on Information Science at MIT, commented, "Reaching an equilibrium by 2027 is unlikely, and advances in technology will yield cycles in information trustworthiness as technologies for manipulating and verifying (respectively) information advance, and society reacts to them. In the mid-term, distributed ledger technologies (e.g., blockchain) will provide a powerful tool for establishing verifiable information in some scenarios. In addition, as a result of trends in information privacy in Europe, trust in the management of personal information online may be improved.”

Brian Cute, longtime internet executive and ICANN participant, said, "Users will have more tools that offer trust in information online. At the same time new techniques to deceive or promote fake news in new forms will be developed. It will continue to be a ‘mixed bag’ of trust and deception with individual responsibility being the most important element to protect the user.”
Giacomo Mazzone, head of institutional relations for the World Broadcasting Union, replied, "The world will be divided in do-knows and don’t-knows. Only the first ones will be able to find trusted sources.”

Ryan Sweeney, director of analytics, Ignite Social Media, wrote, "Trust in information 10 years from now relies on our actions today. If we can curb these negative trends and rebuild the marketplace of ideas, our trust in information – and each other – will vastly improve. However, if we continue our current trajectory, the film ‘Idiocracy' will be reclassified as non-fiction.”

danah boyd, principal researcher, Microsoft Research and founder, Data & Society, wrote, "We are going to see more and more adversarial attempts to gaslight the public. This is not specifically about the internet. This is about who trusts what.”

Frank Odasz, president, Lone Eagle Consulting, wrote, “By 2027, we'll have learned the public internet has been soiled, and walled gardens are necessary to separate those who desire to build trust and a better world from those who seek to destroy what others have built, and/or seek to profit at the expense of others. A reputation economy is evolving where it matters what you put online (and then can't delete). But history teaches us that civilization has cycles, and we're seeing a seeming loss in America of decency, ethics and honesty and the world sees mercenary interests are in control that threaten civil society at all levels.”

O'Brien Uzoechi, a business development professional based in Africa, replied, "If misinformation continues to go on unchecked, trust will become a trash word in 2027. But, with appropriate laws and the right application of development through technological commitments there could be a turnaround in our trust in information dissemination by 2027.”

Mark Glaser, publisher and founder, MediaShift.org, wrote, "Hopefully by 2027 systems will be in place to help people better understand digital information and sources, with improved digital media literacy and public education. Trusted sources will be more valued, and people will gravitate to them.”

Susan Etlinger, industry analyst, Altimeter Research, said, "It all depends on the state of the world and the political, health and economic impact on the individual. Technology has created an information arms race that is very similar to what we see with cybercrime and hackers. My guess would be that information ecosystems will behave similarly: periods of relative apathy punctuated by panic and outrage.”

Andreas Birkbak, assistant professor, Aalborg University, Copenhagen, said, "There will be more online brokers of information around who rely on a reputation of trustworthiness to attract an audience.”

Rob Atkinson, president, Information Technology and Innovation Foundation, wrote, "Trust will increase by 2027, as technology improves and as more people are better able to differentiate real from fake information.”

John King, professor, University of Michigan School of Information Science, said, "Caveat User: We'll learn a lot about trust, which we think we understand now, but we don't.”

Marc Rotenberg, president, Electronic Privacy Information Center, wrote, "Brands will continue to determine the public's perception of trust. But perception and reality often diverge.”

Iain MacLaren, director of the Centre for Excellence in Learning & Teaching, National University of Ireland-Galway, commented, "The default position, which is taking shape even now, will be that of not taking seriously information that is not backed up by evidence or which is part of an obvious 'high shock' deluge. Just as we have developed the ability to screen out many of the ads that plaster websites, so too will we see much of this type of 'information' as electronic noise.”

Greg Wood, director of communications planning and operations for the Internet Society, replied, "I am hopeful that systems and practices will be developed and deployed that improve the ability of internet users to better verify online information and its sources. However, it is not clear if the economic and other drivers to do this exist. And, practical incentives to spread false information will remain. I'm optimistic, but the history of the internet has shown that projecting its future is fraught with uncertainty!”

Stephen Bounds, information and knowledge management consultant, KnowQuestion, said, "By 2027, trust in science and journalism without a known personal endorsement will have continued to erode. Governments and commercial organisations will all either own or lease access to significant aggregations of on-demand media. Traditional media advertising will be all but obsolete. Instead, the 'influencers' that star in these channels will be paid to pass on information to their followers. However, since this is common knowledge, their views will be treated with suspicion (thus repeating the cycle of increasing media-savviness seen in the previous iteration of advertising through mass media). A small but increasingly influential band of information providers known as 'patronus' will rely exclusively on no-strings-attached support from patrons. They will pride themselves on their fierce independence and champion issues of political and social importance that receive intense focus from their followers. Their success rate will be higher than the most highly-paid political lobbyists. Patronus will often be subject to information warfare attacks and lawsuits from disgruntled parties, and will be forced to invest in countermeasures as part of the cost of doing business. The most successful will have a staff to vet requests for coverage by governments, scientists and commercial organisations. Only a small percentage of these requests will be covered on the ‘main channels,’ but additional ‘side channels’ for niche topics of interest will be curated and published by their staff. Five years in, a patronus will suffer a damaging hit to their reputation when a second-in-command is bribed into publishing side channel content beneficial to The Walt Disney Company. In countries that outlaw or fail to develop a patronus culture, the shift towards authoritarianism will be marked. In the absence of reputable sources of information, citizens will tend to find a single outlet for information and consume it unquestioningly, reasoning that ‘they are all as bad as each other anyway.’ This will make government and corporate manipulation of sentiment easy to achieve.”

John Wilbanks, chief commons officer, Sage Bionetworks, replied, "We will find this question quaint and outdated by then. Trust is a word that gets redefined by new generations with new access to information. So this isn't about ‘trust’ but about ‘what we thought trust was before it got subsumed in an information flood.’”

Charlie Firestone, executive director, Aspen Institute Communications and Society Program, commented, "People will be skeptical of information online, but most (or at least many) will have the skills to determine the truthful sources if they care to.”

Sebastian Benthall, junior research scientist, New York University Steinhardt, responded, "People will trust particular sources in ways that are correlated with their demographics, education, and involvement in offline institutions and will not trust information online in general.”

Darel Preble, president and executive director, Space Solar Power Institute, commented, “The future depends on well we all fight for accuracy, trust and fair play and develop superior and smarter communication skills.”

David Weinberger, writer and senior researcher at Harvard’s Berkman Klein Center for Internet & Society, said, "At best, we will have learned that while the Net looks like a publishing medium, it is not. It is a conversational medium in which ideas are promulgated without always having been vetted. We will become more ‘meta’ in our approach and recognize that we have a responsibility to question the truth and validity of what we see. That's always been our obligation but we have spent centuries outsourcing it to authorities. By 2027, perhaps we will recognize that it's up to us. It is the most basic and urgent of collaborative tasks the Net requires from us. Taking this meta step would be a significant achievement in the history of civilization. Maybe we'll get there.”

Bernie Hogan, senior research fellow, University of Oxford, said, "I'm sure we want to believe it will get better, but I assume that instead it will get more effectively manipulated. Those on the right are increasingly suspicious of institutions and those on the left are suspicious of many actors that do not pander to their specific cause. Personalised, demographically appropriate celebrities will be increasingly available to appeal to specific groups. A cataclysmic event such as a pandemic or world war might disrupt this trend, wherein we reevaluate the overall state of information distribution. Barring that, I imagine it will be business as usual, with people trusting what they believe in, in the most convenient, smallest doses possible. I mean we would much rather buy an intelligent agent that tells us what we want to hear than one that tells us what we should here to engage in politics beyond the local level.”

Ian Peter, an internet pioneer, historian, activist user and futurist, replied, "Trust should deteriorate, and we all should become far more critical of what we read and are told, but I am not sure whether that will eventuate. We are more likely to carry on trusting information as if nothing had happened and just believe whatever we are told we should believe.”

Rich Ling, professor of media technology, School of Communication and Information, Nanyang Technological University, said, "I am somewhat hopeful that our current trials will help us to understand the dimensions of the issue and develop measures to support both the press and democracy. Society faced somewhat similar issues with the development of the printing press. In that case, there was the development of mechanisms that worked to enhance to positive sides of the development while hindering the negative effects. That interaction took many decades (and perhaps centuries) to work out. Hopefully we will be able to address this issue in a reasonable way on a shorter time-scale.”

Rajnesh Singh, Asia-Pacific director for an internet policy and standards organization, wrote, "We could go two ways. One is a complete breakdown of trust in information online which would then put into question anything online. The other could be that society and technology work hand in hand to differentiate fact from fiction and work to self-correct. I'm hopeful it will be the latter.”

Henning Schulzrinne, professor and chief technology officer for Columbia University, said, "There will be two worlds - one world of people and institutions that value factual accuracy, with correction and reputation mechanism, and the other where anything goes. The hard part is not distinguishing truth from malicious fiction but choosing to ignore the latter.”

Sonia Livingstone, professor of social psychology, London School of Economics and Political Science, replied, "I really hope that the public will become much more discerning, sceptical and mindful of information quality, source and intent. Will it? Yes, for the mindful – the educated, the politicised, the angry, probably not for everyone else. I can't foretell which will be in the majority – that depends on some other things like the economy, geopolitics, etc.”

Eric Burger, research professor of computer science and director of the Georgetown Center for Secure Communications in Washington, DC, replied, "One could expect to see reputation brokers, be they private enterprise (e.g., in the US) or the state (e.g., in China).”

Barry Wellman, internet sociology and virtual communities expert and co-director of the NetLab Network, said, "We will have better means for verifying information.”

Tom Worthington, honorary lecturer in the Research School of Computer Science at Australian National University, commented, “We may see subscriber-based services for information verification replace ‘news’ services. There is a risk that governments will try to regulate and force their neighbors to also do so, as Saudi Arabia is currently doing to Qatar.”

Greg Shatan, partner, Bortstein Legal Group, based in New York, replied, "At best, no material change. At worst, a significant degradation.”

Seth Finkelstein, consulting programmer with Seth Finkelstein Consulting, commented, "When people are bombarded with contradictory and confusing information, they often fall back on a strategy of just going with their gut feelings. While that's an entirely reasonable and understandable reaction, it's also good for manipulators. When there's much noise, only what's loud and simple gets heard. That's not necessarily what's right. Thus in the absence of dramatic changes reigning in laissez-faire capitalism, I expect trust in information overall will continue to worsen.”

Virginia Paque, lecturer and researcher of internet governance, DiploFoundation, wrote, "We are entering into a period of open and serious skepticism of any information, online or offline. I hope that this will quickly be followed by implementation of tools to address this both on and offline, and we will have recovered before 2027. The internet will require a quick response time to maintain its usefulness.”

Tatiana Tosi, netnographer at Plugged Research, commented, "The information online in 2027 mostly will be verified, and at the same time you will still have independent channels and citizen journals. It will be a combination of major trusted information channels, verified each year and daily by AI and a board of advisors.”

Barry Parr, owner of Media Savvy, replied, "Generally, we will have a better idea of who the trusted sources are, but there will be less variety in points of view from trusted sources.”

Pamela Rutledge, director of the Media Psychology Research Center, said, "Trust in information depends on individuals taking action and responsibility on their own behalf. If we try to offload responsibility, we will give away freedom.”

Richard Lachmann, professor of sociology, State University of New York-Albany, replied, "Trust will increase for both true and false online information.”

Mark Bunting, visiting academic at Oxford Internet Institute, a senior digital strategy and public policy advisor with 16 years' experience at the BBC, Ofcom and as a digital consultant, wrote, "Trust in information online will be largely what it is today – that is, most people have trust in most of what they consume, but they trust some sources more than others, and can occasionally be fooled. The big question is whether trust in information from public institutions will have improved or declined – if the latter I fear our polities will be in an even direr state than they are today.”

Tim Bray, senior principal technologist for Amazon.com, wrote, "I believe that the people pushing the lying stories also have an explicit political agenda, and once that agenda is discredited, the effect on lying-as-a-strategy will be salutary.”

Kelly Garrett, associate professor in the School of Communication at Ohio State University, said, "In 10 years’ time, there will be victories on both sides. Individuals and societies will have developed new strategies to keep online deception in check, and Americans will have found renewed faith in some type of authority (because believing nothing and no one is untenable). Unfortunately, some bad behaviors will have been normalized, and new threats to our ability to know what to believe will emerge. Ideological divisions will remain sharp, and beliefs will continue to fall along party lines. Foreign powers’ attempts at political manipulation via disinformation will be more commonplace. And technologies for fabricating audio and video recordings of events that never happened will be widely known, and regularly abused.”

Thomas Frey, executive director and senior futurist at the DaVinci Institute, replied, "We're working our way through a challenging transition with most proposals taking a piecemeal approach to finding a solution. Some of it can be handled through advanced technology, some through better systems and techniques, but it will not be a smooth journey. Overall, trust levels will climb, but it will come with some unfortunate gaffs along the way.”

Daniel Berleant, author of the book "The Human Race to the Future," commented, "Trust will decline, as society becomes more polarized and more segmented into parochial special interests. If and when society turns a corner and prevailing values begin to favor the common good, trust may begin to increase, but there is no particular reason to believe this will occur soon.”

Willie Currie, a longtime expert in global communications diffusion, wrote, "It is not as if propaganda has just been invented with social networks and hackers. The scale and degree of intimacy is different. Authoritarian states already exert controls over online information. PR firms are specializing in tactics similar to those used by Russia in the US elections. In South Africa, PR firm Bell Pottinger developed a communications strategy based on fake news and disinformation for the Gupta family,which has effectively captured the government of President Jacob Zuma and the African National Congress. Outraged South Africans have reacted to this by besieging Bell Pottinger's Twitter account and forcing them to make it private. The revelations (through email leaks of Gupta family correspondence) of Bell Pottinger's role in this concerted attack on the citizenry of South Africa has resulted in a grovelling apology and potential sanctions by PR regulators in Britain. The point is that there is no one-way road to increasing manipulation of populations by the purveyors of fake news for nefarious purposes. Citizens and institutions do fight back. I expect to see increasing regulatory and legal steps internationally to curtail the practice over the next ten years. The Bell Pottinger/Gupta family/Zuma/fake news affair makes a good case study in a developing democracy. I imagine as the Robert Mueller investigation reaches its conclusions there may, depending on what happens in the next US elections, be steps taken to curtail the practice. If not we're on the road to fascism.”

Katim S. Toray, an international development consultant currently writing a book on fake news, said, "I expect that, on the whole, there will be slight improvement in trust in information online as a result of increased user awareness about the fake news menace and increased efforts to curb it in the first place.”

Wendell Wallach, a transdisciplinary scholar focused on the ethics and governance of emerging technologies, The Hastings Center, wrote, "The trustworthiness of online information will remain mixed, so we will need to rely more upon the discernment of individuals and their digitable education in discriminating what information is reliable. Hopefully, that discernment can be improved through education, experience and the recognition of the value of information that is worthy of our trust.”

Amy Webb, author and founder of the Future Today Institute, wrote, "Today, people – not algorithms – are still the most important drivers of fake news. We’ve become conditioned to share before we read all the way through a story, or before our common sense kicks in. We’re also slaves to our amygdalas, and this moment in human history is rife with economic uncertainty, geopolitical anxiety and wild stories about the future of transformative technologies like artificial intelligence and genomic editing. Given what we know to be true today, it’s clear that we’re on a dangerous path towards the future. Without significant changes, the public trust of quality news will continue to erode, which inevitably contributes to the financial demise of our once-lauded news organizations. Without trained investigative reporters, copy desks, producers and editors, we’ll find ourselves drowning in information but without any sense of which paddle or tree branch to grasp onto for help. Around 2027, people and the artificially intelligent systems that work alongside and augment them, could have to make decisions based on a cesspool of misinformation, misleading statistics, rumour, innuendo and whatever’s left of our trusted news organizations. It’s a bleak outlook, but here’s something important to keep in mind: that future hasn’t happened yet. The future has always been our shared responsibility in the present. When you stop to think of the critical role that you, personally, play in what’s over the horizon, it can be very empowering. And, by the way, that’s a good way to keep your amygdala in check.”

Ian O'Byrne, assistant professor at the College of Charleston, replied, "Complacency will set in and some bad actors will be prosecuted to make us believe something has happened to address these issues. Business, governments, and organizations will continue to spread these digital texts and tools and play it ‘fast and loose’ with our rights and liberties. Our online tribes and affinity spaces will continue to fracture and solidify as we find more in common with the collection of friends we have online than we do with the people on our street, state or country. Trust and truth will be different commodities for different individuals in and across these spaces. Everyone will have trust and truth. It will just mean different things for different people.”

Giovanni Luca Ciampaglia, a research scientist at the Network Science Institute, Indiana University, wrote, "Different sectors of society will have to work together; this includes the press and the social media companies whose platforms connect society with information. And we will need to improve our understanding of these digital information networks to make this happen.”

Jason Hong, associate professor, School of Computer Science, Carnegie Mellon University, said, "There will be less trust overall by 2027 for two reasons. First, fake news is like Gresham's law: the bad drives out the good. As there is more of it, it becomes harder and more time consuming to differentiate between the good and the bad. The second is the growing criticisms (some justified, much of it not) of mainstream media. In both cases, it is important to remember that there are specific groups that benefit from both of the above (fake news and criticizing mainstream media), and thus have strong incentives to keep doing more. Unless we can find ways of undercutting those incentives, fundamentally changing the cost-benefit, we'll just keep seeing more and more fake news and misinformation.”

Daniel Menasce, professor of computer science, George Mason University, replied, "The answer will depend on how the level of education of our society evolves in the next decade.”

Dave Burstein, editor of FastNet.news, said, "The best but unlikely outcome would be for people to learn to be less trusting.”

David Manz, a cybersecurity scientist, replied, "Trust is an attribute of a relationship between two human beings. You don’t trust a chair to hold you. You trust the maker, the installer, the last user, etc. Similarly in the use of computers we might anthropomorphize them but at the end of the day it is trust between the human content creator, the distributor, the echo chamber, your peers and finally you the consumer.”

Hjalmar Gislason, vice president of data for Qlik, said, "Online information in 2027 will be more reliable than in 2017, mainly through availability of reliable information contextual to the content we are consuming (meta-data) and tools that make this context available to online citizens.”

Wendy Seltzer, strategy lead and counsel for the World Wide Web Consortium, replied, "’Trust in information online’ will continue to be a near-meaningless concept. We'll be able to trust some information, whether its immediate source is online or off-, and distrust others. More important will be the end-to-end nature of trust: can we add enough source-to-reader indicia that enable readers to determine whether to trust the source and its reliability?”

Emmanuel Edet, head of legal services, National Information Technology Development Agency of Nigeria, wrote, "There will be designated sources of information which can be trusted and other sources would be considered suspect.”

Joshua Hatch, president of the Online News Association, said, "Trust will be improved, as there will be more-sophisticated consumers and more social awareness, but the problem won't be completely solved.”

Ari Ezra Waldman, associate professor of law and New York Law School, wrote, "Like today, people will trust information that confirms their biases. They will not trust information that challenges those biases.”

Alexios Mantzarlis, director of the International Fact-Checking Network based at Poynter Institute for Media Studies, commented, "It is impossible to know. To give but one number: 10 years ago Facebook had 58 million monthly users; it now has 2 billion. Shouldn't we expect an equally dramatic evolution in our online information landscape in the next 10 years?”

Alan Inouye, director of public policy for the American Library Association, commented, "There will be some net deterioration in trust by 2027. It is easier to play offense than defense in this arena. I am more concerned about differential impacts. More affluent people with graduate education will continue to access systems that are mostly trustworthy. Other socio-economic groups could be subjected to less robust systems, and importantly, the gap between the haves and have-nots grows – it is a new kind of digital divide – the trust divide.”

Jim Warren, an internet pioneer and open-government/open-records/open-meetings advocate, said, "Trust will improve. It already has – e.g., the recent election results in England and France. Most people adjust fairly quickly to discounting false and misleading information once they recognize it as such. The trustworthiness of information will be judged in the future, as it has always been – by the reputability (in the eye of the beholder) and competency of the source.”

Evan Selinger, professor of philosophy, Rochester Institute of Technology, wrote, "How much trust is given to online information in 2027 will be determined, to a large extent, by whether society comes to its senses and recognizes: 1) That democracy requires quality investigative journalism; and, 2) That this, in turn, requires financially supporting the organizations and companies that can provide it. Algorithmic policing of content and generation of content shouldn’t be fetishized as forms of solutionism.”

Eugene H. Spafford, internet pioneer in cybersecurity and professor at Purdue University, commented, "Trust will become more bimodal – some sources will be more trusted as correct by the majority but a significant percentage of people will continue to view dark conspiracies and fringe theories, thus disbelieving the better sources. This will be unevenly written globally, with some countries more prone to such fringe beliefs.”

Michel Grossetti, research director, CNRS (French National Center for Scientific Research), commented, "There will be a competition between the true and the false, as always.”

Maja Vujovic, senior copywriter for the Comtrade Group, said, "Trust will gradually diminish in the short and medium terms, necessitating that new filtering mechanisms be devised, tested and applied. The solutions will not come from governments, but from technology and mass human effort, akin to Wikipedia. Many people – those who can afford to – will opt to pay for access to reliable information. But the sheer number of those who cannot, coupled with ethical considerations, will spawn technological solutions and new standards in information quality control. The whole society will need to step up and this will result in a new norm of what it means to be literate.”

Stuart A. Umpleby, professor emeritus, George Washington University, wrote, "Trust in information online will improve by 2027.”
William L. Schrader, a former CEO with PSINet Inc., wrote, "Much like HTTPS helped provided perceived improved security for financial and other information, I suspect other technologies and organizations will be created which validate that the ‘publisher’ is of very high or very low repute. That report can also be hacked, but it will be noticed, and published. In short, there is so little trust in online information now that trust may actually go up.”

Dan Gillmor, professor at the Cronkite School of Journalism and Communication, Arizona State University, commented, "If we do this right, people will be better able to sort things out for themselves, using critical thinking skills and new tools that will be developed to help.”

James Schlaffer, an assistant professor of economics, commented, "People will adjust to the amount of available information better. Also, the people who only want news from their worldview will double down on their own narratives.”

Stowe Boyd, futurist, publisher and editor in chief of Work Futures, said, "I predict a rapid increase in 'information trust' inline that will directly track the rise in capabilities in AI. Of course, we have to trust the AIs too. Quis custodiet ipsos custodes?”

Larry Diamond, senior fellow at the Hoover Institution and FSI, Stanford University, wrote, "I have given up predicting things like this. Social scientists don't have a great track record of prediction. I will only predict, given the speed at which things are moving technologically, that by 2027 cyber technical means and consequent social and political challenges will have emerged that we haven't even imagined today.”

Fredric Litto, professor emeritus, University of São Paulo, Brazil, wrote, "It will be increasingly diminished, much as we currently witness, dividing every community into ‘tribes’ organized by ideological, economic, religious and other cultural criteria, thereby augmenting extreme stress in everyday life, with unpredictable, long range, consequences.”

Kevin Werbach, professor of legal studies and business ethics, the Wharton School, University of Pennsylvania, said, "There will be no difference between trust online and offline. There will be more online tools for those who care about having a verified and curated information feed.”

Filippo Menczer, professor of informatics and computing, Indiana University, said, "There will be a continuous arms race between increasingly sophisticated abuses and countermeasures. Trust will not be completely restored nor completely lost.”

Adam Holland, a lawyer and project manager at Harvard's Berkman Klein Center for Internet & Society, said, "It will in general decrease, as the sheer amount of what is available proliferates. Alongside this trend, information consumers will also *increase* trust in information from certain people or outlets. This trust will sometimes be warranted, but it will also sometimes be the result of avoiding cognitive dissonance or (virtue), signaling tribal allegiance.”

Susan Hares, a pioneer with the NSFNet and longtime internet engineering strategist, now a consultant, said, "The origin of the internet was guided by the US National Science Foundation (NSF) which had strict rules on what information could be added to the Internet. This created an atmosphere of public trust. For example, pornography was not allowed. Since 1995, the US government has not taken control and the quality of information has gone downward. For example pornography websites have created a national problem with online pornography. Today, families pay for "clean" sources and blocks for pornography. News information used to be tested by being distributed from key news channels. Now, these news channels are no longer trusted. Many individuals are ready for unbiased news on the Internet, and are willing to pay for it. Two futures exist for the internet. One option is that Internet service providers decide that they will no longer offer the ‘free unfiltered service’ and provide only clean data. This level of service will clean out many BOTs, attacks, and pornography. In the second, the internet service providers will continue to have two services: trusted and ‘anything goes’ internet. Businesses and individuals will desire information that is trusted – so a portion of the internet will have the ‘high-trust’ information.”

Matt Armstrong, an independent research fellow working with King's College, formerly executive director of the U.S. Advisory Commission on Public Diplomacy, replied, "2027 is too far out to consider. We may have had one, two, three or more virtual sine waves toward and against false information by then. At present, I am pessimistic that we will collectively come together to fix the education system or come together behind a national identity (which is not exclusive of international engagement).”

Alfred Hermida, an associate professor and journalist, commented, "It depends on what you mean by ‘information.’ From whom? How? When? Anxieties about new forms of information come with every technological revolution, leading to social and cultural changes in information practices.”

Charles Ess, a professor of media studies at the University of Oslo, wrote, "This will very much depend on what happens in the next few years - and it will likely vary widely from places such as Norway (with trust levels at 71%) vs. the United States (with trust levels at 38%). As is well known and documented in the relevant literatures, once trust is broken, it is very, very difficult to restore. Once trust in institutions such as the government or the media is broken, it will be very difficult to restore - more so in the U.S. than in Norway, for example. Broadly, if measures to restore trust are not successful over the next few years, I am pessimistic about the possibilities of doing so in the long run.”

Rick Forno, senior lecturer in computer science and electrical engineering at the University of Maryland-Baltimore County, said, "Trust will be improved, but it's still up to the end-user reader to determine what's ‘fact’ vs. ‘fiction’ vs. ‘fake.’”

John Anderson, director of Journalism and Media Studies at Brooklyn College, City University of New York, wrote, "Will we still have a functional and definable society in 2027? I am not so sure.”

Stephen Downes, researcher with the National Research Council of Canada, commented, "Nothing will happen. There will definitely not be a technological change that guarantees the accuracy of online information. There will not likely be a political change, unless perhaps it is in the direction of getting worse (though given the current environment, who could tell whether it has actually gotten worse?).”

Philip J. Nickel, lecturer at Eindhoven University of Technology, said, "I doubt whether we will find ways to sustain a strong culture of journalism and reporting, since the business model has collapsed. Some innovation is needed.”

Miguel Alcaine, International Telecommunication Union Area Representative for Central America, commented, "There will be trusted information from trusted sources and misinformation as well. Technology will evolve for good guys and bad guys to use. The key is education, so people can better decide by themselves to trust or not to trust. E.g., Why do people share personal information in reply to emails allegedly sent by a bank?”

Laurel Felt, lecturer at the University of Southern California, wrote, “There will be less trust in information online by 2027 because some of our naïveté will be gone, but we will have better mechanisms for flagging suspicious information.”

Isto Huvila, professor of information studies, Uppsala University, replied, "People will learn new strategies how to cope with untrustworthy information and find reliable information sources. The question will not probably be whether to trust information online as much as it is a question of trusting information in a specific place or service online.”

James LaRue, director of the Office for Intellectual Freedom of the American Library Association, commented, "There are so many factors to consider thinking ahead to 2027! Are we still a democracy, or a tyranny? Do we still have technological infrastructure, or have we bombed ourselves back to the pre-industrial age? Have we developed enduring systems to support the collection and reporting of reliable information? Few trends follow a straight line from the present. A less-partisan society, a non-cynical investment in institutions like newspapers, libraries, and advisory boards (scientific and budget, for instance) would give us a very different outlook from my currently bleak projections.”

Stuart Elliott, visiting scholar at the US National Academies of Sciences, Engineering and Medicine, wrote, "People will have sufficient trust in the information sources they use online to feel comfortable using them.”

Tanya Berger-Wolf, professor at the University of Illinois-Chicago, wrote, "It will not be very much different than it is today. People will have different trusted sources they believe provides reliable information and there always be a source of misinformation with a percentage of people believing it.”

Veronika Valdova, managing partner at Arete-Zoe, said, "In the next decade, it is likely that some parts of the internet will become more regulated because of enforcement of certain type of behavior in response to the outcome of a series of new landmark cases. Unregulated parts of the web will likely get worse.”

Scott MacLeod, founder and president of World University and School, replied, "It will build historically in the USA on the trust of media ecosystem of today in 2017 – so similar, since the USA (with its laws) is generating much of the coding of today's information ecosystem. To the degree that, with a distributed internet, each of us individuals around the world can continue to be a information-producer (while continuing to be information-consumers) – creating our own URLs – this will increase trust.”

Peter Jones, associate professor in strategic foresight and innovation at OCAD University, Toronto, commented, "Trust in mainstream news is so low, that their validity is nearly irrecoverable. They cant' afford to repair trust through content or editorial, their business models are broken. The trends of social media are terrifying - we have created massive echo chambers for confirmation bias seeking. Rather than seeking a single truth regime, we should use more of a people's ‘trial of the facts.’ Online source pools such as Wikileaks and independent journalists are our best hope.”

Pete Cranston, knowledge management and digital media consultant, replied, "People will find ways and means to accessing information they trust, in a network of trusted sources.”

Ben Justice, professor and chair in Rutgers University's department of educational theory, policy and administration, wrote, "‘Online’ will not exist as we know today by 2027. I'd be safer speculating about the next season of ‘Game of Thrones.’”

Morihiro Ogasahara, associate professor at Kansai University, said, "The trust in information via online social network would increase but lower than the ones of mainstream media, which will mainly be delivered online.”

Mark P. Hahn, a chief technology officer, wrote, "It's hard to imagine. I *hope* we see more ways to understand where information originates and how the information is synthesized and presented. Better understanding of the chain of custody for information and contact with local sources will allow individuals to better adjust their own trust models over simply adopting one from a centralized organization or power broker. But people will still defer to a trusted advisor, but ideally that trusted advisor is more local and tools will exist for that trusted advisor to better evaluate the chain of custody of sourced material.”

Scott Guthrey, publisher for Docent Press, said, "Information – online or offline – is always a signal in noise. We all apply different filters to reduce the noise and then look for the signal we want. Going on-line may have changed quantity and accessibility of signals but there has been no change in kind since some caveman whispered, ‘I love you.’”

Paul N. Edwards, Perry Fellow in International Security, Stanford University, commented, "Trust in many sources will be even further eroded.”

Philippa Smith, research manager and senior lecturer in new media, Auckland University of Technology, said, "It will be most important for people to develop key skills and awareness when it comes to deciding what they should or should not trust online by 2027. It comes down to education.”

Steve McDowell, professor of communication and information at Florida State University, replied, "We will need to have higher levels of information literacy, and be encouraged to be skeptical consumers and users of news and information. As in other sectors, such as consumer goods and services, our trust may need to be gained or built by using a network of references.”

Jonathan Brewer, consulting engineer for Telco2, commented, "Trust in all information will be diminished by 2027, whether it comes via the internet, broadcast, or print.”

Fred Davis, a futurist based in North America, wrote, "You will have friends you trust and hope they have the right information. Ultimately we need to have a better-educated society who has a better ability to discern things for themselves. Since the educational level of the US is far below many other developed countries it is a hotbed for misinformation. Given the current state of education, and the high cost of college, I doubt the country will be smart enough to spend the kind of investment it would take to improve our educational systems. Teachers get inadequate pay, which attracts fewer capable teachers.”

Philipp Müller, postdoctoral researcher at the University of Mainz, Germany, replied, "I would assume that individuals will have become more experienced in judging which information to trust and that single instances of (mis)information will not be able to cause an uproar as they can nowadays. As a consequence, I hope we all will have become more placid about online information, which will hopefully help to support trust.”

Daniel Alpert, managing partner at Westwood Capital, a fellow in economics with The Century Foundation, wrote, “The problem with this question is that people DO trust their self-selected information sources, regardless of their veracity.”

Isabel Walcott Draves, president of Crowdfest Inc., wrote, "In 10 years there will no longer be an ‘online’ as people see it today. Everything we do will be internet-enabled, we won't go to a specific phone or laptop or television to ‘get online,’ connection will just be ambient and everywhere. People will be immersed in information environments they trust shaped by individuals, communities, organizations and corporations they trust. However people will all be getting the same news, the same facts or the same opinions the same way they always have, based on whom they surround themselves with. That will never change, and we'll just have to learn to work out our differences or die.”

Denise N. Rall, adjunct research fellow, Southern Cross University, Australia, said, "I suspect that intelligent people will manage to filter out the dross and the remainder will continue to believe any piece of conspiracy, bigotry and rubbish that they come across. Exactly what happens today, only with fewer scientists and more reality-deniers in place.”

Robert W. Glover, assistant professor of political science, University of Maine, wrote, "Trust will decline precipitously and this will result in at least one catastrophic event resulting in loss of life (military action in response to disinformation, breakdown of social order, etc.).”

Tony Smith, boundary crosser for Meme Media, commented, "Islands of sufficient trust will grow unless the next great failure of capitalism is triggered or, less likely in that time frame, ecological or climate systems collapse become so visible they can't be ignored.”

Steven Polunsky, writer with the Social Strategy Network, replied, "We will have more information to make better informed judgment.”

Timothy Herbst, senior vice president of ICF International, said, "I hope it will improve. Perhaps trusted third-party verifications entities will become part of the solution to rate or verify information.”

Carol Wolinsky, a self-employed marketing researcher, replied, "Each side will be certain its version of the information is ‘honest’ and that of the opposition is ‘fake’; I don't see potential for a coming together.”

Joel Reidenberg, chair and professor of law, Fordham University, wrote, "The need for trust is likely to increase ‘gated communities’ online.”

Jack Park, CEO, TopicQuests Foundation, said, "Aspirationally thinking, trust will increase. Realistically thinking, not so much. That's because, if one reads the way the questions are posed in this survey, it's unlikely that anything will be done beyond criminalizing fake news the same way we criminalized drugs; how's that worked out for us?”

Johanna Drucker, professor of information studies, University of California-Los Angeles, commented, "We can hope that an information ecology will emerge that allows user/readers to distinguish among the different ethical values on which different sources operate.”

Tom Wolzien, chairman of The Video Center and Wolzien LLC, said, "I think marketplace economics will require verified responsibility, which will improve trust. The first marketplace evidence is in digital subscriptions to the New York Times, Washington Post, etc.”

Stephan Adelson, an entrepreneur and business leader, said, "There will be sources of truthful information among sources of untruthful information.”

Mike O'Connor, a self-employed entrepreneur, wrote, "I can only work toward a cure, not predict our success.”

Don Kettl, professor of public policy at the University of Maryland, said, "I fear it is likely to diminish.”

Antoinette Pole, associate professor, Montclair State University, said, "People will continue to rely upon trusted sources most notably the mainstream media.”

Shawn Otto, author of "The War on Science," wrote, "On the whole trust will further erode.”

David A. Bernstein, a marketing research professional, said, "Unless things change, I foresee a continued erosion of trust of all kinds of information. If you cannot trust what you are told, hear, or read regardless of the source, why would anyone follow any rules? After all, how do I know that rule or government order is real? Do I have to obey a rule or statute if it's likely to be bogus? Perhaps we will need a truth police force to let us know what is real or not. I see a continued breakdown of civility in our society.”

Michael Wollowski, associate professor at the Rose-Hulman Institute of Technology, commented, "Information will be tagged by degree of reliability. People will still disagree with the ratings, but they have been doing this for a long time.”

Axel Bender, a group leader for Defence Science and Technology (DST) Australia, said, "As of today, there will be different degress of trustworthiness in information which will remain strongly correlated with who is providing the information (i.e., the source of information). However, trustworthy information sources will become sparser in an ocean of random/uneducated/irrelevant/trivial/false/fake data. That's a continuation of a gradual trend rather than a disruption.”

Monica Murero, a professor and researcher based in Europe, wrote, "By 2027 trusted online information (fake or not fake) will mainly circulate through individual networks and perhaps via ‘high-quality brand’ official websites/sources.”

Ned Rossiter, professor of communication, Western Sydney University, replied, "Trust will be dependent on the social aggregation of affirmation, whatever scale that might be. There will be no option for information offline, since the intersection or overlap between on- and offline worlds is already largely consolidated. Trust will remain an uneven and contested landscape in 2027.”

Andrew Feldstein, an assistant provost, said, "Most likely something I can't anticipate. Perhaps we will no longer get our information from secondary sources. We will all have access to some virtual reality playback of actual events. Then we'll be able to make our own judgments of truth. Of course, this might not lead to better outcomes than our current, mediated access to information. In 2027, there might be a whole new generation of people wishing for the good old days when Fox News and PBS could tell them what and how to think.”

Danny Rogers, founder and CEO of Terbium Labs, replied, "If the internet is even going to be around and be a dominant economic force in 2027, trust in online information – and in the online experience as a whole – must improve drastically by 2027.”

Dane Smith, president of the public policy research and equity advocacy group Growth & Justice, said, "Things will improve. This is just a hunch, partly because I'm an optimist. Truth finds a way and prevails.”

Garland McCoy, president, Technology Education Institute, commented, "By 2027 the Earth will be toxic to humans and so our species will no longer be on the planet. This is a fact (trusted information) supported by the US government and many others and by all members of the scientific community worldwide. I trust this information and so know I will be dead by 2027. Obviously you are a non-believer to have asked this question. I will be reporting this to the appropriate authorities and they will take you to a proper government run re-education camp for your heresy! You must never question the government's ‘trusted information.’”

Meamya Christie, user-experience designer with Style Maven Linx, replied, "Online experiences will change completely by 2027. The engineering of algorithms and keywords is going to force this change. In one portal, we will continue lifting and praising false success, illusion, and gossip agendas as truth. Here, trust in information online will continue to decline, it will be a joke, and will produce chaos. In the other portal, there will be a more transparent and trustworthy experience.”

Sandra Garcia-Rivadulla, a librarian based in Latin America, replied, "Better mechanisms will be in place to ensure trustiness of information, also, people will be more information literate and know which sources they can trust and which ones are fake.”

Peter Lunenfeld, a professor at UCLA, commented, "I am hopeful that the long arc of history includes information technology as well, and that we will be able to reestablish trust in institutions and bring a majority (never a totality) of the populace back to the notion that verified and verifiable content is what they should be consuming and voting on, even if they don't always live up to this ideal.”

Vince Alcazar, business owner and retired US military officer, wrote, "Trust has surface area and a penetration coefficient. Propaganda, demagogues, and lie peddlers will seek to operate where laws cannot reach them, or in front of laws that have yet to catch up. I shudder to think that 2027's low points may be even lower than today’s.”

Sam Punnett, research officer, TableRock Media, replied, “Trust in information is a personal affair. There will be a continuing trend in the ability for individuals to tailor their information flow to suit their tastes. Developments such as those in artificial intelligence and voice and gesture interfaces will make tailored feeds increasingly customizable. There will be a continuing gravitation to moving images away from text. Trust will be afforded by credibility and credibility determined by the individual's information/media literacy. The ratio of credible versus misleading information will remain constant.”

John Laprise, consultant with the Association of Internet Users, wrote, "People will become more skeptical/circumspect.”

Ray Schroeder, associate vice chancellor for online learning, University of Illinois-Springfield, replied, "Most colleges and universities, and many libraries, now offer mini-courses on assessing quality and truth in materials disseminated by Internet sources. I believe that an educated public will help to overcome the proliferation of misleading information. Perhaps something like the creative commons license bug will be generated for information that has been vetted by an approved entity.”

Davide Beraldo, postdoctoral researcher, University of Amsterdam, said, "Trust will generally increase. Social media oligopolists will find their way to accredit themselves as champions of 'true information' spreading.”

David C. Lawrence, a software architect for a major content delivery and cloud services provider whose work is focused on standards development, said, "As the internet becomes more and more a part of the everyday lives of more and more people, it's normalcy would tend to mean that trust in information online will be about the same as trust in information anywhere else.”

Francois Nel, director of the Journalism Leaders Programme, University of Central Lancashire, said, "Trust in information, like beauty, will continue to be in the eyes of the beholders.”

G. Hite, a researcher, replied, "As a person who knows the difference in online / offline information, I see the trend to be more trusting of information online by 2027 as more and more agencies delegate their procedures to the internet – i.e., school grades and assignments, paying bills, applying for various applications and looking into accounts of all types (utilities, payroll, academic, medical, etc.).”

Kenneth Sherrill, professor emeritus of political science, Hunter College, City University of New York, said, "Trust in online information will decline. So will trust in information in print.”

Scott Fahlman, professor emeritus of AI and language technologies, Carnegie Mellon University, said, "For a given audience, there will always be trusted and un-trusted sources. With more advanced AI and natural-language understanding, it will be much easier to detect statements that seem to contradict solid evidence or the consensus of trusted agencies, and to make that info available to readers who care to see this. However, there will always be some readers who choose to ignore such information and believe what they want to believe.”

Anne Mayhew, retired chief academic officer and professor emerita, University of Tennessee, replied, "Most people will be better equipped to be intelligently skeptical.”

Greg Lloyd, president and co-founder of Traction Software, wrote, "It will be much easier in 2027 to reliably identify people and sources and make judgments based on personal and social reputation.”

Luis Martínez, president of the Internet Society's Mexico chapter, wrote, "Certainly trust will have improved by 2027.”

John Sniadowski, a director for a technology company, said, "It will probably take this next decade to make progress towards building systems that can properly attribute information to source and its score for authenticity.”

Mike Gaudreau, a retired IT and telecommunications executive, commented, "All the fake news sites need to be closed and Google, Facebook, et cetera, need to step up policing what is out there.”

Louisa Heinrich, founder of Superhuman Ltd, commented, "The way we measure and think of trust in the digital world will have fundamentally changed.”

Michele Walfred, a communications specialist at the University of Delaware, said, "I fear it will be a free-for-all, a giant mess of competing/conflicting information. I hope there is a way to not censor material, but fully disclose or make transparent who publishers are or how they are funded. Newspapers did that.”

Jeff Jarvis, professor at the City University of New York Graduate School of Journalism, commented, "It is impossible to tell. We know that the situation will get worse unless we actively move to change it by reinventing journalism and its institutions and by developing business models that reward relevance and reliability over reach.”

Ken O'Grady, a futurist/consultant, said, "By 2027 trust in information will sadly drive people further into the like-minded groups that are polarizing our country today. This is something we need to address with education (critical thinking and dialogue).”

Gianluca Demartini, a senior lecturer in data science, wrote, "We will have quantifiable trust measures attached to each piece of data. This will allow content consumers to personally decide what sources to trust when they decide what information to access.”

Richard Jones, a self-employed business owner based in Europe, said, "Hopefully, information will be treated with circumspection.”

Steve Farnsworth, chief marketing officer, The Steveology Group, wrote, "In many ways it will fundamentally be the same, though it may look different. While I think we will be more sophisticated in ferreting out fake news, there will always be bad actors. When money or power is at stake, people find a way to subvert the truth to their advantage. However, we as a sociality can work to make that harder, and that will help in fighting this threat.”

Ed Terpening, an industry analyst with the Altimeter Group, replied, "I don't have high hopes for upholding trust by 2027, when all signs point to further polarization.”

Paul Jones, director of ibiblio.org, University of North Carolina-Chapel Hill, said, "I expect further chaos before we as a culture (national and global) begin to set things right. But by 2027, we should be on the paths to correction.”

Frank Kaufmann, founder and director of several international projects for peace activism and media and information, commented, "It depends on if corporate journalism can reform itself. Presently its inner rot, infantilism, hypocrisy, irrelevance and ethics-less environments have been exposed, and the industry is in crisis. If news reporting can awaken and reform, people's trust in information will improve by 2027.”

Sam Lehman-Wilzig, associate professor and former chair of the School of Communication, Bar-Ilan University, Israel, wrote, "Significant improvement by then for two reasons: 1) Verification systems will get very good; 2) Serious mistakes by leaders based on misinformation will bring home just how harmful is the phenomenon of ‘alternative facts.’”

Bradford W. Hesse, chief of the health communication and informatics research branch of the US National Cancer Institute, said, "The same thing that will happen to trust in information offline: that is, societal efforts will continue to drive for greater accountability for misinformation. The public will turn to trusted sources to help them interpret a morass of conflicting messages. In health, for example, data suggest that peoples' trust in their doctors has continued to grow as they work together to disentangle credible from specious health messages.”

Adam Lella, senior analyst for marketing insights, comScore Inc., replied, "People will become smarter about how to identify false information online. The biggest question for me is whether people will care, or if they will still knowingly seek out false information that supports their preexisting beliefs.”

Clifford Lynch, director of the Coalition for Networked Information, said, "I hope more people will be more skeptical and thoughtful about what they believe.”

Helen Holder, distinguished technologist for HP, said, "1) A minority of people will still value high reliability information. It's not clear whether that group will be sufficient to support professionally verified information from an economic or market perspective. 2) There will be an increase in ‘crowdsourcing’ information, which will give mixed results. On one hand, multiple eyewitness sources will give credibility to wrote facts and events. On the other hand, no context will be provided, potentially leading to divergence due to confirmation bias. (If people like what they see, they will believe it. If they don't, they won't.) 3) There will be an expansion in for-profit ‘fact makers’ who will be hired to inject any narrative or idea into the public consciousness or to targeted groups or individuals. These organizations and methods already exist and need only to mature in order to make it virtually impossible for even well-meaning people to believe reported information as objective reality. People will only really believe what they have personally wrote. In some ways, this change will be a return to the way people evaluated information prior to moveable type enabled mass communication.”

Glenn Grossman, a consultant in the financial services industry, replied, "I believe we may need to rely on trust institutions. We may need more auditing too. Self-imposed fines could be result if you break your code of ethics.”

Peter Dambier, DNS guru for Cesidian Root, commented, "Intelligent people will learn how to treat information and sources. Others will buy dehydrated soup flavoured with their preferred poison.”

Peter Eckart, director of health and information technology, Illinois Public Health Institute, replied, "Information will be hotly debated, and the extremists on both sides will be the loudest.”

Peter Levine, associate dean and professor, Tisch College of Civic Life, Tufts University, wrote, "We could see an online environment even more dominated than it is today by a few big corporations. If it evolves that way, we'd better hope they curtail bad information. Or we could see a more open environment, in which case responses to bad information will have to be more distributed.”

Rob Lerman, a retired librarian, commented, "It's not looking good. There are few trends that point to a public more able to make good decisions about what media to trust.”

Su Sonia Herring, an editor and translator, commented, "People need critical thinking skills and media literacy. If they possess these skills all the misinformation in the world would have minimal impact. Trust in the next 10 years will depend on the skills of users by 2027.”

Megan Knight, associate dean, University of Hertfordshire, said, "It will be massively reduced, and people will rely on the word of individuals they trust.”

Dave Kissoondoyal, CEO, KMP Global, replied, "Bad actors on the internet came with viruses and malware, and, in reaction, companies came with anti-viruses and anti-malware. Similarly, due to the spread of untrusted information on the Internet, companies and other agencies will come up with the concept of clearing houses whereby they provide the service of verification of information. People will then tend to trust information that has been verified.”

Matt Moore, a business leader, wrote, "I suspect people will have completely retreated to their own echo chambers and tribes by 2027. There will be no ‘public space,’ no agora. The world will be Hobbesian rather than Habermasian. Facebook groups with moats and drawbridges.”

Carl Ellison, an early internet developer and security consultant for Microsoft, now retired, commented, "It should have gone away by then, but that would require education of the consumer of information. As long as that consumer prefers to get info from an echo chamber, his trust will remain high and he will be in danger of being lied to.”

Adam Powell, project manager, Internet of Things Emergency Response Initiative, University of Southern California Annenberg Center, said, "Trust in information in 2027 will be exactly where it is today.”

David Harries, associate executive director for Foresight Canada, replied, "’Trust in information' on what subject, sector? Before 2027, the trend to trusting first and only who and what YOU KNOW will become the norm.”

Deborah Stewart, an internet activist/user, wrote, "As long as Republicans stay in power things will be slow to change for the better.”

Sasa M. Milasinovic, an information technology consultant with Yutro.com, replied, "Some rules will become effective but it will still not be sufficient.”

Troy Swanson, a teaching and learning librarian, replied, "I remain an optimist. I am not sure that a decade is enough to work all of this out, but we will be moving in better directions. I think we have a growing awareness of the potentials and realities for the manipulation of information. People do not want to be deceived. We will find new systems for discourse that will strengthen our trust in information. This may take longer than 2027. Maybe 2067?”

Jonathan Ssembajwe, executive director for the Rights of Young Foundation, Uganda, commented, "If people are sensitized on the proper use of internet the information will be trusted but if not sensitized the information will not be trusted by 2027.”

Joanna Bryson, associate professor and reader at University of Bath and affiliate with the Center for Information Technology Policy at Princeton University, said, "As now this will vary on both the side of the producers and of the consumers, but it will generally move in a direction that requires less education to attain more certainty.”

Adrian Schofield, an applied research manager based in Africa, commented, " Unless there is a major shift in the willingness of the passive majority to demand ethical accountability from their leadership, there will be a decline in the trust of published information from any source online. It will be read for its entertainment value (as is the case with scandal-sheet newspapers and magazines) and not for decision-making. Ten years after that may be a different story because we will see everything "live" and not filtered through a publication.”

Alf Rehn, chair of management and organization studies, Åbo Akademi University, commented, " We will almost certainly have better technologies for ascertaining the veracity of information, and hopefully have less ‘spews’ (spam-news), but I also foresee entire fringe information ecosystems – not merely fake news, but an entire parallel fake media.”

Shirley Willett, CEO, Shirley Willett Inc., said, "It will worsen in 10 years. We will need it bad enough to make a change.”

Andrew McStay, professor of digital life at Bangor University, Wales, wrote, "While I have no reason to foresee an increase, I do not see trust plummeting either. In sum, little change.”

Marcel Bullinga, futurist with Futurecheck, based in the Netherlands, said, "General trust online will decline heavily. But trust in specific sites and proven AI methods will increase.”

Vian Bakir, a professor in political communication and journalism, Bangor University, Wales, commented, "It will probably decline, as it has been for years.”

Sharon Tettegah, professor at the University of Nevada, commented, "Trust will be improved if we have checks and balances.”

John Lazzaro, a retired electrical engineering and computing sciences professor from the University of California-Berkeley, wrote, "No change from 1997, 2007 or 2017. Common sense and skepticism have always been needed, since the Usenet days.”

Larry Irving, CEO of The Irving Group, wrote, “I hope trust in information online will be restored to pre-internet levels by 2027. I fear we will have to maintain vigilance.”

Paul Hyland, principal consultant for product management and user experience at Higher Digital, wrote, "It will evolve. There will be better tools to help you decide what to trust.”

Scott Shamp, interim dean of the College of Fine Arts at Florida State University, commented, "Trust will decline.”

Mike Meyer, chief information officer at University of Hawaii, wrote, "Online information that is confirmed as trustworthy will be trustworthy.”

Dan Ryan, professor of arts, technology, and the business of design at the University of Southern California, said, "I would expect bi-(or tri-)furcation into tiers of credibility, possibly not universal but "spatially particular" (the way different fields and different cultures have sense of the more and less authoritative sources). I might guess we will see a topography of "information" in which there are zones of established and accepted bits and equally consensual uncertain bits, and then contested bits where groups take different things as given. One version of this might be a sorting out of information that is known from believed, from suspected, etc. maybe a new vocabulary of meta-informational characterizations. Likely we'll see evolution of norms around misrepresentation and trafficking in untruth. You can find interesting examples of this in different religions' prohibitions of idle talk and gossip (see, for example, http://soc-of-info.blogspot.de/2014/02/religion-and-information-behavior.html).”

David J. Krieger, director of the Institute for Communication & Leadership, Lucerne, Switzerland, commented, "Trust will become the key to successful networks in all areas of society. Trust by design will restore faith in our institutions.”

Vivienne Waller, senior lecturer, Swinburne University of Technology, replied, "By 2027, consumers will be able to verify the source of their online information. Just as consumers have always chosen which newspaper or news service to trust, consumers will choose which source of online information they trust.”

Daniel Kreiss, associate professor of communication, University of North Carolina-Chapel Hill, commented, "It depends what happens with US political parties as institutions. If the two parties began to punish the actors that spread misinformation and conspiracy theories, I think there would be a different dynamic in ten years. I am not sure that parties are strong enough to do that, unfortunately, in large part because they are too weak and there is too much public participation in electoral politics, which opens the door to extreme activists, interest groups, and donors.”

Alexander Furnas, Ph.D. candidate, University of Michigan, replied, "Trust in information from sources that are socially distant will be low. There will be few if any institutions or actors viewed to have a high degree of credibility by a wide swath of the population.”

Tomslin Samme-Nlar, technical lead, Dimension Data Australia, commented, "Trust will deteriorate if nothing is done not just to fake news but cyber security as a whole.”

Amali De Silva-Mitchell, a futurist, replied, "Trust will depend on social behaviour, which can change rapidly in any manner.”

Ayaovi Olevie Kouami, chief technology officer for the Free and Open Source Software Foundation for Africa, said, "To trust in information online by 2027 many signatures must be confirmed.”

Ed Tomchin, a retired writer and researcher, said, "Trust will increase greatly due to a change in the psyche of the American public.”

David Schultz, professor of political science, Hamline University, said, "I am not even sure if online information will be relevant by 2027.”

Mark Patenaude, vice president for Innovation, cloud and self-service technology, ePRINTit Cloud Technology, replied, "There will be new laws that govern the worldwide internet and content providers. They cannot be governed by any political agenda but will be governed by the people and a separate arm of the government that does not pander to its leader, but reports to a world body.”

Patricia Aufderheide, professor of communications, American University, said, "It really depends on whether we even have a stable society and one in which the powers that control state violence protect expression.”

Greg Swanson, media consultant with Itzontarget, said, "I hope leaders will emerge that respect the need for a shared national conversation, as opposed to leaders Who see media as merely tools of propaganda. But I fear it is going to take many more than 10 years for us to adjust to social media and the ubiquitous web.”

Mike Roberts, pioneer leader of ICANN and Internet Hall of Fame member, replied, "Trust in information will generally be higher, but perhaps not viewed as high enough.”

Paul Kyzivat, retired software engineer and Internet standards contributor, said, "It will be stratified. There may not be much information that everyone agrees to trust.”

Clark Quinn, consultant with Quinnovation, said, "It will still be necessary to be an informed information consumer. We will likely need to ensure such skills are deliberately developed.”

Flynn Ross, associate professor of teacher education, University of South Maine, said, "The echo chambers will continue to have people trust sources they personally value - the need is to talk across these echo chambers.”

William Anderson, adjunct professor, School of Information, University of Texas-Austin, replied, "Trust will be both better defined in practice and under constant review.”

K.G. Schneider, dean at a public university library, replied, "I would like to think we will go full circle by 2027. I suspect we're going to have to hit bottom with some national crisis before we get there.”

Jeff MacKie-Mason, University librarian and professor of information science, professor of economics, University of California-Berkeley, replied, "I'm an optimist. Trust in our ability to *discern* trustworthy information will have increased, and our tools to evaluate and filter information will have improved. We won't trust all information online – in fact, probably not a lot of it. But we will have more trust in our ability to find and utilize information that *is* trustworthy.”

Nate Cardozo, senior staff attorney, Electronic Frontier Foundation, wrote, "Hopefully media literacy will increase greatly in the next decade.”

Andee Baker, a retired professor, said, "If mow safeguards are put in and more education of the public occurs, information in 2017 will be more trusted than it is today.”

Jeff Johnson, professor of computer science, University of San Francisco, replied, "There will be a significant drop in trust in information.”

Federico Pistono, entrepreneur, angel Investor and researcher with Hyperlook TT, commented, "Trust will be improved through a join effort of private companies, governments, and civil liberties organizations.”

Richard Rothenberg, professor and associate dean, School of Public Health, Georgia State University, said, "The restoration of the American presidency would go a long way toward restoring trust. The invocation of moral responsibility as a legitimate tool of social policy is a sine qua non for trust in information. I can only hope the downward spiral our country is suffering will reignite this concept.”

Diana Ascher, information scholar at the University of California - Los Angeles, wrote, "Information will reach us through new and more ubiquitous systems. We will be dealing with the new yellow journalism for the foreseeable future. Defunding research into critical information literacy threatens our ability to live in a democratic society. By 2027, we’ll either place little stock in the information we encounter, or we’ll succumb to the Borg.”

Dariusz Jemielniak, professor of organization studies in the department of Management In Networked and Digital Societies (MiNDS), Kozminski University, wrote, “It is difficult to say, as it is actually increasing now in spite of the fake news environment.”

Siva Vaidhyanathan, professor of media studies and director of the Center for Media and Citizenship, University of Virginia, wrote, "No one has any idea. Anyone who pretends to be able to predict the information environment of 2022 is a fool – let alone 2027.”

Alexis Rachel, user researcher and consultant, said, “Trust in information will continue to decrease.”

Jennifer Urban, professor of law and director of the Samuelson Law, Technology & Public Policy Clinic at the University of California Berkeley, wrote, “I would like to think that we will have become better at discerning what is false or misleading and what is not. However, humans have always been taken in by frauds, scams and misinformation. Fundamentally, it seems unlikely that we will get much better at discernment on an individual level. But online services may become better at sifting some information out, and we may have come up with a way to better scale our legal system's protections against false information.”

Michael R. Nelson, public policy executive with Cloudflare, replied, "’Will there be online information you can trust inn 2027?’ Definitely! New business models and new techniques that harness AI, digital watermarking, and more powerful forms of crowdsourcing will mean more information is verified and reliable. But there will also continue to be deep oceans of misinformation, doctored images and even computer-generated video that portray things that never happened in very convincing and realistic ways.”

Andrew Odlyzko, professor of math and former head of the University of Minnesota's Supercomputing Institute, wrote, "We are likely to learn that all online information is suspect to some extent, and will learn to associate varying degrees of trustworthiness to different parts of it.”

Janet Kornblum, a writer/journalist, investigator and media trainer, replied, "By 2027 people will be a lot more savvy. Maybe that's wishful thinking, but look at Trump voters: they definitely skewed older. People who were reared online are smarter about sourcing. Again, that may just be wishful thinking.”

Sharon Haleva-Amir, lecturer in the School of Communication, Bar Ilan University, Israel, said, "Unless steps will be taken to reduce the level of false information dispersion, people's trust in online information will be minimized.”

To return to the survey's for-credit responses home page, with links to all sets, click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_information_environment_credit.xhtml

If you wish to read the full survey report with analysis, click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_the_information_environment.xhtml

To read anonymous survey participants' responses with no analysis, click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_information_environment_anon.xhtml