Elon University Home

 

The 2017 Survey: 
The Future of Truth and Misinformation Online

Credited responses to the fourth follow-up question:
What penalties should there be for harmful misinformation?

Internet technologists, scholars, practitioners, strategic thinkers and others were asked by Elon University and the Pew Research Internet, Science and Technology Project in summer 2017 to share their answer to the following query:

What is the future of trusted, verified information online? The rise of "fake news" and the proliferation of doctored narratives that are spread by humans and bots online are challenging publishers and platforms. Those trying to stop the spread of false information are working to design technical and human systems that can weed it out and minimize the ways in which bots and other schemes spread lies and misinformation. The question: In the next 10 years, will trusted methods emerge to block false narratives and allow the most accurate information to prevail in the overall information ecosystem? Or will the quality and veracity of information online deteriorate due to the spread of unreliable, sometimes even dangerous, socially-destabilizing ideas?

About 49% of these respondents, said the information environment WILL improve in the next decade.
About 51% of these respondents said the information environment WILL NOT improve in the next decade.

Follow-up Question #4 was:
What do you think the penalities should be for those who are found to have created or knowingly spread false information causing harmful effects? What role, if any, should government play in taking steps to prevent this?

Some key themes emerging from among the responses: - Corporate actors profiting from information platforms should assist in improving the information environment. - Individuals and cultures must do a better job of policing themselves; it's best to generally avoid any sort of added regulatory apparatus. - Governments should not be allowed to take any sort of oversight role. - Some sort of regulation should be applied, updated or adapted to help somewhat ameliorate the problem of misinformation. - While legal remedies may work locally at times, the global nature of the internet and variability of the application of law negates their efficacy. - Further legal approaches are not likely to be workable, nor are they likely to be effective. - Free speech is a pivot point: Regulatory mechanisms may stifle unpopular but important speech just as much or more than they stifle harmful speech. - The misinformation conundrum presents too many complexities to be solvable

Written elaborations by for-credit respondents

Following are full responses to Follow-Up Question #4 of the six survey questions, made by study participants who chose to take credit when making remarks. Some people chose not to provide a written elaboration. About half of respondents chose to remain anonymous when providing their elaborations to one or more of the survey questions. Respondents were given the opportunity to answer any questions of their choice and to take credit or remain anonymous on a question-by-question basis. Some of these are the longer versions of expert responses that are contained in shorter form in the official survey report. These responses were collected in an opt-in invitation to about 8,000 people.

Their predictions:

Micah Altman, director of research for the Program on Information Science at MIT, commented, “The government should be supporting an independent media, and robust information systems that are open, transparent and traceable to evidence and not focused on suppressing false information.”

Rick Forno, senior lecturer in computer science and electrical engineering at the University of Maryland-Baltimore County, said, "This is a hard issue to enforce, since in the US, First Amendment protections prevent prosecution of even the most moronic 'fake news' items.”

John Anderson, director of Journalism and Media Studies at Brooklyn College, City University of New York, wrote, "We have existing legal mechanisms to combat the spread of false information with the intent to do harm, but our legal system works about two generations behind where communications technology is. Things are not helped by the increased politicization of the judiciary itself.”

Stephen Downes, researcher with the National Research Council of Canada, commented, "Using existing laws, we can assess penalties based on actual damages caused, in those few cases where actual prosecution is possible. But given that government and large corporations profit the most from spreading false information, it seems unlikely they can be trusted to take any steps to prevent it. There is probably no legal remedy, because the people who benefit from misinformation have been the ones to write the laws.”

Steve McDowell, professor of communication and information at Florida State University, replied, "It will be easier for private-sector actors to proceed as they already do, and make commercial decisions about their policies. Civil law remedies for defamation, libel, and privacy protection already are in place, and if other types of harm can be identified, there may be civil law approaches that can be followed. Since many stories may originate outside the country, this approach may have significant limitations. It will be more difficult for the government in the United States to be involved in such efforts, given the strong First Amendment traditions limiting government actions concerning speech and expression.”

Bart Knijnenburg, researcher on decision-making and recommender systems and assistant professor of computer science at Clemson University, said, "Harmful speech should not be protected as free speech. I believe that the European anti-hate speech laws make a lot of sense: if there is an intent to harm others (or move one's followers to harm others), it should be punishable by law.”

Adam Gismondi, a researcher at the Institute for Democracy & Higher Education, Tufts University, observed, "I think that it is hard to overstate how delicate the approach must be on these questions. If this problem isn't approached in a way that transcends partisan politics, it will forever be dragged down by polarized perspectives. Until there is a collective recognition of facts around false information and the harm that it causes, the idea of penalties and a role for government in the matter is a non-starter.”

Matt Moore, a business leader, observed, "I am not sure that there is a public appetite to enforce penalties for doing this. We need people to take public responsibility for both what they say and what they consume. This needs to come from the top. And it is manifestly not happening.”

Carl Ellison, an early internet developer and security consultant for Microsoft, now retired, commented, "Seventy years ago, such a source would be denied air time. We no longer have limited channels. We can apply economic sanctions against Russia but what power do/should we have against Breitbart or The National Enquirer?”

Jason Hong, associate professor, School of Computer Science, Carnegie Mellon University, said, "Applying penalties to people who spread false information is a good idea in theory but incredibly hard to do in practice. For domestic cases, how can we differentiate from stupidly but innocently jumping to conclusions vs. deliberate misinformation? How to differentiate between legitimate grassroots campaigns vs. organized groups deliberately aiming to spread misinformation? How to identify domestic groups or individuals vs. foreign ones that local governments have little leverage over? There aren't many good options here, especially when the bulk of misinformation today is substantially benefiting one of the two main political parties in the United States. What the government can do is to hold hearings to do basic fact finding, and offer research funding/competitions to disincentivize and/or block the most widely agreed upon and egregious cases.”

Adam Powell, project manager, Internet of Things Emergency Response Initiative, University of Southern California Annenberg Center, said, "No, and therefore none. Remember, ‘Congress shall make no law....’”

Bob Frankston, internet pioneer and software innovator, said, "It is dangerous to impose too much control, but maybe there should be a concept of public libel?”

Jane Elizabeth, senior manager American Press Institute, said, "There already are penalties for hateful/dangerous speech and other communications. The penalties for malicious misinformation could work in a similar way.”

Jonathan Grudin, principal design researcher, Microsoft, said, " Ideally the government should distribute accurate information and help establish the provenance of misinformation. It is difficult to prove ‘intent of causing harmful effects.’ If I lie to elect a candidate I believe will be good, did I intend to cause harmful effects? Where intention to harm can be proven, remedies often exist.”

Siva Vaidhyanathan, professor of media studies and director of the Center for Media and Citizenship, University of Virginia, wrote, "We used to have such penalties: Social shaming; loss of credibility and status; exclusion from the public sphere. Government should play no role in such dynamics, but government plays an important role in certifying the dependability of much scientific, economic and demographic claims. That should be defended and maintained.”

Mark Lemley, professor of law, Stanford University, observed, "While false facts that injure people (inaccurate drug ingredient information, say) can and should be punished, the government should not be in the business of punishing fake news.”

Nigel Cameron, technology and futures editor at UnHerd.com and president of the Center for Policy on Emerging Technologies, said, "Governments should have no role, and false and misleading speech needs to remain free. But, for example, websites/social media companies have their own free speech rights and can excise/label as they choose.”

Alexis Rachel, user researcher and consultant, said, "There needs to be a cultural shift wherein spreading of false news is looked on as a heinous and dangerous act, versus the current ambivalence. I'm not sure what the government can or should do with regard to this, except lead by example.”

Jennifer Urban, professor of law and director of the Samuelson Law, Technology & Public Policy Clinic at the University of California Berkeley, wrote, “We already have laws against fraud, defamation, harassment, etc. Those are good models; we need to find a way to scale them. Government's role should be to pressure other state actors that support or engage in spreading misinformation, to enforce the law, and to avoid spreading misinformation itself. We could also consider reviving the Fairness Doctrine, which would require that multiple viewpoints be presented, though this only applied to broadcast license holders. Beyond measures like these lies a very slippery slope towards government censorship. We should also ask about the role of corporate actors - it is Google, Facebook, Twitter, et cetera, that actually make many of the relevant decisions today.”

J. Nathan Matias, a postdoctoral researcher at Princeton University, previously a visiting scholar at MIT Center for Civic Media, wrote, “The most powerful, enduring ways to limit misinformation and expand the use of civil liberties is by growing our collective capacities for understanding. In my research with large news-discussion communities for example, encouraging people toward critical thinking and fact-checking reduced the human and algorithmic spread of articles from unreliable sources.”

Barry Chudakov, founder and principal, Sertain Research and StreamFuzion Corp., said, “Since governments and administrations within governments can employ the strategic practice of knowingly spreading false information, we should institute laws with stiff penalties for spreading false information – but we should not make governments themselves our sole information watchdogs. Nor should the governments be charged with anything more, or less, than enforcing good laws that prohibit knowingly spreading false information. Just as electricity in the US is regulated at the federal, state, and local levels, information is now a force like electricity and needs independent oversight with checks and balances, of course, but with some recourse to signal the deliberate spread of false information and some power to stop it.

"Penalties for those found to have created or knowingly spread false information with the intent of causing harm should be at least as severe as a class B felony (punishable by up to 20 years in prison, a fine of up to $20,000, or both.) News sources (i.e., CNN, The New York Times, Washington Post, The New Yorker, et cetera) should hold a news reliability summit and devise what might be termed a ‘reliability index.’ This could be as detailed as necessary, but should include a flag – a visible, graphic component – that can appear online, in print, and broadcast next to facts, quotes, or statements of policy. Like the American constitution, there should be a means to amend or improve the reliability index.

"Once that is in place, each piece of information, or definitive statement, can be assigned a level of certainty or reliability. For example, when 17 government intelligence agencies have agreed that there was Russian interference in the US 2016 elections, this statement, as information, would have a reliability index of 99 or 100 (scale 0-100). In other words, we need to establish a standard, a common language, which incorporates shared goals: honesty, transparency, accuracy, truth. Talk and point-counterpoint discussion, while useful and interesting, is not a standard; we need something more definitive and concrete.

"We have standards of measurement in the food industries and commerce, we have standards of disease and wellness in healthcare, we have standards of tolerance and capacity in civil engineering and aerospace. We can establish standards for information. With a meaningful standard in place, we can establish penalties for violations of the standard(s). Those who create or knowingly spread false information will hopefully be caught in the reliability index; they will be ‘outed’ and notified that the best news sources in the world have determined their information is not honest, transparent, accurate, and truthful. If public notification is not sufficient to stop these sources from knowingly spreading false information with the intent of causing harmful effects, legal penalties can be brought to bear. These might include losing publication rights, losing their on-air license, or other sanctions.

"A free press should be able to govern itself without government interference, so the pillars of the press community should establish and jealously guard the integrity of a reliability index. If, however, there are instances where the free press is incapable of stopping the spread of false information – if there is a totalitarian-style attempt to crack down on free expression or certain types of data or information – we need to establish clear sanctions and penalties to deter any authority or other entity from designing and spreading misinformation, a trendy word for lies. Information is the lifeblood of democratic institutions. Without trustworthy, reliable information on which to base real-world decisions about crucial issues such as climate change, the spread of nuclear weapons and fissile material, or global biodiversity – democracy will die.”

Bill Woodcock, executive director of the Packet Clearing House, wrote, "This is the ‘crying fire in an opera house’ abuse, generally. The problem is in the transnational nature of the commission of the crime; in the country of origin, it may be a patriotic act, rather than a criminal one. And nations will never curtail their own Westphalian sovereign ‘rights’ to define what actions are criminal within their borders.”

Jerry Michalski, futurist and founder of REX, replied, "I am not a lawyer, but current laws regarding freedom of speech and harmful speech give us a lot to work with. The problem is the anonymity and superconductivity of the Net, along with the global trust implosion. Governments need to address trust more directly.”

David Brake, a researcher and journalist, replied, "It depends on the ‘harmful effect’ sought. If the intent is to incite hatred of others it should be dealt with through hate crime legislation (present already in most countries) or anti-bullying legislation. If ‘merely’ political then simple ridicule by a free press is the best we can hope for.”

Esther Dyson, a former journalist and founding chair at ICANN, now a technology entrepreneur, said, "There should be some application of legal penalties, but very carefully. The government should run the courts; the people should file lawsuits. There is also a regulatory role for the Federal Trade Commission and the like.”

Howard Rheingold, pioneer researcher of virtual communities, longtime professor and author of "Net Smart: How to Thrive Online," said, "Criminal penalties might infringe on free-speech rights, but identifying known liars and purveyors of false information and providing links to proofs that their information is false - enabling reputation to enter the equation - could be helpful. But then the smartest purveyors of false information will shift identities.”

Mark Bunting, visiting academic at Oxford Internet Institute, a senior digital strategy and public policy advisor with 16 years' experience at the BBC, Ofcom and as a digital consultant, wrote, "The role of government is to ensure that the intermediaries who operate our information environments do so responsibly, in a way that takes appropriate account of the competing interests they must balance, that provides opportunities for appeal and redress, that is driven by consumers' and citizens' rather than purely commercial interests. It is not governments' job to try to specify in micro-detail what content should and shouldn't be allowed.”

Christian H. Huitema, past president of the Internet Architecture Board, commented, "I would not like to have to write such laws.”

Seth Finkelstein, consulting programmer with Seth Finkelstein Consulting, commented, "[There is] a system of institutional incentives that promotes profitable misinformation over unprofitable but true information. The following sentence encapsulates the problem: There needs to be a business model for truth. I'm reminded of the legend, which is completely untrue, that Fox News is supposedly banned in Canada because ‘it's illegal in Canada to lie on airwaves.’ Are those proposing penalties for having ‘created or knowingly spread false information’ willing to apply them to a large amount of lobbying, campaigning, and, sadly days these days, many media organizations? If so, there are major problems, not the least that such a proposal would go against much of the legal protection for freedom of speech in the Western world. If it's proposed to apply narrowly, then by definition it's only making a few fringe players miserable. Consider Tom Paxton's 1964 song ‘Daily News’: ‘Don't try to make me change my mind with facts / To hell with the graduated income tax / How do I know? / I read it in the Daily News.’ It's tempting to dismiss the problem as always with us. But it's also distracting to focus only on scapegoat outliers who are safely removed from positions of power.”

Michael Marien, senior principal, The Security & Sustainability Guide and former editor of The Future Survey, wrote, "False information generally does not intend to harm, but to promote certain interests (e.g., the oil/gas/coal industries) and hard-right ideologies. I doubt that government can or should try to prevent false information, especially because the current administration shamefully dispenses so much of it. However, ‘crap detecting’ should be a major concern for education at all levels. And what about the pussycat press: why aren't they demanding evidence for questionable assertions and examples of so-called ‘fake news?’”

Paul M.A. Baker, senior director of research for the Center for Advanced Communications Policy, said, "The consequences and penalties of knowingly spreading false information is a tricky balance. In a private setting, operation of market mechanisms/self-regulation would seem to viable approach; in a public setting the balance between free speech and protection of vulnerable populations must be maintained. For willful promotion/distribution of dangerous or harmful material it would seem that the judicial process is appropriate. Use of regulatory mechanisms while possible run the risk of stifling both dangerous, as well as unpopular speech. The latter could be a case of criticism of an administration which might be valid.”

Laurel Felt, lecturer at the University of Southern California, “It might be difficult to prove intent. But assuming that one could prove harmful intent, then the government would need to create some sort of policy that condemns such an action. The government body responsible for investigating and prosecuting such cases might be the Federal Communications Commission? Assuming prosecution that culminates in conviction, the penalty could be a fine, I suppose.”

Jonathan Brewer, consulting engineer for Telco2, commented, "Dangerous speech is not a phenomena unique to the internet. Existing regulations and programs around the world may need to be updated or enhanced, but I don't think any new penalties need be established.”

Michael R. Nelson, public policy executive with Cloudflare, replied, "No penalties or fines (after all we have a First Amendment). But governments can encourage self-regulation like the codes of ethics that have guided journalists for more than 100 years. Attempts to ‘make the Internet safe and orderly,’ like the July 2017 German law on hate speech and ‘dangerous speech,’ are overly broad and would certainly be unconstitutional in the US.”

Jon Lebkowsky, web consultant/developer, author and activist, commented, "This question suggests a slippery slope we might want to avoid. The one thing the government has done before and might do again is a ‘fairness doctrine.’ However involving the government in managing information accuracy or quality invites the potentially greater problem of censorship.”

Leah Lievrouw, professor in the department of information studies at the University of California-Los Angeles, observed, "There are already legal sanctions, even in societies with strong free-speech traditions, on particular classes of information that cause harm: fraud, libel, slander, incitement and so on. These should be revisited and adapted to the online social context. However, I would be very cautious about establishing other, particularistic types of ‘harms’ that are invoked to restrict speech and information more broadly: blasphemy, disrupting ‘public order,’ laissez-majesté rules against insulting states or rulers, even some instances of hate speech. The difficulty is balancing individual sensitivities and the wider interest in a diverse, pluralistic, and sometimes disputatious, society.”

James LaRue, director of the Office for Intellectual Freedom of the American Library Association, commented, "I suppose libel/defamation laws provide some guidance: a finding of deliberate harm has financial penalties. The government role: rule of law, incorruptible courts. The adequate funding of public libraries to provide sufficient and timely resources to investigate claims.”

Tim Bray, senior principal technologist for Amazon.com, said, "Existing libel and slander laws are adequate. Canada has anti-hate legislation but its effectiveness has really yet to be established.”

Joseph Turow, professor of communication, University of Pennsylvania, commented, "If such penalties were created and enforced many public relations executives would arguably be liable to prosecution. And the notion that government officials would lord over decisions about the facticity of news often about them or the parties is laced with conflicts of interest and threats to democracy.”

Jack Park, CEO, TopicQuests Foundation, noted, "Penalties should fit the nature of the measured harm. Government playing roles in this context raises issues like: who gets to decide what is and is not "false" information? In my view, if there is a role, it should be that the government funds, in the same way it funds biomedical research, ways in which to increase public engagement in civic activities, some of which include crowd-sourced, role-playing-game-based global sensemaking.”

Johanna Drucker, professor of information studies, University of California-Los Angeles, commented, "We have methods of meting out punishment for lying in financial, legal and medical realms (or used to, they are being quickly stripped away). Why not create similar laws and liability statutes for information? My concern about government controls comes from observation of current trends in the Trump administration to control discourse through intimidation, closed briefings, strategic release of misinformation as if it were official - or as official - statements. The checks and balances built into the relationships among the judicial, legislative, and administrative branches of American government are still essential. No single branch should have any exclusive powers over information or it will lead to abuse (of those powers and of information).”

Peng Hwa Ang, a global internet policy expert researching this topic at Nanyang Technological University, observed, "Much must depend on the level of harm intended. False advertising, for example, has led to fines. It will also depend on the benefit that the creator hoped to gain. If it is financial, then businesses should be alerted. If it is political, it should be pointed out. The main thing is a name-and-shame approach. The government should not have the final say because history suggests that such an approach has tended to lead to abuse of the power by governments.”

danah boyd, principal researcher, Microsoft Research and founder, Data & Society, wrote, "What kinds of harm? Which governments? What's at stake is far more complex than is implied here. We're talking about jokesters engaging in similar practices as nation-states, profiteers using the same techniques as ideologues. For example, all governments are engaged in these practices and one could argue that they're information operations practices are harmful.”

Susan Etlinger, industry analyst, Altimeter Research, said, “It depends on the context. Are we talking about antibiotics? Children's toys? Or taking down a government? There already are guardrails in effect in many countries to protect the integrity of products, services and institutions. I don't believe we need to reinvent all of those institutions. Rather, organizations that protect public health – food and drugs, and the electoral process, among others – need to account for and guard against their specific vulnerabilities to misinformation.”

Daniel Alpert, managing partner at Westwood Capital, a fellow in economics with The Century Foundation, observed, "Government cybercrime-efforts should track down and confront malefactors and seek to shut them down or block them. But there has to be a transparent judicial process to oversee such efforts.”

Andreas Birkbak, assistant professor, Aalborg University, Copenhagen, said, "Governments should use carrot more than stick and try to cultivate a culture that cares about facts without expecting facts to be universal truth.”

Marc Rotenberg, president, Electronic Privacy Information Center, wrote, "As the problems are structural, the remedies must also be structural.”

Sebastian Benthall, junior research scientist, New York University Steinhardt, responded, "Companies should be subject to penalties for deceptive practices as under a strong FTC regime. Defamation should be punished under the relevant laws. And so on. There is ample precedent for the role and limits of government in confronting false information.”

Henning Schulzrinne, professor and chief technology officer for Columbia University, said, "I don't see how penalties can be administered except in cases of libel for non-public figures.”

John Laprise, consultant with the Association of Internet Users, wrote, "There will be reputational harm, there should be no civil/criminal penalties.”

Ray Schroeder, associate vice chancellor for online learning, University of Illinois-Springfield, replied, "We may need to interpret the libel and slander rules to include knowingly disseminating false information with the intent to wrongfully influence political and policy decision making for personal gain or profit. Media may choose to focus reporting on statements delivered through legislative venues in which contempt proceedings can be initiated for knowingly false and misleading statements.”

Sonia Livingstone, professor of social psychology, London School of Economics and Political Science, replied, "I'd treat it like we do the incitement to racial hatred. If there is intent to harm, then the penalty should reflect the intended or actual harm. This must be done by governments not companies, as government is (should be!) accountable to its people.”

Tanya Berger-Wolf, professor at the University of Illinois-Chicago, wrote, "We already have most of the penalty system for intentionally harming somebody, including with misinformation (libel, false advertising, identity theft, et cetera). The intention is very hard to prove. However, the punishment, as always, should be commensurate with the resulting harm.”

Angela Carr, a professor of law based in North America, replied, "I would like to see government step up efforts to enforce the laws against unfair or deceptive marketing practices. Also, I also think more should be done to protect speakers that others try to silence through threats and intimidation. I would like to see Citizens United overturned and effective campaign reform legislation. Beyond these efforts (which certainly seem unlikely in the present environment) I think it is difficult for government to prevent distribution of false information. Not only is it difficult to know whether information is true or false, but it is even more difficult to determine the speaker's intent. Government can, however, encourage the dissemination of accurate information by supporting public broadcasting, and other non-profit organizations that seek to genuinely inform the public.”

Michael J. Oghia, an author, editor and journalist based in Europe, said, "There should be some form of criminal procedure for this, which includes a fine or other appropriate penalty. Depending on the intended effect, prison or internet restrictions could also be options, but governments and law enforcement would have to be involved in this process. I also fear that empowering these two stakeholder groups with such power could be used against, say, minorities and other disadvantaged groups.”

Veronika Valdova, managing partner at Arete-Zoe, noted, "False statements distributed to official authorities as a witness statement qualify as perjury. Adverse information identified during background checks hinders the individual’s ability to find gainful employment, get a security clearance, or obtain a visa. If such information turns out to be false, this may be the grounds for a civil suit. Currently, pursuing such suits is difficult because the victim is rarely able to prove the nature and origin of such information and prove a causal relationship between a specific piece of information and rejection. The spread of illegally obtained surveillance material, personal health records, and other sensitive material is illegal under specific laws in most jurisdictions. The right to due process may be the answer. Resolution of such disputes generally belongs to courts. The role of governments is to ensure the resilience of their systems and rigorous assessment of evidence and the prevention of abuse. The penalties can range from shutting down a single website to no-fly lists for specific individuals.”

Jamais Cascio, distinguished fellow at the Institute for the Future, noted, "The penalties should be essentially a ‘scarlet letter’ – a tag or flag or some kind of transparent labeling that identifies the person as an intentional purveyor of falsehoods. Government would likely have to play a role in universalizing a system, but you'd likely have multiple alternative bodies putting out tagging guidelines. A ‘scarlet letter’ of sorts identifies the perpetrators as purveyors of dangerous and false facts to any who might interact with them, along with cultural norms that shame the perpetrators, even if they are ideologically friendly.”

Andrew Odlyzko, professor of math and former head of the University of Minnesota's Supercomputing Institute, observed, "Some extensions of the current criminal law.”

Nathaniel Borenstein, chief scientist at Mimecast, commented, "Penalties should be severe, including substantial jail time and fines. But I expect this to be unenforceable across international boundaries.”

Alexios Mantzarlis, director of the International Fact-Checking Network based at Poynter Institute for Media Studies, commented, "I would be very very very wary of restrictive government intervention in this space. The media, tech companies, schools and the public all have a lot do before we hand this over to governments. Governments should for the moment limit themselves to educational initiatives and encourage research/debate on this topic.”

Alan Inouye, director of public policy for the American Library Association, commented, "We already have some well-established laws, such as the rubrics of libel and slander. Perhaps these rubrics need to be revised.”

Jim Warren, an internet pioneer and open-government/open-records/open-meetings advocate, said, "Accuracy of information is NOT binary. It is a continuum. Additionally, proving intent makes legal or governmental penalties VERY difficult; even moreso when government agents, themselves, are the perpetrator. If there is substantive harm from such disinformation, defined by law, then those same laws need to include penalties and enforcement procedures. We have ample precedent for this (difficult!) situation, in the form of slander and libel laws.”

Janet Kornblum, a writer/journalist, investigator and media trainer, replied, "We do have a pretty good First Amendment here and strong libel/slander laws. We should simply enforce them (for instance, you can already be prosecuted for causing a riot). Besides, many of the lies spread online come from outside the United States. I do wholeheartedly back sanctions against countries known to hack and spread misinformation.”

Glenn Edens, CTO for Technology Reserve at Xeroz/PARC, “There should be penalties similar to libel or fraudulent transactions, government should play a role similar to the laws governing consumer protections.”

Deirdre Williams, retired internet activist, replied, "Being publicly shamed and then ostracised, but those penalties are very hard to exact.”

Serge Marelli, an IT professional who works on and with the Net, wrote, "They should be sentenced to prison for a limited time and be forced to publicly retract and correct any lies. Also, they should be barred from running for public offices. Nowadays, they get elected to be president.”

Geoff Scott, CEO of Hackerati, commented, "Knowingly spreading false information is a form of taking away people's right of self-determination and it is an extremely heinous act that should be severely punished. On the other hand, what constitutes ‘false information’? The First Amendment exists for a reason; but some forms of speech are not protected, and these have been clearly defined. How would ‘False Information’ be defined in the context of the First Amendment?”

Garth Graham, an advocate for community-owned broadband with Telecommunities Canada, said, "Since the governors (i.e., external authority), are primary users of public relations manipulation, giving them a role in regulating distribution, is like giving the insane the control of the asylum.”

Philip Rhoades, retired IT consultant and biomedical researcher with Neural Archives Foundation, said, "Separate, non-falsifiable networks need to be established as alternatives. It is not going to be possible contain the powerful ‘bad actors’ who basically own the system.”

Edward Kozel, an entrepreneur and investor, replied, "Only changes to social behaviour will/can address the dire situation: any such changes will require a degree of social judgment or even shame (i.e., morality). A difficult subject for government indeed, but changes to our educational system (comprehensive) that include and are embraced by society can bring about such societal changes.”

Alejandro Pisanty, a professor at UNAM, the National University of Mexico, and longtime internet policy leader, observed, "The first and most important role of government in this respect is to promote education and support spaces for open, healthy, civil debate. Basics as mandatory vaccinations and science and logic in schools have to be provided as an infrastructure of trust. Unequivocal support for science, and the prosecution of bad actors such as phony medical treatment providers will help keep false information in check. That is, the action is on all fronts, not only on the news front.”

Brian Harvey, teaching professor emeritus at the University of California-Berkeley, said, "Throughout history, governments have been among the most prolific creators of fake news. If I could choose between eliminating Breitbart and eliminating the CIA, I'd definitely choose the latter. Not only does the CIA have a bigger budget, but they are better at creating plausible misinformation. When people believe things like that pizza parlor story last year, the biggest problem is not the story itself, but rather the social conditions that leave people so (rightly) mistrustful of social institutions that they find the story plausible. Trump wasn't elected by Breitbart; he was elected by the 2008 bank crash and the government's response to it.”

David Conrad, a chief technology officer, replied, "They should be similar to those for false advertising, perjury, and/or libel depending on context.”

Iain MacLaren, director of the Centre for Excellence in Learning & Teaching, National University of Ireland-Galway, commented, "The obvious issue here is that 'government' is not a neutral construct. Governments are run by political parties with a vested interest in retaining power. Of course, in many 'developed' democracies the means by which power is obtained and maintained are considerably more subtle than in more unstable states, but that does not mean that unreliable information, innuendo, smears, leaks, et cetera, are not part and parcel of such political systems. I value the fact that the blatant, extreme cases of misinformation/lies on the internet we are all discussing now raises the broader question about the reliability and veracity of mainstream information providers. Not to mean we should trust no one, but that the requirement to critically examine claims, to challenge statements, and to engage in debate is now even more recognised.”

Stephen Bounds, information and knowledge management consultant, KnowQuestion, noted, "I would support the establishment of on-the-spot fines for certain classes of information infractions. In a similar manner to speeding fines, grossly defamatory or insulting speech could be subject to an on-the-spot fine by a suitably constituted law enforcement body. Given the massive cultural change this would involve, I would recommend a graduated approach with either warnings or points used to encourage behavioural modification without immediate financial penalty. This could be used separately or in conjunction with 'disclose or remove' laws, where a person responsible for a post could be compelled to modify it to identify themselves and any financial incentives received in relation to that speech, or to remove it from publication. Both approaches encourage personal responsibility to the circulation of socially inappropriate information without outright censorship. The complications of anonymous speech are not insurmountable, since the most problematic free speech exists on highly trafficked platforms where there is a clear corporate body to engage with for assistance in enforcement of notifications and user identification.”

Charlie Firestone, executive director, Aspen Institute Communications and Society Program, commented, "We should guard against state censorship, or even corporate censorship that becomes equivalent to state censorship. As for knowingly spreading false info with the intent to create harm, there can be civil actions from those harmed, like libel allows someone to sue for damages.”

R. Lee Mulberry, managing partner, Northern Star Consulting, said, "Organizations should evaluate the situation and punish as appropriate. The government should be involved if a person feels they have been injured and files a lawsuit. If we use a government ‘watchdog’ method we risk suppression of free speech – a very slippery slope.”

Mercy Mutemi, legislative advisor for the Kenya Private Sector Alliance, wrote, “I don't think a legal approach is the best solution. While spreading lies is immoral, it is not illegal (unless one is under oath or under an obligation to tell the truth). Government regulating fake news is also quite dangerous as this power could be abused to silence critics and the opposition (the truth is relative when it comes to politics). A better approach would be to educate the public on what fake news and how to spot it. If fake news has no consumers, it may die down.”

Larry Keeley, founder of innovation consultancy Doblin, observed, "We should spank them and send them to bed without any supper. ;-) The solution to bad information is always better information.”

Jan Schaffer, executive director of J-Lab, said, "Access to public airwaves should be rescinded for those who spread false information. Antitrust actions should be taken against companies aggregating scores of local television stations, with an intent to broadcast false, political information. No one company should be allowed to own the majority of local television outlets. Owners must demonstrate diversity of viewpoints, fairness and balance in coverage. I'm even OK with tax credits or tech or other companies who develop viable verification or monitoring solutions. Or perhaps take a cue from NASA and organize national competitions for best ideas.”

Steve Axler, a user-experience researcher, replied, "The government should monitor for illegal activities but not beyond that. It is the price you pay for freedom of expression.”

Nick Ashton-Hart, a public policy professional based in Europe, commented, "Penalties should be proportional to the ability to harm, and government should step in only to the extent that civil or criminal action to redress harms done are appropriate and proportional.”

Scott MacLeod, founder and president of World University and School, replied, "In the context of the USA, build on developing US law in terms of penalties. In the context of war, and regarding the US government - same. Engage other democratic countries' governments, history, and laws in these regards - and build a developing generative conversation.”

Eleni Panagou, cultural informatics and communication scientist at Arwen Lannel Labs in Greece, wrote, "There should be judicial penalties.”

Jack Schofield, longtime technology editor at The Guardian, now a columnist for The Guardian and ZDNet, commented, "You can't fine or lock people up for spreading false information, because there's too much information to fact-check, and because it's sometimes quite hard to separate fact from opinion. Worse, giving government a role is potentially dangerous. The current US administration and the Republican Party, for example, distribute a lot of false information, so the problem isn't restricted to countries such as Russia, China and Turkey.”

Morihiro Ogasahara, associate professor at Kansai University, said, "Basically compensation or banning their accounts on platforms would be enough, except for intelligence. Government should NOT try to control online discourse.”

Ross Chandler, principal network architect for nir, said, "Public exposure. If the information is provably false in court they should face legal sanction for harm done.”

Sahana Udupa, professor of media anthropology at Ludwig Maximilian University of Munich, wrote, "Existing laws in the country, but with a strong global stakeholder oversight to avoid misuse.”

Uta Russmann, a professor whose research is concentrated on political communication via digital methods, wrote, "The problem is that we have and should have ‘free speech!’”

Kenneth R. Fleischmann, associate professor at the University of Texas- Austin School of Information, wrote, "I am not sure that I would trust any government to carry out such prosecution in a politically neutral manner, including the current US administration (or even the previous ones).”

Laurie Rice, associate professor at Southern Illinois University-Edwardsville, said, "Attempts at prior restraint would likely violate First Amendment rights; libel and slander case law could offer a model for fighting harmful effects.”

Jim Hendler, professor of computing sciences at Rensselaer Polytechnic Institute, commented, "The knowing creation and spread of information that is both provably wrong and done with malicious intent needs to have strong penalties via court of law. Government's role in this would amount to a censorship that would likely be unacceptable – the key phrase is ‘intent of causing harmful effects’ (illegal might be a better word than harmful) would be what need to be enforced via civic mechanisms and courts.”

J. Cychosz, a content manager and curator for a scientific research organization, commented, "Fund research to understand how to mitigate false news and societal impact. If the false news makes one an accomplice to a crime or the equivalents to screaming fire in a theatre then current legal penalties would apply.”

Liam Quin, an information specialist with the World Wide Web Consortium, noted, "Take away heir shoes! Seriously, though, hate speech, libel, slander, are already illegal here in Canada and that's a good thing, although only because we have precise and limited definitions of those crimes. Again, the responsibility to build community overrides the individual’s right to freedom of speech here.”

Pete Cranston, knowledge management and digital media consultant, replied, "As now, penalties within a national legal framework according to the seriousness of the offence and the consequences. Government must therefore be central to monitoring and management of information and communication channels.”

Mark P. Hahn, a chief technology officer, wrote, "The best penalty is to be exposed. There already exist legal and social frameworks for recourse and while they may be slow to adapt they are catching up. They will always trail but in aggregate people will figure it out and respond. A centralized, power-concentrating response will be knee-jerk and worse than the original problem. That does not mean we should not address and discuss the problem, just that envisioning a single plan of action will fail. The issue that this survey envisions needs a vast dispersed diffused response with lots of different approaches to be tested and transparently evaluated by individuals, not by centralized power brokers.”

Scott Guthrey, publisher for Docent Press, said, "Neither 'false' nor 'harmful' with respect to written words can be sufficiently well-defined to stand a legal challenge. In the era of micro-aggression even truth seems to be capable of causing harm.”

Andrew Dwyer, an expert in cybersecurity and malware at the University of Oxford, commented, "From the perspective of the UK/Europe – we already have systems in place that allow for misinformation to have penalties. Yet these have not been routinely applied online thus far. Developing a body of case law could be a productive way forward. Government roles are and should always be limited – and they should always be lead by an independent judiciary – so that we could have an agency, but not one that hands out things without some form of judicial approval.”

Dean Willis, consultant for Softarmor Systems, commented, "I'm in favor of exposure and ridicule. You know, like they did to Darwin after he launched that ridiculous theory. Oh wait, he was right.”

Axel Bruns, professor at the Digital Media Research Centre, Queensland University of Technology, commented, "Possible penalties could range from temporary social media bans to imprisonment, but how these are applied is a matter for the judiciary. The fundamental principle, however, must be that legal penalties are designed to promote rehabilitation rather than exact revenge; simply locking up trolls and propagandists merely makes martyrs out of them. Government is clearly central here, as it is to all aspects of society: it must get better at sensibly regulating traditional, digital, and social media platforms and channels, rather than vainly believing in market self-regulation; it must develop a much better understanding of contemporary media platforms amongst policy-makers, law enforcement, and the judiciary; and most of all it must develop far more proactive means of promoting media literacy in society.”

Sharon Tettegah, professor at the University of Nevada, commented, "The government as a whole should not be allowed to play any role. We need a neutral advisory board that is not influenced by lobbyers and financial incentives to be in control of the information distribution.”

Alladi Venkatesh, professor at the University of California-Irvine, replied, "This is not easy. It should not become a legal issue.”

John Lazzaro, a retired electrical engineering and computing sciences professor from the University of California-Berkeley, wrote, "Laws presently on the books cover these issues well. After all, ‘yelling fire in a crowded theatre’ is knowingly spreading false information with the intent of causing harmful effects. The relevant issues are defining ‘false’ and ‘harmful,’ and that's the purview of a judge and jury.”

Paul Hyland, principal consultant for product management and user experience at Higher Digital, observed, "The only role for government is in the areas of threats, harassment, endangering, et cetera – with very few exceptions, simply enforcing existing rules applied to a new medium. So penalties would be to reputation, or in extreme cases access, and would be enforced by service provider according to terms of use of their services, not the law.”

Mike Meyer, chief information officer at University of Hawaii, wrote, "The role of government is critical to the preservation of valid public information based on federal criminal law.”

Amber Case, research fellow at Harvard Berkman Klein Center for Internet & Society, replied, "Governmental regulations might not be able to fully curtail the spread of fake news, as it relies on the emotional impulses of consumers. However, reducing payment incentives through advertising revenue could curtail the spread of fake news. Forcing a pause before spreading or reacting to content could also help. If an individual is found to spread false information and can be identified, then perhaps their ability to post and make revenue could be taken away, but this will not prevent them from operating anonymously. Some education for consumers could help, but this is not a problem that one government can solve. There are many nations and locations at play here, and there is not a "one size fits all" punishment or law that could be enacted to curtail behavior. It could be made less convenient or profitable for the original poster, or the social networks in question could send a follow up note to all who reposted or reacted to the message with a note that the message in question is fake news, educating the recipient and amplifier on why it was fake news. That way each piece of fake news shared could become an educational moment.”

George Siemens, professor at LINK Research Lab at the University of Texas-Arlington, commented, "Penalties should be social, not government mandated. For example, there are libel laws, but most gossip isn't handled through that legal model. Most is social awareness in networks that results in a softer pressure.”

Matt Mathis, a research scientist who works at Google, said, "It is only necessary to limit the scope of the distribution.”

Charles Ess, a professor of media studies at the University of Oslo, wrote, "The penalties should be severe. Rights to freedom of expression have always recognized that speech intended to generate harm is NOT protected speech. As has become manifest over the past two decades, the international corporations controlling most of our communication media have little incentive to regulate or control harmful speech: the more clicks, the better, etc. Democratically elected and responsible governments - i.e., ones that citizens constantly call into account – are the only institutions capable of policing and regulating harmful speech.”

Philip J. Nickel, lecturer at Eindhoven University of Technology, said, "This is a very difficult question and a model penal code would need to be drafted and discussed among relevant stakeholders and legal experts.”

Miguel Alcaine, International Telecommunication Union Area Representative for Central America, commented, "Penalties will be equivalent between online and offline. However, determining, minimizing and eliminating false information is much more difficult in the online world. And that is because usually are a few jurisdictions and legal frameworks applicable in a case.”

Shane Greenstein, professor at Harvard Business School, noted, "Repressive governments already perform this activities. I am skeptical that the actors who represent Democratic governments can get into related activities without favoring their own viewpoints. Just imagine giving the present administration such rights? It would not turn out well.”

Sharon Haleva-Amir, lecturer in the School of Communication, Bar Ilan University, Israel, said, "These actions should be criminalized and people should stand to court for knowingly spreading false information that could even cause wars between states. BUT as for the governments desired positions in these cases - it is not that simple because governments may take an active part in it as governments can sometimes serve as those dispersing the false information... so it is quite difficult to figure out what is the appropriate role for the governments in these situations... as a matter of fact I believe the ecosystem should also include NGOs as well as international organizations to balance the powers of states and governments to spread false information.”

Giacomo Mazzone, head of institutional relations for the World Broadcasting Union, replied, "Governments and judiciary are the last resort. Algorithms, media and ad-spenders need to take on most of the job. And the school system comes before than anybody else.”

Sandro Hawke, technical staff, World Wide Web Consortium, noted, "I don't know exactly why the existing rules concerning fraud and libel are failing us. It might be about anonymity. It might be about jurisdictional boundaries. It might be lack of training for law enforcement. It might just be society is reeling, trying to adapt to a new set of problems. I doubt we need more-severe penalties. We probably need to look at making traditional consequences still enforceable, even with the new technologies. Most of the time, it shouldn't need to rise to the level of law enforcement, though.”

Ryan Sweeney, director of analytics, Ignite Social Media, wrote, "The government should not work to actively stop or control the flow of false/damaging misinformation. While it might sound like a good idea in theory, that kind of power won’t always live under management of those with good intentions. We should never sacrifice our freedom or our values because we don’t know who will use it against us in the future. Additionally, the Internet is a global network and one government tackling misinformation would not be enough and could only serve to silo said country from the rest of the world (i.e., North Korea).”

Jesse Drew, professor of cinema and digital media, University of California-Davis, commented, "Shaming!”

Paul N. Edwards, Perry Fellow in International Security, Stanford University, commented, "Obviously the degree of harm is important here. Penalties for spreading false/misleading/deliberately provocative information should be harsh for anything leading to injury or death. It is not clear to me that government intervention in news reporting is ever a good idea; this capability is too easily turned in favor of those in power.”

Julia Koller, a learning solutions lead developer, replied, "The penalties for spreading misinformation should match the penalties for similar crimes. For example, the ‘yelling fire in a crowded theater’ scenario shouldn't matter if it's in a theater, or on the internet. The consequences for harming another person should be the same. I don't support a strong government role in information protection, beyond enforcing laws that are written.”

O'Brien Uzoechi, a business development professional based in Africa, replied, "Appropriate and commensurate penalty should be meted to such individual(s) as could be decided by the recognizable governing body. But the person certainly should be made to pay or suffer for the pains and griefs he has wittingly determinedly caused others. Government should enact the appropriate laws in that regard, and also set up Technology Group to create and develop the right applications to monitor, evaluate and discover wrong inciting information distribution.”

Mark Glaser, publisher and founder, MediaShift.org, observed, "The penalties for those who spread fake news knowingly should be the same as penalties for libel and slander and those people and organizations should be prosecuted for their behavior. Governments should play a role in stopping this behavior in the same way they prosecute libel and slander.”

Philipp Müller, postdoctoral researcher at the University of Mainz, Germany, replied, "It is impossible to judge whether the intended effects of spreading a certain information are ‘harmful.’ Therefore, I see no way how this could be punished. If false information harms existing laws (e.g., against insult or demagoguery) the source should be punished according to these existing laws. If false information is spread with the intention of, e.g., political campaigning without harming any existing law, we cannot begin to punish this. This would undermine freedom of expression. The consequences of this would be harmful to a much greater extent than any misinformation could be.”

Darel Preble, president and executive director, Space Solar Power Institute, commented, "Establishing intent is not always possible. Currently, free and open communication MUST be encouraged and supported publicly. The media is grossly biased.”

David C. Lawrence, a software architect for a major content delivery and cloud services provider whose work is focused on standards development, said, "Figuring appropriate punishment or remediation even for physical violations of the social contract is difficult enough for me to provide suitably well-informed opinions about, without adding in my misgivings about matters like intangible thought crimes. To be honest though I'd be immediately suspicious of any sort of government involvement to ‘prevent the distribution of false information’ as I don't entirely trust that various governments are already a source of a fair bit of false information themselves.”

Francois Nel, director of the Journalism Leaders Programme, University of Central Lancashire, noted, "Guidelines should come from existing penalties for hate crimes, which government agencies should (continue to) enforce in line with the law.”

G. Hite, a researcher, replied, "False information has always been spread - urban legends and hoaxes passed on verbally and then through emails - now on social media. The gullible can live with their unfounded fears but the perpetrators should be subject to some type of consequence if it is slanderous against a business who could lose profits as a result. There should be a segment of the government that is watchful.”

Denise N. Rall, adjunct research fellow, Southern Cross University, Australia, said, "First, the policies would have to muzzle people like Rush Limbaugh and Andrew Bolt. I don't quite see how that could happen at present. Further, penalties in the case of Julian Assange and Edward Snowden would be difficult to assign, as their information was not false, but classified. In my opinion, the worst harm of the WWW is found in the propagation of pornography, child sex crimes, and terrorist sites. When the government includes persons like Donald Trump and Paul Ryan (et cetera), it is difficult to leave these issues in the hands of US government. It will have to be implemented by a United Nations type of agency and stringently policed for corruption.”

Tony Smith, boundary crosser for Meme Media, commented, "There should be no penalties except possibly for those who have intentionally compromised the integrity of a delivery service to bypass voluntary filters. There will continue to be a role for standards organisation whose existence is endorsed through representatives processes, though not created at the behest of governments.”

Steven Polunsky, writer with the Social Strategy Network, replied, "It's just a tool. If used to further a crime, criminal action should follow. Otherwise, in and of itself no crimes are committed.”

Timothy Herbst, senior vice president of ICF International, noted, "It seems hard to regulate or criminalize the spread of false information. One can never be sure about the accuracy of some information. There is also the First Amendment to prevent such a thing. Transparency and public discourse need to rule the day here to help stem the spread of false information.” 

Carol Wolinsky, a self-employed marketing researcher, replied, “It seems to be settled law that government cannot prevent publication of information unless there is a clear and immediate public threat. Social media has difficulty because they cannot determine what will be written in advance. It's a perplexing problem and almost insoluble.”

Eileen Rudden, co-founder of LearnLaunch, wrote, "TC regulates advertising claims; perhaps other claims can also be tested (ProPublica does some of this today).”

Hazel Henderson, futurist and CEO of Ethical Markets Media Certified B. Corporation, said, "The US law may have to follow the lead of EU courts in firmer prosecution.”

Kelly Garrett, associate professor in the School of Communication at Ohio State University, said, "There are many existing penalties facing those who are caught knowingly spreading falsehood, from social ostracism to jail time. Perhaps a more useful question to consider is how we revise these penalties to account for new ways of spreading falsehoods, and new ways in which misinformation is being used. For example, there are important differences between falsehoods spread via social media bots versus those spread in the newspaper or water cooler gossip. It seems likely that our legal system needs to be updated to reflect these differences.”

Tom Wolzien, chairman of The Video Center and Wolzien LLC, said, "Generally civil, but key part of question is ‘intent of causing harmful effects.’ If harmful effects cause death, including going to war, then what's the difference from yelling fire in a theatre? Short of that, civil approaches along the lines of slander and libel laws, should suffice.”

Thomas Frey, executive director and senior futurist at the DaVinci Institute, replied, "In a world that is transitioning from national systems to global systems, we are desperately in need of a new global watchdog, one that perhaps most nation states are members of, to oversee the creation of policies, rules and enforcement around the globe.”

Daniel Berleant, author of the book "The Human Race to the Future," commented, " Those who provide an oath or affirmation to tell the truth could be legally penalized for violating it, so the legal structure required is to specify the precise requirements of people making such oaths or affirmations, and the legal penalties for violating the oath or affirmation.”

Stephan Adelson, an entrepreneur and business leader, said, "The enforcement of this type of oversight sounds like an impossible task. Ideally repercussions for spreading harmful untruths should exist but the expense of monitoring and pursuing those guilty would be immense. I can't imagine the government being put in the position of determining what truth is determining what harm is and then pursuing those they have determined to be spreading harmful untruths under their own definitions.”

Brian Cute, longtime internet executive and ICANN participant, said, "The law could develop such penalties. Intent and harm would be key elements in developing such penalties. I am skeptical that governments could play an effective role in preventing the distribution of false information. The government could provide a role in educating the public about this issue. Even in that context I am dubious that governments could do so in a disinterested manner.”

Mike O'Connor, a self-employed entrepreneur, wrote, "Try to take positive action rather than penalize. Repairing DNA seems a better approach than killing cells when it comes to cancer. Restoring habitat for birds seems a better way to control bugs than spraying insecticides. Let government lead the way by curing itself first.”

Willie Currie, a longtime expert in global communications diffusion, wrote, "Prosecution in terms of the law governing fraud. If there are jurisdictions where such laws do not exist they will need to be created. The rule of law within a constitutional framework. Fraud is fraud. Governments should ensure laws governing fraud are enforced; they should not get into a censorship role. If social networks distribute fake news and claim they are unable to stop it, then there are other remedies including unbundling them into smaller more manageable entities.”

Katim S. Toray, an international development consultant currently writing a book on fake news, noted, "Many countries already have laws against willingly providing or distributing false information. Given the global menace of fake news, efforts should be made to develop model laws for dealing with it.”

Wendell Wallach, a transdisciplinary scholar focused on the ethics and governance of emerging technologies, The Hastings Center, wrote, "Government provides one of the few means to punished wrong-doers, but when people elect those who exploit falsehoods, rationalize falsehoods, or declare falsehoods to be truth, relying on government to lead is foolhardy.”

Ian O'Byrne, assistant professor at the College of Charleston, replied, "The responsibility lies in the hands of government, businesses, and other organizations (some unseen) that work with these digital texts and tools. Ultimately, I know that (as I've stated in earlier questions) this will have no effect on changing the situation. Education is the only answer. This includes media and information literacy as well as emotional intelligence. If and when we actively make these literacies an integral part of our educational systems around the globe, will we develop a global populace that can notice and name these instances when they see them.”

David A. Bernstein, a marketing research professional, said, “My thinking is that such actions should carry similar penalties to those we impose on someone who yells fire in a crowded theater and people get hurt as a result. The individual or entity responsible can be prosecuted by the local government and those injured can sue for damages. Clearly this will put a large burden on our courts and therefore we may need a separate system just for these types of claims. Yes, I think the government must be at the front of the line in preventing false information. We depend on the government to protect us through the FDA and other regulatory agencies. Why should we expect any less when it comes to false information?”

Axel Bender, a group leader for Defence Science and Technology (DST) Australia, said, “If it was possible to prove the misinformer's malicious intent and his/her understanding of the scale of harm s/he would be able to cause, then the penalties should be proportional to the harmful effect; i.e., if {intended effect = kill people} and if {real effect = intended effect} then {penalty = penalty for murder}. The government's role should be in equipping its people with the means to identify false information (i.e., education, awareness campaigns et cetera).”

Monica Murero, a professor and researcher based in Europe, wrote, “The creator of false/harmful information should undergo the same penalties that the law foresees in ‘analogic’ environments. Any? And how will penalties vary around the globe? There will always be the chance to bypass rules if part of the solution is NOT technical as well. The real problem is not creating false info but their large circulation among networked of trusted individuals that do not verify info and naively contribute to the circulation over time. Even when the most trusted sources (or the perfect and reliable information system of the future) declares an information as ‘fake’ there will always be someone or a parent who start using Whatsapp that circulates ‘funny’ info that is totally fake (we already see today such phenomenon).”

Ned Rossiter, professor of communication, Western Sydney University, replied, "Penalties will vary according to the form or type of institution or organization found responsible for the generation of informational harm. Liberal democratic governments are still, to some extent, accountable. The large tech companies are not. Government is no longer autonomous from tech companies and indeed overlaps with them in many instances. So there is no single measure for what the penalties should be. But governments might make a start by being more upfront in what they do with public data they collect across the many 'services' they provide to publics.”

Andrew Feldstein, an assistant provost, noted, "The first question might be ‘Whose truth?’ I'm not sure I trust the government or any organization in this role. If people have to be open and transparent about what they say and who they are then, potentially, market forces such as a person's credibility and reputation would be at stake.”

Sandra Garcia-Rivadulla, a librarian based in Latin America, replied, "If someone does something really harmful that constitutes a crime, the same laws should be applied as for those committing crimes in the ‘real’ world. It is not the medium that should have the central place in the debate, but the action and its consequences.”

Randall Mayes, a fellow with AAI Foresight, observed, "It would be similar to other intentional acts. Why would it be different?”

Hanane Boujemi, a senior expert in technology policy based in Europe, commented, "The press code should include some kind of reference to fake news.”

Agien Nyangkwe, a journalist based in Cameroon, said, "Repeated false information should face the same sanction as defamation.”

Daniel Menasce, professor of computer science, George Mason University, replied, "I am not sure if penalties can be applied based on intent if no harm has actually been done. I do not think that our government should be in the business of preventing the distribution of false information. This would be akin to censorship.”

Dave Burstein, editor of FastNet.news, said, "None. ‘False information’ is so hard to define the term could easily include ordinary and sometimes accidental comments. For example, FCC Chairman Pai recently claimed $200M in CAF funding would connect the unserved in New York State. A close look at the program finds most of the money will not go to reaching the unserved. AT&T CEO Randall Stephenson told Commissioner Clyburn that special access regulation would meaningfully reduce 5G and investment in extreme rural areas; highly unlikely.”

David Manz, a cybersecurity scientist, replied, "We already have laws that cover this. From libel to public safety to POTUS safety. I am not sure we need new laws to address this, we simply need reason and common sense to apply the ones we have. Being a sleazebag or liar is not a crime, and it should not be. Causing a stampede or bodily harm in a public place is. And its already protected. Define harmful effects.”

Hjalmar Gislason, vice president of data for Qlik, noted, "People should be allowed to publish any nonsense they want without penalty. However, falsely attributing the information to trustworthy sources or in other ways disguising the mis-information to use the trust of third parties should be illegal. Making up misleading content is OK, making up misleading context is not.”

Lauren Wagner, journalist and experience strategist at Google, said, "It will be difficult to identify these individuals and punish them in the country where the harm was caused. For example, if fake news about American politicians was written and disseminated from computers Estonia, can they be punished in the US for influencing an election? I believe it's the responsibility of consumers and larger media organizations to filter information.”

Emmanuel Edet, head of legal services, National Information Technology Development Agency of Nigeria, observed, "Propaganda is an old word that predates the Internet era. False information with intent to cause harm is weaponized information and the consequences of using a weapon to cause harm should be applied. I think the best step government can take is to ensure that every information put out to the public is associated with the originator of such information.”

Joshua Hatch, president of the Online News Association, noted, "Making it socially unacceptable is one aspect, but ultimately the profit motive for lying and knowingly spreading false information needs to be removed or threatened with fines.”

Troy Swanson, a teaching and learning librarian, replied, "Hacking a computer network is very different from spreading false information. Free speech needs to remain paramount.”

Allan Shearer, associate professor at the University of Texas-Austin, observed, "This is difficult to answer because I believe there would need to be evidence of intent to deceive and that can be difficult.”

Evan Selinger, professor of philosophy, Rochester Institute of Technology, wrote, "This question can’t be answered adequately without thinking about the larger contexts where the intentional spread of disinformation has had profoundly negative consequences. A good place to start to think through this issue is Naomi Oreskes and Erik Conway’s co-authored book, ‘Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming.’”

Garrett A. Turner, a vice president for global engineering, noted, "It should be up to the people not the government to determine punishment.”

Michel Grossetti, research director, CNRS (French National Center for Scientific Research), commented, "The only solution is to disseminate true information.”

Maja Vujovic, senior copywriter for the Comtrade Group, noted, "We might need to charge known perpetrators who wittingly contaminate information high fees, as disincentives - provided we make it more expensive than whatever it is they earn by doing it. The government should reform education in such a way that all students understand the mechanisms and perils involved. The society will need to find ways to reduce the grip that politicians have over our institutions, given their loyalties are increasingly geared toward their respective ideologies and away from public interest, making information the first casualty of their biases.”

Stuart A. Umpleby, professor emeritus, George Washington University, wrote, "I assume that proving intent will be a challenge. The cigarette companies and oil companies are good cases to consider. Trump's stand against the climate treaty could be regarded as disregard of information showing harmful effects, also statements opposing the use of vaccines. I think fines or imprisonment may be appropriate if the scientific evidence is overwhelming. Such punishments would lead to more debate and understanding regarding scientific knowledge. I think that would be a step forward. Acknowledging that scientific judgments do change would be part of the discussion. What laws provide examples? Laws against slander and libel? Fines for criminal negligence? Class action lawsuits, which corporations are trying to get rid of? There are helpful precedents.”

William L. Schrader, a former CEO with PSINet Inc., observed, "Society has shunned those it deems to be liars for hundreds of years. People just stop listening to them. Whether they are named Machiavelli or Hitler or more recent leaders, eventually, social norms return. Society and government have stoned women for being witches, shall we return to those days? No. Don't be silly. Government should do nothing about liars, even when the lies induce harmful effects.”

Matt Armstrong, an independent research fellow working with King's College, formerly executive director of the U.S. Advisory Commission on Public Diplomacy, replied, "Our modern view of the First Amendment is perhaps 100 years old. The malicious creation and spreading of information is an intent that can be pursued, however this will not be successful tactic unless society and government are unified behind this approach. At present, the creation and spread of intentionally false and harmful information plays into a divisiveness that must be addressed first. It is very close to a chicken and egg conundrum, but we have to start somewhere.” 

Don Kettl, professor of public policy at the University of Maryland, said, "The penalty should be a diminishing in public role and power. Government should not play a formal sanctioning role.”

Dan Gillmor, professor at the Cronkite School of Journalism and Communication, Arizona State University, commented, "In a few cases this is already illegal. Expanding laws to punish speech is a step toward a police state. One key government role should be to give schools better incentives to teach critical thinking – specifically media/news literacy – as a core part of the curriculum.”

Raymond Hogler, professor of management, Colorado State University, replied, "The problem is that the proliferation of actors hinders the punishment of false dissemination of information.”

James Schlaffer, an assistant professor of economics, commented, "Nothing. It's freedom of speech. Although such instances of false information should require the offending organization to issue a ‘front page’ apology for a certain number of weeks to allow people to update their view of the organization's trustworthiness.”

Justin Reich, assistant professor of comparative media studies, MIT, noted, "The primary role of local government is developing school systems where students learn the information literacy skills needed to identify or verify fake news. The role of state and national governments will be to support curriculum development and research towards these ends. Sam Wineberg and colleagues at the Stanford History Education Group are doing important work towards these ends.”

Stowe Boyd, futurist, publisher and editor in chief of Work Futures, said, "What's the legal consequence of yelling 'fire' in a crowded theater? Or libeling or slandering others? We have laws in place that could be repurposed, reinterpreted for our modern times.”

Larry Diamond, senior fellow at the Hoover Institution and FSI, Stanford University, observed, "Digital platforms should take the lead in denying access or demoting in visibility sources that persistently, knowingly, and harmfully distribute demonstrably false information. Government intervention should be a last resort only when there is imminent threat to public safety.”

Kenneth Sherrill, professor emeritus of political science, Hunter College, City University of New York, said, “I'm a hard-core, empirical, quantitative scholar. We think that good information drives out bad information and that systematic liars are shunned. This is wishful thinking. I don't want the government to decide what information is false. Would you trust a government headed by Donald Trump to do this? The only answer is to be found in free speech and the marketplace of ideas. This is why I'm so pessimistic.”

Scott Fahlman, professor emeritus of AI and language technologies, Carnegie Mellon University, noted, "I would be reluctant to go much beyond the current libel laws. I think that shame and total loss of credibility are the best punishments, and still consistent with free speech for true dissenters.”

Fredric Litto, professor emeritus, University of São Paulo, Brazil, wrote, "Similar to the laws concerning murder and manslaughter there is a need for new laws which define minor offences (those which cause minimum harmful effects) on a sliding scale up to major offences (those which cause major harmful effects), with establishment of intent being an important factor in the degree of punishment. Government, federal, state and municipal, must take the leading role in creating and policing such new laws, since it is, by definition, impartial and has coercive powers.”

Kevin Werbach, professor of legal studies and business ethics, the Wharton School, University of Pennsylvania, said, "There is extensive First Amendment caselaw on actions equivalent to yelling ‘fire!’ in a crowded theater. The issue isn't false statements; it’s intent to harm. Again, that's a familiar principle in libel law, unfair competition, etc.

Garland McCoy, president, Technology Education Institute, commented, "Who defines ‘false information’? Many argued at the time that Orson Welles should have been put in prison for his ground breaking radio broadcast of ‘War of the Worlds.’ So the government or mob rule should hang a modern day Orson Welles?”

Meamya Christie, user-experience designer with Style Maven Linx, replied, “Depends on each situation. I feel that people will continue to receive false information and process it as truth as long as their minds are fertile for deception.”

Vince Alcazar, business owner and retired US military officer, wrote, "First and foremost, we must reclaim our civility form the Silicon Valley tycoons and twenty somethings who all have not lived long enough nor are informed enough to comprehend the destructiveness of a hands-off approach to driving hate and propaganda off of their platforms. That they do not somehow see and know it in the moment is ludicrous. In connection with that proposal, America needs new laws that put the burden of accountability for hate and lies on two sets of shoulders: the perpetrator and the platform.”

Greg Lloyd, president and co-founder of Traction Software, wrote, "Corporations or groups may be subject to ‘false advertising’ like fines and limitations (as libel + criminal prosecution as now). Preserve individual speech rights with only libel or criminal prosecution - as now.”

Luis Martínez, president of the Internet Society's Mexico chapter, observed, "Such penalties must exist but it is dangerous to be implemented by governments or civil society, the only feasible solution is to have such false informants publicly exposed.”

John Sniadowski, a director for a technology company, said, "Naming and shaming, for example their social media pages could be marked up as being untrustworthy with some sort of score depending on the level of harm caused. Statute book laws on this are potentially dangerous especially for the suppression by governments of citizen rights. The rules should be governed internationally with local country addendums that can be marked as such so it becomes possible to see transgressions defined by international law vs. local law.”

Adam Holland, a lawyer and project manager at Harvard's Berkman Klein Center for Internet & Society, noted, "I suspect that exisiting law about willfully causing harm, whether physical, emotional, or reputational, will provide a useful template for the actual nature of any penalties. However, intent is extremely difficult to effectively prove, and "false information" is going to be equally difficult to distinguish from fiction. Penalties, regardless of what they are, will be rare in application. Government should not be taking steps to prevent, since definitions of what is subject to any prevention may well change with the government. Government should empower the citizenry and enforce existing law equally.”

Shawn Otto, author of "The War on Science," replied, "Freedom ends when it reduces the equal freedom of others. Thus, one is not allowed to yell "fire" in a crowded theater, to use the old cliché. Similarly, that's why we have regulations: to cause bad actors from taking from the freedom of others by dumping their pollution or other bad behavior on others. Similarly when tobacco companies used propaganda to create uncertainties about what the science was showing: that smoking causes cancer. Freedom of speech does not extend to defrauding the public. That's why certain cases currently being brought against energy companies who knowingly lied to shareholders and the public about the dangers of using their product should be held to account for those bad actions in a court of law. This is an analogous situation. Some remedy may be developed through legal theory and case law, and some may need to be accomplished through legislation. You are free to lie, but not when it becomes fraud, with the intent of personal financial gain, and this is the new twist that some may argue for - ideological gain.”

Danny Rogers, founder and CEO of Terbium Labs, replied, "We've seen examples recently of successful countermeasures in this information war, especially in France. People don't like to be manipulated, so calling out manipulation and discrediting the discreditors are key tactics in fighting this battle.”

Susan Hares, a pioneer with the NSFNet and longtime internet engineering strategist, now a consultant, said, "In the US, slander is treated as a virtual attack on a person. Individuals who plan a destructive riot rather than a peaceful demonstration are criminals. Purposely spreading false information about a company that impacts business can also be considered a crime. Based on these existing legal principles, the government can press for laws that set legal penalties for creating and knowingly spreading bad information. The difficult part of these laws will be the definition of "intent" to create and knowingly spread false information.”

Mike Gaudreau, a retired IT and telecommunications executive, commented, "They should be fined heavily and sent to jail.”

Louisa Heinrich, founder of Superhuman Ltd, commented, "Our legal system provides frameworks for punishing fraud that correspond to the effects of the crime. I think it should be possible to prosecute these kinds of fraud and punish them in a way befitting the severity of the effect. But I'm no legal expert and I don't really know what that should look like.”

Jeff Jarvis, professor at the City University of New York Graduate School of Journalism, commented, "The First Amendment protects the right to lie and to be wrong. Government should play *no* role in controlling public speech. The only penalty for knowingly spreading false information should be shame - which is why we need encourage citizens to adapt their social norms to reward civility over incivility.”

Ken O'Grady, a futurist/consultant, said, "Fake news in my opinion is a form of liable or defamation and as such should be treated as a crime. Government by was of the executive branch (police and legal authorities) should be the group responsible to enforce.”

Shahab Khan, CEO for Planwel, Karachi, Pakistan, replied, "The role of the government is very important by regulations and cyber laws.”

Gianluca Demartini, a senior lecturer in data science, observed, "I am not in favour of heavy punishment for creators of false information. On the other hand, technology should be adopted to fight against false information. Government should support such technological development to the benefit of the general public.”

Richard Jones, a self-employed business owner based in Europe, said, "There are laws of libel and treason and incitement to violence for instance. History is written by the victors? Advertising standard authorities exist. Yet it is promulgation particularly by nation-states vs. North Korea, China or Russia seeking to create unrest which promises the most serious effects followed by activists who may pursue entryist methods.”

Robert Bell, co-founder of the Intelligent Community Forum, commented, "Difficult question, because false news travels fast and governments move slowly by design. Deterrence is best accomplished by modest penalties that are assured to take effect. So specific definition of ‘false information’ and ‘intent to cause harm’ are needed, and ‘intent’ may too slippery a slope compared with actual harm. Government can provide a consistent regulatory structure and let internet companies enforce it.”

Ed Terpening, an industry analyst with the Altimeter Group, replied, "I don't know if government can be trusted anymore as the arbiter of truth. In terms of penalties, it depends on impact - which could vary substantially.”

Paul Jones, director of ibiblio.org, University of North Carolina-Chapel Hill, noted, "There are a number of laws already in place that cover most of these concerns. They should be uniformly and fairly enforced.”

Frank Kaufmann, founder and director of several international projects for peace activism and media and information, commented, "I do not think there should be any penalties and my God PLEASE keep the government away from more meddling. I believe there are sufficient laws for criminal behavior. Existing laws already are sufficient to cover any criminality that might arise from distributing ‘false information.’"

Sam Lehman-Wilzig, associate professor and former chair of the School of Communication, Bar-Ilan University, Israel, wrote, "Unless the harmful effects transgress laws, the government should not get involved. Civic solutions like I mentioned earlier should suffice.”

Mark Johnson, chief technology strategist for MCNC, the technology non-profit that builds, owns and operates the infrastructure for North Carolina's community institutions, commented, "We already have legal remedies to deal with shouting 'fire' in a theater. Affirmative steps for establishing trust will be most effective and don't require new laws and regulation.”

Bradford W. Hesse, chief of the health communication and informatics research branch of the US National Cancer Institute, said, "The consequences for domestic abuse should be akin to actors who knowingly commit fraud (with intent included as part of the formula). International actions will need to be dealt with through treaty, collaborating police (e.g., Interpol), and trans-state organizations (e.g., U.N.).”

Clifford Lynch, director of the Coalition for Networked Information, noted, "The government must play some part in this, particularly in cases of deliberate fraud or espionage/information operations.”

Peter Levine, associate dean and professor, Tisch College of Civic Life, Tufts University, observed, "Generally, lying and misleading people is a right. However, there may be some specific venues where penalties for misleading information should be raised. For example, there is no price for lying under oath to Congress unless a majority votes to hold you in contempt, and that protects anyone with a partisan majority. If there's a way to make lying to Congress a real risk, it would be good.”

Helen Holder, distinguished technologist for HP, said, "Penalties should be those for incitement, fraud, harassment, libel, slander, et cetera, rather than any additional or specific penalties. The government could make it easier to pursue these cases. For example, today it is very hard for a person who has been threatened online to take action against their harasser. Often law enforcement is unable or unwilling to investigate. Policy, training, and staffing adjustments could be made to better enforce existing laws and regulations.”

Peter Dambier, DNS guru for Cesidian Root, commented, "Disinformation can only survive in filter bubbles. So keep governments and censoring out of this to help the ecosystem to cure itself.”

Peter Eckart, director of health and information technology, Illinois Public Health Institute, replied, "If there are existing laws that are consitutionally sound that can be brought to bear on those who are intent on causing harmful effects, use them to the fullest extent possible. ”

Rob Lerman, a retired librarian, commented, "The government's role should be to adequately fund our schools.”

Cliff Cook, planning information manager for the City of Cambridge, Massachusetts, noted, "How do you punish people who misinform and lie given First Amendment protections for political speech? I'm not at all sure that you can? Internet content hosts such as Twitter can be pressured to clamp down but this approach has a mixed record and could easily be weaponized itself given the wrong motivations.”

Su Sonia Herring, an editor and translator, commented, "Most of the time the purpose of spreading false info is not causing harm, it's strictly financial, outrages news get clicks, clicks mean money. The best penalty would be to financially fine the website owners or return of advertisement revenues. This would not work for major platforms.”

Dave Kissoondoyal, CEO, KMP Global, replied, "The penalties should depend on the gravity of the offence of creating or knowingly spreading false information and it is upon each Government to bring the appropriate legislation. The Government has an important role to play.”

Deborah Stewart, an internet activist/user, wrote, "License revocation and eventual judicial.”

Sasa M. Milasinovic, information and communication technology consultant with Yutro.com, replied, "The most severe punishments. Government should issue clear rules, understandable to any citizen.”

Jonathan Ssembajwe, executive director for the Rights of Young Foundation, Uganda, commented, "The government of different countries should come up with penalties for those spreading false information according to the laws of the country. Governments should also invest in educating the public the proper use of internet.”

Alf Rehn, chair of management and organization studies, Åbo Akademi University, commented, "Well, this needs to be looked at case by case, but I do believe that the grossest offenses should be treated as vandalism and sabotage, and not merely as misdemeanors.”

Shirley Willett, CEO, Shirley Willett Inc., said, "There has to be a formal research set up to study the falsehoods and the connected harm - then set related penalties.”

Riel Miller, an international civil servant who works as team leader in futures literacy for UNESCO, commented, "Feedback systems are helpful – if there is a way of knowing if there is a fire in the theatre independent of the person who shouts ‘fire’ then all they do is diminish their credibility. Policing the mechanisms for ensuring assessment and transparency rather than policing content.”

Bill Jones, chairman of Global Village Ltd., observed, "False concepts are an important way of advancing knowledge and science. Great works of art are based on false narratives without questioning motivations. I don’t think governments should play a role in preventing distribution of false information. governments are the servants of the people.”

Andrew McStay, professor of digital life at Bangor University, Wales, wrote, "So much depends on context. Newspapers for example have long spread misinformation with corrosive effects, but we tolerate this on basis on grounds of free speech. Then there is the question of where the source is (Macedonia, Russia, etc.). Efforts are best placed on: a) technical solutions; b) education; c) legacy news institutions.”

Marcel Bullinga, futurist with Futurecheck, based in the Netherlands, said, "1) Don' t think in terms of penalties for fake news creators. That is 19th century thinking. The future is about *preventing* fake news distribution, through the use of AI. 2) Don't talk about ‘government.’ It is not specific enough. Many governments (even democratic ones) are themselves active fake news creators... We need scientific parties and NGO's and certain governments to solve it.”

Vian Bakir, professor in political communication and journalism, Bangor University, Wales, commented, "It is difficult to establish intent to cause harm at the level of individual people. Probably better to educate people to be suspicious of false information and know where to go to for trusted information.”

Jens Ambsdorf, CEO at The Lighthouse Foundation, based in Germany, replied, "In general there are already rules and laws in place and they should be applied.”

Dan Ryan, professor of arts, technology, and the business of design at the University of Southern California, said, "I am not sure we should be thinking about this as a separate and distinct category. But to the degree that we do, information behavior has potential to be self-correcting, and self-reinforcing. One mechanism is the social currency of credibility. Should government be involved? If you think of government as law and regulation, sure. There are existing models in how we think about fraud, misrepresentation, fiduciary duty, Pringle agent problems, libel and such, etc. that can be used as jumping off points for careful analysis of the trade offs in the design of the regulation of the information order.”

David J. Krieger, director of the Institute for Communication & Leadership, Lucerne, Switzerland, commented, "Deceit, fraud, and misinformation should be punishable by law. Governments must create and enforce sanctions against the misuse of information.”

Ella Taylor-Smith, senior research fellow, School of Computing, Edinburgh Napier University, noted, "The government should lead by example and stop spreading harmful disinformation. They should also increase transparency by keeping a public record of all their online information and posts, including adverts placed by political parties and their supporters on social media.”

Ian Peter, an internet pioneer, historian, activist user and futurist, replied, "If there is to be such a scheme, it needs to be in an area free from political interference and within the judicial system with penalties commensurate with the harm caused.”

Rich Ling, professor of media technology, School of Communication and Information, Nanyang Technological University, said, "The same rules of slander and libel should apply on the Net as in other information channels. In those cases where the information is spread with the intention of disrupting civic processes such as voting, the perpetrators should be held accountable for undermining the democratic process.”

Rajnesh Singh, Asia-Pacific director for an internet policy and standards organization, observed, "This would have to be weighed against the actual or perceived harm. That determination in itself will be very difficult in some circumstances.”

Patrick Lambe, principal consultant, Straits Knowledge, noted, "I'm not in a position to judge the penalties. Existing legal frameworks should be capable of determining where penalties should apply: harm or harmful intent, cheating, incitement, aiding and betting criminal activity, slander, libel. The role of government is to ensure that policing and the courts have the necessary resources to investigate and prosecute.”

Daniel Kreiss, associate professor of communication, University of North Carolina-Chapel Hill, commented, "The issue to me is the willingness of the public to believe misinformation, not the presence of misinformation in the first place. Establishing 'facts' is very difficult, and there is more interpretation about what constitutes 'factual information' than these questions acknowledge. As such, I do not think that 'penalties' are realistic or feasible (who would judge the truth of claims?) And, the government should continue to have specialized, non-partisan, bureaucratic, and expert institutions that produce knowledge. But I find the idea of the government preventing the distribution of false information to be neither feasible nor realistic. How would the government prevent the current president from lying?”

Steven Miller, vice provost for research, Singapore Management University, wrote, "In various aspects of law, it is unlawful, and therefore a crime, to mis-represent things, especially intentionally, and there are penalties for this. However, it is a laborious process (slow, expensive, cumbersome) to have these matters go through the legal proceedings. So there are means of setting laws, and enforcing laws. There are means of dealing with arbitration in support of the laws. We do not have to invent these institutional mechanisms. And YES, government would have to play a stronger role.”

Eric Burger, research professor of computer science and director of the Georgetown Center for Secure Communications in Washington, DC, replied, "What are the penalties for libel? Correspondingly, one would expect (hope?) there is an exclusion for humor. However, that is a grand loophole as well.”

Tom Worthington, honorary lecturer in the Research School of Computer Science at Australian National University, commented, "Having governments attempt to prevent fake information risks a totalitarian state.”

John McNutt, professor, School of Public Policy and Administration, University of Delaware, wrote, "This is a horrible idea. I can't imagine any reason we should be doing this. Who decides what information is false?”

Greg Shatan, partner, Bortstein Legal Group, based in New York, replied, "Penalties should be significant, but only when one can distinguish between false information and unpopular information. Governments should be involved, but as part of a larger public/private effort.”

Alexander Furnas, Ph.D. candidate, University of Michigan, replied, "I don't think the government should play a role in preventing the distribution of false information beyond libel and slander laws that currently exist. To do more than that would likely violate the First Amendment. When spreading false information does, in fact. cause harm, those who have knowingly created or spread it should be civilly liable. Demonstrating harm is likely to be hard to do, however.”

Tomslin Samme-Nlar, technical lead, Dimension Data Australia, commented, "I can't suggest a penalty but there should be one. Government should take an active role in educating its citizens about the dangers of fake news and put in place legal tools that should be used to help fight fake news.”

Andrea Matwyshyn, a professor of law at Northeastern University who researches innovation and law, particularly information security, observed, "The nature of the information matters. That said, if the action violates the terms of use of the platform/social media site, this type of contract breach provides basis for shutting down the user's account, in the discretion of the platform/site. Government should ensure that the information it provides to the public is itself fully accurate.”

Amali De Silva-Mitchell, a futurist, replied, "Government must educate the people to keep an open mind, take in all information in an unbiased manner, seek the comments of those impacted etc. This is current good diligence practice and the public educated in these processes, i.e. education in building good judgment.”

Ed Tomchin, a retired writer and researcher, said, "They should be outed and put on public display.”

Bryan Alexander, futurist and president of Bryan Alexander Consulting, replied, "Governments should not resort to censorship. Penalties should be light.”

Rick Hasen, professor of law and political science, University of California-Irvine, said, "Existing tort law should handle these things. For example, fraudulent conduct leading to damages compensable under current tort system.”

Alexander Halavais, associate professor of social technologies, Arizona State University, said, "There are already remedies for the spread of false information that injures an individual, though they are relatively weak. The government's role in determining the validity of information needs to be constrained to areas in which fact is relatively easily determined (e.g., truth in advertising, medical claims, et cetera). When the government seeks to police truth claims in the political sphere, there will be a problem. Caveat lector.”

Greg Swanson, media consultant with Itzontarget, noted, "The government is a leading source of false information. Even so, penalizing people who spread false information raises alarming questions about the difference between propaganda and opinion. Are climate deniers spreading false information or simply another point of view?”

Paul Kyzivat, retired software engineer and Internet standards contributor, noted, "Trying to criminalize this is a slippery slope and bad idea. And I don't think this should be a government function.”

Flynn Ross, associate professor of teacher education, University of South Maine, said, "The legal system is charged with enforcement of laws so need to update slander laws to include internet related materials and class action suits on behalf of the public could be brought.”

William Anderson, adjunct professor, School of Information, University of Texas-Austin, replied, "Penalties should be similar to yelling ‘fire!’ in a crowded theatre or arena. Any penalties need to apply to the entire country, so federal legislation needed.”

K.G. Schneider, dean at a public university library, replied, "Government should support the media and First Amendment advocates in their quest for addressing false information.”

Alan D. Mutter, media consultant and faculty at graduate school of journalism, University of California-Berkeley, replied, "I don't trust the government to regulate free expression. Once that happens, our democracy will die.”

Eduardo Villanueva-Mansilla, associate professor, department of communications, Pontificia Universidad Católica del Perú, said, "First, define ‘harmful,’ then, we will have to deal with the interconnectedness of the systems allowing the spread of such information, and the fact that there are no political mechanisms to punish actions by a citizen from a given nation state in other nation state, even if s/he is identifiable. Sanctions between states are limited and dangerous beyond some very specific scope.”

Nate Cardozo, senior staff attorney, Electronic Frontier Foundation, observed, "The government must have no role in punishing the spreading of fake news. Those who spread fake news should face the social consequences of their actions.”

Andee Baker, a retired professor, said, "These people should be penalized with fines, and for serious cases, jail terms.”

Jeff Johnson, professor of computer science, University of San Francisco, replied, "Penalties for that should be loss of account with whatever online service was used to spread the misinformation.”

Federico Pistono, entrepreneur, angel Investor and researcher with Hyperloop TT, commented, "It's a difficult balance to strike. Individuals' freedom of expression should be respected at all times, as should the freedom for publishers to ignore or filter false information, similar (but not equal) to the process we reserve to peer-review literature.”

David Sarokin of Sarokin Consulting, author of "Missed Information," said, "There are too many different scenarios to address this, but in general, right to free speech is paramount and should always be protected.”

Paul Gardner-Stephen, senior lecturer, College of Science & Engineering, Flinders University, noted, "The role of governments is vexed, as it is often governments and political parties who are the ultimate source of much of the fake news. Governments should stand up for truth, and where appropriate enact legislation that provides proportionate penalty to the advantage sought through the peddling of fake news. This is very difficult in practice to define in truly reasonable terms, however.”

Richard Rothenberg, professor and associate dean, School of Public Health, Georgia State University, noted, "The social mechanisms are already in place: standing, and harms are the criteria, and these can be used. The government should be a plaintiff, when appropriate.”

Virginia Paque, lecturer and researcher of internet governance, DiploFoundation, wrote, "The same laws that govern offline slander, libel, and spread of false information for different purposes (theft/personal/societal/political) should be imposed online and offline. In this case, the medium is not the message. It's just harder to control.”

Tatiana Tosi, netnographer at Plugged Research, commented, "To prevent the distribution of false information it should have an education first and then later applies society rules for those who are causing harmful effects.”

Barry Parr, owner of Media Savvy, replied, "There's no way to do this without limiting free inquiry and dissent. Government action would be disastrous to democracy.”

Pamela Rutledge, director of the Media Psychology Research Center, noted, "Penalties should be commensurate with damage. Laws should be updated to reflect online as an extension of offline as social space. Government regulation, however, is a slippery slope toward loss of freedom.”

Richard Lachmann, professor of sociology, State University of New York-Albany, replied, "This should not be a criminal offense. Such purveyors should be vulnerable to civil suits for libel, defamation and damages from the harm their lies cause.”

Noah Grand, a sociology Ph.D., wrote, "Between the fake news sites during the election and Donald Trump’s ‘alternative facts’ I understand why there is a lot of anger toward people who knowingly spread false information. Punishing these deceivers seems very appealing. Unfortunately, punishment won’t do anything about the people who want to be deceived. America’s ‘War on Drugs’ – with its emphasis on punishing suppliers – hasn’t been very effective. There’s always a new supplier who rushes in to fill the demand. Why would we expect something different from a ‘War on False Information’ that targets suppliers?”

Meg Mott, professor of politics at Marlboro College, commented, "We do not need to increase criminal penalties. I would be cautious about tort reforms that make it harder for plaintiffs to sue when they have been harmed by fraudulent speech. I do think government should greatly increase spending in schools, prisons, community centers, and libraries on processing information towards better decision-making.”

Dariusz Jemielniak, professor of organization studies in the department of Management In Networked and Digital Societies (MiNDS), Kozminski University, observed, “Libel and defamation regulations are a good paragon, but in general fake news – pretty much like misleading advertising or air pollution – is harmful to the society as a whole and should be treated with the severity it deserves.”

To return to the survey's for-credit responses home page, with links to all sets, click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_information_environment_credit.xhtml

To advance to the next set of for-credit responses - those to survey Question 6 - click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_information_environment_credit_Q6.xhtml

If you wish to read the full survey report with analysis, click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_the_information_environment.xhtml

To read anonymous survey participants' responses with no analysis, click here:
http://www.elon.edu/e-web/imagining/surveys/2017_survey/future_of_information_environment_anon.xhtml