Elon University Home

 

Survey IX: 
The Future of Well-Being in a Tech-Saturated World

Credited responses to the second research question:
What actions be taken to reduce or eradicate potential harms of digital life harmful to individuals' mental and physical well-being? 

Results released in spring 2018 - To illuminate current attitudes about the likely impacts of digital life on individuals’ well-being in the next decade and assess what interventions might possibly emerge to help resolve any potential challenges, Pew Research and Elon University’s Imagining the Internet Center conducted a large-scale canvassing of technology experts, scholars, corporate and public practitioners and other leaders, asking:

Digital life's impacts on well-being: People are using digital tools to solve problems, enhance their lives and improve their productivity. More advances are expected to emerge in the future that are likely to help people lead even better lives. However, there is increasing commentary and research about the effects digital technologies have on individuals’ well-being, their level of stress, their ability to perform well at work and in social settings, their capability to focus their attention, their capacity to modulate their level of connectivity and their general happinessThe questions: 1) Over the next decade, how will changes in digital life impact people’s overall well-being physically and mentally? 2) Do you think there are any actions that might be successfully taken to reduce or eradicate potential harms of digital life to individuals' well-being yes or no - what might be done?  3) Please share a brief personal anecdote about how digital life has changed your daily life, your family's life or your friends' lives in regard to well-being. 

About 93% said in answer to question two that there are actions that can be taken to reduce or eradicate potential harms of digital life. You will find the written responses to question two submitted by anonymous respondents listed below the following summary of the common themes found among all responses to the primary research question.

To put things into context, among the key themes emerging from all of the 1,150 respondents' answers to all three research questions were: * CONCERNS - DIgital DeficitsCognitive abilities, including analytical thinking, memory, focus, processing speed and effectiveness, creativity and mental resilience, are undergoing change. - Digital Addiction: Internet businesses working to earn attention-economy profits are organized around dopamine-dosing tools designed to hook the public. - Digital Distrust/Divisiveness: Personal agency is reduced and emotions such as shock, fear, indignation and outrage are being weaponized online, driving divisions and doubts. - Digital Duress: Information overload + declines in trust and face-to-face skills + poor interface design = rises in stress, anxiety, depression, inactivity and sleeplessness. - Digital Dangers: The structure of the internet and pace of digital change invite ever-evolving threats to human interaction, security, democracy, jobs, privacy and more. * POTENTIAL REMEDIES - Reimagine Systems: A revision and re-set of tech approaches and human institutions (their composition, design, goals and processes) will better serve long-term good. - Reinvent Tech: A reconfiguration of hardware/software to improve human-centered performance can be paired with appropriate applications of emerging technologies such as AI, AR, VR and MR. - Regulate: Governments and/or industries should effect reforms through agreement on standards, guidelines, codes of conduct, and passage of rules and laws. - Redesign Media Literacy: Formally educate people of all ages about the impacts of digital life on well-being and the motivations underpinning tech systems, as well as encourage appropriate, healthy uses. - Recalibrate Expectations: Human-technology coevolution comes at a price; digital life in the 2000s is no different; people must gradually evolve and adjust to these changes. - Fated to Fail: A share of respondents say all of these remedies may help somewhat, but, mostly due to human nature, it is highly unlikely that these responses will be effective enough. * BENEFITS of DIGITAL LIFE - Connection: It links people to people, knowledge, education and entertainment anywhere globally at any time in a nearly frictionless manner. - Commerce, Government, Society: It revolutionizes civic, business, consumer and personal logistics, opening up a world of opportunity and options. - Crucial Intelligence: It is essential to tapping into an ever-widening array of health, safety and science resources, tools and services, in real time. - Contentment: It empowers people to improve, advance or reinvent their lives, allowing them to self-actualize, meet soulmates and make a difference. - Continuation Toward Quality: Emerging tools will continue to expand the quality and focus of digital life, and the big-picture results will continue to be percieved as a plus overall. 

To read the 86-page official survey report with analysis and find links to other raw data, click here:
http://www.elon.edu/e-web/imagining/surveys/2018_survey/Digital_Life_And_Well-Being_Home.xhtml

To read the for-credit responses to the main survey question, click here:
http://www.elon.edu/e-web/imagining/surveys/2018_survey/Digital_Life_And_Well-Being_credit.xhtml

To read a 272-page Expanded Version of the Digital Life and Well-Being report click here:
http://www.elon.edu/docs/e-web/imagining/surveys/2018_survey/Elon_Pew_Digital_
Life_and_Well_Being_Report_2018_Expanded_Version.pdf

Written elaborations by credited respondents

Following are the full responses by study participants who chose to take credit for their remarks and write an elaboration in the survey to the secondary question, "Do you think there are any actions that might be successfully taken to reduce or eradicate potential harms of digital life to individuals' well-being? Explain." Some of these are the longer versions of expert responses that are contained in shorter form in the official survey report. This page includes many responses that were not in the official report.

Dana Klisanin, futurist and psychologist at Evolutionary Guidance Media R&D, wrote, "The science of the impact of digital life on our physical, emotional, mental, spiritual and communal lives is in its infancy. This is an area requiring interdisciplinary and transdisciplinary scholarship, and we need more of it. We will use what we learn to mitigate the harm and enhance the benefits. To think that we can do nothing to mitigate the potential harms of digital life is similar to saying that we can do nothing to mitigate the potential harm of pesticides. We can definitely do it, we just require the willingness to take the necessary actions.”

Salvatore Iaconesi, an entrepreneur and business leader based in Europe, said, "Cultural actions. Bringing in arts and design to work not only on providing information and skills, but also to work on the dynamics of desire, imagination and emotion, which are the real behavior-changers."

Sherry Turkle one of the world’s foremost researchers into human-computer interaction shared the following action steps: "1) Working with companies in terms of design – [these tools] should not be designed to engage people in the manner of slot machines. 2) [There should be] a movement on every level to make software transparent. This is a large-scale societal goal! 3) Working with companies to collaborate with consumer groups to end practices that are not in the best interests of the commons or of personal integrity. 4) A fundamental revisiting of the question of who owns your information. 5) A fundamental revisiting of the current practices that any kind of advertisement can be placed online (for example ads that are against legal norms, such as agist, sexist, racist ads). 6) Far more regulation of political ads online. 7) An admission from online companies that they are not ‘just passive internet services.’ 8) Finding ways to work with them so that they are willing to accept that they can make a great deal of money even if they accept to be called what they are! This is the greatest business, political, and social and economic challenge of our time, simply learning to call what we have created what it really is and then regulate and manage it accordingly, bring it into the polity in the place it should really have."

Rich Miller, a practice leader and consultant for digital transformation at Telematica, Inc., wrote, "Thoughtful and pragmatic incorporation of legal mechanisms offers a number of opportunities to improve the situation. 1) Establishment of legal liability for software (particularly embedded software) that 'misbehaves' and cannot be updated in the field. This includes punitive measures for faulty software/systems that endanger life (e.g., medical instruments, healthcare systems). 2) Establishment of effective data privacy legislation, appropriate penalties for non-compliance and effective enforcement."

Louis Rossetto, founder and former editor-in-chief of Wired magazine, said, "The future is not pre-ordained. Of course, courses can be corrected. Will be corrected. It's part of human nature. Nothing is unalloyed good or bad. Indeed, the bad is an intrinsic part of the good. Digital technologies have net been beneficial. But the negative consequences of digital technologies can, are being and will be dealt with. Specifically, new technologies will obviate old problems, create new industries, wipe away old ones. As problems are identified, ‘solutions’ will be proposed. Some will work, some work. In extremis, political solutions will be applied. In all cases, unintended consequences will occur. In other words, evolution will continue, as it has, for billions of years."

David Wells, a CFO who lives and works in North America, said, "The current efforts underway to police the posting and spread of fake news, untrue accounts or stories will get better and be a positive for online information sharing. More nation states will be isolated and called out on cyberwar, espionage, sponsored hacking, as our corporations slowly get better at these efforts. Privacy stands to give way to security as younger people see the value of trading one liberty for another (freedom through better security and transparency). Internet 3.0 will be about less privacy and more accountability (perhaps an optimist's view)."

Nicholas Carr, well-known author of books and articles on technology and culture, said, "The advertising-based profit models of internet companies encourage design decisions that end up harming the users of the companies' products and services. The companies, therefore, are unlikely to be the source of beneficial changes in design and use patterns. Ultimately what's required – and what's possible – is a broad countercultural movement through which the public questions and rejects the cultural and social hegemony of digital media and the companies that control it."

Michael Kleeman, senior fellow at the University of California-San Diego and board member at the Institute for the Future, wrote, "We might begin by taking digital technology off its pedestal and portraying it as just another profit-driven part of commerce, albeit one that can separate us from those physically close and enable those at a distance to harm us. A focus on what contributes to health and happiness, literally health and literally happiness, as opposed to consumption might let us take advantage of the good and push down the negative impacts."

Jamila Michener, an assistant professor of government at Cornell University, wrote, "As far as mitigating potential harms the most important steps are as follows: 1) Rigorous research (qualitative and quantatative) to identify harms. We cannot assume they exist or speculate about what they are. 2) Rigorous research to test the effectiveness of various interventions in reducing said harms. 3) Once we have identified real harms and useful interventions, educational institutions, government and others in positions of power need to disseminate information and resources to parents, educators and ordinary people so that they can implement those interventions to the extent possible."

Michael Rogers, a futurist based in North America, said, "We will certainly develop new ways to adapt to the digital environment. The key question: What is the balance of the real and the virtual that will keep us healthy in every sense? Example: I know one large company that now has a ‘remedial social skills course’ for certain new hires. Growing up with asynchronous communication methods like IM and texting means that some adolescents don’t have as much practice with real-time face-to-face communication as did their parents. Thus, for some, tips on how to start a conversation, and how to know a conversation is over, and a bit of practice are helpful. It’s not the fault of the technology; it’s rather that we didn’t realize this might now be a skill that needs to be taught and encouraged. I think we’ll ultimately develop and teach other ways to overcome negative personal and social impacts. The challenge for older people in this process will be to ask ourselves whether, in these interventions, are we protecting important human skills and values, or are we simply being old fogies?"

Gail Brown, an instructional designer based in Australia, wrote, "This is as much about critical thinking and reading as learning. From presidential elections, to Facebook and Google unwittingly promoting fake news, things could change. Whether this happens or not, depends on the resources put to these efforts, and the knowledge of those involved – around both the nature of people, truthfulness and how to more likely ensure these are inherent in the online world, rather than the reverse situation – where lies and ‘used car salesmen’ selling ‘snake oil’ can be promoted by the online algorithms of large internet organisations."

Jeff Jarvis, a professor at City University of New York Graduate School of Journalism, said, "Every single one of us has the opportunity to improve the Net and the society we build with it every time we share, every time we publish a thought, every time we comment. Those are the interventions that will matter most as we negotiate our norms of behavior in the Net. I have long valued the openness of the Net but I fear I have come to see that such openness inevitably also opens the door to spam, manipulation and trolling. So platforms that value their service and brands are put in the position of compensating for these forces and making decisions about quality and misuse. I prefer to have users and platforms attempt to compensate for bad behavior and regulate themselves, for I do not trust many governments with this role and I fear that a system architected for one benign or beneficent government to act will be used as a precedent for bad governments to intervene."

Vint Cerf, Internet Hall of Fame member and vice president and chief internet evangelist at Google, commented, "We need to help people think more critically about what they encounter in information space (film, radio, TV, newspaper, magazines, online sources, personal interactions..). This needs to be a normal response to information: Where did it come from? Who is providing it? Is there a motivation for the particular position taken? Is there corroborating evidence? We can't automatically filter or qualify all the data coming our way, but we can use our wetware (brains) to do part of that job."

Michel Grossetti, research director at the French National Research Center, said, "One can imagine policies to avoid the increase of social inequalities caused (among others) by the development of electronic communication. For example, to put human intermediaries back into the administrative services that are in contact with the poorest populations."

Yasmin Ibrahim, an associate professor of international business and communications at Queen Mary University of London, said, "The problem is that as digital technologies become seamlessly part of our everyday engagement and mode of living we may not question actions or decisions we make online. Making the internet a healthy space means analysing our modes of being and everyday engagements in the digital realm, and this itself can be stressful. But keeping the internet a space of ideals requires us to do precisely that; to question every action and think about the internet architecture and how our activities are connected to a wider digital ecology of producing and consuming.”

David Weinberger, a senior researcher at Harvard’s Berkman Klein Center for Internet & Society, said, "In addition to the technical affordances and ‘nudges,’ we need to teach our children to be kinder. We also learn to be more ‘meta,’ making explicit norms that geographically local communities can take for granted are shared."

Oscar Gandy, emeritus professor of communication at the University of Pennsylvania, said, "If these many platforms that are providing assistance to people as they attempt to improve their own well-being are required by regulatory oversight to keep the well-being of their users the primary determinant of the recommendations, comparisons, warnings, etc., that they provide, nearly all of us can improve. I have suggested that the market needs an aide to self-management in the area of news and information, where ‘balanced diets’ can be evaluated and improved by a trusted agent. In my view, Facebook is not a trusted agent, and its influence over our information diets is not healthy, in part because of its conflict over whose interests are supposed to be served. In the absence of the emergence of a successful information platform, regulatory oversight that includes assessments of individual and collective harms will have to evaluate the performance of market leaders and exact compensatory payments to support the development of such agents/services."

Perry Hewitt, vice president of marketing and digital strategy at ITHAKA, said, "We're now living with a structural lag between the rapidly advancing technology and the means to regulate it – as a society and as individuals. As the risks become more quantifiable, both governments and individuals will take action."

Sasha Costanza-Chock, associate professor of civic media at MIT, said, "We absolutely need to take actions to mitigate digital harms. Actions are possible at every level, from the personal (adopting better digital security practices), to the interpersonal, to organizational shifts, as well as for entire communities, municipalities, governments, and so on. Harm mitigation can be accomplished through shifts in practice, regulation, policy, litigation, code and design, and norms. For example, there is the growth of the #designjustice approach: Design justice explores how the design of technological objects and systems influences the distribution of risks, harms, and benefits among various groups of people, or in other words how design both reproduces and is reproduced by the matrix of domination (white supremacy, heteropatriarchy, capitalism, and settler colonialism). Design justice is also a growing social movement that focuses on the fair distribution of design’s benefits and burdens; fair and meaningful participation in design decisions; and recognition of community based design traditions, knowledge, and practices."

Michael Roberts, an internet pioneer and Internet Hall of Fame member, commented "Politics is a lagging indicator, and politicians are just beginning to grapple with the threats posed to democracy and quality of life by the misuse of powerful digital technology and globe-spanning networks. Most politicians are not well-equipped to deal with the task of translating rules and laws developed for an analog world to the emerging digital reality. Many jurisdictions are already well launched into defining behavioral norms for cyberspace and considering appropriate penalties for criminal acts. The social media giants have discovered their original 'hands off' approach doesn't fly when individual users have no ability to deal with the bad guys on their own. Bottom line – steps are already being taken."

James Galvin, a director of strategic relationships and technical standards, said, "I worry that as technology ‘replaces’ people, it will in fact ‘replace’ people. Technology is a tool and should be used as such. In all places where it is deployed it should be the case that life is improved for people. It should never be the case that people are displaced. Business in particular needs to embrace the use of technology, but they need to continue to support their employees in the process. This is not an easy problem. It's not as simple as retraining employees for another job, nor is it as simply forcing employees to find another employer. It is a society problem, not just the problem of any individual business that wants to improve its efficiency. Every business has a role but so does every person in the development of a long-term, mutually satisfying solution."

Craig J. Mathias, principal for the Farpoint Group, wrote, "Apart from education regarding personal responsibility (to use internet-based services in a responsible manner), nothing can be done. We cannot compromise freedom under any circumstances.”

Richard Lachmann, professor of sociology, State University of New York-Albany, said, "We can take actions on a personal and community level. For example, private schools in Silicon Valley keep computers out of elementary school classrooms, and many of those in the industry don't let their young children use such devices. We can do the same. We also can create community websites and publicly owned WiFi networks."

Claudia L'Amoreaux, a digital consultant, wrote, "We've passed through the naive phase of internet optimism and utopian thinking. Issues are on the table. That's a good thing. I am encouraged by the work of people like Tristan Harris, Eli Pariser, Ethan Zuckerman, Sherry Turkle, Yalda Uhls, Zeynep Tufekci to identify and present solutions to the potential harms of digital life facing us – harms to children and in the family, and harms to civil society and democracy. I do think more individuals are becoming aware of the challenges with 24/7 digital life. More people are calling for transparency – in particular, with algorithms. Some solid investigative reporting is happening (e.g., ProPublica's recent piece on discriminatory housing ads on Facebook). The fake-news crisis has sounded an alarm in education that young people today need critical digital literacy, not just digital literacy. And the hearings in Washington post-election with the leaders in the digital industry have exposed deep problems in the way business has been conducted."

Greg Shannon, chief scientist, CERT Division in the Software Engineering Institute at Carnegie Mellon University, commented, "Human nature isn't so fatalistic as to give up. Mileage may vary. Here are some education interventions that already show promise: *Digital literacy *Critical thinking in the digital age *Trust in a digital world. Society needs to demand a digital world that is more secure, private, resilient and accountable."

Stephen Abram, CEO of the Federation of Ontario Public Libraries, wrote, "The digital industry needs to invest in:

• Tools to label potential propaganda, fake news.
• Tools to address hate speech/distribution against any group.
• Tools to address 'fake' actors and accounts on social media.
• Better tools for addressing hacking, attacks, viruses, ad purchases (such as by the Kremlin, etc.) that disrupt life the real world.

“All of this needs to be done in a way that allows the content to exist and remain findable and addressable. However this content should not be search-engine-optimized to the top page or pushed, boosted or promoted over higher-quality information. Governments (in concert internationally through the UN or WIPO etc.) need to invest in:

• A statement of principles and policies that are agreed to internationally with consequences – for example, words should not be banned to disrupt search. Content should not be locked down – especially content that doesn't align with the governments' in power political views (e.g., climate change, abortion, civil rights, etc.)
• Laws and treaties in all countries protecting the right of access as a human right.”

John Markoff, a fellow at the Center for Advanced Study in the Behavioral Sciences at Stanford University and longtime New York Times technology writer, said, "Science fiction writers have done the best job of outlining the sociology of computer networks and their impact on society generally. Early on Vernor Vinge wrote ‘True Names.’ It is still one of the best descriptions of the challenges that networks provide for identity and privacy. Reluctantly I think that there must be a technical solution to the challenge of anonymity and trust. Perhaps an answer lies in blockchain technologies. Also, recently, Danny Hillis, has proposed a semantic-knowledge tool that would allow the proving of ‘provenance’ if not truth. He describes this in a paper he is circulating as ‘The Underlay.’"

Annette Markham, professor of information studies and digital design, Aarhus University, Denmark, said, "We can help mitigate some of this stress and anxiety by engaging people to be more conscious of what's happening as well as – and this latter part is critical – more deliberate in establishing and maintaining better habits of digital media consumption. This means more work to develop effective media literacy (media, digital and data literacy), through strategic educational efforts or more informal consciousness raising, using feminist models of the women's liberation movements in the 60s and 70s. I've been wanting to figure out a way to have an international holiday called ‘memory day,’ where we spend time sorting through our own personal 'big data' to see what we've collected and generated throughout the year, to clean up our files and throw away junk, but to also more carefully curate what matters to us. This sort of regular reflection help people recognize how much they click, store, and share, which can in turn help people reflect on what those activities mean to them. Sorting through one's data to commemorate what matters is something that social media platforms like Facebook are happy to do, but are they the best curators for our memories? Tracing, remembering, and commemorating can help us slow down, be more deliberative about our digital lives, and be more reflexive about the impact of the internet overall.”

Stowe Boyd, managing director at Work Futures, said, "One of my abiding beliefs is that we are better off when we take a active and intentional approach to living digitally. Rather than being just a passive 'consumer' of digital streams, I feel people are better off through activity. To comment, argue, share and curate. Then, instead of being buffeted by the storms raging online, you can use the blowing winds to fill your sails and set a course."

Chris Morrow, a network security engineer, said, "I don't think that trying to 'intervene' is the right view. People need to realize that balance in their lives is important. Access and information at a wide scale enables people to see, hear, change many things, but at the end of the day they still need to interact with actual people and perform basic tasks in their lives. Trying to force this behavior will not work in the long term, people must realize that they need to balance their use of anything (digital access, food, exercise, etc.)."

Daniel Berleant, author of "The Human Race to the Future," commented, “It is pretty obvious that the legal environment can and does affect the form and degree of how digital technologies impact our lives."

David Myers, a professor of psychology at Hope College, wrote, "Much as humans flourish when living with an optimal work/life balance, so we will flourish when technology serves us without making us its slave. Thus we need creative minds that can help us intentionally manage our time and priorities accordingly."

Andy Williamson, CEO of Democratise, said, "We need better education in information literacy; our school curriculum isn't keeping pace with technology and that's to the detriment of all of us. Information is now constantly permeating so many aspects of our lives that it's too important to leave this to chance – we have to know how to qualify, filter, process and accept or dismiss what we're told. It might also be useful to remind people that it's possible to slow down and that a reply a day late is fine because sometimes it's the quality of the response, rather than the speed, that matters."

Jerry Michalski, founder of the Relationship Economy eXpedition, said, "User-experience design dictates most of what we do. Place a big source of addictive content in the focus of attention and most people will slip into that trap. If our UX designers wise up, they can just as easily design wellness, mindfulness, self-control and other features into the devices we use. It's possible, but the business models that fuel these companies make such steps unlikely."

Mario Morino, chairman at Morino Ventures, LLC, wrote, "There is promise in developing algorithmic and human-based countermeasures to detect, escalate awareness and even blunt or directly attack data pollution/polluters."

Jim Hendler, an artificial intelligence researcher and professor at Rensselaer Polytechnic Institute, wrote, "There is much discussion starting around the ethical issues in new technologies, especially artificial intelligence, and in ‘algorithm accountability.’ I believe that as more algorithms gain some measure of transparency and people's awareness grows there will be a growing awareness that new technologies depend on people who deploy them and the public response, not just on the technologies themselves."

Hal Varian, chief economist at Google, commented, "There are lots of experiments that can be tried, such as better forms of crowdsourcing, better validation tools and more attention to critical reading skills in high school and college. We don't know what combination of these experiments will work, so we should try them all."

David Ellis, Ph.D., course director of the Department of Communication Studies at York University-Toronto, said, "There certainly are actions that can be taken to mitigate harms in our digital lives. The challenge is takeup. The first and most important of these actions is: educate thyself. The less people know about the technologies they use, the more likely they are to be victimized in some fashion or constantly confused and frustrated trying to get what they want. The items needing some helpful explanation range from misguided beliefs about privacy, like ‘I’ve got nothing to hide,’ to why VPNs are useful and how they work, along with perspective adjustments about which actors pose a real threat to online welfare. Should hackers top everyone's threat-modeling list or should we leave room up there for Facebook and your ISP?

"Learning about any technology is tough. Digital technologies are especially so not only because they’re mostly hidden from sight, but also because of the industry’s big value proposition; ignorance is bliss, whether it be about privacy policies or the details of how services actually function. Consumers have become so accustomed to hearing that their digital life, indeed all of life, must be effortless in every way that little incentive is left to dig for details, even if doing so might improve their welfare.

"What will it take to make mitigating harms more appealing? For individual consumers, it’s going to take more than blaming our digital woes on the Silicon Valley crowd, however culpable they may be. It’s time to look in the mirror and decide for ourselves what we want from the digital life, now that escape is well nigh impossible. Some may stumble on the incentives they need to conduct their lives differently. But most people will need to be influenced by the trickle-down effects of broad social changes, some planned, others unplanned. In the planned category, one area ripe for change is higher education. On thousands of North American campuses, classroom learning has been radically disrupted by the unfettered use of smartphones and laptops to transport students away from the instructor and the course material.

"The campus takeover by digital and the ensuing plague of inattention has reached crisis proportions. One factor that may shine a cold, clear light on this problem is the discovery by parents of the extent to which their money and family resources are being wasted by their college-age kids. Any potentially reformist ideas will, however, have to face the entrenched assumption by administrators, vendors, students and many educators that more tech in the classroom is always good for business.

"In the unplanned category, a misguided regulatory decision taken in December 2017 shows how unintended consequences and lots of bad publicity can promote progressive change. That would be the Ajit Pai FCC’s repeal of the Open Internet Order, and with it the rejection of Network neutrality as part of the U.S. policy framework for broadband. With the ink barely dry, a storm of protest and threatened legal actions has erupted suggesting the FCC order was politically shortsighted and likely to backfire on its intended beneficiaries. This war over internet gatekeeping, which promises to rage through 2018 and beyond, has had the desirable outcome of making millions of consumers aware of the harms that can be visited on them by their ISP and what’s at stake in their digital lives when the regulator sees the public interest exclusively through the eyes of the telecom industry. We can reasonably hope that what began as an arcane policy process will prompt lots of skeptical questioning about digital harms and mitigation, whether through advocacy efforts, political action or casual introspection about our digital future. Not an ideal way to promote public education, but definitely the silver lining in Pai’s perverse gesture to ‘internet freedom.’"

Doug Breitbart, co-founder and co-director of The Values Foundation, said, "Technology developed in service to human beings’ experiential generativity and collaboration holds the potential to materially enhance the quality and depth of human connection and mitigate the current isolation and antisocial behavioral imprinting currently reflected in our culture by its use today."

Charles Ess, professor, department of media and communication, University of Oslo, said,  “As a humanist and as an educator, the central question is... us. That is, it seems very clear that as these technologies become more comprehensive and complex, they require ever greater conscious attention and reflection on our part in order to ascertain what uses and balances in fact best contribute to individual and social well-being and flourishing. In some ways, this is ancient wisdom – and specifically at the core of the Enlightenment: if we are to escape bondage, we must have the courage to critically think (and feel) and act out of our own (shared) agency. This is the virtue ethics approach taken up by Norbert Wiener at the beginning of computing and cybernetics – and may be enjoying a much-needed renaissance in recent years, not simply among philosophers of technology and designers, but more broadly among ‘users’ themselves. Fairly simply put: The more these technologies both enhance my capabilities and threaten my freedom (e.g., the infinite surveillance possible through the Internet of Things), the more I am required to be aware of their advantages and threats, and to adjust my usage of them accordingly, whether in terms of close attention to, e.g., privacy settings on social media platforms, software and software enhancements (such as browsers and browser extensions, PgP apps, etc.), and/or simple decisions as to whether or not some technological conveniences may simply not be worth the cost in terms of loss of privacy or ‘deskilling’, as in the case of offloading care to carebots. But as these examples suggest, such awareness and attention also require enormous resources of time, attention and some level of technical expertise.

"How to help ‘the many’ acquire these levels of awareness, insight, technical expertise? The Enlightenment answer is, of course, education. A version of this might be ‘media literacy’ – but what is needed is something far more robust than ‘how to use a spreadsheet’ (as important and useful as spreadsheets are). Rather, such a robust media literacy would include explicit attention to the ethical, social, and political dimensions that interweave through all of this – and highlight how such critical attention and conscious responsibility for our technological usages and choices is not just about being more savvy consumers, but, ultimately, engaged citizens in democratic polities and, most grandiosely, human beings pursuing good lives of flourishing in informed and conscious ways. All of that is obviously a lot to demand – both of educational systems and of human beings in general. The ancients – Plato in particular – argued that ‘the many’ were incapable of meeting such demands. The moderns, including Kant and the Enlightenment more broadly, were willing to be that ‘the many’ could achieve these high demands with the help of suitable education: such achievements would not only foster good lives of flourishing but also specifically make democratic polity possible. To see who is right requires us to do our best to provide such education. Whether we can do so in the current climate is a very open question."

Alex Halavais, director of the MA in Social Technologies, Arizona State University, said, "The primary change needs to come in education. From a very early age, people need to understand how to interact with networked, digital technologies. They need to learn how to use social media, and learn how not to be used by it. They need to understand how to assemble reliable information, and how to detect crap. They need to be able to shape the media they are immersed in. They need to be aware of how algorithms and marketing--and the companies, governments, and other organizations that produce them - help to shape the ways in which they see the world. Unfortunately, from preschool to grad school, there isn't a lot of consensus about how this is to be achieved."

Katharina Zweig, professor of computer science at TU Kaiserslautern, said, "We need to develop devices that learn from local information in a truly anonymized way. We also need regulation on how insurers can and cannot incentivize the use of health sensors. Of course, this is only one tiny aspect of the wide field of digital life and health. Other aspects will have to be analyzed in detail as well, e.g., benefits and potential risks of VR and other topics will be of great interest in the future. In general, I am a strong believer in the scientific method to firstly identify chances and risks and to secondly find meaningful ways to steer towards the chances and away from the risks. For me, this is the most promising approach to mitigate the potential harms of any kind of technology."

Peter Levine, associate dean of Tisch College at Tufts University, said, "The EU and the U.S. could take the lead in regulating the big social media platforms in the public interest."

Dewayne Hendricks, CEO of Tetherless Access, said, "Most folks forget that the internet is a 'network of networks.’ Autonomous networks choose to peer with other such networks. I believe that its time to do a reset on the global internet and move to a model where trust between peers can be achieved. That is NOT the case now. I personally am spending more time in much smaller peering networks, where you can choose to peer only with those whom you trust. The TCP/IP protocol suite makes it possible to create a multiverse of internets. There need be only one. Time to explore just what a trust-based internet would look like. I don't believe that the current global internet is sustainable."

Stephen Downes, a senior research officer at the National Research Council Canada, commented, "We have to recognize that people can be harmed through technology and in particular through the exercise of what some call 'free speech' using technology. Just today there was a story of an innocent man being killed in a SWAT raid that resulted from a dispute between two people playing online games. One of them gave a fake address to another, and the other reported the address to police, which resulted in the raid, and the death. This is a tangible harm caused by ‘free speech.’ It was a deliberate act, and the speaker will be held accountable for the consequences. We know that speech harms. We know that spreading false beliefs will lead people to act on those beliefs, often to the point of harming themselves and others. We know that spreading hatred and incitement to violence result in hatred and violence. The right to a peaceful life and enjoyment of society are not superseded by the desire to engage in irresponsible use of free speech. There is a clear case for limits to expression online, and people violating those limits ought to be sanctioned. By the same token, though, people need to become more resilient to the effect of online speech and actions. Many of the calls to violence and racism fall on willing ears, and people who are unable to grow and develop through other more peaceful means bond together in hatred to enjoy community, to advance their position in society, and punish people for their imagined misdeeds. We need to give people more to hope for and, frankly, more to lose. If they have an investment in society they won't be so quick to destroy it. Inequality breeds racism and intolerance, and in turn, feeds on it. We need to reverse this."

Alice E. Marwick, an assistant professor of communication at the University of North Carolina-Chapel Hill expert on the impacts of commercial internet applications on identity and social interaction, said, "Generally, technologies tend to amplify pre-existing social conditions. If we invest in mental health care, affordable housing, education and access to health care, social conditions will improve and technology may aid that. If we do not, then people are more likely to have stress and difficulties stemming from their environment, health, workplace, and the like. Rather than focusing on technology as a possible cause of harms, we should work to improve factors that we know affect people's health and wellness."

Anita Salem, a human systems researcher based in North America, commented, "Potential risks can be mitigated by reframing the role of technology and reducing the power of corporations. Technology needs to focus on the whole system, minimize unintended consequences and support big lives rather than big corporations. In addition to marketability, technology should be valued by how well it strengthens human relationships, preserves our planet, bridges inequalities and provides a livable wage, gives voice to the marginalized, develops creativity, supports mental and physical health and increases opportunities for leading a meaningful life. This however, requires a cataclysmic shift in our economic system."

Jason Hong, professor at the Human Computer Interaction Institute, Carnegie Mellon University, wrote, "There are three big things people can do to take back control of their time and their attention, and improve well-being. The first is to turn off notifications from apps and services. When I sign up for a new service or install a new app, the first thing I do is figure out how to minimize the number of notifications it sends. The default for most apps is to buzz and make loud sounds when it receives a notification. It turns out that you can often block these notifications or make it so that they silently send notifications, making it so that you don't get interrupted all the time. The second is to decrease use of apps and services that try to monopolize your attention, in particular social media. Learn about the psychological strategies that they use to capture your attention. Put your smartphones away when dining with friends. Also, try reducing your usage of these apps too. You'll find that you're not really missing that much if you reduce your use of Facebook or Snapchat to once a week or less. Focus on the here and now, on the people around you right now, rather than the virtual you. The third is to change how you use these apps. Social media is a lot like TV: you can watch it by yourself, or you can use it as an excuse to get friends to watch things together. In one case, TV is isolating, and in the other case, it is bonding. Instead of mindlessly browsing information about acquaintances, use social media to build or maintain strong relationships. Check in directly with close friends to see how they are doing, or use these social media platforms to coordinate meetups with friends."

Jennifer deWinter, associate professor of rhetoric and director of interactive media and game development at Worcester Polytechnic Institute, said, "This is one massive open box. Companies can create reasonable technology policies about communication technologies. Germany has just passed a law that requires platforms to remove hate speech from their sites. This is good. We need to seriously interrogate what we mean by digital democracy and create policies that support and nourish online democratic engagement – one that cannot be policed if that is what we think is valuable. We need to think through policies of hate and online harassment. These things have real health effects on people, yet our justice system doesn't really have a way to intervene, research and prosecute others for offenses. We need to think through privacy and data and be explicit when talking with people and educating them about what their rights are. They should have rights. We need to think through geographical power and access to these technologies so that power is not concentrated in certain areas but is dispersed. We need to give people input and control over the algorithms that overdetermine content. [We need to address] the issue of Net neutrality."

Ralph Droms, a technology developer/administrator based in North America, said, "Improve privacy technology to be more protective of personal information; improve education opportunities to give those displaced by technology a better chance for alternative employment."

Rich Ling, professor of media technology at Nanyang Technological University, said, "It is my hope that tools like AI will be able to address some of the abuses that we have seen in, for example, the Russian involvement in the U.S. elections."

Avery Holton, an associate professor of communication at the University of Utah, commented, "We're already seeing a push toward regulating and vanquishing mis- and disinformation and those who spread such discourse. That can greatly decrease levels of stress, confusion and anger that build around such information. We're also seeing a sort of overlap in available app technology, meaning that we'll likely be faced with fewer, not more, apps in the coming years. As Facebook and Instagram adopt the technology of Snapchat, and in some cases of themselves, there are less apps competing for our attention. They are also searching for new ways to enhance individual and collective engagement, moving ahead of the experimental curve we've all had to deal with. Sure, this might create a sort of monopolistic view of apps, but it also helps to streamline user experiences."

Sonia Jorge, executive director of the Alliance for Affordable Internet and head of the Web Foundation's Digital Inclusion Program, said, "There are many actions that can be taken to mitigate potential harms of digital life/interactions, and many organizations are working towards ensuring that those are designed thoughtfully and implemented correctly, including the Alliance for Affordable Internet, the Web Foundation, the Internet Society, the Association for Progressive Communications, some corporations and governments (with a number of Scandinavian countries and the European Union being good examples). Such actions include, for example, comprehensive data protection laws (the EU General Data Protection Regulation being a good example), or corporate transparency and accountability standards to increase consumer trust. Some examples include: 1) A4AI has published Suggested Policy Guidelines to Make Public WiFi Work for Users. 2) The Web Foundation has published a whitepaper series entitled "Opportunities and risks in emerging technologies," which addresses some of these issues and suggests some actions. Other areas of concern are around legal frameworks to ensure that internet-based violence against women is addressed by law enforcement and other agencies. Without such frameworks in place to increase privacy and protection, women will increasingly question the benefit to participate in digital life, as the costs of access may be far to high for many. This is unacceptable, therefore, leaders MUST develop policy solutions to address such situations."

Alan Tabor, an internet advocate based in North America, wrote, "Step one, we need something like credit reports for digital advertising so we can see what our profiles are on the various media and, second, who is using them and why. This may be a sufficient counter-measure since ‘many eyes make all bugs shallow’ as it were."

Megan Gray, a regulatory attorney based in North America, said, "We can take steps, as a society, to decrease negative effects, but not eliminate them entirely. Overused statement, but it really is a brave new world."

Daphne Keller, a lawyer who once worked on liability free speech issues for a major global technology company, said, "I'm a lawyer, so I think laws matter. If EU law compels platforms to build online content filters, for example, that will a) foreseeably lead to lots of erroneous suppression of lawful information, b) speed the day when filtering technologies are easily available to oppressive regimes around the world, and c) entrench incumbent platforms at the expense of new market entrants. Interventions to shape the law can mitigate harms to digital life. So can pressures on private companies and other powerful actors in the space."

Sy Taffel, senior lecturer in media studies at Massey University, wrote, "Moving away from the corporate model of platform capitalism towards commons and public alternatives that are driven by a desire to build a more equitable and fair society rather than profiteering from the commodification of communication and systematic dataveillance would be a good start at addressing the systemic issues that currently exist. There are a huge number of areas where legislative activity to curb the behaviour of tech corporations can help, and the EU has recently taken a lead in doing this in numerous cases, ranging from prohibiting the use of toxic substances in digital devices to how personal data can be used. The social harm that results from tech corporation's pervasive tax avoidance cannot be overstated either."

Philip Gillingham, Australian Research Council Future Fellow, said, "We need to take a much more critical approach to the use of technology. Human need needs to take the lead in technology development and we need to think through what the unintended consequences of particular technologies might be. For example, do we really want highly paid senior academics trying to process their casual expenses (all afternoon) through an IS or should we leave it to administrative staff who are much quicker and efficient. Just because we can do things with technology does not mean that we should. We need to stop and think."

Anne Collier, consultant and executive at The Net Safety Collaborative, said, "Yes, on the one hand, I think there needs to be greater transparency on media companies' part at to how users' data is being used. On the other hand, I feel regulators and governments need to show greater responsibility in 3 ways: 1) grow their understanding of how digital media work, of algorithms, machine learning and other tools of ‘big data,’ including the pace of change and innovation, 2) begin to acknowledge that, given the pace of innovation, regulation can't continue to be once and for all, but rather needs a ‘use by’ date, and 3) develop more of a multi-stakeholder rather than a top-down, hierarchical model for regulation. In fact, we all need to think about how regulation needs to be multi-dimensional (including self- and peer-to-peer) and how all the stakeholders need to collaborate rather than work from an adversarial approach."

Mike Liebhold, senior researcher and distinguished fellow at the Institute for the Future, wrote, "The most important civic actions to mitigate potential harms of digital life are: 1) Continuous education for citizens on critical thinking skills, and cyber secure behaviors. 2) Continuous education for well-being professionals and practitioners on effective application of technology, best practices for privacy and security. 3) Continuous education of technologies on designing and operations for quality of care, privacy and security. 4) Government policies providing lifelong UBA (Universal Basic Access to health, education, livelihood)."

Bill Lehr, a research scientist and economist at MIT, wrote, "Anonymous commentary has done great damage, on balance, to the quality of public discourse. Things like cyber-bullying and fake-news would be less of a problem if those who offer opinions were more often held accountable for their thoughts. I am fan of First Amendment protections and recognize the importance of anonymity in protecting privacy, but I think we will have to give up on some of this. This is just one example of something immediate that could be done to improve digital life."

Bob Metcalfe, Internet Hall of Fame member, Ethernet innovator and professor of innovation at the University of Texas-Austin, wrote, "’Interventions’ are not what's needed, but a competitive evolution of the tools, now ongoing, with Facebook and Twitter defending their flaws."

Daniel Weitzner, principle research scientist, MIT Internet Policy Research Initiative, commented, "When interacting online, we need to know whether we are dealing with real people, and those people need to be held accountable (sometimes socially, sometimes legally) for the truth and integrity of their words and actions. As an alternative to censoring speech or controlling individual associations, we should look to increasing accountability while recognizing that sometimes anonymity is necessary, too. And, when platform providers (i.e., advertisers and others) operate platforms for profit, we should consider what mix of social and legal controls can provide the right measure of accountability."

Amy Webb, futurist and professor of strategic foresight at the New York University Stern School of Business, wrote, "The exponential advancement of technology is now outpacing our ability to adapt. Most people haven't developed the digital street smarts required to be a good citizen of the internet.  Anyone over the age of 30 knows what the world was like before we had ubiquitous internet, smart devices, social networks and cable news. Which means that at some point, we were slower consumers of information. But we've fetishized the future and forgotten our recent past.

"We can choose to improve the quality of our digital experiences by forcing ourselves to be more critical of the information we consume, whether that's a friend's enviable Instagram posts or incendiary political news. The world we see looking only through the lens of a single post never reveals all of the circumstances, context and detail. Schools must teach digital street smarts, which includes digital literacy, beginning in elementary school. We are well-past teaching kids that Wikipedia isn't a reliable primary source of information. From an early age, kids should learn about bots and automatically-generated content. They should have provocative ethics conversations – with their peers, not just their parents – about online content and about technology in general.

"Content distributors must stop asserting that they are merely platforms. Algorithms are continuously making decisions about what users see and when, and, while that process may be automated, the code is written by humans. As we enter the next era of computing – the Artificial Intelligence era – we must examine and make transparent how platforms make decisions on our behalf. Those decisions impact our health and well-being in the digital realm."

Bob Frankston, a technologist based in North America, said, "There is a constant tension between investors wanting to capture all the value and own the future and the enabling technologies that aren't highly profitable but which can improve lives and allow all of us to be not just participants but contributors."

Douglas Massey, a professor of sociology and public affairs at Princeton University, wrote, "I am not very optimistic that democratically elected governments will be able to regulate the internet and social media in ways that benefit the many rather than the few, given the vast amounts of money and power that are at stake and outside the control of any single government, and intergovernmental organizations are too weak at this point to have any hope of influence. The Trump Administration's repeal of Net neutrality is certainly not a good sign."

Mark Richmond, an internet pioneer and systems engineer for the U.S. government, wrote, "I'm concerned that the more people try to fix things, the more problems are caused. Regulation, deregulation, censorship, openness, filtering, verifying, no matter what you call it. With the best of intentions, people have proposed requiring real identification for online posters, for example. The downside is the risk of repression, censorship, discrimination and marginalization. To make it worse, overcoming such a requirement is a trivial matter for anyone determined. It just makes it harder on the honest. Protections against the misuse of the technology must continue to be developed. Financial transactions, privacy concerns, all of those of course Revival. But that's a transactional change, not a foundational change. The foundation of the internet really must remain one of providing a billion soap boxes for a billion points of view."

Cliff Zukin, a professor and survey researcher at Rutgers University, commented, "Simply put, I believe the technology governs. It is a variant of McLuhan's ‘media is the message.’ It continues the argument of Neil Postman's in ‘Amusing Ourselves to Death.’ People send the pictures and go on Facebook because they can, not because there is any real content involved. Over time, that becomes the communication and a new normal evolves."

Thad Hall, research scientist and coauthor of the forthcoming book "Politics for a Connected American Public" (Oxford University Press), commented, “My concern is that the battle over digital life is a competition where one side is using addiction-psychology models to get people addicted to their devices and the apps on them and the ability of people to resist these temptations is questionable. In addition, the ability of people to use the technology for nefarious purposes – creating fake information, especially high-level information like video and audio – and the internet to spread this information is going to create ongoing problems that will be very difficult to address."

Colin Tredoux, a professor of psychology at the University of Cape Town, commented, "Digital technology is just about uncontrollable. There are myriad examples of this. The internet was designed to be robust to local disruption (or control), and the many many examples of hacked banking, government, health, education sites show that it is not possible to provide meaningful control except at the cost of draconian measures as in Iran at the moment, or China, and even those will likely fail. Some military protocols now require computers to be offline. We will have to live with the bad while enjoying the good. It is not clear that we can do anything meaningful to ensure that the good outweighs the bad."

William J. Ward, president of DR4WARD, said, "There can be no actions taken to mitigate potential harms of digital lives that are mandated or imposed on people. Each individual must discover for themselves that digital is harming their physical/mental health and relationships. Only when they come to this personal realization will they be able to seek out solutions and make changes in their life to reduce the harm and mitigate the damage caused by digital."
Gianluca Demartini, a senior lecturer in data science at the University of Queensland, commented, "People will naturally adapt to the ubiquitous and growing presence of digital technology in daily life. This may take a generation to happen where the older generation may struggle to adapt and will perceive some of the potential harms and the younger generation will naturally benefit from the positive aspects."

Alice Tong, a writer based in North America, said, "We all have free will and if someone wants to do something, we cannot stop them, not digitally. What will be important is to promote the idea of non-digital life to people starting at a young age. Make it known that also living a non-digital lifestyle is a must for balance."

Heywood Sloane, partner and co-founder of HealthStyles.net, said, "Interventions, no! That implies someone believes they know where this is all headed. The risk of unintended consequences is higher than we can possibly understand or appreciate. Learning to use the best of it and avoid the worst of it – with experience over time – is quite possible."

Tom Massingham, a business owner based in North America, wrote, "I just can’t think of a possible intervention. It seems like a creature growing, and out of control."

Shahab Khan, CEO of PLANWEL, said, "This is a natural and evolutionary process and our world will get used to it. However, due to the changing paradigm we need to commence capacity-building programs for our youth to become ready for the requirements of the digital world."

Charlie Firestone, executive director of the Aspen Institute Communications and Society Program, said, "As tech companies get bigger and bigger it is really only government that can form an effective counterforce. At every level, government programs, ideally in partnership with business and civil society, have a role. 1) In cyberwarfare, government is our first level of protection against state-level (or equivalent) attacks. Hacking of the Internet of Things could shut cities down and have other disastrous consequences. In cybercrime (including identity theft), we need government to protect and enforce laws aimed at protecting citizens and businesses. We also need antitrust and regulatory enforcement against abuses in business such as anti-competitive behaviors, fraud, misrepresentation and discrimination. 2) Another major area to think about is advances in artificial intelligence, genetics and robotics. We need vigilance on how these technologies are advancing, but the point of governmental intervention is very difficult. We don't want to stifle innovation or investment, but can't wait too long to avoid a disastrous outcome. We need more attention to that issue. 3) The question of data ownership is extremely significant to both business models and individual autonomy. I am hopeful that blockchain technology or other means will enable a move towards more personal ownership of our own information, recognizing at the same time that we can't and perhaps even shouldn't control all public information about ourselves."

Brad Templeton, software architect, civil rights advocate, entrepreneur and internet pioneer, wrote, "The key action is to identify when things are not working well, do research, and then work to fix it in the design of the next generation of products. First generations will continue to tend to have unintended consequences. You can't have innovation without that."

Gus Hosein, executive director of Privacy International, a London-based nonprofit, wrote, "We can’t continue down this path because we can’t continue to be this stupid. I’m mostly speaking at a security and privacy level but I also hope it applies at a competition level too. We are building a very unresilient socio-technical infrastructure that we are coming to rely upon ever more. This is insanity by definition as we’ve seen all these problems before and somehow we still invest with the thought that what happened before won’t happen again: breaches due to inattention and lack of care of systems, domination by few companies who have vast access to insights into our lives and markets, governments intervening only to advance their own interests to gather intelligence."

Ethan Zuckerman, director of the Center for Civic Media at MIT, wrote, "The platforms we use are often actively hostile towards attempts to make them kinder and less harmful for users. A new category of innovators is starting to build complementary systems that allow users of these systems to improve how they use them. I see great promise in users taking responsibility for their health within the systems we use."

Paul Saffo, a leading Silicon Valley-based technological forecaster and consulting professor in the School of Engineering at Stanford University, said, "It is tempting to list the myriad specific steps we must take, such as changing the rules of anonymity on social media and fine-tuning human abilities to discriminate the artificial from the real. However, all of those steps are but footnotes in a more fundamental challenge. We are tuned to feel empathy for individuals, but empathy doesn't scale. As Stalin put it, ‘a single death is a tragedy; a million deaths is a statistic.’ We must find a way to scale empathy. We must find a way to use digital media to cause individual humans to have empathy for the multitude, and ultimately for the entire planet."

Jillian C. York, director for International Freedom of Expression at the Electronic Frontier Foundation, said, "Interventions to mitigate the harms of digital life are possible, but they require a commitment to holistic solutions. We can't simply rely on technology to mitigate the harms of technology; rather, we must look at our educational systems, our political and economic systems – therein lie the solutions."

James M. Hinton, an author, commented, "I strongly believe the de-commercialization of people's online presence could only be of benefit to the private individual. Having the freedom to know that your interests can be expressed without being collated by large, monetized interests with the intent on capitalizing on you either directly or by essentially selling you to other interests will enable people to be able to connect more comfortably, with a sense of security in ‘exposing themselves’ to a community online that is there for them. Unfortunately, the trends throughout history run in directions quite opposite of this sort of thing. People's online lives, from data searches to chat activities to shopping histories have been, and will continue to be, viewed as a valid commercial opportunity to be profited through, until eventually there will come a time that your only value online, much as it is in the offline world, is what you can make for others, and if you have little value to commercial interests you will be afforded little opportunity online. As evidence I offer you the debate over Net neutrality currently ongoing. So, are there things that can be done? Most certainly! Will they? No, I see no hope of that."

Seth Finkelstein, consulting programmer at Finkelstein Consulting, wrote, "We desperately need legal protections to redress the imbalance of power between large corporations and ordinary people. We're at a stage for ‘connectivity’ now comparable to the early days of ‘industrialization.’ Back then, there was the idea of ‘If you accept the job, you accept the risk.’ That meant if you were maimed or killed by factory machinery, too bad. It wasn't the problem of the ‘platform owner,’ err, company. And a corresponding type of establishment apologist would similarly offer tips and exhortations to always be watchful in dangerous areas, but eventually just wring their hands that nothing could be done about carelessness or bad luck, and anyway to even try would kill precious start-ups, err, industrial spirit.

"An illuminating example of not believing in technological determinism is the issue of copyright. Just as an observation, without taking a position myself on whether the ultimate result is true or not, big media company owners of copyrights believe very many things can be done to migrate the effects of digital life on their business model. They are not simply throwing up their hands and saying nothing can be done, and we shouldn't even try for fear of consequences. Only the little people get that line. The fact that the United States has an extremely weak labor movement, and a press where there's little besides corporate interests, means that this discussion takes place in the U.S. in a very skewed way. It becomes a very financially-oriented framework, such as proposing property rights for individuals in their data, or viewing the harms as a market opportunity for other companies.

"Privacy and data protection laws run into the problem that fundamentally they restrict the ability of a large corporation to profit somehow, which is difficult when politics is dominated by money. But on the other hand, there are frequent calls now for monopoly media companies to use their immense power to directly marginalize fringe ideas. After years of hearing hucksters touting the internet as letting everyone have a voice, I find it darkly amusing that it's become a moral panic the instant such hucksters weren't the ones shouting loudest.

"There's not enough space to do a full analysis here. But briefly, I think that conflict is a symptom of a dysfunction in what's supported overall by the social system. If there's only an economy of attention-seeking outrage, that's the problem itself, not having someone pick the correct winner among all the outrage-mongers."

John Sniadowski, CEO of Riverside Internet, Wales, commented, "It should be possible using machine learning neural networks to provide personal digital assistants (PDA) to individuals to help them cope with online interactions. Machine learning can help prevent the distribution of fake news and warn people of poor content. However, personal digital assistants themselves will need a large number of built in safeguards to prevent personal information being disclosed to unauthorised third parties. How PDA's can be implemented is something of a challenge. They should probably be provided by not-for-profit companies that are either paid for by the individual or subsidised by ISPs or government support. There should be a global registry of companies providing such services and they must under no circumstances provide free services based on the individual concerned giving up any control of their personal information."

Ian Peter, an internet advocate and voice of the people, commented, "Interventions are particularly necessary in areas such as cybersecurity and cyberwarfare: the internet can definitely be made more secure. There are regulatory measures that can assist with many other problems, such as fake news, algorithmic injustices, privacy breaches and market domination via breakdowns in Net neutrality or unregulated market dominance. All these things can be improved by regulatory measures; whether they will be is another matter."

Diana L. Ascher, co-founder of the Information Ethics & Equity Institute, wrote, "Actions certainly can be taken to mitigate the potential harms of digital life, but to do so will require moving beyond partisanship and (re)defining the values by which we wish to live. Legislative moves that make it possible for powerful entities to limit the capabilities and opportunities of the powerless, such as repealing the common-carrier classification of internet service providers, have disproportionately negative effects on under-represented populations."

Douglas Rushkoff, a professor of media at City University of New York, said, "The companies would have to adopt different profit models, based on revenue rather than growth. They would have to decide whether the future of the species is important to them. Most see humans as the problem, and technology as the solution. They seek to replace humanity before the environment is destroyed, or to get off the planet before that happens. If, instead, they decided to align with humanity, our species could indeed survive another century or more. If they continue to see humans as the enemy, we don't have much longer."

Ginger Paque, a lecturer and researcher with DiploFoundation, wrote, "We must and we will learn to take individual and family responsibility for the kind of lives we want, offline as traditionally we have done offline. We have to realise that we cannot cede our decision-making to our governments or even society. Online dangers (lack of privacy, cyber-bullying, harassment and so many more) have gotten worse, and have even exacerbated offline dangers, or catalysed terrible offline consequences. As campaigns like #MeToo gather force and are replicated in other areas, social, parental and personal responsibility will necessarily be re-learned and used for better personal responsibility and management. If governments don't respect or protect their individual citizen users, citizens will have to exercise their rights and protect themselves from evildoers, violators of our rights, business and sometimes our own governments."

Paul Jones, a professor of information science at the University of North Carolina-Chapel Hill and internet pioneer, wrote, "People are already seeking attention-management services. Freedom.to being my favorite. I expect that just saying ‘no calls’ to Assistant or Alexa or Siri is the equivalent of taking the phone off the hook in the 1960s."

Jamais Cascio, a distinguished fellow at the Institute for the Future, said, "We will find a combination of behavioral norms, regulation and technology that will help to minimize or mitigate potential harms of digital social media. I'm equally certain that these changes – alone or in combination – will in turn produce unintended results that could be seen as harmful. This shouldn't be seen as discouraging, but it should be recognized. Many of the problems we see with digital social media come from previously accepted (or at least tolerated) behaviors that have their impacts magnified by hyperconnectivity. The capacity for marginal groups to get together online has been remarkably beneficial for some oppressed communities but just as helpful for fringe pathologies. Statements that would once have been heard or seen by a few now can reach audiences of millions. In parallel to this, statements that would once have been heard or seen in passing now have digital persistence, even functional immortality, enabling them to have an influence lasting well beyond the moment they are stated."

Henning Schulzrinne, Internet Hall of Fame member and professor at Columbia University, commented, "Twitter and Facebook, as well as specialty social networks frequented by teenagers, are likely to have the largest potentially negative impact. All of them can foster an environment that discourages harmful effects such as personal attacks and, for content targeted at teenagers, avoids interaction models, such as ‘like’ counts, that are likely to increase negative affect and depression. For example, Twitter users should be able to limit their posts to verified accounts that have not received complaints about inappropriate content or personal attacks."

Marc Rotenberg, director of a major digital civil rights organization, wrote, "The initial hurdle in all such challenges will be to overcome technological determinism. This is the modern-day religion of acquiescence that stifles reason, choice and freedom."

Alejandro Pisanty, a professor at Universidad Nacional Autonoma de Mexico and longtime leading participant in the activities of the Internet Society, wrote, "An open, public, civil, rational discussion of principles guiding systems design and implementation will become critical. All stakeholders must be availed a chance to participate meaningfully, in a timely and relevant manner. The most important intervention is to help, nudge or even force people to THINK, think before we click, think before we propagate news, think before we act. Some regulatory actions inviting information disclosure by corporations and government may be helpful but will fall on fallow ground if people are not awake and aware. Second: transparency to a reasonable extent will continue to be necessary, so the basis of decisions made by systems can be understood by people, and people and organizations can in turn test the systems and adjust their responses.”

Nathaniel Borenstein, chief scientist at Mimecast, said, "Most obviously, rigorously enforced Net neutrality would prevent many of the worst outcomes. More positively, I think we can develop spiritual and philosophical disciplines that will help people get the most out of these technologies, and will help people develop in ways that minimize the chances that they become cyberbullies or other cybermisfits."

Shel Israel, CEO of the Transformation Group, said, "The issue becomes one of public policy and government regulation. My concern is the quality of such policies is dependent upon the quality of government, which at this moment in time is pretty discouraging."

Marshall Kirkpatrick, product director, Influencer Marketing, said, "We can all help create a culture that celebrates thoughtfulness, appreciation of self and others and use of networked technologies for the benefit of ourselves and the network. We can create a culture that points away from the exploitive mercenary cynicism of ‘Hooked’ growth-hacking."

Bart Knijnenburg, assistant professor, Clemson University, said, "An important side-effect of our digital life is that it is observable and amenable to research. This aspect is slowly but steadily revolutionizing the fields of psychology, sociology and anthropology. The available data is so vast that we can now study subtle phenomena and small sub-populations (e.g., underserved minorities) in increasing detail. If insights from the ‘digital humanities’ can be fed back into the development of online technologies, this can help mitigate the potential harms of digital life."

David J. Krieger, director of the Institute for Communication & Leadership, Lucerne, Switzerland, observed, "Generally society and its organizations should proactively move away from the established solutions to problems as they were defined in the industrial age and try innovative forms of networking, sharing and management of information."

Daniel Schultz, senior creative technologist at the Internet Archive, commented, "Technology is built by humans, and in the best cases it is designed for humans. There are some areas where unintended consequences of certain design decisions have become so dramatic that the fabric of our society feels like it might unravel (e.g., social media/Twitter/bots and vitriolic interactions/etc.) but I feel confident that it is possible to correct these problems through changes to the technologies themselves to account for newly discovered needs as well as a newly recognized need for a more informed/trained user base. I imagine that people weren't driving 70 miles per hour when the Model T came out; society had time to adapt and evolve. We haven't had this luxury with the internet, but that doesn't mean it's too late for us to catch up with the pace of innovation."

Fred Davis, a futurist/consultant based in North America, wrote, "Digital life is still relatively new and we are just beginning to understand its effects. Awareness is growing quickly about the negative effects and as more is known there will be more known about how to combat or lessen these effects. People are already taking breaks from social media. Social media as it is today magnifies and intensifies moral outrage, which is unhealthy and has led to breaking up friendships and relationships. As people become more aware of this negative feedback loop they will find ways to intervene or at least recognize the phenomenon and take steps to put things in better perspective."

Steve Stroh, technology journalist, said, "It amazes me that in 2018, we are still subject to email scammers, phishing, etc. Our ‘systems’ ought to be better to be able to protect us. I happen to use Gmail as my primary email, and although I'm sure that Gmail is stopping a lot of scam email and I never see it, there's a lot that does get through. That said, I've seen other email systems that are very routinely hacked / email accounts taken over – AOL, Yahoo!, etc. In comparison, Gmail does a very, very good job. Organizations that choose to acquire personal information that is not voluntarily disclosed, should be held liable if that information is leaked / stolen. I'm thinking of the many recent hacks of retailers, and especially the recent credit bureau debacle. I did not agree to have a retailer retain my (bank) credit card information. I did not individually disclose to a credit bureau my personal information (for them to retain). In most of the cases that I've heard of, the disclosure of personal information was due to negligence on the part of the organization – they were lazy, or cheap, or incompetent. If they were held liable – by regulatory agencies, or sued in a class action lawsuit, THEN they would start caring."

Michael R. Nelson, public policy expert with Cloudflare, said, "The most important government intervention is to avoid regulations or lawsuits that would lead to less competition in the IT and telecommunications sectors. Competition drives innovation and leads to more solutions to meet the varied needs of consumers. Too often governments try to pre-select a favored solution, when finding ways to encourage competitive markets that deliver competing solutions is a much better goal."

Micah Altman, head scientist for the program for information science at MIT, said, "Information technology is often disruptive and far faster than the evolution of markets, norms and law. This increases the uncertainty of predicting the effects of technological choices but doesn't render such predictions useless, nor prevent us from observing these effects and reacting to them. Furthermore, we know enough to effectively design substantial elements of privacy, security, individual control, explainability and audibility into technical systems if we choose to do so. How will specific technology choices affect individuals and society? We do not always know the answers to technology questions in advance. But we can choose now to design into our systems now, the ability for society and individuals to ask these questions and receive meaningful answers."

Christopher Richter, an associate professor at Hollins University, wrote, "Net neutrality was a good start – now gone. That said, ‘well meaning interventions can also have unintended consequences for good or ill.’"

Jeff Johnson, a professor of computer science at the University of San Francisco, said, "We can call on Congress to overturn the FCC's recent decision to eliminate Net neutrality. Net neutrality is a principle that is based on the long-standing concept of Common Carriers, in which certain parcel and information carriers are required to be impartial about the information that they transport. We can raise our children to not be addicted to their smart phones and the internet as the current generation of young people is. Just as many families limited their children's TV time during the 50s, 60s, 70s, and 80s, we can limit our children's internet time, texting time and video-game time."

James Scofield O'Rourke IV, professor of management at the University of Notre Dame, said, "If technology has placed us in danger it can remediate, obviate or eliminate that danger. I have great faith in technology, but somewhat less faith in the nature of the humans who employ it. I remain ever hopeful, though, that we can invent our way out of the dangers we have created."

Eelco Herder, an assistant professor of computer science whose focus is on personalization and privacy, Radboud Universiteit Nijmegen, the Netherlands, wrote, "The main intervention needed to be taken to mitigate potential harms of digital life is to prevent or limit current interventions that partially lock us in a filter bubble. Social media platforms have several strategies for maximizing interaction, including selecting mainly content that confirms a user's current opinions, negative content (which invites more discussion and comments than positive content), and updates from very close friends (with whom we already communicate a lot). Facebook now seems to have recognized that in order to remain relevant, they will need to provide more balanced and diverse feeds. In addition, I believe that users need to be more in control regarding content that is currently largely automatically selected from them. We do need information filtering, but each user needs to be able to influence how this is done."

Ross Rader, vice president for customer experience, Tucows Inc, said, "We will see more and more social pressure employed on companies as to the secondary costs of their innovation, and companies – the good ones – will embrace this as a social responsibility and work to absorb those costs to the extent feasible. We – society on Earth – are developing an awareness of what secondary costs look like, why they can be negative and why they can't be left untended. As we are learning how to mitigate these costs in legacy markets like agriculture, energy and finance, I believe that we will apply those lessons in other sectors and avoid the huge sunk-costs problem that we've let develop over the last few hundred years as our population has ballooned."

Susan Price, lead experience strategist at USAA, commented, "Technology is disrupting interaction patterns that took millennia to evolve. We can use human-centered technology design to improve our experiences and outcomes, to better serve us. I have a vision for a human API that allows us to moderate and throttle what occupies our attention – guided by principles and rules in each user’s direct control, with a model and framework that prioritizes and categorizes content as it reaches our awareness – to reduce effort and cognitive load in line with our own expressed goals and objectives. Today we cede that power to an array commercial vendors and providers."

Kevin J. Payne, founder of Chronic Cow LLC, said, "The internet is an environment – distributed and digital – but an environment, nonetheless. Ample research now supports the assertion that one of the surest ways to influence behavior is simply to alter the environment to one that rewards ‘positive’ behaviors and punishes ‘negative’ behaviors (although, of course, there's an endless debate surrounding which behaviors to label as positive or negative). There's also quite a bit of research yet to do to understand which factors we can manipulate and how in order to optimize success. Not to mention the ethical dilemmas that arise. However, with big data and intelligent, adaptive, prescriptive algorithms, we should technically be able to achieve the required, targeted nuance. The question remains as to whether we should. And, if so, how far we should go and who would oversee."

Jordan LaBouff, associate professor of psychology at the University of Maine, commented, "In short, the idea that we can't shape our behavior to be more helpful and less harmful is just wrong. We can always investigate a situation, recognize harms and work to reduce those – and we should."

Pete Cranston, a Europe-based trainer and consultant on digital technology and software applications, wrote, "Technology can and needs to be managed. People can and need to have access to learning and education about the benefits and risks of technology. The constraints are people's time and the huge resources at the disposal of malicious and criminal online behaviour."

David A. Bernstein, a retired market researcher and consultant, said, "Just as we have come up with interventions for alcoholism and smoking, we can certainly develop behavioral-modification strategies to handle internet addiction. However, just like any typical behavior modification program, if the person does not want to change there will be no change. Taking pre-emptive measures to protect society from potential harms associated with digital life crosses into the realm of civil rights and personal freedom. Yet, if science can provide sufficient evidence of a specific harm and its causes, the government has the right – just as it did when seat belts were introduced in the early 1960s – to work with companies (e.g., ISPs, manufacturers, software companies, etc.) to introduce protection mechanisms. Local, state and federal governments can also implement laws and regulations to help support such actions for public safety. I believe we are seeing the leading edge of this with local laws prohibiting the use of cell phones while driving."

Daureen Nesdill, research data management expert based at the University of Utah, said, "Research is already underway to understand the effects of technology on our well being. We need to translate the research into action."

Simeon Yates, professor of digital culture at the University of Liverpool, wrote, "I would have once argued vehemently against this, but we need to start looking at how we regulate the internet. It is not just the horrors of hate online, nor ‘fake news,’ but more importantly how we chose to solve social problems with (in part) technology. As I have argued repeatedly in my recent secondment and joint research with UK government and local government agencies – technologies are never the solution. Technologies embedded and developed with appropriate regulation are the key to delivering good outcomes. For example, how will we regulate aspects of automation. What will be our goal in regulating it or not (profit? well-being? risk?). Again context is for kings – we need to know from a strong evidence base specific potential impacts and the contextual limitation of regulation and policy."

Olugbenga Adesida, founder and CEO of Bonako, wrote, "The key is for all countries to reflect on the emerging digital world and to formulate national strategies and policies to seize opportunities provided and minimize the risks. One critical response is capacity building and training on the uses and misuse of digital tools right from pre-school through university education. In the emerging digital world, in which everything will be connected, privacy and skills redundancy will critical challenges for society. Preparedness will be needed at the supra-national and national levels. Similarly, individuals will also have to take responsibility.”

Richard Sambrook, professor of journalism at Cardiff University, UK, wrote, "It is hubris for technology companies or their evangelists to think they are beyond regulation. They have acquired huge market and social power – it is the place of politics and society to ensure that is managed for the collective good. I believe there will be regulation and other measures introduced to ensure the market power of these huge companies is not abused or misused (and currently it seems to me there are many examples of current misuse which are coming under scrutiny)."

Ana Cristina Amoroso das Neves, director of the Department for the Information Society at Fundação para a Ciência e a Tecnologia, said, "Digital competences are key to understand the hyperconnected society. The awareness and the capacity to understand the evolution of digital life should increase together with digital technology evolution. That is why I am confident that actions can be taken to mitigate potential harms of digital life. But the investment on digital competences is key."

Maureen Hilyard, IT consultant and vice chair of the At-Large Advisory Committee of ICANN, wrote, "Actions that need to be taken to mitigate potential harms of digital life: 1) The internet needs to be made more accessible for those who would not normally be able to because it is so expensive. The government gets a lot of data downloaded from the satellite, and due to our small population who can afford to use it at the costs that the telecom company charges, we don't use that much. 2) If the government was more socially minded, they would provide the those who are least able to pay for it with free internet or a highly subsidised rate, so that everyone could have a share of the broadband that is available. 3) People won't become aware of the value of digital life and the harms that prevent them from taking advantage of the positives of the internet, if they can't use it!"

Brenda M. Michelson, an executive-level technology architect based in North America, commented, "As mentioned previously, broad education on information literacy and critical thinking can help people discern the validity of information (content), view multiple sides/perspectives of an issue and consider the motivations of content creators/providers. There should be a developing/refining our individual habits. Turning off notifications. Giving ourselves digital breaks with other people, dooing outdoor activity and so on. Essentially, regaining our attention. As well, we can choose devices and interfaces that augment our everyday experiences while being a present participant in social/work/family situations; rather than something physically isolating, such as today's virtual reality headsets."

Adrian Colyer, a business leader/entrepreneur based in Europe, said, "There are actions we can take, but they won’t be popular and I think they will need to come in the form of laws and regulations – nothing else will be strong enough to stand in the way of commerce. I’m thinking of privacy and security regulations for example (such as the forthcoming GDPR, and its successors) as well things like requiring clear labelling or disclosure when media has been digitally manipulated."

Akah Harvey, co-founder, COO and IT engineer at Traveler Inc., said, "One thing I find useful telling people is that, ‘What we learn may be up to you, but what we do with what we learn is up to us.’ This means that there'll possibly be limited tools available for us to use, but when we learn how to use what's available, then we can build new tools and either use them for prosperity or harm. It's important to always educate people about the moral values they should hold as they experience new, changing digitalized environments.”

Richard Jones, an investor based in Europe, wrote, "Education is key, but education needs redefining. It must be continuous and location-independent, reserving playground-type interaction for the young to develop social skills and for the adults as a cafe culture. It must identify aptitudes and requirements and opportunities to deploy, and – because all this is surrounded by uncertainty – this must focus on flexibility to pivot, to reconfigure. This is a major change of attitude and education resources, and vitally must be accompanied by appropriate financing. The debt burden of the typical graduate is not acceptable or sustainable, especially in the light of the above considerations. Lifelong education is as much a necessity as free medical care. Without appropriate education, a person is as unlikely to lead a good life as if they are ill. Increasingly over 30 years we have moved from it being a stigma to move jobs or to not work in a location with peers or to be moonlighting, to non-hierarchical and gig economy organisation whereby given the right tools paradoxically start-ups thrive in the older end of society."

Rosanna Guadagno, a social psychologist with expertise in social influence, persuasion and digital communication and researcher at the Peace Innovation Lab at Stanford University, wrote, "There are many scholars examining these issues, and industry leaders should consider this scholarly work when making design decisions. Protecting people from cyberwarfare and disinformation campaigns would go a long way to reduce the harm caused by digital communication as it currently exists. There's also evidence that communicating via the spoken word is better for well-being than text-based communication. This is because people dehumanize others when communicating via text (see Schroder, Kardas, and Epley, 2017, http://journals.sagepub.com/doi/abs/10.1177/0956797617713798). So, digital technologies that function using voice communication may be one way to ameliorate the negative effects of social media."

Tom Wolzien, chairman at The Video Call Center LLC, said, "1) Provide plain and simple notice to the consumer of ultimate ownership responsibility for each site, app, stream or other material reaching that consumer on that web/app page or event. 2) This is a legal editorial responsibility for the content presented (consistent with current libel, slander, defamation and rights laws covering legacy print and mass media). 3) Application of anti-trust law to vertical and horizontal integration across all media, including all online media."

Kyle Rose, principal architect at Akamai Technologies, Inc. and active IETF participant, wrote, "Facebook, in particular, has a responsibility to both educate users on why they are seeing the posts they are seeing and to tweak those algorithms to reduce information siloing. Many people, especially younger folks, understand at a gut level how they are being manipulated, but at least anecdotally older folks in general don't seem to have the ability to use and react to social media and to external links of questionable provenance with the right level of skepticism. Social media unfortunately seems to thrive on and reinforce confirmation bias. This is probably its biggest challenge."

Marshall Kirk McKusick, an internet pioneer and computer scientist, said, "The internet is a very powerful tool. As with all-powerful tools it is capable of building great things but also inflicting great harm. It is imperative that we learn how to use it effectively and also that we do our best to mitigate its harm. There must and will be actions taken to mitigate the potential harms of digital life. The only question is how successful they will be."

Eric Allman, research engineer at the University of California-Berkeley, commented, "I am heartened by some of the efforts to automatically ‘vet’ information sources. But the question was a bit vague: to be clear, I do think there exist actions that can (and will) be taken to mitigate problems, but I am not confident that those mitigations will be enough to solve the problems."

Joseph Turow, professor of communication at University of Pennsylvania's Annenberg School of Communication, wrote, "Changes can be made to mitigate potential harms of digital life but, depending on what those harms are, the responses will require a complex combination of public education, government activity and corporate agreement. Some of the harms – for example, those relating to issues of surveillance and privacy – unfortunately result from corporate and government activities in the political and business realms. Moreover, government and corporate actors often work together in these domains. Their vested interests will make it extremely difficult to address privacy and surveillance practices so that they match the public interest, but advocacy groups will keep trying and they may make some progress with increasing public awareness."

Daniel Pimienta, an internet advocate and activist from the Dominican Republic, commented, "The answer is simple: education! The answer is not only simple, but heavily urgent. Without comprehensive information literacy programs people are going to be more and more confused by information technologies. Without information literacy, the gap between literate persons and non-literate is increasing over time."

Erika McGinty, a research scientist based in North America, wrote, "Revolutions like the Internet of Things should be better explained, and by neutral or consumer-friendly parties, not just by vendors who dream up stuff to make a profit, whether it's useful or not. There needs to be more education from journalists, nonprofits and government as well as consumer watchdogs about the implications for social interaction, privacy, self-censorship, fear, isolation, safety, empathy, personal control, citizenship, etc. from installing things like Google Voice or Amazon Echo or surveillance cameras or smartphone apps that allow one to control one's heating, radio and so on. I believe more regulation is required to safeguard against privacy and security breaches, constitutional violations – venders could include consumer-advocacy information with their products. And government applications should be evidence-based before implemented. Yes, I believe in government regulations - of and by the government!"

Sam Lehman-Wilzig, retired chair, School of Communication and Department of Political Studies at Bar-Ilan University, Israel, wrote, "Social media will be forced by regulation, legislation and/or public pressure to limit some of the more deleterious elements within their platforms. This will involve artificial intelligence to aid in ‘surveying’ the constant, vast, flow of communication, a small part of which is harmful and even illegal."

Christopher Wilkinson, internet pioneer and longtime leader based in Europe, wrote, "I limit this reply to online banking. Quite negative: 1) Security concerns have given rise to over-complex identification procedures. 2) The basic security procedure should be standardised and implemented by all banks. Learning distinct procedure for each bank, which are periodically revised, is burdensome. 3) Increasingly, digital services assume GSM coverage for SMS, and high-bandwidth connections. Assumptions not justified in practice."

Giacomo Mazzone, head of institutional relations at the EBU/WBU Broadcasting Union, said, "Yes. Here some examples. 1) New antitrust rules on a global scale need to be defined, and corporations that have reached far beyond their boundaries have to break up. The internet giants that immediately take over any innovation arriving into the market are becoming an obstacle to change and progress. 2) The open internet needs to be preserved at any price. If we have separate internet for the rich and the poor, the reasons we have granted special status and exceptional treatment to the internet revolution have gone. 3) Disruptive social impacts need to be adressed quickly – as the disruption process is identified and not afterward. Educational processes need to be redesigned, taking in account the notion of digital citizenship and the need for lifelong learning processes. 4) A brand new ‘social contract’ should be defined and signed between ruling classes, business community, citizens; the notions of salaries, jobs, pensions and social security need to be redesigned from scratch."

Su Sonia Herring, an editor and project coordinator based in Europe, wrote, "Making business practices of technology companies more transparent and accountable is a must. The same transparency must apply to government’s use of technology, especially when related to privacy, access and security of big data. Cooperation and dialogue between all stakeholders is key as the technology and the internet are virtually borderless. This practice of information exchange on good practices and diverse experiences would help create useful and flexible policies. Dated laws, non-transparent decision-making and over-regulation are not the best way forward."

Eduardo Villanueva-Mansilla, an associate professor in the Department of Communications at Pontificia Universidad Católica del Perú, said, "It is indubitable that new developments will enhance quality of life for many, but that does not mean that a plurality of people, especially outside the developed world, will access these developments, not just because of costs, but due to design issues. In developing nations there are things that are still to be solved that do not require digital tech; and when digital tech may help, adaptations and localizations are required, too.”

Jon Lebkowsky, CEO of Polycot Associates, said, "It's a ‘training issue’ – our dependence on various technologies is way ahead of our comprehension. It'll probably take a generation or two to catch up with accelerating change (though some believe that technology will always accelerate faster than human comprehension – I don't share that view, at least I don't see it as inherent)."

Mary Ellen Bates, president and founder of Bates Information Services Inc., commented, "One of the simplest changes would be for all social media platforms to include an off switch – allowing users to be reminded after a certain amount of time that they have spent X minutes/hours on social media – rather than building better and better algorithms for keeping people hypnotized by a scrolling screen.”

Larry Roberts, Internet Hall of Famer, said, "I don't see any real harms of digital life, only benefits."

Fay Niker, postdoctoral fellow at Stanford’s Center for Ethics in Society, wrote, "As a political theorist, I think that there are grounds for the public regulation of our digital environments, based on the harm principle and/or on asserting and defending a freedom of attention. Governmental regulation is required, because we cannot trust the self-regulating efforts of the firms themselves and we should not be responsibilizing individuals when it comes to dealing with the harmful effects on their lives and society of systemic issues. That’s not to say that individuals have no role and responsibility in the management of their digital lives, but that the main burden should not be held by individuals within the current system."

Phill Hallam-Baker, an internationally recognized computer security specialist and principal scientist, commented, "Zuckerberg really has to fix Facebook before it dies. The chief problem with the digital economy is that network effects lead to concentrations of power. It is well understood that network effects mean that communications systems tend towards natural monopolies. What is less well understood is that the network effects collapse at the same rate. I have been on the Net long enough that I saw many digital communities collapse and die over the course of a few years. In 1994, USENET was the largest digital community in the world. By 1996 it was almost dead, killed by spam. Geocities went the same way so did dozens of others. Right now, people value digital communities as if they are the record labels of the digital economy. What if that is wrong? What if they are actually merely a hot super-band and investors are buying the next ABBA or Rolling Stones? Facebook has been designed to be as addictive as possible and to this end the designers have intentionally encouraged conflict. It is only possible for people to approve other people's posts. Every reaction is interpreted as an endorsement of the post. There is no response for 'this is untrue' or 'this is Russian propaganda.’ Online communities have faced the problem of anti-social posters poisoning the debate for decades and there are many approaches that are proven to be effective in doing that. But Facebook does not want to make use of them because the trolls drive their bottom line by increasing engagement. There is a whole literature that could be applied to fix Facebook but right now Zuckerberg doesn't want to because a dysfunctional digital community is more profitable."

Rich Salz, principal engineer at Akamai Technologies, said, ‘Intervention requires rigorous filtering of ‘facts’ and taking time away to make human connections. It is difficult."

Theodora Sutton, a Ph.D. candidate at the Oxford Internet Institute, wrote, "We do need guidelines for technology design to prevent companies from exploiting users in the realm of personal information and the attention economy. Implementation of guidelines like this is possible and likely to happen. Technology companies are already aware of their bad reputations, and at some point they will have to address the things they are accused of in order to keep their user base. I know that Facebook is turning its attention to well-being at the moment."

Richard Padilla, a retired system administrator, said, "Just as with anything in life having too much doesn't improve but hamper life thus it is the same with tech too much abuse in its use doesn't help and should be used to show how bettering life can be engaged using technology."

Sam Punnett, president of FAD Research, Inc., said, "Advances in monitoring, such as the ability to observe real-time brain activity, are leading to insights into the effects of media exposure upon brain function. The realization that digital media consumption is not benign will hopefully lead to greater awareness of the effects. Harm reduction in the form of distracted driving laws are a welcome measure. The effects of digital engagement are broad-ranging. They have changed the nature of interpersonal communications and social engagement. Excessive use by individuals appears to cause users to exhibit symptoms of both obsessive-compulsive disorder and substance dependence in some cases. Eventually the discussion will become a part of greater conversations related to mental health as we discover more."

Marcus Foth, professor of urban informatics at Queensland University of Technology, wrote, "Whether the design of blockchain and distributed-ledger technology – and robotics, and AI and other digital technologies – advances us toward dystopian or utopian futures will have a tremendous impact on people's well-being. Continuing to work just in our little square and not seeing the bigger picture, can do harm. The bigger-picture disciplines such as humanities and especially axiology are called on to guide the way. I believe actions can be taken to mitigate potential harms of digital life. However, this depends on a number of factors, including political and ethical direction and framework. ‘Ethics can't be a side hustle’ – https://deardesignstudent.com/ethics-cant-be-a-side-hustle-b9e78c090aee. Take blockchain and distributed-ledger technology as an example: There are many downsides and challenges that if they are not overcome can be detrimental to people's well-being. The exponential energy use can accelerate fossil fuel use, depletion of rare earth metals, e-waste production, etc. The technology can produce dystopian futures (see ‘Black Mirror’). Money could become programmable, so the issuer of your salary or welfare cheque could determine how you can and cannot spend your income. On the other hand, the technology has the potential to do good, kill off the neoliberal nastiness of our current capitalist system through disintermediation, and bring about radical changes to society – universal basic income, direct/representationless governance and democracy, e.g., http://www.mivote.org.au)."

Mary Griffiths, associate professor in media at the University of Adelaide, said, "Developing and maintaining up-to-date digital and smart literacies is urgent. Coding education in primary schools could be routinely added to the curriculum. People can only appraise the benefits of technologies (and eventually contribute to the general well-being of humanity) the benefits of technology if they know how it works."

Stuart Elliott, a visiting scholar at the National Academies of Sciences, Engineering and Medicine, said, "It would be bizarre to think there are no possible actions that can be taken to mitigate potential harms. Individuals and organizations can choose how and whether to use the technology for particular purposes. If the technology is perceived to have a net-negative effect, reduction in use by individuals and organizations will force companies to change the ways the technology works to reduce the negative effects. Governments can also regulate the technology in a variety of ways. This doesn't mean that the technology will be structured in a way that's optimal, but it does mean that the structure of the technology and the way it's used are likely to evolve in a way that provides a net benefit."

Brittany Smith, a digital marketing consultant based in North America, said, "Companies such as Facebook and Google need to be heavily regulated and turned into useful public utilities. Especially with the repeal of Net neutrality these companies and their services will further detract from the social good."

Jason Abbott, professor of political science at the University of Louisville, said, "At this juncture as an adult it is difficult to know what interventions will be created other than applications that encourage mindfulness and rest, but as a parent I am confident that there will be increased parental controls and even controls by media content providers to limit the access to potentially harmful websites and applications for children."

Joseph A. Konstan, distinguished professor of computer science and engineering at the University of Minnesota, expert in human-computer interaction, commented, "We need to restore a commitment to Net neutrality. We also need to think about re-architecting the internet to remove anonymity from public postings – let's consider what the internet would be like if all messaging were publicly traceable – how well would that help beat back bullying and hate groups? We need tools that allow individuals to see the variety of ‘possible digital spaces’ they might be in, recognizing the different products, news, commentary, etc., that are out there and prominent to others. We also need tools to help individuals and families set rules around availability and interruption – rules with the flexibility to support emergencies yet the automation to restore levels of human interaction."

Timothy Leffel, a research scientist at NORC at the University of Chicago, one of the largest independent social research organizations in the U.S., said, "I believe in science, and behavior change can be approached scientifically."

Bill Woodcock, executive director at Packet Clearing House, the research organization behind global network development, said, "The European General Data Protection Regulation is the first sign of regulators waking up to the need to protect the public interest in cyberspace. Privacy and control over personal information have been the worst victims of our rush to move everything into the cloud. I believe that the current conflict between state currencies and ‘cryptocurrencies’ will need to be resolved by regulators soon, and the role of privately mediated transactions will need to be clarified. I believe that one of the most insidious threats we face is the monetized exploitation of our own psychological weaknesses: the creation of AI and deep learning devoted to extracting money from people's needs for social acceptance, addictive behaviors, or insecurities is, essentially, the breeding of predators for whom we are the prey. Five years ago, this basically didn't exist. Now, such systems extract money from the very young, the very old, and the very credulous; but they're learning quickly, and five years from now, all of us will be within their reach, unable to determine whether we're talking to a real person or being scammed by an AI. This is an area that looks to me to be completely unregulated right now, and the area which most needs regulatory attention."

Silvia Majó-Vazquez, a research fellow at the Reuters Institute for the Study of Journalism, said, "Digital literacy should be a priority in education systems all over the world to enhance people' skills to cope with potential threads of the digital domain but most of all to make the most out of the digital sphere."

Erin Valentine, a writer based in North America, commented, "People need to work on actively finding a balance in their lives regarding technology. I see this changing from person to person. Also, working on being present in the moment is important."

Dana Chisnell, co-director of the Center for Civic Design, wrote, "It's an arms race. As individuals find tools for coping and managing their digital lives, technology companies will find new, invasive ways to exploit data generated on the internet in social medial. And there will be more threats from more kinds of bad actors. Security and privacy will become a larger concern and people will feel more powerless in the face of technology that they don't or can't control."

Bouziane Zaid, an associate professor at Al Akhawayn University in Ifrane, Morocco, wrote, "We can educate people more on privacy issues, on how to protect their information and be aware of what they sign off on when they click ‘agree’ to terms and conditions. We can also pressure governments to be more judicious in their surveillance activity and pressure them to establish mechanisms of oversight to limit any potential abuse of power."

Morihiro Ogasahara, associate professor of sociology at Kansai University, said, "Because users of platforms (e.g., Google, Facebook) hope for these actions, platforms will have to respond to the huge demand. Of course the definition of benefits/harms sometimes depends on people's habits or cultural context and these have been shifting, therefore the actions will be necessarily temporal symptomatic treatments. And yet, they will mitigate the ‘harms’ at that time in some extent."

Jonathan Irvin, a retail manager based in North America, said, "I think there is a possibility for actions that would mitigate harm caused by digital life but I am skeptical that those actions would be taken."

Mícheál Ó Foghlú, engineering director and developer, tools and signals at Google Munich, said, "More pro-active monitoring (automated and manual) of the abuse of online content generation – written, video and other – will help mitigate the negative impact of such technologies."

Lucretia Walker, a quality improvement associate for planning and evaluation social services, said, "The main intervention that could help by at least mitigating harm would be directed towards companies and that would be strong REGULATIONS for consumer protection which sadly, will never happen."

Eileen Rudden, co-founder of LearnLaunch, wrote, "Research is being done on managing digital distractions (Gazzely and Rosen). Just as ‘digital literacy’ is now taught (for example, how to recognize biases in media), so can digital citizenship and digital friendship."

Sa'ar Gershon, an online education administrator and statistics Ph.D. candidate based in Europe, wrote, "Technology is bigger than nations and politics. The internet as the infrastructure enabling reach of many tech to the population should be maintained and handled by a global consortium, and all aspects that might affect it might be maintain in that forum. No one nation or country should be allowed to have control over it. The same goes for online identity issues. Email and identity on the internet are a global issue, hence all aspects of it should be handled globally. Systems security protocols that are consent-based, allowing only unanimous actions distributing potential risk from 'black swan’-type events are needed for all global and national systems and technology. Policy regarding the use of tech should not lagging so far behind reality. More tech experts are needed in places of decision-making and policy."

Garland McCoy, president of the Technology Education Institute, said, "New private-sector tools continue to enter the market that help people manage their online experience and their interactions with smart devices to provide a better/safer ‘digital’ environment. I don't think government intervention will help. Additionally, with government interaction there is always the risk of unexpected collateral damage and the inevitable regulatory creep. If consumers/customers need something the private sector is usually quick to respond with a product."

Robert Stratton, cybersecurity entrepreneur, coach and investor, wrote, "There is a loss of online civility even on the part of otherwise decorous people. There are sound arguments for discussing and promulgating social norms of civility and due care in the consumption of online media. This is not to suggest that regulation is the right idea. To the extent that we request online service providers or the government to protect us from unpleasant speech, we are planting the seeds of our own repression and chilling effects. We need to explain just how important it is to verify information against known valid sources. Reputation systems, even when pseudonymous, can help. If ever there was a time to point out that the speech most deserving of our protection may well be the most unpleasant it is now."

John David Smith, coordinator at Shambhala Online, said, "We need new ways to educate people so that they understand the impact of their actions online. A lot of what's going on in the online world is hidden; people need to be able to see it and they need to be educated so that they can see it."

Jenny L. Davis, a lecturer at the Australian National University's School of Sociology, said, "Critical attention to design and evidence-based assessments of how technical design decisions affect diverse populations will be key to generating socially responsible technologies that attend to the potential benefits and harms of digitality."

Louis Schreier, a respondent who shared no additional identifying background, wrote, "Safety, security and simplicity. Those functional attributes must be built in to assure users, protect user information and make ease of use equal in importance to the underlying functionality of the technologies themselves."

Deborah Lupton, a professor at University of Canberra's News & Media Research Centre, said, "More information communicated to people about how best to protect their personal information online would help them to engage with fewer risks of having their data stolen or misused. It is also up to the internet empires to step up and take more responsibility and care for protecting people's personal data, an issue that they have often been far too cavalier about in the past."

Beth Kanter, an author, trainer, blogger and speaker based in North America, wrote, " 1) We can do a lot more education on the harm that uses of Facebook and our mobile phones can do to our mental and physical health. Folks like former Google design ethicist Tristan Harris and others from the tech industry have brought a lot of this to light. There are also scientists who are studying this, like Gary Small, an author of ‘iBrain: Surviving the Technological Alteration of the Modern Mind’ and his lab. In addition to educating people about the dangers of improper use of online tech, we need to understand when we're getting addicted and how it is impacting us and learn techniques in how to practice technology wellness. We also need to educate in the workplace as well as in homes and schools. More workplaces in America need to set better limits on employees’ after-hours communications. Maybe we should follow France's lead and make after-hours emails and messaging illegal. Schools also need to teach tech wellness to kids, especially today's screenagers. 2) We can’t just put this on the backs of individuals. The tech companies have to take responsibility, too – they are the tobacco industry of today. Harris has been a leading voice on the ways that technologies are being designed to create behavior addiction, and the motive is so they can sell our attention to the advertising buyer.”

Walt Howe, a retired internet consultant and U.S. Army education specialist, said, "Those who have trouble embracing change will still be helped by advances in medical technology and artificial intelligence. Siri and Alexa are just a bare beginning of major assistive technology to come."

John Dorrer, a consultant based in North America, wrote, "If the power of technology can be used to widely distribute information than surely we can do better to identify sources and filter out untruths and damaging content."

Dan Ryan, professor of arts, technology and the business of innovation at the University of Southern California, wrote, "I would like to see a low-transaction-cost method for tagging ownership of personal information that would allow individuals to up-license use of their data (including the ability to withdraw the license) and potentially collect royalties on it. A block-chain-like technology that leaned in the direction of low transaction cost by design rather than trying to be a currency might allow this to work. Alternatively, third-party clearing houses that operate as consortia could control good/bad behavior of information users (e.g., if you continue to use personal info when license has been revoked you will be denied access to further information) could make something like this possible. An extension of this to permanent transportable identity and credit ratings could make a big difference in parts of the world where those things are a challenge."

David S. H. Rosenthal, retired chief scientist of the LOCKSS Program at Stanford University, said, "The only possibly effective intervention would be the aggressive use of anti-trust action to break up the oligopolies that dominate internet service and the applications that run on it. But, given the power of increasing returns to scale and network effects, even if undertaken it would likely have only temporary success (see AT&T). Given the lobbying power of the incumbents it is extremely unlikely to be undertaken."

Richard Bennett, a creator of the Wi-Fi MAC protocol and modern Ethernet, commented, "Highly-connected nations such as South Korea have had to develop treatment programs for internet addiction. Gamers in particular are subject to this malady, and Korea's broadband networks make gaming very attractive to socially isolated teens."

Frank Kaufmann, a scholar, educator, innovator and activist based in North America, commented, "This is common sense. Everything can be corrected and improved upon to remove or diminish its negative elements. Technology is no different."

Eric Royer, a professor based in North America, said, "Government action is necessary to help regulate and ensure equal access to digital life. More importantly, some type of regulation or government intervention is necessary to mitigate potential harms associated with digital life (e.g., hacking, data manipulation).”

John Skrentny, a professor of sociology at University of California, San Diego, wrote, "There are two key problems with digital life today. 1) Digital media. 2) Digital platforms for services. First, social media and search engines harvest data about users and monetize that data for advertising, insidiously destroying privacy. Even if we are aware of this, we forget about it in our daily lives. Social media and search engines (Facebook and Google) should have paid models where, for a subscription, users can have access to these sites but *not* have their data collected and monetized. I know many who would pay to use these services and protect their privacy. Second, platforms like Uber, Lyft and Taskrabbit should be public utilities that extract the minimum amount necessary to maintain themselves. These companies are exploiting people and I believe it is beneficial for all to make these public and non-profit.  Bonus suggestion: Net neutrality is a non-negotiable. It is appalling that we have lost this. Second bonus: Internet access should be a public good, like water, and we should not be at the mercy of monopolies to provide the internet."

Warren Yoder, an adjunct instructor based in North America, said, "Restoring Net neutrality is essential to creating the commons needed for social and economic innovation. We have only started to realize the potential to support digitally-mediated human-to-human links for friendship, support, education, creation and who knows what.”

James Blodgett, a respondent who shared no additional identifying background, wrote, "There are interventions that can be taken. However, I hope we think out intervention protocols very carefully, because intervention also can backfire. Witch hunts and brainwashing are examples of bad intervention. I think the U.S. Constitution is an example of enlightenment thinking regarding social control that usually does work fairly well."

Lisa Nielsen, director of digital learning at the New York City Department of Education, said, "There are plenty of programs now to address the potential harms of digital life. These are being implemented in schools with programs that address cyberbullying and mindfulness. They are also being addressed more and more in the mental health world. People are learning techniques for being upstanders when they see others not treating someone right. Online spaces are getting much better at setting ground rules.”

Thomas Streeter, a professor of sociology at the University of Vermont, said, "The protection of an individual's data should be defined strongly as a right, alongside the right to life, liberty, etc. Clearly voluntary ‘opt in’ to personal data sharing should be the only allowable way for commercial enterprises to gather data, and it should be required by law to be for limited times only (e.g., after three months, the permission to share automatically disappears and the data must be erased). This would change business models, and might cause some businesses (e.g., Facebook) to implode. That would be fine. And yes this would require an enormous amount of aggressive political intervention in the economy that might seem unlikely and would be completely politically unprecedented. The same was said about the likelihood of the election of Donald Trump."

Gary L. Kreps, distinguished professor and director of the Center for Health and Risk Communication at George Mason University, wrote, "Efforts are underway to improve digital health information tools to make them easier to use and more informative, adaptive, interactive, personalized, relationally sensitive, interesting, private and mobile. New digital health information systems are being built into societal infrastructure to provide automatic access to needed information and support in homes, cars, schools, stores, businesses, clinics, public transportation, clothing, roads, the human body and other parts of everyday life to provide easy access, automated delivery of information/support and specialized functions."

Isto Huvila, a professor at Uppsala University, said, "A major question is to think broader about how we would like to have our life as a whole, not just focus on solving individual hurdles. It is not a question of how I can do a specific thing like to make an appointment or buy something but how and what kind of life all human-beings at large should have on this planet, to think everyone of us, how my doings and digital life affects the others and vice versa, and try to find a balance."

Kelly Quinn, a clinical assistant professor at the University of Illinois at Chicago, wrote, "Many digital tools are provided in exchange for personal data – both about the individual and about the individual's activities. More action can be taken to protect the privacy and integrity of this information, both in its collection and in how this data is used. Unfortunately, the proprietary nature of technology development and internet provision has shrouded the ways in which personal data is collected and used. The argument to ‘just don't use’ digital tools (or the internet) is not an effective means of regulating the power imbalance between providers and users."

Jan Schaffer, executive director at J-Lab, wrote, "I've judged enough SXSW Accelerator competitions to believe that engineers live to solve problems, especially is there is a financial reward at the end of the rainbow. It would be my hope that the tech giants will be moved to embrace problems that preserve civil society and democratic values."

Laurie Orlov, principal analyst at Aging in Place Technology Watch, said, "Boost investment by tech firms in protecting identity more effectively. Begin charging for access to technologies that are useful – and reduce dependency on advertising."

Adriana Labardini Inzunza, commissioner of Mexico's Federal Institute of Telecommunications, said, "The real challenge is reinventing education, learning and teaching programs, reinventing pre-scholars first encounters with IT, educating for the benefits of digital life but also for the risks and perils of an ill use of digital products, combining physical, artistic, athletic, manual skills and training millennials’ senses and sensitivity to keep their body, mind and spirit alert, active, receptive and skeptical, to balance online and offline lives and activities, to learn to produce and create works of art, science, technology rather than being consumers only. To innovate, to solve social or collective problems to use digital products as tools not as ends. New education programs, new skills and guidance for parents, employers, entrepreneurs, government officials, should be designed and put in use in order to help humans bring the best of humanity with the aid of technology, with ethics and empathy, with new golden rules of the digital era, to encourage critical analysis, time management, creativity and empathy serious lectures on privacy and data protection and new laws and regulations that may efficiently, if at all possible, create incentives for a healthy lawful use of digital tools and deter harmful, unlawful and abusive use of it to the detriment of society. Education may prove more effective than law enforcement but insufficient, to persuade people in their own best interest to make a responsible use of IoT, AI, digital transactions among others. But a 180-degree change in the law (torts, criminal, labor, copyrights, procedures, class actions, antitrust law) and culture should be implemented to address the challenges and risks of an automated society and economy if humankind intends to remain human, free and civilized."

Barry Chudakov, founder and principal of Sertain Research and Streamfuzion Corp., commented, “Eric ‘Astro’ Teller, CEO of Google X research and development lab said, ‘enhancing humanity’s adaptability is 90% optimizing for learning.’ The first action we must take to mitigate potential harms of digital life is to frame our present situation for learners in order to provide a context for how far we have come and the implications of the speed at which things have changed. Our emotional and limbic frameworks were developed as hunter-gatherers who moved to an agrarian lifestyle and economy, then to a manufacturing and more urban perspective; in the last 100 years we left that purely mechanical world and moved to a digital-technical substrate and as Teller describes it, ‘our societal structures are failing to keep pace with the rate of change.’

“Well-being – the state of being comfortable, healthy or happy – is increasingly difficult when societal structures – family, church, government – cannot cope with the rate of change confronting us. Our world feels, and is, ‘out of control’ in Kevin Kelly’s famous phrase, because the distributed networks that now are the managing levers of modern life are not top-down control mechanisms, they are flows arising out of crowds and platforms managed by a host of evolving digital tools run by algorithms. Digital life now entails being present in some digital venue, presenting yourself in that digital venue and using that platform to further some goal such as your career or brand. While there are few people who are in a position to affect this, we must educate, educate, educate – young and old.

“We are in a new reality with new dimensions and new rules. So our first intervention should be education starting at the primary level and going through all further levels of education and instruction. The actions we should take are not like building a bridge or forming a government. These are actions of consciousness and awareness that are akin to an exploration because we do not have all the answers, nor are we likely to get all the answers anytime soon. We must insert the useful wedge of space when it comes to our tools. If we are to be fully present when using digital tools in our burgeoning digital life, we must begin to establish some distance of awareness, i.e., enough space between us and our tools to see what we are doing. Time outs and breaks are necessary but are not sufficient. We must become self-aware enough to look around the corners of our tools to see how they are affecting us and influencing us to change our behaviors.

“The first action required to mitigate potential harms of digital life is to build tools and protocols of awareness. Conscious tools – and that’s where we are headed – demand conscious awareness; intelligent tools challenge and demand more of our intelligence. This is a new ball game: there are new rules. ‘Actions’ here are different. The quandary of smart tools is that we must be aware of their new and unique effects. To be blind to this would be as foolish as being blind to the exposure effects from radioactive isotopes. The more intelligence we build into digital tools – things that think – the more it is incumbent upon us, especially as parents and educators, to prepare for (understand, outline, delineate) how these tools ‘use’ their users. We must face the ‘us-ness’ of our new tools. Increasingly they feel, think and look like us (often using our own image to stand for us). This affects, and will continue to affect our well-being, especially if external algorithms begin to ‘hack humanity’ and monitor us to get to know us, perhaps better than we know ourselves. Having a Metalife, as I have said before, is a full-time job.”

Janet Salmons, Ph.D., principal at Vision2Lead, commented, "1) Public education about digital literacy, as well as education in K-12 and at the college-level. If users are more confident, they will venture out of social media platforms and look at a wider range of resources – including original source material. Also, they will think twice before posting personal information online. Civics in the online sphere should also be included. 2) Regulations, including updated Net neutrality rules, must be put into place to protect users and their data."

Meg Mott, a professor of politics at Marlboro College, said, "Political urgencies may force people out of their digital echo chambers. Away from picket lines and riot police, the face-to-face encounters required by retail politics open up the possibility for serious questioning. In terms of teaching and student anxiety: when I introduced a Renaissance technology into my classes – the commonplace book – students realized they had another source to go to when crafting an essay. Once they get the hang of the formalities of each entry, they report a tremendous sense of increased confidence. Instead of looking for answers online, they consult their hand-written, fully-indexed, accurately organized set of entries for interesting questions and possible answers. Both retail politics and commonplace books illustrate ways of engaging with different points of view in the animal/spiritual world. That feels very important."

Riel Miller, team leader of futures literacy at UNESCO, said, "We are in symbiosis with our tools and these tools and uses emerge as we evolve. If you or the tool are inadequate there is a breakdown – if there isn't a breakdown there is a process of adaptation, transformation – driven by values, hopes and power. ‘Mitigating the potential harms of digital life’ is the same as ‘mitigating the harms of life’ – not the right way to frame the issue. At issue here are a few metaphysical starting points – is it better to know than not, even if knowing can kill? Is it better to be open than closed? Is it better to seek diversity, to cultivate creativity? BETTER than what? No way of knowing – it's just a current preference. ‘Potential harms’ is imaginary – how do you construct this imaginary future and why? For me there is a strong danger of 'using-the-future' in the wrong way with these formulations because the way the future is 'used' is not based on an explicit or developed theory of what is the future."

Flynn Ross, associate professor of teacher education at the University of Southern Maine, wrote, "Other nations have regulated advertisements on television to limit marketing targeting our children, we need to create and enforce similar regulations for social media."

Uwe Hasebrink, a research scientist based in Europe, commented, "Digital life does not follow a law of nature. It follows ‘laws’ of economy, sociology, politics, psychology and other disciplines that share the basic assumption that reality is socially constructed. Thus digital life can be shaped along basic values and specific principles and objectives. We cannot not shape digital life."

Ebenezer Baldwin Bowles, author, editor and journalist, said, "Mitigation involves restriction because education isn't working. No number of clever videos or school assemblies can convince a sufficient number of drivers to pay more attention to the act of driving and less attention to the digital devices in their possession. A citizenry already trained to accept limits on freedom in the name of safety and security will eventually enter motor vehicles designed to block microwave signals and shut down their personal digital assistants. We are moving inexorably toward absolute control of digital life by global corporate entities, abetted by bought-and-paid-for public servants and government leaders co-opted by business. These controllers will define by decree the harms of digital life and then mitigate these harms through the extreme threat of blocking access to data delivery systems."

Peter and Trudy Johnson-Lenz, principals of Pathfinding Smarter Futures, wrote a comprehensive response with many details and related links: “Frankly, our global situation is far worse than we ever imagined. Humankind has organized to create civilization to exploit self-preservation instincts that shut down our thinking in favor of quick, automatic fight/flight/freeze reactions. Those reactions come from the deeper, older parts of our brains that kept our ancestors from death and destruction so they could survive. Now, increasing hyperconnectivity and constantly accelerating change are confronting us with threats that trigger fear and anxiety all the time. Our civilization is not smart enough to survive the mess we've created because it's making us stupid with uncertainty, fear and helplessness. To foster personal and societal well-being, we need to learn and practice how to be present to what is, with each other, without fear. It's a tall order, but what else is there to do? This will take patience, love and practice, practice practice! There are three classes of actions that can be taken to mitigate potential harms of digital life.

1) Highlight positive potentials and negative and sometimes unintended consequences of digital technologies and digital life as they evolve in order to provoke and foster societal conversations about how to orient their development. The media and many other information sources must act as ongoing observers, reflecting to us what digital technologies are doing to us individually and collectively. The evolution of digital life needs to be oriented toward personal and social well-being, rather than being shaped by market forces and profit. A few current examples of such observation and reflection include: The BBC's television series, "Black Mirror," now in its fourth season, CNN's documentary series ‘Mostly Human’ with Laurie Seagall, Jean Twenge's research on smartphone use, reported in The Atlantic's ‘Has the Smartphone Destroyed a Generation?’ Scientists need to find ways of listening to and valuing more diverse forms of public knowledge and social intelligence. Only by opening up innovation processes at an early stage can we ensure that science contributes to the common good. Debates about risk are important. But the public also wants answers to the more fundamental questions at stake in any new technology: Who owns it? Who benefits from it? To what purposes will it be directed? See ‘See-Through Science: Why Public Engagement Needs to Move Upstream’ by James Wilsdon & Rebecca Willis: ‘Those advocating redesign and different ways of using these technologies must be given a platform to share their thinking so new products and services can be developed, tested, and adopted. Ultimately, we need to have more ‘see-through science,’ to involve the public upstream in the development process to make sure science and technology contributes to the common good.’

2) Develop new apps and digital technologies for the express purposes of enhancing well-being in the five areas described in Gallup research [career, social, financial, physical and community well-being] and further developing and reinforcing the twelve well-being skills that Rick Hanson's Foundations of Well-Being teaches. A few current examples: The Smartphone Psychiatrist
https://www.theatlantic.com/magazine/archive/2017/07/the-smartphone-psychiatrist/528726/ …

3) Encourage and support education, training and practices of critical thinking, respectful engagement, mutual trust, collaboration, conflict resolution & transformation, and the like. Digital life is increasingly fragmented and polarized, further eroding trust, cheapening relationships, and shattering community. At the same time, there more programs, courses, examples, and practices that point the way to a better future. Learning to skillfully manage our (scarce) attention and our thinking for our own well-being and that of our circles of influence is key. A few current examples include: University of Washington's Calling Bullshit in the Age of Big Data, Howard Rheingold's chapter on ‘Crap Detection 101’ in his book, ‘NetSmart: How to Thrive Online’ and Jerry Michalski’s : What If We Trusted You?

Frank Feather, a business futurist and strategist with a focus on digital transformation, commented, "Yes, of course, common sense and thoughtful analysis tell us that we must use any technology wisely. Indeed, digital technology itself helps us to be more educated about its safe and productive use and application."

Rob Frieden, a professor of telecommunications and law at Penn State University, said, "Leaving technology introduction and integration to an unregulated marketplace diminishes the benefits, because most stakeholders do not operate as charities. If governments conscientiously embrace their consumer-protection and public-interest advocacy roles – a big if – society can integrate new technologies accruing measurable benefits."

Hanane Boujemi, a senior technology policy expert based in Europe, commented, "Time-management, digital detox, focus on how technology can be used for the good, be creative in using technology to identify practical solutions to daily life hurdles."

Sheizaf Rafaeli, a professor at the University of Haifa in Israel, wrote, "Digital is powerful. In the hands of ill-meaning people, corporations, governments or groups, it can be used to leverage crime, violence, oppression. Everything that can be done, including regulatory acts, citizen and social action, scientific and technical effort, should be put into reducing these ill effects. I do not believe that the market, left to its own devices, will suffice. So governments, NGOs, the public, have to take responsibility and intervene. On the other hand, I do think that progress is being made."

Jaime McCauley, an assistant professor of psychology at Coastal Carolina University, said, "Sure, as we become more familiar with the internet and the ways in which people use it we will continue to develop strategies to make it safer and more secure."

William Schrader, founder and CEO of PSINet, wrote, "There are several companies working on improving life for humankind worldwide. I am involved with one. [NEXT, a social-impact enterprise consisting of a team of performance experts and behavioral scientists working to enable people to overcome life’s biggest challenges.] I have seen what this approach can do over the past five years. Hundreds of people have been helped, permanently. This fact gives me some peace. They will thrive and others like them will join and we can possibly actually subdue the evil by allowing them to see the value of happiness and peace."

Adam Nelson, a technology developer/administrator based in North America said, "Good governance can be enabled by technology, and it is the heart of protecting people from its malicious use."

Clifford Lynch, executive director of the Coalition for Networked Information, commented, "This can be approached in two ways: one can forbid certain types of systems and services, or certain practices, or one can help people to learn more about the choices they can make, their implications, and how to better cope with the digital environment. Both are needed. As an example, it is becoming almost impossible to function without using the web and email at least occasionally, yet both have become utterly disrespectful of individual privacy, and this is getting worse. There are very significant limits to what an individual can do about this: regulation and law is clearly necessary. But we also do very very little to help individuals to cope: this should begin in elementary school and be a part of the educational experience at all levels. It's less clear how to reach adults, though certainly libraries and perhaps the health care system can play a part."

Vicki Davis, an IT director, teacher and podcaster based in North America, said, "If smartphone companies care about the health and wellness of humans, they will make these things easier. But until then humans must use the greatest software ever invented – their brain – to set healthy boundaries. Charge phones in the kitchen (not the bedroom). Kids are desperately sleep-deprived. We see it in dropping test scores. I believe much of the education crisis is due to children not getting sleep. Kids need cell phone-free hours, and the phones should not be in the bedroom. Get an alarm clock. Kids sleeping with phones under their pillows aren’t sleeping. How many of them are being deprived of valuable REM sleep because one teen with insomnia causes a massive 2 a.m. group chat that causes homeroom sleepy heads the next day? Fight this battle to raise grades and have healthier kids as a result.

"No phones at meal times; talk to each other. The stats on family mealtimes and reducing drug use and other risky behaviors are there. We must learn to respect the person in front of us, and that means the phones are put away. This means parents, too. Use the do-not-disturb mode at scheduled times.. Everyone needs to learn to be strategic about what you let through. I also wish that phones had a work-only mode. For example, when I am at work I want to use certain apps and I want social media to stay away. At night I want social apps but not work apps. Right now, this isn’t possible.

"Our workplaces deserve our uninterrupted focus, just as we deserve personal time on weekends and at night. Our phones have become a maelstrom milkshake of work and personal, and it is increasingly difficult to focus on one or the other. Use ‘airplane mode’ to activate your camera only: When on a vacation, you want to take pics. However, it isn’t a great idea to let people know you are not at home. Turn the phone to airplane mode and take pictures and wait to share then when you are back. My pastor, Michael Catt, has a phone basket and when everyone arrives for vacation, the phones go in the basket. This is awesome and really helps families connect. Businesses need to set boundaries to preserve work/life balance. Admittedly, when you work for an international business the different time zones make it very difficult to keep balance. However, I remember my husband once had a boss who worked Sunday mornings and he would email him and expect answers immediately, even if he was teaching a Sunday school lesson or during church. Businesses that don’t respect boundaries will find the best talent goes places where those boundaries are more respected.

"Rest and sleep are vital needs, as is personal time. Human hamsters on an incessantly-turning wheel don’t make great employees. Setting reasonable expectations for email response time and delayed email delivery are things that can help mitigate the incessant barrage of work life on one’s personal life. People are used to getting instant answers now, but we must all have healthy boundaries. I will be an excellent employee, however, my family is even more important. Like Gandalf in ‘Lord of the Rings,’ I will turn things off so that messages ‘do not pass’ and I can have healthy time off work. Take a digital sabbath: Once a week I put up my phone for a day. I schedule social media updates ahead of time. I do have a worry about when smartphones move into our glasses and contacts. This is why doing these things has to be easier. If we want healthy human beings we need to establish boundaries."

José Estabil, CEO of a biotechnology startup, said, "It depends on what is meant my an ‘intervention.’ Society has decided – almost everywhere really – that the construction, maintenance and improvement of roads and bridges should be delegated to a government or a public trust. We have not (yet) achieved a similar kind of mechanism for technology. But I hope we do. And soon."

Miguel Alcaine, an ITU area representative based in Central America, said, "We can learn what and how to educate people, particularly children and youngsters. Also, consideration must be given to educate people entering their golden years."

Christian Huitema, a technology developer/administrator based in North America, said, "Your question is asking for a negative, to agree that nothing can be done. That would be extremely unlikely."

Kathryn Campbell, a digital experience design consultant, said, "Feedback is a critical component of bringing attention to a problem and affecting change. Simple tools that periodically alert you to how long you’ve been online, for example. Or perhaps a tool could show you how diverse and credible the news sources that you reviewed today were. Just as you might weigh yourself every day to make sure that additional weight doesn’t sneak up you, we can and should have alerts and gauges that remind us when our digital behavior has become unhealthy."

Amali De Silva-Mitchell, a futurist based in North America, said, "There is the opportunity to mitigate risks of harm by active public and international engagement of the discussion of emerging issues e.g., UN Internet Governance Forum consultations. However, there is a need for policy makers who are business, non-profit or government to open up to all forms of input to the public. ‘Street Academics’ (information, analysis and observation articles/reports of knowledgeable individual advocates and public with an ‘opinion,’ that are written pieces that are not blogs, newspaper articles or associated with an institution) approach (use of public information sources such as Wikipedia, Twitter which are dynamic (verification may be difficult, slow, non-existent or inconclusive and change to reflect the public opinion) is going to be critical to source as it is going to be very current, hold popular opinions not typically used by academics and will provide access to those who may not be typically be published or consulted due to issues of access, or their work as volunteers, independents, etc. These analytical pieces will lack the academic referencing and so forth because it is about emerging issues perhaps with no prior citations etc. The opinions of these influence-generating, pioneering citizens are going to be increasingly important. They are close to the ‘high-street’ of the public and are writing in real time. They may set up think tanks, etc., but their opinions must be consulted. Fake news can impact newspaper articles that will see a reduction in the quality of their verifiable content due to the use of street information as we are all aware."

Mark Glaser, founder and executive director of MediaShift.org, said, "The main action that people can take around harms of technology is to limit time with it, to take time in nature, to turn it off and take a break. This is difficult because of the way we are addicted to having it on and with us at all times. There are apps that remind people to take a break and workshops and retreats without technology. I think all of those things and more will be needed to keep people healthy in the future."

Jodi Dean, a professor of political science said, "Internet giants (Google, Facebook, Apple, etc.) can be collectivized, turned into public utilities so that capitalist dynamics don't guide the way they develop."

Mike Silber, general counsel at Liquid Telecom South Africa, wrote, "We need partnerships to deal with content issues. No one entity can accept responsibility: there needs to be a form of co-regulation between content creators, content users, platforms and governments to ensure that the freedom and openness allowed by digitalisation is preserved, while malicious actions can be mitigated. Education is one of the most important issues and current education systems are not equipping learners with the critical thinking skills necessary to read broadly and form opinions. We run the risk of perpetuating digital echo chambers where independent thought will gradually disappear."

Glenn Grossman, consultant of banking analytics at FICO, wrote, "There are harmful uses of technology. Actions can be taken to rate or approve services. Underwriters Laboratories existed for products; we can have third-party certification for digital."

Narelle Clark, deputy CEO of the Australian Communications Consumer Action Network, said, "The booming industry of mental health apps illustrates the desperate need for broader availability of mental health care. Many of the current apps fail to contain appropriate attributions to their creators or to the evidence (if any) of their effectiveness, yet many make extraordinary claims. These apps also have the ability to prey upon vulnerable people through in-app purchases, inappropriate treatment and so forth. I welcome advances in apps that work, and in the efforts of health practitioners and regulators to act against the predatory ones. If we can promote the effective ones, these apps and related services have the potential to deliver real benefits to society."

K.G. Schneider, dean of the university library at Sonoma State University, wrote, "The big thing is anonymity. The idea that people can be vicious, can lie, can do all kinds of things in the virtual realm while avoiding accountability is a strange disconnect from how we organize our cultures in the analog realms."

Scott Johns, a high school teacher, commented, "There are definitely educative things which could be done. When children first use computers, they could simultaneously be exposed to nature in equal measure, be asked to draw things that they could snap with their mobile phones. Write stories that are read in the family. A thousand things that engage the mind in doing or contemplation. Apps could be devised to remind people to live a proper life of challenge. Remind us of how long we have been online and ask provocative questions. ‘Have you spent this long talking to your significant other today?’ Remind us to move, go look at a bug with a magnifying glass, feel wet sand between our toes or handwrite a letter. We have to be reminded to be full humans by the same technology which has the capacity to undo our human skills and deprive us of human experience."

Srinivasan Ramani, a retired research scientist and professor, said, "Making a city a healthier place to live does not require shutting down all the bars! We should concentrate on creating more gyms, parks and places serving healthy food. Similarly, I believe we should focus on creating healthy offerings over the Web that enrich people's lives. Schools should teach their students about exciting and valuable resources the Web offers. In many countries, students use smartphones only for entertainment and basic communication. Schools and even colleges respond to this by banning students from bringing smartphones to their premises."

Jillana Enteen, an associate professor at Northwestern University, said, "California put out guidelines for radiation and cell phone use this week. The American Pediatrics Association is re-examining their limitations on screen time concluding that poor well-being is not always linked to more than two hours per day screentime. I think that in a few decades, if not a few generations, more will be understood about the effects of digital life on well-being and this knowledge will be consumed and deployed eagerly."

Laura Young, an information science professional, wrote, "We can mitigate some of the potential harms of becoming dependent on our technology by creating social or economic consequences. In schools they still require kids to put their phones away; we can enforce this at home or in business. We could improve bureaucratic dealings by requiring government offices to employ human beings to answer the phone, but we as a society would have to pay sufficient taxes to afford it. We could choose to boycott companies that employ phone trees that do not ever lead a caller to a human being. Children can be taught to take tech timeouts for an hour or more a day, in the reverse of our early adoption of TV when we were restricted to an hour a day of television watching (this in the 60s and early 70s)."

L. MacDonald, CEO of Edison Innovations, wrote, "People are creative and entrepreneurial, so continuous change and improvement is a fact of life in a free society."

Philip J. Salem, a respondent who shared no additional personal information, wrote, "I am cautiously optimistic. I have taught classes about human communication technology for 35 years, and I have seen the differences. The goals should be greater mindfulness, empathy, and personal responsibility."

Gabriel Kahn, professor of journalism, University of Southern California, said, "Digital platforms control our communication ecosystem. Just as a supermarket is held responsible for the quality of the food on its shelves, social platforms should be responsible for the content they spread."

Allen G. Taylor, an author and SQL teacher with Pioneer Academy, said, "There is always a way to improve any situation."

Laura M. Haas, dean of the College of Information and Computer Sciences, University of Massachusetts-Amherst, wrote, "Research and innovation are constantly tackling the hard problems we discover. If we find a drug has a bad side effect, we look for new and better drugs, and often we succeed. Likewise, I have faith that we will find new solutions for the negative effects of digital life. Of course, policy and regulation can also play an important role: the question there is whether we have the will to set those policies, enforce regulations, and so on."

Georges Chapouthier, a retired research scientist who lives in France, wrote, "The use of the Net should be left free for adults."

Vincent Alcazar, director at Vincent Alcazar LLC, wrote, "We must absolutely remain vigilant for the unintended consequences wrought by technology. An example is Adobe's VoCo technology, which if commercially developed – as it most assuredly will be – will fully, perhaps violently, upend all that this society and civilization holds as truthful with regard to human voice and motion veracity."

Tom Barrett, president, EnCirca Inc., wrote, "Training for new occupations can be formed for those people, such as drivers, lawyers, etc., whose jobs are disrupted by new technologies, such as artificial intelligence and self-driving trucks and cars."

Michel Menou, a retired professor of information science based in France, commented, "Ensure that real identity of Net users is disclosed or at least easily available might reduce, if not stop, bullying."

David R. Brake, an independent scholar and journalist based in North America, said, "Tech giants must take responsibility for their newfound power, which is not just financial but social. They must openly and transparently deal with the negative consequences – for example when they facilitate online bullying and hate speech, and where they increasingly control flows of news and information they must adopt appropriate norms and values borrowed from, for example, responsible public service broadcasting organizations. If these organizations fail to act promptly, governments should have the courage to regulate them in the public interest – preferably in concert."

Mike Caprio, innovation consultant for Brainewave Consulting, said, "Internet access is a human right and steps must be taken to give every person everywhere unfettered access to networks. Democracy and social mobility will increase everywhere that digital life is allowed to flourish away from the negative influences of vast commercial monopolies and overreaching governments corrupted by corporations. Public funding must be applied to create infrastructure that is not owned and manipulated by corporations, and net neutrality must be the principle applied to all networks. Publicly funded alternatives to walled-garden digital services must also be implemented, with data freedom and portability for all users – people must control their own data at all times."

David Golumbia, an associate professor of digital studies at Virginia Commonwealth University, said, "Serious government regulation is needed at both the national and international level. It is possible, despite the many tools Silicon Valley – building on other industries like oil and gas and tobacco and finance – have developed to prevent it."

Bradford Hesse, chief of health communication and informatics research at The National Cancer Institute, NIH, said, "The National Academy of Sciences maintains a Board on Human System Integration, whose responsibility it is to leverage the capabilities of human factors specialists in monitoring the unanticipated consequences of complex technological systems. The Board also reviews and curates the methods that can embedded within complex to promote safety and system improvements. In Silicon Valley, these methods are often referred to as ‘User-Centered Design’ or more colloquially ‘Design Thinking.’ Unfortunately, resources are not always allocated within the most vital sectors of the economy to self-correct when negative consequences are detected, or more importantly to embed the data-based signal processing systems needed to prevent negative consequences early in their life cycle. My hope is that resources will become more available as the negative consequences of not engaging in cybernetic, sociotechnical monitoring becomes apparent."

John McNutt, a professor at University of Delaware's School of Public Policy & Administration, commented, "We can almost always make things better. We have to want to do that."

Vittorio Veltroni, CEO of Art Caffè Torrefazione, commented, "Technology, at the end, goes where the money is, and it is important that while we spend extensively on digital improvements, as they can have enormous positive potential for our health, we do not stop also spending on real life. As an example, the oceans must be really preserved, air must really improve, water must really be cherished and protected. Only if we avoid pouring all the money into digital because it seems an easier solution (forget about the real river, feel this virtual one!) can we actually make healthier people. This is a political and cultural choice that cannot be left to the animal spirit of the market (financial or consumer) alone; they will flock to the easier and most cheap solution available."

Joe Raimondo, digital CRM leader at Comcast and former CEO, said, "Eventually the need for fair and intelligent propagation of rule sets will take over – slowly and not in an organized fashion. But eventually."

Meredith P. Goins, a group manager at ORAU, wrote, "We need to teach children on how to function with and without technology. NO one should have their own personal phone before the age of 13. Children should not be allowed to play games all day long. One thing that has worked in my office is that we are not allowed to look at an electronic during meetings (computer, iPad, phone, etc.) as they were distracting individuals from the content at hand. Thus, they were not paying attention, not participating, and not buying into the solution because they didn't hear it. This has HELPED shorten our meetings and HELPED us stay on track."

Fabian Szulanski, a professor at Instituto Tecnológico de Buenos Aires, said, "All is a matter of balance. Programmed digital detox with personalized prompts of digital personal assistants will avoid or dampen side effects such as isolation, nature-deficit disorder, eyesight issues, attention-deficit, anxiety and depression."

Meg Houston Maker, author/editor/journalist, wrote, "Digital tools themselves can be designed with a self-policing layer. The technology and AI are in place, but they must be implemented humanistically."

Robert Bell, co-founder of Intelligent Community Forum, wrote, "The interventions will not, for the most part, be technological but social and cultural. The plague of ‘phone zombies’ crossing streets without looking for traffic and bumping into us on the sidewalk is a marker I watch; I expect that over a matter of years, this behavior will decline because it is socially inept. Equally, we will slowly develop habits and ways of thinking that make us less susceptible to hackers."

Ian O'Byrne, an assistant professor of education at the College of Charleston, wrote, "For me this answer is both a yes and a no. I never thought I'd say this, but I think it might be based on the age of the individual. I think you're seeing a growing contingent of people who are actively examining or problematizing their use of technology. Possible interventions may include a growing focus on meditation and mindfulness practices. This may also include designating off time, ‘screen-free Saturdays,’ or making your displays grayscale. This may also include more reading of texts, including philosophy and Stoic-based texts. For some people there is a desire to find balance in these relationships with technology. In many ways it is like the discussions addicts have about their relationships with vices. I also believe that we (if I can lump adults into one box) don't entirely know what the best uses of these tools and platforms may entail. We also don't entirely know what is best for the children and future generations. As we've learned from work by danah boyd and the HOMAGO [Hanging Out, Messing Around and Geeking Out] group, and, as recent anecdotal research suggests, we do not know exactly what the future generations will want or need from these spaces. There is already anecdotal evidence that they do not see much value in the social media that monopolizes the lives of adults. We need to see what impact there is for the individuals that full grow up in the soup that is this digitally connected space."

Lisa Lurie, a senior product strategist/designer, commented, "There can be actions taken to mitigate the potential harms of digital life. These actions will likely take the form of: 1) Controls by parents. 2) Control by lawmakers. 3) Self control. Perhaps it will be partially the responsibility of the creators of the technology to mitigate for potential harm as part of the design process. But like any problem, first the idea of harm has to be recognized."

Aram Sinnreich, an associate professor at American University's School of Communication, said, "The most important thing we can do to mitigate the negative social effects of the internet is to draw on social scientific and communication research to understand the multifaceted roles it plays in public and private lives, and to use both state and market regulatory measures to address these different dimensions separately, while maintaining a holistic understanding of its transformative potential overall. In practice, this means measures including but not limited to: 1) Holding algorithms, and the companies responsible for them, accountable for their role in shifting and shaping social and political power dynamics. 2) Developing a ‘digital bill of rights’ that privileges human dignity over the profit motive; c) Involving multiple stakeholders on a global scale in internet governance. 3) Integrating digital media literacy more deeply into our educational systems. 4) Regulating internet communications in way that privileges diversity of participation at every level and requires accountability and transparency to consumers and citizens. 5) Investing heavily in post-fossil fuel energy sources."

Stephen McDowell, professor and associate dean at Florida State University's College of Communication and Information, commented, "As with other social and economic transitions in the 19th and 20th century, concerted civil society, political and legal actions can change the terms and conditions of different actions and interactions in digital life. These include reinforcing basic human rights, and not allowing an unexamined technological optimism about the benefits of innovation to paper over legitimate concerns and the achievement of core social values."

Adam Montville, a vice president at the Center for Internet Security, said, "The only mitigation I see is raising awareness, but to do so responsibly (i.e., based on high-quality studies). We believe too much screen time is a bad thing, but how do we know for sure? Media needs to be responsible when they're saying such things."

Justin Reich, assistant professor of comparative media studies at MIT, wrote, "Education and regulation will both play essential roles in mitigating potential harms. Just as earlier generations of media literacy practices explained to students how advertising strategies work, we'll need similar education to folks about how consumer technologies are designed to capture and maintain attention, to surveil consumers and other network actors to harvest vast amounts of data, and how to organize that data for targeted advertising. As the largest communication platforms begin to function as monopolies, we may need to depend more on regulation than competition to curtail the most anti-consumer behaviors."

Ellen Detlefsen, associate professor emerita at the University of Pittsburgh School of Information Sciences, commented, "I look forward to the use of machine learning and artificial intelligence tools that have the potential to screen and remove destructive or harmful Internet activities."

Kathee Brewer, a technology journalist, said, "Science always finds ways to mitigate the damage new technologies impose on societies and individuals. Medicine presents an excellent example. Cancer treatments save lives, but at significant cost to comfort and quality of life, at least temporarily. Over time, medical researchers have discovered way to mitigate those harms, and there is reason to hope on-the-horizon treatments will be even more effective and less disruptive."

Mike Maloney, manager of Web strategy at the University of Vermont Medical Center, said, "A lot can be done. It all comes down to resources and whether or not it becomes a sufficient priority. As we've already seen, the very technologies we use that are increasing sedentary lifestyles - when used creatively – can be excellent tools to aid in well-being. I can imagine wearables becoming ubiquitous as technology continues to advance becoming less expensive and more intelligent. This means there is potential having a real impact on getting people moving or to make healthier nutritional choices. I can see a ‘parent’ setting on mobile devices that turns the device off automatically until the child moves around actively for 20 minutes (while being tracked by the device). Even in the last three years I've seen a big change in the number of people in my office using standing desks now because they're more aware of the adverse affects of sitting at a computer all day. So I think leveraging technology for small, smart interventions coupled with sustained education could really make a difference."

Heidi Julien, a professor at State University of New York-Buffalo, wrote, "I strongly believe in the value of policy to mediate or mitigate technology's effects. For example, schools could limit the use of technologically-mediated learning, and ensure that children get unstructured, technology-free play time in outdoor settings, to encourage face-to-face social play and physical activity."

Llewellyn Kriel, CEO of TopEditor International, said, "1) A thorough knowledge of the most effective use of technology is indispensable. 2) People will need to find ways (as I have done) to identify what matters and what doesn't, what warrants their attention/concern and what can be neglected and build their own coping mechanisms to deal with digitality. 3) There will be an explosion of methods/people to cope with discomgoogolation and digital overload and a huge proportion of its will be bogus if not downright dangerous. Children will need to be taught from an early age how to cope the digitality. With each new development and new technology, fresh dangers will arise - and ach will need to be addressed as it emerges."

Paul Rozin, a professor of psychology at the University of Pennsylvania, said, "We can mitigate negative effects only because humans have continually done this with new inventions."

Richard Chobot, a consultant and author, wrote, "Improved filters can be created to allow a person to winnow down large number of hits to ensure better relevance of results. Use of natural language-based searching can be further popularized."

Thomas Viall, president of Rhode Island Interactive, commented, "The greatest risk of this new technology is becoming more disconnected from people as we become more connected to things. It is easy for us to withdraw from strangers – we don't have to go into a store, we don't have to ask for directions, we often interrupt the person in front of us to read a text from a person 200 miles away. Digital technology can make us antisocial and rude. I believe we can take action to mitigate these problems – to make sure children at a young age are exposed to etiquette, to game-ify interactions with people and more."

Mike Caprio, visitor services director for Virginia State Parks, said, "Social media sites are tightening their policies against hate speech, bullying and even misinformation and I think that will continue us to expand. I also think they should and will be called upon to encourage people to have more well rounded lives and reduce over reliance on digital excesses."

Sarah Andrews, a program coordinator based in North America, said, "Additional research should help parents and childcare providers know how much access to technology and at which age are appropriate developmentally. Therapists are just starting to be able to treat people for internet addiction and nomophobia [the fear of being without a mobile phone]. Bullying and stalking will be more difficult to deter as they are underreported."

Kathleen Harper, an editor for HollywoodLife.com, said, "Technology can't really be stopped. There will always be people out there willing and able to take things to the next level."

Laurie L. Putnam, an educator, librarian, and communications consultant, wrote, "Here are two of many actions we can take as a society. Action #1: Educators and policymakers can make information literacy a core subject. Information literacy is taught in many schools and libraries already, but it is rarely given the financial and political support it deserves. As citizens and consumers, we are responsible for knowing how to use digital technology critically and responsibly. This is not just about spotting misinformation or ‘fake news’; it’s about learning to maintain our well-being when digital technologies are embedded in every aspect of life. We need to understand how to manage our personal data, to protect our privacy, to assess the information we encounter, to communicate effectively. When living offline is not an option, information literacy becomes a basic skill, and we need to build it into our education system. We need programs for adults, too, and public libraries are a natural fit for this task. It’s time to take digital information literacy seriously. Action # 2: Tech companies, universities, governments and other influencers can broaden the scope of their thinking when it comes to digital technologies. Tech companies – the digital creators – need to think more broadly about the use and impact of their products. For the next phase of digital development, we need to understand not just the user, but the user in context. Digital tools don’t exist in isolation, especially if we’re talking about social media or the Internet of Things, and the impact of these tools can go far beyond individual users and user communities to permeate the very fabric of our society. We can’t fully understand the effects of our digital tools and toys just by looking at the technology. We need to think outside the technology. While companies like Facebook and Twitter and Google can pack a meeting room with high-powered engineering talent, they need to balance the table with social scientists and anthropologists and futurists who can look around and behind and beyond the technology. There will always be unintended consequences, and we need more people watching for them. Digital creators need to pay more attention to the broader circles of impact their products have on society and the information ecosystem. Where hardware is involved, especially as the internet of things is embedded in our infrastructures, manufacturers need to think more about product lifecycles, including reliability, serviceability, and recyclability – all of which, ultimately, affect our daily life and well-being. At the same time, universities need to incorporate information issues more deeply into technology and business programs so that future creators will have more nuanced perspectives on the purpose and impact of their work. Policymakers, too, need to make space at the table for futurists and others who see the world from different angles. An effective democracy needs people who can think broadly, study potential scenarios, and inform policymakers before the big decisions are made – especially big decisions about technology, which is probably not their area of expertise."

Tom Barr, a respondent who shared no additional identifying background, wrote, "The monopolies in the digital realm might – just might – be forced to self-regulate on several fronts by market forces. However, in order for there to be widespread benefits accrue to the billions of ordinary people using these tools, excesses and abuses by these companies will likely have to be reined in by regulatory apparatuses."

Jack Hardy, an entrepreneur based in North America, wrote, "The use of technology is pervasive in daily life and can be a distraction to attention and sleep patterns. Acknowledging addictive behavior is the first step and then creating digital detox programs to reduce dependence and use will be critical."

Tiziana Dearing, a professor at the Boston College School of Social Work, said, "Interventions might include increasing our understanding of social empathy and including it in design. Working extremely hard to mitigate inherent bias in design. Setting out to develop our norms as carefully, thoroughly and rapidly as we develop the digital technologies that change them."

Leah Robin, a scientist based in North America, said, "There are a lot of vested interests in limiting the expense of dealing with fraud and privacy breaches on the internet. I would like to see online bullying and harassment decrease, but suspect that this will only be remedied by particular platforms and niches, not universally. There is not the same economic incentive to limit such harm."

Ruth Ann Barrett, an information curator at EarthSayers.tv, wrote, "Parents getting their nose out of work and paying attention to their children. Limit the use of devices, for heaven’s sake.”

Leo Klein, a web coordinator at a large academic institution in the U.S., said, "We need to guarantee Net neutrality with legislation so that it can never again be endangered. We also need to ensure privacy."

Michael Knowles, an entertainer and entrepreneur, said, "The actions that will be taken will be social in nature. Parents and peers will rediscover in-person connections and augment those in-person connections using social media rather than the other way around."

Robert Touro, an associate professor at Colorado Technical University, commented, "I truly believe that technology can and will address the perils and negative aspects of life online, some will protect privacy, security and filtering will enable a greater level of ‘opt in’ and ‘opt out’ capabilities for all. We are not there now, but I expect over time – improvements will be forthcoming."

Stephanie Mallak Olson, director at the Iosco-Arenac District Library in Michigan, wrote, "Just put it down and take a break from the attachment to the device. Continue to educate the public on how to determine accuracy and safety of online information. Get out and breathe some fresh air."

Kat Song, communications and digital strategy director at the American Association for the Advancement of Science, wrote, "I suspect that the spiral into digital dependence can be curbed through education and practice. Tell people about the potential harms. Give them ways to prevent or reduce their overuse of devices. Those methods can range from things like apps that shut down your device at certain times, to setting a ‘digital sabbath’ (times when you and your family are not permitted to use devices), or even asking other parents to bar or limit device use when kids get together."

Laura Guertin, a professor of earth sciences, said, "I do think there can be actions taken, and these actions do not need to be digital ones! With cyberbullying and an overall lack of civility on social media, parents, teachers and any adult or teenager can work with subsets of the population that are doing intentional harm to others."

Andrew Czernek, a former vice president of technology at a personal computer company, wrote, "Absolutely critical is the implementation of a digital key that can be tied to each individual to eliminate the shoddy security that email/password ‘soft’ security provides. In addition, Congress and our state legislatures need to pay much closer attention to protection of personal privacy. Indeed, at some point we may need a Bill of Rights for digital privacy."

Kirsten Brodbeck-Kenney, a director, said, "It's important to take action to treat internet access as an essential utility rather than as a luxury. Otherwise, as our lives become increasingly entwined with online activity, individuals who can't afford access will be even more isolated and disadvantaged."

Serge Marelli, an IT security analyst, wrote, "People need to learn how to use the tools. Some are willing (or equipped) to learn, some are not willing, or possibly ill-equipped to learn. It is a bit like reining in uses of television, or tobacco, some won't try, some will try and give it up, some will become addicted and some will use ‘tobacco-addiction’ as an excuse. With television, some watch it a few moments a day, some watch it all the time, some select high-value programs, some watch trash (of course, we all define differently what ‘trash’ means). The same will happen with digital tech, just as television."

Steven Polunsky, a research scientist at Texas A&M University, wrote, "As a society, we must make a concerted effort to increase dialogue, to have people meet other people who are not like them and share their personal stories. As individuals, we must become more open to hear from people with experiences outside of our own, and at the same time apply a greater measure of skepticism to new or unconfirmed information."

Paul Manning, a cybersecurity manager, commented, "In some locations the internet or connectivity should be shut down. While excuses would be made regarding emergency services or required connectivity they are in the end excuses to continue behavior that is not helpful in building an in-depth relationships which require a give-and-take approach as well as focused communication."

Gina Neff, an associate professor and senior research fellow at the Oxford Internet Institute, said, "Technology did not create the vast economic inequality that is shredding the social fabric of American life, but it can amplify it. If we don't address inequality then the potential harms of digital life will only worsen."

Darlene Erhardt, senior information analyst at the University of Rochester, commented, "As with anything, if the driving force behind the latest/greatest developments in technology is based on creating things for the betterment of society, taking time to consider the implications, establishing/refining ‘Good Practices’ to go with them, than I think the outcomes may be more positive."

Adrian Schofield, program consultant at the Institute of Information Technology Professionals-South Africa, said, "Vulnerable people of all ages should be protected from harm perpetrated through digital systems. All people should be educated about how to protect themselves from such harm. Policing the digital world should be the same as policing the physical world – protecting the innocent, catching and punishing the criminals. The key is ethical and professional practice in the creation, construction and application of digital systems."

Jane Gould, Ph.D., an author and futurist, commented, "It is particularly important to educate young people to responsibly use smartphones and to be aware and mindful of their online activities."

Estee Beck, an assistant professor of technical and professional writing and digital humanities at The University of Texas-Arlington, said, "In the United States, only government regulation or legal protections for fair information practices in the collection of online data along with the advance of do-not-track technologies, and opt-out practices will help to mitigate the potential harms of digital life. Additionally, consumers need methods for knowing how data collection will be used beyond its original purpose and must have access to files when data is shared to third parties, along with how third parties make decisions on how to use the data for better or for worse."

Stuart Umpleby, a professor and director of the research program in social and organizational learning at George Washington University, commented, "Algorithms in social media show articles and ads similar to what readers have looked at before. In politics this means people live in different information universes. Algorithms could be used that would present other points of view, but will the ‘platforms’ find this to be in their interests? Some regulation will likely be necessary. Alt news sites do not present news. Also, some ‘science reporting’ is by corporations with an interest is a particular point of view. The sources of articles and ads should be clearly stated. How to do this in an environment of free speech and press will require experimentation and clever design. Lessons in understanding media should be offered at all levels of education to help people understand what they are reading and the intent behind it."

Ed Dodds, a digital strategist at Conmergence, said, "Educational sector, nonprofit organization and government investment in ransomware and virus awareness, data-driven democracy, citizen science, distance education."

Eric E Poehler, associate professor of classics at University of Massachusetts-Amherst, commented, "Digital life is part of life. Since the development of the pointed stick, we have been able to surround our technologies with an envelope of social acceptability. We protect ourselves from physical traffic by rules, signaling devices and dedicated spaces for different means of travel – cars on highways, bikes in bike lanes, pedestrians on sidewalks. All these came after a difficult learning experience of the dangers of automobiles. We created physical buffers, laws and norms of behavior (e.g., around drunk driving). We will in time learn to do the same with internet traffic. We can learn (both as individuals and as a society) to more fully control the speed and volume of information, divert its content onto particular paths, authorize and deauthorize some forms of access and establish sanctions for malpractice."

Karl M. van Meter, founding director of the bilingual scientific quarterly "Bulletin of Sociological Methodology," said, "Education is the necessary part of using the internet correctly, which means a better-educated public, and more integration of the internet and associated technologies in the system of education."

Ann Adams, a technical writer based in North America, said, "AI can be used to filter out hate speech."

Michele Walfred, a North American communications specialist, said, "The main one, and it has already started, are interventions to stop or prevent or curtail distracted driving. This is already being driven by the insurance industry, and I foresee stronger partnerships among auto manufacturers, smart phone designers, and telecom companies. My newest iPhone defaults to a do not disturb mode when it detects I have connected to Bluetooth. Parents will be able to disable and lock their children’s phones when they are driving. I also see no reason why privacy laws, which haven’t caught up to the digital era, will be improved."

Lori Laurent Smith, an entrepreneur based in North America, commented, "There will be solutions that arise. Especially once the rising generation become parents, since they will be able to relate to what their children are going through and demand changes. For example, as a Gen Xer, my childhood was the opposite of 'safe' as many online memes show, yet, as the first generation of 'latchkey' kids, safety has come to define us with solutions to ensure our kids are safe: mandatory car seats, bicycle helmets, walking/driving kids to/from school, sex offender registry, locks on everything, expiration dates on all food and so on. In the same way, I believe the rising generation will pass laws to restrict social media use until (hopefully) age 16, increase the understanding about mental health issues in general and make more treatment options available, be able to better identify those at risk of suicide and get them help, set expectations between online and private (offline) life, demand vehicles that require a driver (even if 'self-driving') and so forth. But it will be a very bumpy 20-30 years while our collective wellbeing will, I believe, continue to decline before it improves (assuming those changes are important enough to our future society)."

Larry Rosen, a professor emeritus of psychology at California State University-Dominguez Hills known as an international expert on the psychology of technology, wrote, "I have written seven books, each with strategies for healthy technology use. They are being used by many people who have heard me talk either at schools, to parent groups or to general audiences. Key is moderation plus frequent breaks to calm your brain."

David Klann, a technology consultant for the broadcast industry, said, "People need to be reminded to leave their devices on the desk or table, and simply go outside. Not necessarily to interact with other people in person, but to simply be in the elements and step away from the digital world. Here's an exercise I learned at a spiritual retreat many years ago: go outside and focus on a one-square-foot plot of land for 15 minutes. Note all the things you feel, see, hear, smell or taste in that single square foot. Then spend the next 15 minutes staring up at the sky above the square-foot plot and make a note of all the things you observe in the sky above. This action of leaving the digital world behind for even half an hour can mitigate and relieve some of the stresses imposed by our hyperconnected lives."

Adam Powell, manager, Internet of Things Emergency Response Initiative at the University of Southern California, wrote, "We have no idea – indeed my current project is inventing solutions – but technology (and law and public policy and good business) can be harnessed for good. The white hats can be creative, too."

Frank Odasz, president of Lone Eagle Consulting, commented, "If no one raises the flag on how good people can truly learn to specifically take action to counter all the current negatives online, then the world will continue in a downward spiral. I started in 1983 to win my freedom, successfully, and I now understand that even as one person I now have a global voice that ideally could go viral and change the lives of billions. Perhaps your question begs the question: Are there enough good people out there willing to act if given a roadmap forward? Who will lead the deep research of techno-social psychological impacts, both positive and negative, given Google, Facebook, Apple, Amazon and others have over one-half trillion dollars each. This isn't expensive, when low-cost, short-term pilot projects can quickly demonstrate what does and doesn't work for incentivizing individual and group positive outcomes. Herein lies the rub: most adults – particularly those in leadership positions – subconsciously avoid ‘learning anything they know nothing about,’ from Native elders up the ladder to legislators and Congresspersons. So, while it might be unlikely that solutions to help, not harm, will come from the top down, it is highly likely they will come from the bottom up. It is just a matter of who and when. I've posted online many grant templates for youth-led local pilot projects. I'm pitching a ‘Rogue Scholar’ online program – short e-learning lessons to teach others how to teach online for positive short-term measurable outcomes – based on my 33 years of continuous online teaching and innovation. For example, Native-American youth suicides are often due to individual’s perceived lack of a meaningful role in society and support system for self-esteem. US West funded my Big Sky Telegraph for 10 years; $1.5 million, connecting over 100 one-room schools from 1988-1998, mainly for research and development on whether online markets were going to emerge or not."

Valerie Bock, principal consultant at VCB Consulting, wrote, "I like to remember the power of the off switch. Digital things are electrical things, and we can choose to power them down. Consumers can and should educate themselves about what information they are giving away, and also, on how they can choose to configure their devices to share more or less information. They should make purchases mindfully and refuse to buy products that do not offer them options to configure for more privacy. Of course, there isn't any grabbing back the private data one has mindfully or unwittingly shared, so we need regulations about what can be used in aggregate and what can be used individually. I think it's going to probably take some pretty egregious abuses of personal data before that regulation comes into being, and even then, digital technology often features secret backdoors through which people/organizations may avail themselves of our information illegally."

Michael Glover, a software engineer based in North America, said, "Fraud, theft, and harassment should have consequences. Also, I believe that the standard of behavior should be higher if internet activity is part of paid political or commercial activity, as distinct from individual activity."

Guy Levi, director of innovation at the Center for Educational Technolgy-Tel Aviv, said, "Privacy issues should be changed to adjust to the new reality. Behemoths like Google, Facebook and others should be limited and regulated. Education must change for more personalization, etc.”

Ray Schroeder, associate vice chancellor for online learning at the University of Illinois-Springfield, wrote, "As we become more experienced with the network and its impact on individuals, we will be able to create effective interventions. Data collection carries with it a risk of loss of privacy, but also carries the opportunity to refine and hone our use of network resources to most efficiently and effectively utilize the network to serve our needs."

Matthew Tsilimigras, a research scientist at the University of North Carolina-Charlotte, said, "There is a huge personal and career-related cost to you if you are unable or unwilling to participate in digital life, and so one way of mitigating the potential harm of non-participation would be to make it more accessible to those facing economic or disability-related challenges that make it hard for them to participate in digital life. However, workplace protections need to be enforced so that employers do not feel like they have 24-hour access to employees, which many use as a crutch for their own poor management skills. It is also the responsibility of online forums themselves to moderate content produced and exchanged on their platforms so as to police bullying and other threatening behavior."

Marc Brenman, managing partner at IDARE LLC, wrote, "Ethical constructs can be introduced into artificial intelligence devices. But this is not likely to work well, since there is no unanimity on what ‘ethical constructs’ are. A ‘veracity application’ could be used as a filter to judge the truth of an internet posting. Prosecutors could charge those who threaten on the internet. Internet service providers and others could be required to provide much better service. The U.S. could introduce a ‘right to be forgotten,’ as Europe has. Net neutrality could be required."

Alf Rehn, a professor of innovation, design and management at the University of Southern Denmark, wrote, "As always, information and education are key. People need to understand their own usage of digital tools, not be nannied by it. Rather than building in limitations such as ‘maximum allowed screen time,’ digital tools should inform their users of good usage practices, allowing for considered choices."

Antoinette Pole, an associate professor at Montclair State University, said, "1) Alarms that goes off when one is using a device for too long. 2) Recommended usage by the American Medical Association for adults. 3) Greater push for technology breaks, like those in France."

Scott McLeod, a professor at the University of Colorado-Denver, wrote, "Law, policy and social norms always have struggled to keep up with rapidly-changing technologies. As a history major, I will note that our societies generally seem to catch up with the needs and demands and affordances of each new era. There's a lot of pain and adjustment along the way but eventually the 'new' becomes the 'new normal.' The more forward-thinking we are, the sooner we'll be able to make these necessary adjustments. After all, we're not going to regress to a less-technological society..."

Edward Tomchin, a retiree, wrote, "Humankind has a quirk. When we discover something new, it seems the first thing we do is abuse ourselves with it. There's a long history of this behavior, so I expect it will be no different with AI or anything else we discover or create. But we always manage to rise above it. I don't see that changing much. In fact, right now we are in the midst of political chaos, but the picture we see around us at this moment is not a foretelling of the future. It is a portrait of what we are leaving behind. We are in the midst of one of the greatest changes in human life we've ever encountered and it's happening at an amazingly fast rate. Yes, there will be some losses because change is a fearsome thing, but we will survive it and like the Phoenix, rise from the ashes of our past."

Mark Patenaude, vice president and general manager of cloud technologies at ePRINTit, said, "I truly believe society will regulate the digital world. There will always be a part of society that does not want to belong to the group collective. That is good. Society should be different, but we should, as a society to band together and steer this big ship called earth. There will always be waves and storms along the way, but, having good GLOBAL captains that understand the barriers of religion, ethnicity, regionality and language, we will almost always land safely."

George Strawn, director of the U.S. National Academies of Science, Engineering and Medicine Board on Research Data and Information, said, “’Interventions’ will be among the new tools and services that will continue the evolution of the Internet."

Deborah Hensler, professor of law at Stanford University, wrote, "Policy decisions can equalize access to digital technology and free and equal use of the internet. Of course, they can do the opposite, as with the U.S. FCC's decision to overturn Net neutrality regulations."

Jacob Dankasa, a North American researcher, said, "Educate people to use technology in a way that adds value to their lives. This entails knowing how much time one can spend on technology and knowing when to drop your piece of technology and engage in everyday face-to-face human relationship."

Deborah Coe, a coordinator of research services based in the U.S., said, "This is one of those instances of cultural lag in which social change takes a few years (or maybe a generation) to catch up with technological change and make some necessary adaptations. When the automobile was first invented, people didn't cope well. This too shall pass, but only if we help it along with some interventions. Society must teach itself and its newer generations how to do this. We've already seen some good experiments in which people (and especially children) have voluntarily relinquished their cell phones, tablets and computers for a few days. Although they were very distressed at first, after a few days most participants said they were surprised at how much more relaxed and focused they felt."

Michael Everson, publisher at Evertype, commented, "The one intervention which is important is the guarantee of Net neutrality worldwide."

Yoram Kalman, an associate professor at the Open University of Israel, wrote, "Awareness, training and education are critical for people to understand the benefits and risks of digital life. In particular, I believe that understanding the forces that power digital innovation and adoption of innovations (commercial, economic, psychological, social, etc.) are key to empowering people. Furthermore, carefully considered regulation should be used to protect individuals and organizations from powerful players when market and/or social forces fail to do so."

Andie Diemer, journalist and activist user, wrote, "There are small but realistic steps we can take as a society to mitigate potential harms of digital life. There is research available to form an outline and guidelines for technology consumption by age. We can teach children healthy boundaries with devices. We can recommend tactics to change behavior and specific accessories to preserve our physical senses.  However, as a society we are also at the whim of private businesses that can deliver various forms of technology without studying how it impacts consumers."

Dan Rickershauser, senior account manager at StumbleUpon, said, "We're all still new to this whole internet thing; there is so much to be done to mitigate its damage. Young parents limit the amount of time their toddlers spend with iPads. People take time to reconnect with nature, unplug and retune. Social networks will realize users are pulling away from their services if they feel they're doing more harm than good. Cities will rewire themselves to account for services like Uber, then do it again to adapt for self-driving cars. Democracy will demand voters do better vetting of information. The information bubbles we insulate ourselves within will pop. We're adapting as humans always have and always will, but it'll be messy."

Christopher Bull, a university librarian, said, "Unless there is a complete censoring of materials, people will find harmful material. Moreso, they are inclined to reinterpret information to match their own views."

Chris Udochukwu, a CEO based in Africa, commented, "The greatest challenge in life is that it is impossible for the human mind to think about nothing! The internet is a total-transformation missionary and no one has the complete answer. The human mind cannot be controlled. The more you attempt to control it, the more it transforms. Yes, we may lose our privacy and humanity, but we will continue to walk in the shadows of our fears of the unknown, which further whets our appetite for infinite adventure to the universe. With the internet, humanity is in the threshold of Mind-Social Continuum. All actions taken to attempt to control the mind lead to a direction but in all directions awaits the unknown alley. The complexity of the human mind is infinite."

Adebisi Adekanmbi, a research student at Obafemi Awolowo University, commented, "I do not think there can be actions to mitigate potential harms of digital life. Digital life is a phase in the age far as this generation is concerned, and it is expected to evolve. Man had to improve technology from gathering fruits to the use of stones to making of tools, this movement cannot stop!"

Ian Rumbles, a technology support specialist at North Carolina State University, commented, "Digital technology is adopted by young people first. They traditionally do not listen to their parents. They develop patterns that are hard to break or train to be different."

Ian Fish, an enterprise manager, wrote, "I cannot believe that there are not things which could be done to mitigate the deleterious effects of digital life but I am hard-pushed to think what they might be. For example, the erosion of Net neutrality could lead to the dominance of commercial corporations to the disadvantage of those who do not have spending power. This could lead to the further erosion of human rights for large swathes of the global population."

David J. Wierz, senior principal of The OCI Group, commented, "How does a society define 'harm' in use and then act to reduce affects leading to such harm without overt monitoring? Perhaps the demise of Net neutrality and onset of associated volume-based costing for use may provide a positive unintended consequence."

Danny Gillane, librarian, Lafayette (LA) Public Library, said, "Freedom to choose what is easiest or cheapest, the very real spectre of addiction to our devices, these things cannot be solved with another technology, and Citizens United has pretty much convinced me that the corporations who stand to make money off these devices and services will not be working to lose eyeballs in the name of what may be better for us."

Mark Maben, a general manager at Seton Hall University, commented, "While I did answer ‘yes’ to ‘there are interventions that can be made in the coming years to improve the way people are affected by their use of technology,’ I am not sure they will be widely adopted. Already there are software programs and apps that can assist individuals with modulating their digital consumption. However, few people take advantage of them. As more people suffer the harm caused by 21st century digital life, we will as a society be forced to create more tools and strategies that help individuals unplug and keep their digital consumption at safe levels. It will become a necessity, and I am hopeful that humans will take advantage of the tools we develop to protect ourselves from the emotional and physical harm created by our digital lives."

To read the 86-page official survey report with analysis and find links to other raw data, click here:
http://www.elon.edu/e-web/imagining/surveys/2018_survey/Digital_Life_And_Well-Being_Home.xhtml

To read for-credit responses to the main survey question, please click here:
http://www.elon.edu/e-web/imagining/surveys/2018_survey/Digital_Life_And_Well-Being_credit.xhtml

To read a PDF with an expanded version of the full Digital Life report, please click here:
http://www.elon.edu/docs/e-web/imagining/surveys/2018_survey/Elon_Pew_Digital_
Life_and_Well_Being_Report_2018_Expanded_Version.pdf