Elon University Home

 

Survey IX: 
The Future of Well-Being in a Tech-Saturated World

Anonymous responses to the second research question:
What actions be taken to reduce or eradicate potential harms of digital life harmful to individuals' mental and physical well-being? 

Results released in spring 2018 - To illuminate current attitudes about the likely impacts of digital life on individuals’ well-being in the next decade and assess what interventions might possibly emerge to help resolve any potential challenges, Pew Research and Elon University’s Imagining the Internet Center conducted a large-scale canvassing of technology experts, scholars, corporate and public practitioners and other leaders, asking:

Digital life's impacts on well-being: People are using digital tools to solve problems, enhance their lives and improve their productivity. More advances are expected to emerge in the future that are likely to help people lead even better lives. However, there is increasing commentary and research about the effects digital technologies have on individuals’ well-being, their level of stress, their ability to perform well at work and in social settings, their capability to focus their attention, their capacity to modulate their level of connectivity and their general happinessThe questions: 1) Over the next decade, how will changes in digital life impact people’s overall well-being physically and mentally? 2) Do you think there are any actions that might be successfully taken to reduce or eradicate potential harms of digital life to individuals' well-being yes or no - what might be done?  3) Please share a brief personal anecdote about how digital life has changed your daily life, your family's life or your friends' lives in regard to well-being. 

About 93% said in answer to question two that there are actions that can be taken to reduce or eradicate potential harms of digital life. You will find the written responses to question two submitted by anonymous respondents listed below the following summary of the common themes found among all responses to the primary research question.

To put things into context, among the key themes emerging from all of the 1,150 respondents' answers to all research questions were: * CONCERNS - DIgital DeficitsCognitive abilities, including analytical thinking, memory, focus, processing speed and effectiveness, creativity and mental resilience, are undergoing change. - Digital Addiction: Internet businesses working to earn attention-economy profits are organized around dopamine-dosing tools designed to hook the public. - Digital Distrust/Divisiveness: Personal agency is reduced and emotions such as shock, fear, indignation and outrage are being weaponized online, driving divisions and doubts. - Digital Duress: Information overload + declines in trust and face-to-face skills + poor interface design = rises in stress, anxiety, depression, inactivity and sleeplessness. - Digital Dangers: The structure of the internet and pace of digital change invite ever-evolving threats to human interaction, security, democracy, jobs, privacy and more. * POTENTIAL REMEDIES - Reimagine Systems: A revision and re-set of tech approaches and human institutions (their composition, design, goals and processes) will better serve long-term good. - Reinvent Tech: A reconfiguration of hardware/software to improve human-centered performance can be paired with appropriate applications of emerging technologies such as AI, AR, VR and MR. - Regulate: Governments and/or industries should effect reforms through agreement on standards, guidelines, codes of conduct, and passage of rules and laws. - Redesign Media Literacy: Formally educate people of all ages about the impacts of digital life on well-being and the motivations underpinning tech systems, as well as encourage appropriate, healthy uses. - Recalibrate Expectations: Human-technology coevolution comes at a price; digital life in the 2000s is no different; people must gradually evolve and adjust to these changes. - Fated to Fail: A share of respondents say all of these remedies may help somewhat, but, mostly due to human nature, it is highly unlikely that these responses will be effective enough. * BENEFITS of DIGITAL LIFE - Connection: It links people to people, knowledge, education and entertainment anywhere globally at any time in a nearly frictionless manner. - Commerce, Government, Society: It revolutionizes civic, business, consumer and personal logistics, opening up a world of opportunity and options. - Crucial Intelligence: It is essential to tapping into an ever-widening array of health, safety and science resources, tools and services, in real time. - Contentment: It empowers people to improve, advance or reinvent their lives, allowing them to self-actualize, meet soulmates and make a difference. - Continuation Toward Quality: Emerging tools will continue to expand the quality and focus of digital life, and the big-picture results will continue to be percieved as a plus overall. 

To read the 86-page official survey report with analysis and find links to other raw data, click here:
http://www.elon.edu/e-web/imagining/surveys/2018_survey/Digital_Life_And_Well-Being_Home.xhtml

To read the for-credit responses to the main survey question, click here:
http://www.elon.edu/e-web/imagining/surveys/2018_survey/Digital_Life_And_Well-Being_credit.xhtml

To read a 272-page Expanded Version of the Digital Life and Well-Being report click here:
http://www.elon.edu/docs/e-web/imagining/surveys/2018_survey/Elon_Pew_Digital_Life_
and_Well_Being_Report_2018_Expanded_Version.pdf

Written elaborations by anonymous respondents

Following are the full responses by study participants who chose to remain anonymous when making remarks in the survey (those who included a written elaboration) to the secondary question, "Do you think there are any actions that might be successfully taken to reduce or eradicate potential harms of digital life to individuals' well-being?" Some of these may be the longer versions of expert responses that are contained in shorter form in the official survey report. This includes responses that were not in the official report.

The head of privacy for a major U.S. technology company commented, "[In the next decade] a few areas will be better addressed to mitigating digital technology use concerning the effects on eyesight, attention spans, addiction and stress."

A social scientist based in Europe commented, "Adapt or die."

A CEO wrote, "Standards, ethics."

An internet pioneer and longtime engineering innovator wrote, "As AI makes digital applications easier to learn, fix and adapt to us, it will greatly reduce the time learning how to use new applications."

An educator wrote, "12-step programs and services to help people cut the cord, so to speak, may help."

The directing manager of a tech-based business wrote, "Digital literacies should be taught as a part of children's educational development with a passing grade required. A comprehensive understanding of how it all ‘works’ is essential. Apps can time out when too much time has been spent in the screen. VR/MR/AR can be adapted as both teaching and wellness tools."

A professor wrote, "Employers should institute electronic communication vacations for the health of their employees."

A technology pioneer investor and philanthropist wrote, "Transparency will help. But transparency without people well-informed enough to understand the situation won't suffice. So we need better education and people (mentally) healthy enough to withstand the seductions of immediate gratification."

A professor of sociology located in Southeast Asia said, "Education is always good. People need to be taught how to use new techs responsibly and safely."

An internet advocate based in North America said, "Education of the end-user on safety and well-being in schools."

A senior research fellow at a university on the Pacific Rim, wrote, "Since the amount of problems and troubles will become huge, there will inevitably be measures placed in society at large to prevent or solve these problems and harms."

A professor of political science at a major university on the U.S. West Coast, said, "Government intervention to place countervailing pressure on platform monopolists. A new model of education for our technologists and engineers should incorporate ethics and public policy. Better investigative journalism should be directed at the tech sector."

An adjunct professor and political leader commented, "Mainly screening for destructive postings and ensuring digital security."

A lecturer in sociology and director at a major U.S. university wrote, "1) The first thing is to reduce children's screen time. Kids should NOT be given screens until their abstract thinking has evolved, and even then, it needs to be limited. No computer in the bedroom. Phones could be programmed to limit who can call/text in and out. 2) Governments need to take seriously the risks of cyberwar by governments and terrorism by non-governmental agents. Invest. Research. Prosecute. 3) Public health messages on distractions should continue."

A longtime Internet Society and Internet Engineering Task Force leader said, "While there are interventions that can be made, most of them are likely to be worse than the disease, particularly putting more power into the hands of demagogues, those with no interest in listening to others, etc."

A research scientist said, "Revising education so it teaches norms about uses of technology, whether it be to prevent bullying, develop better social habits, or think about human and societal impacts when designing product."

A co-founder of a project for global good wrote, "Privacy policies will have to be changed."

An open-source technologist said, "Public awareness of the nature of the internet and communication technologies must be driven by public campaigns. Study of well-being in relation to technology must be done to determine the best actions to take."

The director of a national council for science and technology for a country located in the Global South said, "Two actors will be very important: Governments and consumer associations. Regulations should be one step forward on the effects of the potential harms of digital life. Consumer associations should be testing the market and the society in order to identify potential harms before they affect significant amounts of people."

A futurist based in the U.S. commented, "Like all market systems, digital technology-building has both positive and negative externalities. And, like all market systems, the negative externalities require either social or regulatory action to prevent unaccounted costs to society. We are just beginning to understand these, and are likely to see deeper global efforts to address them in the coming years."

A professor of computer science at and expert in security at a major U.S. university wrote, "Better access for everyone, better education for everyone. Reinstitute something like the Fairness Doctrine. Or require labeling/standards for actual news."

A professor and associate dean at one of the largest state universities in the U.S. said, "In our roles as citizens, consumers, workers and community members we have the democratic power to build a more inclusive and egalitarian society."

An internet advocate based in North America said, "Education of the end-user on safety and well-being in schools."

A content consultant commented, "Education about the harms of the internet, could be really helpful. Similar to education about eating a healthy diet, etc. Education and peer pressure got people to stop smoking, it could also get people to change internet habits, etc. An international online code of conduct with some enforcement or rating scale would be useful, but that can of worms is so big, it almost breaks my brain."

A senior lecturer at a university in Southeast Asia said, "People should be empowered with their mental personalities in the face of internet media. The personal determination should be developed in the educational curricula."

An internet advocate based in North America said, "Education of the end-user on safety and well-being in schools."

A research associate at a major university in Africa commented, "The most important action would be to learn from the mistakes of the current 'older generation.' Much more education regarding online security and safety is needed. More importantly though, early education regarding the effects of physical inactivity is required. A reward system that encourages more activity even while using the internet would be great!"

A futurist/consultant said, "Every innovation is a double-edged sword. It confers benefits, but there will be a downside. Technology takes us away from direct contact with nature, increases stress and tension. It's up to us and how we choose to use technology. We need to be in charge, but it is changing faster than we can keep up. It is creating unintended consequences we never imagined. All issues are intertwined: climate change, threat of nuclear war, AI, internet. Political power is the biggest impediment. We are ruled by a dysfunctional worldview that values profit over people; it skews what the internet does and what it can do. The internet has the power to be much more positive in people's lives but that requires a different political framework."

An information curator and communications specialist said, "Only time will tell us, just like the creators of the printing press and electronic broadcasting (or the telegraph for that matter) did not know what changes would come from them. But those after them made it better."

The director of a research organization commented, "The debate about Net neutrality is critical for the future of the internet."

The publisher of a privacy journal said, "Training for young people to ration the new media in their lives, to use it to keep in touch with family and friends in meaningful ways, to avoid being manipulated by commercial interests online. No one seems to be doing this."

A foundation director said, "As a K-12 student and again while studying research methodology in college, I was taught to be a discerning user of information; something much easier then than now as I formally only had to contend with libraries with only a couple of hundred thousand titles. So much data (not the same as information) is available via the Internet that my previous skills aren't enough to support my various searches. We all need to be taught to be better consumers."

A retired professor commented, "Providers should be able to better control security and safety for users."

A professor based in Iceland wrote, "First, government and industry has to accept that services provided to the public need to be regulated and designed for safety. Companies can't be allowed to just shrug their shoulders and say that people's safety on the internet is not their concern. Second, people need to be educated about how the interned works so they can better understand the various digital services available to them."

A professor of philosophy at a major U.S. technological university wrote, "There’s a fundamental question that society needs to better confront: As technology advances and becomes ‘smarter,’ are we, human beings, being techno-socially engineered to behave increasingly like simple machines? If this question can be taken more seriously and its ethical-existential-political consequences can be better understood, perhaps society would be better equipped to create policy at the micro, macro and mezzo levels."

An internet pioneer and longtime contributing researcher wrote, “One of the major potential harms of digital life is exposure to inaccurate and/or misleading information – future technologies (e.g., AI, semantic technologies) have the potential to assure greater information/data provenance."

An expert on intelligence and defense policy commented, "There is always something that can be done."

A North American professor wrote, "New technologies are being developed that can mitigate the harmful effects of digital technology. For example, duo authentication can enhance security. That said, it is likely that good and evil will always be in a race."

A research fellow at a major university based in Northern Europe, wrote, "I am positive that artificial intelligence can be put in serve of humanity in the future. It may help us make wise decisions, understand the motives of other peoples, and increase awareness of our own biased thinking."

A research associate at a major university based in Southeast Asia, said, "Legislation on digital behavior should be put on the agenda. Activities on the internet should be treated as the activities offline and people should be responsible for the consequences."

The director of a psychology research center said, "We are, as a society, woefully negligent in preparing people, especially young people, to manage technology. This is the equivalent of letting people drive without training or having them jump into the deep end of the pool without a swimming or lifesaving lesson in sight. Media and technological literacy and digital citizenship training need to be integrated across all grade levels. Media literacy is not just evaluating media content and digital citizenship is not just about cyberbullying. This training needs to be based on: 1) the psychology of human behavior, such as understanding how the brain reacts to virtual behaviors, the cognitive biases that interfere with critical thinking, and an emphasis on self-regulation and self-efficacy, and 2) understanding how technology works at practical and theoretical levels, from privacy settings to algorithms."

A lecturer in digital and social media commented, "A substantive rethinking of design principles and the true potential of these technologies, beyond the limiting visions of IoT and social media, is necessary."

A professor of public policy said, "I have to believe it will be possible for us to gain better control over digital technology. For the most part, our history suggests that we have been able to figure out how to tame our inventions before they destroyed us, though I'll admit it's not a sure thing."

An anonymous respondent said, "Remove Republican control of the government, pass legislation overturning Citizens United, and reinstate net neutrality."

An internet pioneer and longtime participant in internet engineering leadership wrote, "To the degree that life is enhanced as opposed to being interrupted, the right thing is happening."

An internet pioneer and longtime engineering innovator wrote, "As AI makes digital applications easier to learn, fix and adapt to us, it will greatly reduce the time spent learning how to use new applications."

A professor of information technology wrote, "Better ergonomic design of computers and other digital devices will make them easier to use and less harmful."

A research scientist based in North America said, "Moving beyond passive tracking and encouraging motivation for behavioral change."

A teen library specialist responded, "There are always actions that can be taken to mitigate harm. I believe that as an individual I can do things to mitigate personal harms to myself and those around me. And I believe as a society we can adjust norms or even make laws to mitigate potential harms. Never mind the way that digital futurists and engineers and designers can help mitigate potential harms."

A writer/editor based in North America, wrote, "We need greater competition in social media and other services online. There have always been trolls, but when the internet's content was less centralized it was a lot harder for them to cause chaos. Many different communities operating independently, preferably running on open source code, would solve many problems. I don't see it happening, though."

An associate professor of business at a major U.S. university said, "Intervention can happen, but I don't think that it's likely. We need to have some sort of public campaign that demonstrates the downsides of hyperconnectivity. Parents, especially, need to know the consequences and be willing to limit screen time."

An executive director predicted, "As people discover actual or potential harms, they will adjust either the technology itself or their use of the technology. I know a lot of people who have just abandoned social media because of the stress and discomfort it is bringing to their lives (adjusting their use of technology) while on the other hand Facebook, Google and Twitter are clearly having to change or are contemplating changes to algorithms, policies and technical affordances."

A writer and researcher replied, "The safety net must be expanded."

A futurist commented, "Awareness of the risks and negative externalities."

A research leader at one of the top five global technology companies said, "Although we can’t restore the world for which we were designed by evolution, we can certainly mitigate our painful transition to the new world and cushion the shocks that we have begun to experience. Restoring Net neutrality could be an example. The EU’s GDPR initiative is an experiment in pushing toothpaste back into the tube and bleeding the large tech companies that are rocketing us into the future; we will see how it plays out."

An internet pioneer and advocate based in Australia wrote, "Regulatory actions will be essential to continue to protect human rights online, as they are protected offline in many jurisdictions. This includes regulation of monopolies and of anti-competitive and anti-consumer behaviour."

A distinguished advocate for the World Wide Web and policy director based in Europe said, "There is a clear role for governments to appropriately regulate digital technologies that allow them to appropriately function and support growth and innovation while ensuring that they promote the maximum social good and minimal social harm. This balance is currently not the case in most countries, particularly the U.S. Companies also have to do their part to take on their corporate responsibility for any negative effects from the technologies that they are creating and implementing. The initial response by Facebook in response to critiques about its role in affecting the U.S. election demonstrates the prevailing mentality among many tech companies on their roles and responsibilities. This has to change."

A technology developer/administrator based in Oceania said, "Society needs to adjust to technological changes; this will come with time and experience, and hopefully not through regulation or over-reaction."

A post-doctoral fellow at Stanford University commented, "Certainly people will need to consider some form of self-regulation or digital diet. I know my husband and I do our best to put our phones away when with the kids or at the dinner table. We keep screens away from the kids. More structurally, it would help for citizens to have access to some form of digital literacy, including the kind that protects people from falling victim to scam or disinformation, or else to learn how to keep their data secure."

A professor wrote, "There are known steps and actions that can be taken to limit the degree to which discrimination and hyper-surveillance can occur through emerging technologies."

A scholarly-communication librarian said, "There can be more sanctioned support at all levels for victims of online harassment, including punishing the harassers. Knowing that there is a system that can be appealed to in cases of harassment and violence – that actually BELIEVES the victims and ensures that their concerns are not brushed off as ‘just the internet’ – can be reassuring for all. Having consequences for harassers may prevent most of the 'easy' online harassment that can destroy lives now."

A postdoctoral fellow at Stanford University commented, "Create multiple, robust Webs so that we are not reliant on a Net that isn't neutral. Regulate surveillance of content and gathering of metadata beyond simply asking for consent. Encourage the development of communication tools by and for demographics that are generally not served to benefited by them – this needs to be a not-for-profit project and it needs to involve women, minorities, activists and the Global South (these projects exist and they need more funding). Support labor-organizing and rights for workers in the factories making the devices. There should be transparency and audits of hardware and software."

A medical and/or mental health professional based in North America wrote, “Now that we see the power and pervasiveness of technology and see the downsides, we can apply experience and knowledge to keep us grounded in the physical world and continue the advancement of technology. An essential component of this is how we maintain the inherent democratic nature of a non-hierarchical internet."

A distinguished internet advocate and policymaker wrote, "Compulsory education in Internet use and cyber security should become part of the primary school curriculum. To include: online safety, health and well-being."

A vice president at a major entertainment company in the United States commented, "For years internet firms took no action to address child sex abuse imagery, claiming it was not theirs to do. Once they (finally) turned to it they have been able to come up with things to do. Problems can always be solved once people put attention and money towards them."

The head of research and instruction at a major U.S. university wrote, "Understanding the strengths, drawbacks and underlying structures of technologies and their applications is crucial – especially for younger people, who might not have alternative mental models, and those less-familiar with tech. But attentiveness to changing habits and what kinds of spaces aren't being created is integral to interventions that will lessen their effects or help recapture kinds of attentiveness that might be otherwise lost."

A retired internet activist and advocate said, "Broad guidelines and policies need to be created to outline acceptable boundaries for communication. Such guidelines/policies should help audiences and providers address such issues as fake news, privacy invasion, human trafficking, surveillance abuse, tax evasion and money laundering."

An internet pioneer and social and digital marketing consultant commented, "There will be new filters and timers to help people who need assistance to manage their exposure. But there will also be a resurgence of people rejecting the overwhelming pervasiveness of digital in our day-to-day lives."

A research scientist based in Europe commented, "Policymakers need to be at the forefront of these innovations. Some of new technology that is being created is existing in loopholes and acting in areas where there is little policy oversight. There should be. Policy should work towards constructing the frame on which technology sits, rather than being reactive, after the harm has been done."

A professor of psychiatry at a major university in South America commented, "It is important to increase the awareness about the harmful effects of technology. People should have discipline to use it wisely. Employees should not force their employees to be connected all the time."

A research scientist said, "Media and critical literacy, i.e., more awareness of what the norms and circumstances of the online economy, environment, are. This is not so much about ‘teaching’ adults, which is highly unlikely to be effective, but more about the industry developing techniques to clearly distinguish, e.g., ads and editorial, sources of content, etc."

An anonymous respondent wrote, "I believe awareness is the first step. Education on sensible uses of technology and the acknowledgement of potential harm are needed."

A solutions consultant based in North America wrote, "Better training and education of parents and children on the harms and benefits of technology, focusing on its positive use. More visibility for parents of their children's and, for that matter, their own use of technology. Ways that you can measure your own ‘addiction’ readily and see that you may be over doing it, maybe even setting goals so that you can tell when you've gone over the line or your children have. A technology self-limiter needs to be pervasive, not app by app, or site by site, but rather something that's embedded in our culture."

An anonymous respondent commented, "The interventions would be aimed at countering the brain's intense confirmation bias. They would be school-based and would focus on media literacy and general education in critical thinking."

An assistant director of digital strategy at a top U.S. university wrote, "Allowing for better methods to openly flag fake news and block foreign infiltration in social media/political influence would improve digital life."

A college administrator based in North America said, "Individuals must develop necessary emotional intelligence in an effort to remain self-aware in making decisions on level and extent of online engagement. We must set limits, know when to take needed breaks and stop at times to refocus on other life-sustaining activities."

An anonymous respondent wrote, "Parents need to keep kids from getting addicted, lead by example. However I don't see this happening. Adults are almost as bad as kids. I used to teach and except for while the students were taking exams, there was no way to, keep them off their devices. Also don't see how to get people to, put down their phones in public . Even where there are laws about using devices in a car, people still use them. Maybe society will just adjust eventually."

A CTO and attorney based in North America wrote, "I have very little hope that remedial measures will occur, however they are possible. The greatest one would be for people to learn, from childhood, that electronic interactions need to be leavened with face-to-face interactions. Absent a comprehension that one is dealing with people who in many regards are just like one's self we will see a continuation of shallow, abusive interactions that fail to reach understandings and agreements."

A Ph.D. candidate and information science instructor said, "We need more research."

A deputy director at a nonprofit based in the United States wrote, "To date, ‘digital life’ has literally been built by human activity, across academic, government and commercial entities. That human agency means we can choose to make different apps, services, devices and approaches to applying technology in different sectors of society. None of this is foreordained or fate. The technology giants of today and tomorrow can and should recalibrate to encourage conscious consumption and intentional use that leads to meaningful, positive experiences and offline connections instead of incentivizing passive consumption of the curated feeds of others and demanding attention. Employers, schools and families will need to develop and encourage healthier social norms that integrate the use of phones and wearable computers into modern life in ways that bring people in from the cold."

A career coach at a major state university in the United States wrote, "Education is key. Teaching people in concrete ways about how to communicate effectively and how to understand using digital tech is the best tool I've seen and can think of. Beyond that, people will learn protocols that are appropriate to them only by messing up. Some of those messes will inevitably cause problems, but these are the same kinds of lessons we as a society have had to learn with every technology. I don't see digital technology as any different."

A chief data officer at a major university in Australia wrote, "There is a substantial role for government and regulation to ensure that we experience beneficial outcomes from technological advances. This is especially true in the area of privacy, where the EU's GDPR is leading the way. Areas like blockchain and YouTube would also benefit from regulation."

An attorney wrote, "As we learn the ways in which people are being harmed, we can work to try to eradicate those sources of harm. This is likely to be an ongoing cycle, and it's possible that not all the sources of harm can be eradicated and that maybe the harm will win in the end; maybe ‘fake news’ cannot be stopped, maybe cyber threats and security attacks will be the end of us all (not that most of us haven't already had our identities or passwords stolen), but I would like to think that there are ways to make technology work better for us and improve our way of life. Even if technology itself cannot be improved, the people who use it can band together to try to work as a counter-influence against the harm as well; for instance, users can form support groups more easily and communicate with others who may have a better understanding of their lives and problems than they ever could before."

A manager of digital and interactive strategy based in North America wrote, "In one of my masters classes in information science, the topic was ethics around tech. For instance, if there are two autonomous cars and an accident can’t be prevented and there will be injuries which car should have less damage than another? These are based on subject-matter experts in ethics, as well as the opinions of a panel to poll their responses to these types of questions. Ethics always comes into play and this will help mitigate risk. Information professional societies have codes of ethics that members should abide by."

A CPA based in the U.S. commented, "As the technology gets better and better, will significant numbers of people become addicted to a digital life? Will content providers build in automatic shutoffs? Will an underground market develop for content that won't shut off? In the next decade, there will be unexpected actions that will reduce certain risky aspects of digital life and there will also be unexpected actions that will be extremely harmful and society won't be prepared. Virtual reality immediately comes to mind."

A technology developer/administrator based in North America said, "People can always get back to basics with regard to technology, be more critical of how they use it, be more aware of how much is too much."

A distinguished technologist at a major tech company in the U.S. wrote, "Privacy and security are incredibly important in mitigating harm in the digital world. Also, as AIs become more common and important, we need to have visibility to how algorithms are making decisions and what happens to our data. The U.S. could follow the EU and enact GDRP, or something similar (https://www.eugdpr.org)."

An educator at a major state university in the United States wrote, "We need to take more responsibility for our own security and safety online, but there are also things that will force us to do that, hence the two-factor authentication and password managers. I love both of them to help me think twice before acting."

A data analyst said, "To reach the point of policymaking requires society to have discussions within itself about how these new technologies are to be received and used even if it is just teaching people (adults and children) how to behave. It's the human element that needs to be considered as we continue into the digital life. As life becomes more digital, a lot of potential harms can be mitigated or removed by having conversations about what the existing technology means and creating actions and policies from those conversations. One of the looming issues that will need to be dealt with is the sheer amount of WiFi-connected devices that have no built-in security. All it takes is one device to be hacked on a network to give someone access to everything. This is something that can be addressed BEFORE a device is put to market and can be reinforced by policies creating a standard level of security the product must meet."

A futurist and consultant based in Europe commented, "Tech is both our best and worst friend. I can imagine many ways to make it our best friend. 1) Make it stop if over-used. 2) Initiate self-governing rules and self-learning AI rules to avoid things like bullying, etc. 3) Deep-learning fact-checking to avoid fake news. 4) Create social citizenship as part of any action relevance."

A professor wrote, "The internet and other electronic developments are already more intrusive than is desirable. We need less, not more, intrusiveness in our lives."

A technology developer/administrator said, "Education, starting in K-12, is a powerful tool to progressively familiarize (potential) users with the benefits of digital technologies. Nothing is more worrisome than the fear of the unknown."

A research scientist based in Oceania commented, "One reason the internet is so distracting is that the more that websites can get our attention the more money they can make from advertising and related forms of data mining. Having more effective adblocking will take away the motives to distract us, but will reduce revenue and content. My view is that widespread effective adblocking is justified and will be beneficial, because the alternative would be for AIs to be geared towards encouraging us to spend money and resources on things we do not need. This would increase adverse impacts on the planet and risks associated with that."

A technology developer/administrator based in Europe said, "IN SCIENCE new barriers have been erected by companies competing in the market led by the two world publishing leaders Elsevier and Springer. If you are an author of an article, you may be asked to pay 15€ for accessing your own work online! In other terms, personal intellectual property has been taken away from scientists, and money made from it, with no fair sharing of the value with science and scientists! Fortunately there are alternate models such as researchgate.net, where researchers can share their results and publications."

A retired research scientist and university instructor wrote, "We need to develop legislation that integrates the reasons for which information is collected into systems that collect information indiscriminately. I also feel that information collection by corporate interests that integrates information across various sources is overly intrusive."

An anonymous respondent commented, "There are ways that can help mitigate potential harms of digital life. Facebook and Instagram are great examples in that they have implemented suicide prevention tools that help monitor and flag disturbing behavior. Twitter implementing the verified check marks help users confirm reliable sources."

An anonymous respondent replied, "Many of the harms are a subset of wider economic harms. The interventions, in that case, need to address the wider economic harms. The internet is a mirror of wider issues. It is challenging for people around the world who want to be aware of real-life issues when there is a risk that the information is not factual; there is also a risk of oversaturation that creates difficulties you cannot fix. There may be some opportunity for being effective, but it is very hit-and-miss. Stupidity as a distracting entertainment is masking serious issues the United Kingdom, U.S., Australia and probably elsewhere. We need a different economics. If we can use the internet to generate an inclusive ecologically-grounded and humane economics, it would underpin a shift in the mentality of the internet and the wider media as an information space. There seems to be a lack of responsibility for security by service or product providers regarding customer data. As examples, PayWave has no security, the hackable IoT, massive password data breaches are all indicative. I am not sure how to shift those priorities in a context where governments are not as active in the public interest. There are challenges around identity, people pretending to be other real people in order to use a spoof account to defraud or otherwise con people. This happens on Facebook, for example, and there are some active spoof profiles people have been asking Facebook to delete for 10 years. Tensions between freedom of speech and hate, racism, bullying and false information are hard to curate effectively and fairly. This is a mirror of wider issues in society. 1) Youth – the freedom to express themselves and also be safe with each other as well as with the wider community. 2) Gender – the freedom to express themselves and also be safe with each other as well as the wider community. 3) Faith communities – the balance between rights to believe and tolerance of others. There are some evidence-based, science-based, ways of maintaining a level of trusted, objective truth. Wikipedia is a powerful project in this regard. Today we see shifts in the level of understanding of science – flat Earth, etc. Is education as a business [the online platforms sharing a wide range of information as fact that actually falls all over the true-to-false scale] replacing traditional education, which seemed to be a social infrastructure we can use to keep society coherent and civil. Governments have been defunding objective expert scientific opinion because the opinions they are responding to are donors, sponsors, multinational money and not in the public interest or ecological interest. Funding becomes more directly commercial for science, which causes tensions for objective science. It takes more information literacy to navigate for facts. I do not know how to turn that around, but it is important for the way our digital life informs the future of the planet and its human communities. Communities that are about making things seem to be more cohesive… Negotiating a shift from capitalism into something with a real planetary equilibrium is the task for our generation. Perhaps the internet can help with that. Some governments have been co-opted and cannot deliver. It is probable that the internet as a tool for change is the reason why they are blocking net neutrality despite the fact it would be bad for business."

A research scientist said, "I, for one, am doubtul when Facebook says they are working on improving the algorithms to weed out false accounts that purvey false news. When I look at the work some non-profits, educational institutions and individuals are doing to easily root out fake accounts, the only conclusion I draw is that it is not in the interest of Facebook, Google and Twitter to stop, or even reduce this revenue stream."

A multimedia journalist said, "It is up to each person to judge their screen time."

A professor of computer science at a major university in the United Kingdom wrote, "Action on data-privacy, trust and security to ensure digital well being for all."

A blogger and policy and budget analyst and advocate based in North America wrote, "If digital is to actively engage the masses, there must be efforts undertaken to mitigate potential harms. I cannot foresee how the internet will continue as a robust component of life for so many if harmful actions – identity theft, bullying, harassment, to name a few – continue and grow. As much as my professional life has benefited from digital I would return to old-school if digital became the Wild West for users."

A professor wrote, "New interfaces based on voice and gesture commands will become more sophisticated so that reliance on keyboards and screens will decrease. Our digital ‘diet’ will become more apparent with new guidelines for healthy patterns of use. New apps will become more analytic, alerting us to the health of our financial affairs, personal health and well-being and in so doing liberate more time for personal enrichment, exercise, time with family and friends."

An IT systems engineer wrote, "There are things that can be done fut this won't be easy, and it will require deliberate effort. I don't think our society will take the tough route. The lull of the easy road will lead them to harm."

A research scientist and professor said, "Regulatory interventions in areas such as privacy, hate speech, manipulative and deceiving practices by online services, etc."

A research scientist based in Southeast Asia commented, "Following the behavior of the users, the social scientists and computer scientists can come up with solutions for the betterment of society."

A social media manager wrote, "People can reduce the potential harms of digital life by unplugging more often and being mindful of their technology use. I have reorganized the apps on my smartphone so the fitness and education apps are within closest reach and the social media apps are more difficult to access, which encourages me to focus on things that will provide the greatest long term benefit. We can also turn off social media notifications and use tools that lock apps at certain times or use software like f.lux that dim blue light in evenings."

An executive director at a major university in California's Silicon Valley said, “Greater consciousness about how much attention we are giving to our digital devices, unnecessarily and unconsciously, would help mitigate the sense of distractedness from other people."

An anonymous respondent said, "A deeper understanding through additional research and scholarship of the socio-cultural and psychological effects of digital technology will inform our use of these technologies in the years to come. I believe this will lead to improved education and more effective and informed parenting. An increasing focus on the role of the Big-Five tech companies will shape how they behave in the years to come. We see this beginning as Facebook, for example, addresses issues of Russian intervention, fake news and more in private and on their public facing posts. With increased pressure, I believe these companies will address their responsibility for the content on their platforms along with other critical issues such as privacy, access and the potentially addictive nature of product design."

A president and CEO of a company based in the United States wrote, "There are many ways to mitigate the potential harms of a digital life. Take for example, the tension between our First Amendment rights and the impact facial-recognition technology has had on journalists and citizen reporters. These individuals, who play an important role in holding public officials and governments accountable are being arrested after being identified with the use of technology. American citizens don't often know what dangers they face when they step out of the door of their home to join an assembly of like citizens to protest certain actions. Knowing your rights is an important aspect of our lives in today's environment; and education, not just awareness, will be important to help mitigate the potential harms of a digital life."

A internet pioneer and consultant who has been a leader in helping people globally become connected online wrote, "The fact that there are possible interventions for good does not guarantee that they will be effected or that they will not be countered by forces against good."

A professor wrote, "Simple awareness about digital effects is an important starting point."

A director of data scientists at a North American university wrote, "Finding ways to limit engrossment and pervasive impact from digital technologies will be vital to reduce the negative impacts."

An anonymous respondent commented, "Voting systems must produce paper copies of ballots which can be verified by the voter before leaving the polling place copies of these must be saved by the voting officials in order to be able to verify reported results."

A professor at a major university on the West Coast of the U.S. wrote, "There are interventions that can be taken (somewhat similar to various interventions available for gamblers, drinkers and smokers). This doesn't mean that people will use or accept them, or that they will have much of an effect. These can include software to limit amount of use, or block use during certain times (such as some college software that blocks use during classtime), or better opt-in instead of opt-out features. A Fitbit-type device could be linked to digital media use to track and warn. But the essential nature of social and digital media is being embedded in diverse networks of other users and resources, so no individual action is going to help much. Research clearly shows that disconnecting, even for a short time, is way too painful for the individual, and for the other members of that individual's network."

An owner of a tech company based in North America said, "Education. From early childhood onwards."

An anonymous respondent said, "It is important that we actively think about how to mitigate potential harms of digital life. We need to address issues like ethics, privacy, security, and fairness."

A technology developer/administrator said, "More regulation and best practices can help reduce the impacts of cybersecurity events. Creating some set of liquidated damages regulations for industrial equipment and power grid operations should cause insurance companies for these entities to force them to more-secure systems. Social networks are a scourge, especially for teenagers, but it’s hard to see how they can be regulated, due to First Amendment rules. Perhaps limiting use to adults over age 18 would be useful but hard to enforce."

A professor at a major state university in the United States commented, "The number of people who wish to do harm or otherwise unfairly benefit from online activity is small in comparison to those don’t. Increased regulation and severity of punishment for those who break laws will help. Better education and tools for managing identity and security will help as well."

A futurist and researcher said, "Yes, but if and only if internet access is redefined as a public utility."

An anonymous respondent said, "I don't see clear solutions to current issues, such as hacking. But, I think/hope that people will be able to find solutions in the near future, although I feel that it won't be easy."

A leader who works at one of the top global internet administrative organizations wrote, "Security and authentication measures (e.g., fingerprint or facial recognition) will improve and be more-user friendly thus allowing stronger measures to be used without unduly burdening the user. Privacy protections should benefit from improved data-protection measures at the software and hardware levels. Legal privacy protection measures will add to a trend toward protecting users’ rights."

A professor wrote, "The online connectivity is not always a good thing. The responsibility is with us all to prevent misbehavior and bullying by individuals and groups. Deliberately spreading one-sided or totally wrong information should not be allowed, and the online communities should have moderation for such acts. The sanctions should be made harder."

A faculty member wrote, "Democratic governments need to create standing committees to monitor tech and legislation should be kept up-to-date. intelligence agencies should combat cyberwarfare, agitprop, etc."

A president at a company wrote, "More control over what shows up on my screen, less anonymity for those who do harm."

A technology developer/administrator based in Europe said, "We have the means to hold people responsible for misbehavior, even if that means regulating Facebook and the like.”

A director of state library services based in the United States commented, "Actions taken to mitigate potential harm include the reintroduction of library media specialists in schools (K-12), who can teach digital citizenship, digital and media literacy and information literacy. Libraries of all types have an essential role to play in mitigating potential harms of digital life if they are well-supported and recognized."

An anonymous respondent commented, "Like any education campaign, this one might only have a slight impact (and a hard to measure impact at that). Look for historical analogies in this area before pronouncing judgment. I think of anti-cigarette advertising and packaging warnings as a possible analogy, one that seems to have a positive impact but over a very long time period, one of many decades. Digital ‘hygiene,’ that is doing all you can as an individual to protect your personally identifiable information, is just such a thing."

A research scientist said, "Educate the public and regulate use in school."

A professor wrote, "Education. Banning technology from some environments (like schools). And trying to find ways to disallow anonymous posting in online forums."

A principal engineer at a major global networking company wrote, "Every bit of change has an impact and leaves some of the people struggling with the change. Such folks will and should get help in dealing with the change."

A college professor based in Southeast Asia wrote, "I heard that some researchers are initiating a movement called “slow academy.’"

An anonymous respondent commented, "Better protection of personal data and privacy, better cybersecurity prevention, better clarification of sources of information."

A futures thinker who writes major reports for top consulting firm commented, "See Zeynep Tufekci's book ‘Twitter & Tear Gas.’"

A business leader based in North America wrote, "Adherence to privacy laws, firewalls, role and context authentication, dual authentication and continuously updated encryption and data protection."

A research associate at a major university in Africa commented, "The most important action would be to learn from the mistakes of the current ‘older generation.’ Much more education regarding online security and safety is needed. More important though is early education regarding the effects of physical inactivity. A reward system that encourages more activity even while using the internet would be great!"

An entrepreneur and CEO based in North America wrote, "You could unplug, but at a cost."

A professor wrote, "Continued awareness of the negative effects of social media, smart phones, digital technology, etc., may produce innovations that address the concerns. Or they might not."

A professor emeritus said, "Ongoing development of AI."

A research scientist commented, "Proper education about posture and musculoskeletal positioning while using technology would help decrease harmful effects."

A professor said, "More regulation is needed to help govern data sharing. Ethical principles need to be incorporated into schooling and business guidelines."

A professor wrote, "The actions that are most important are government policies to address inequality and injustice."

A North American research scientist said, "Privacy protections should be improved. The risks of training the models on the existing data with its discriminatory biases must be identified and mitigated."

A research scientist and internet pioneer commented, "There are many things that could be done, the question is whether they can be achieved. For some people, usability is a major barrier. More attention to reducing complexity and other barriers to use should be a high priority, but I don't see the private-sector creators of the internet experience motivated make this a priority; the addition of new features (which adds complexity) seems more important to them than ease of use. Disciplining online misbehavior will call for thoughtful reconsideration of how applications modulate the internet experience – a careful balance of accountability and freedom of action is required. I do not yet see a motivation for the private sector to give this priority, although this may change. Overall, I do not see the private-sector battle for market share aligned with the steps that might address some of the negative impacts on well-being."

A professor of information sciences wrote, "Increased regulation of privacy and security of personal information. Increased decentralization of technology. Increased average level of technological knowledge and facility. Increased incentives for diffusion of medical and health technologies."

A professor at a college in North America commented, "We need to teach young people better about how tech companies design devices to keep us tethered to them; we need to use government regulation to demand that tech companies mitigate the harm they cause by not regulating the activity on their sites; we need to propagate the idea that disconnecting, being more aware of one's use and balancing activities is a social value."

The executive director of a Canadian nonprofit organization wrote, "Despite my optimism about technology, I do recognize that their are harms related to digital life. Chief among these are the perpetuation of existing social inequities on a new platform. I don't know what actions can be taken to mitigate these issues, but movements like #MeToo give me some hope."

An anonymous respondent commented, "Media platforms need policies to assist in wise monitoring and guiding of the kind and volume of information pushed onto sites."

An anonymous respondent said, "In terms of security against identity theft, stalking, etc., I hope there is progress. I hope places that jam cell phones become popular, that unplugging gets to be a draw due to popular pressure. Not counting on it!"

A professor wrote, "People can be made more aware of the cost in time their engagement with social media is taking by considering concrete statistics of usage time. But this will only help those people who have the ability to control their habits and behavior. They will become more successful, while those who cannot control their habits will experience more failures."

An anonymous respondent said, "Eliminate anonymity and the use of aliases on the internet. Make sure that everybody is as visible and known as in the real life. Uphold libel laws and hate laws in every country similar to those of France and Germany."

A professor based in North America said, "We need to learn to balance learning from the internet and living in the real world."

An anonymous respondent said, "There are actions that can be taken to mitigate potential harms of digital life in place already but internet users disregard them. Parents don't police their children as carefully as they should and users of all age still click on links taking them to harmful websites or misleading sites in spam email. I don't know if people will ever learn that if it sounds too good to be true it's probably a scam."

An anonymous respondent commented, "A more measured approach to technology would mitigate potential harms, but I doubt most of us have the self-control."

An anonymous respondent said, "Public policy. Especially laws that make it harder (and therefore less profitable) to collect information about people based on their use of the internet and re-purpose that information in ways that the provider of the information did not intend (and probably would not have consented to, if the content and consequences of their ‘voluntary’ disclosure were made clear to them in context and in non-legalese terms)."

A North American research scientist said, "Focusing on the dangers of excessive use and misuse will help, just as it helps with any other technology or potentially addictive-like activities."

A chief of staff for a nonprofit organization wrote, "There is a fine line to walk between restricting freedom of speech and restricting hate/inflammatory speech. Digital has not caught up in how to do this on mass level."

A North American businessman wrote, "There are DEFINITELY a number of actions that can be taken to reduce harm from digital connectivity technologies. For starters, the government can start treating platform companies as media organizations and hold them to similar standards of responsibility for the content they host, whether that involves extremist or exploitative content on YouTube, hate speech on Twitter or foreign propaganda on Facebook. Second, governments can further regulate data protection requirements along the lines of what GDPR is seeking to accomplish. Companies are collecting more and more user data, but are not held accountable for how they store and handle that data, leading to massive insecurity of personal information online. Governments can create *strong* incentives and liabilities to keep that data safe and secure. Finally, the web can generally move toward more human-centric designs that celebrate individuality rather than attempt to put people in pre-defined categories for ad targeting purposes. While this might represent a momentous shift from the current model, I think advertisers themselves can demand it, as it would reduce the propensity toward trolling and extremism that we see today."

A professor/teacher who lives and works in Europe commented, "Better education about digital technology, discussing tech in societies, interest of law makers in technology."

A professor of political science said, "It will be difficult both technologically and politically but, with the right commitment, appropriate steps could be taken."

An associate professor at Texas Christian University commented, "It's really up to education to critically look at information. The solution is not more technology, but the responsitility of the individual to navigate and decipher information and use it as a powerful tool to benefit themselves. Those who cannot do this will be left further behind."

An assistant professor wrote, "Find ways to reduce/police ‘trolling’ on social media and to remove bots."

A professor/teacher located in North America, wrote, "Keep Net neutrality. Make greater efforts to prevent hacking."

A professor wrote, "Internet giants have to be held responsible for algorithms that only have commercial intent (clicks and ads). Just like a newspaper is not a paper company, Facebook is not only a tech platform. They are not ready to sacrifice any of their huge ad earnings to assume responsibility in the widespread harm done by promoting fake news, bomb-building, etc. Legislation should apply a minimum journalistic standard to social media companies to force them to track and rein in the worst abuses, or social media as we know it has to collapse and be re-invented."

A futures thinker and consultant based in Spain wrote, "Cybersecurity should be improved at all levels."

A communications professional said, "Education about what digital technology is doing to us is getting better."

An employee at a major U.S. research lab wrote, “‘For every action, there is an reaction.’  There will be interventions to mitigate some of the harms."

A research scientist based in North America commented, "Care needs to be taken to improve the truthfulness of information that is posted and shared on online platforms. Additionally, advances in AI must be carefully tracked to catch biases that develop."

A professor based in Europe wrote, "Companies and organisations have to take the lead here. Trade Unions (such as they are today) should also take a stand. The erosion of leisure time and the bleeding of work into any and all aspects of life can only be halted by those who are in charge. We need programmes of well-being and self-care that teach us to unplug and walk away. We need workplace policies that define when we are NOT expected (or indeed allowed) to respond to digital communications (and other forms of communication)."

A North American researcher wrote, "Parents need to step in and limit their children's time online. Kids still need to physically go outside, and interact with each other and the larger community. It's time to break the cycle of the smartphone/tablet as ‘babysitter.’"

A chief technology officer at a major global telecommunications company observed, “The technologies that enable the rapid and pervasive introduction of socially impactful developments are, for the most part, open to all. The same machine learning that allows Facebook to target content at particular users (with both positive and negative effects) can also be used to analyze these actions and their consequences."

A research scientist based in North America commented, "Increased collaboration between technology firms, regulators and public-interest groups could help us better understand the pros and cons of new technologies and develop regulatory frameworks that support innovation while at the same time ensuring appropriate consumer protections."

A professor based in the United Kingdom commented, "There are a range of measures that could help such as employee well-being programmes and the provision of telehealth within supportive community care provision."

A professor based in North America wrote, "Similarities to behavioral health interventions are strong for internet interventions. They mainly revolve around individual responsibility and moderation. For those who cannot set personal limits on, say, entertainment activities at the expense of more meaningful interactions, the internet will ultimately lead to unfulfilling lives."

A transdisciplinary faculty member at major research university said, "We need to rethink the policies and structures that are already in place at the systems-level. Add-on identitarian inclusion doesn't work. The internet was/is designed by (mostly) white, Christian cis-gendered, heterosexual males, giving it a particular flavor or world view. I don't see this happening in the U.S.'s current mode of reducing access via bad policy (Net neutrality), large tech companies unwillingness to create better versions of themselves, etc."

A research scientist said, "Shut down Twitter, for one."

A professor based in North America wrote, "As Isaac Asimov once said, ‘Only entropy is inevitable.’ Many actions can and must be done. There are no panaceas, and there are various actions at the macro, meso and micro scales that need to be undertaken in concert. There isn't enough space here to elaborate. The final chapter in Frischmann and Selinger's forthcoming book ‘Re-Engineering Humanity’ outlines some actions, but it isn't really a comprehensive or sufficient plan."

A research scientist commented, "There could be actions taken at schools educate young people about how Instagram and Facebook aren't real life."

A professor wrote, "Media literacy is now more important than ever. People could benefit from learning how to use technologies to their fullest advantage (e.g., the ins and outs of smartphones), how to actively search for information relevant to them, and to assess its veracity and credibility, and how to avoid some of the psychological pitfalls of the use of social media platforms (e.g., constant comparison to peers, bullying)."

A research scientist based in North America replied, "Honestly? I have no idea what steps to mitigate the harm of digital life would actually look like. Not the faintest clue. But, to say that there are no such steps seems like a gratuitous level of pessimism. I mean, I'm sure there is something that might be done. Good luck figuring it out."

A professor at a major university on the East Coast of the U.S. wrote, "1) Strict liability needs to be placed on the sources of cyber risk, not simply shifting responsibility for risk remediation to consumers. 2) Basic technologies should be developed to remediate software development processes that preserve vulnerabilities at the level of language primitives (e.g., require type safe languages be used to develop applications used in critical infrastructures), and create new oversight mechanisms allowing non-specialists to make more informed risk decisions. 3) Government must ensure that market incentives do not propagate vulnerability because of externalities and other misaligned incentives of both IP owners and computer equipment manufacturers (speed to market and features vs. security). 4) Governments need to ensure the security of critical infrastructures from deliberate cyber disruption. This means that they need to be informed and proactive in identifying risks, measurably mitigating them (or requiring that industry do so), and proactive in assigning intelligence assets to tracking state and non-state actors that seek to exploit cyber vulnerabilities. 5) USCYBERCOM and the Department of Homeland Security need to undertake better coordination for the cyber defense of the United States. The U.S. should seek collaboration with like-minded countries to internationalize these measures, defending an open internet from authoritarian states seeking to impose ‘sovereign control’ over data, IP and transport."

A professor wrote, "We need to be aware of the digital divide and ensure access is not a function of status or location. We need to provide strategies for disconnecting which is as important as connecting."

A freelance writer based in Europe said, "I have been online since 1990, and am reasonably tolerant of the kinds of flame wars that erupt in every digital medium. I don't think we will ever eradicate all of that. But I do think that companies like Facebook, Google and even (despite its much smaller size and more precarious finances) Twitter need to recognize that with their power comes great social responsibility. This will be even more true as companies like Uber merge digital and physical worlds so that the risks people face are not just nasty messages but immediate physical danger. In this morning's news is a story that Facebook has enabled companies to recruit for jobs in a discriminatory way by specifying that the job ads will appear solely to people in younger age brackets. I know a company can't foresee all the uses to which its service will be put, but this one ought to have been predictable.  All that said, people being thrown off the mainstream platforms are busy building their own – this is one of the both good and bad things about the internet. We need to find a way to limit the global shaming and attacks faced by some people (who have not always done anything wrong, like Caroline Criado Perez or Kathy Sierra), but we also need to avoid creating martyrs."

A research manager based in North America wrote, "Because I'm on limited time, I will provide only two that are particularly critical: 1) Educating children to understand the value of in-person interaction and in-person learning. Though digital classrooms and technology assisted education are very exciting and quite powerful, I have always believed that the most important part of elementary-level schooling is to help children to develop healthy and productive social skills, which they will not do nearly so well if every interaction is mediated by screens. 2) The adult work environment should be re-focused to reduce the speed at which life is expected to travel. When everyone is meant to be ‘on’ and in frantic motion 24 hours a day, there is little time to rest, recover and/or allow valuable free-form thought and brainstorming. Stress has a myriad negative effects on human health and when stress lives in your pocket with an expectation that you will respond to it 24 hours of the day and within minutes, health and well-being will not benefit."

A professor based in North America said, "Education about how to better use the internet so that it is more helpful and less harmful could help. I don't personally know how this could be done. Psychologists, teachers, and cognitive experts need to direct attention to problems."

A futurist commented, "Much like the development of seat belts for vehicles, as we learn of harms we can build in self-managed and self-aware features to help us stay safer and enjoy the benefits of digital technology as we use it. Perhaps the most important piece being awareness. Teaching kids not just how to use digital technology, but building an understanding that impacts of use can be both positive and negative encourages attention to personal outcomes. With that, we can all use digital technology with intention and navigate our own personal comfort zones and focus on healthy uses."

A blog editor based in North America wrote, "A lot of things need to happen at the level of business model, regulation, corporate company organizational design and operation, prioritization. One of the most important things we can do in the near term is come up with good ways of talking about the nature of the problem, because it’s harder to advocate for change without the right language. Sometimes it’s talked about in terms of distraction or attention, but we tend to associate that with more immediate types of attention, not longer-term life effects. I don’t think it will happen overnight, because a lot of it involves changing the way we talk about human nature and interaction. So much of the way we talk about it, especially in the U.S., is rooted in discussions of freedom of choice. My intuition, and this is just intuition, is the more we can get away from talking about it in terms of choice and start talking about it in terms of chance – which outcome was preferable and which actually happened – the better. Choice is such a messy thing to dive deep into, because then you realize that nobody knows what it means to choose. In terms of individuals working at these companies, I’m still heartened and optimistic, because everybody who’s a designer or engineer is also a user at the end of the day. Nobody goes into design because they want to make life worse. The challenges, generally, are structural, whether it’s about the existing business models of companies or the way in which certain forms of corporate legal structures don’t give people the space to balance some of these more petty, immediate goals with more noble kinds of things. It’s hard to say, in terms of the longer-term of tech evolution, whether we can be optimistic or not. I’m hoping that there will be a point where, if we don’t restrain things or turn the battleship around, we realize the unsustainability of it, from a business point of view but also in our own lives."

A professor of psychology at a major U.S. university said, "Technology that disables cell phones is already available and could reduce the substantial threat to well-being associated with device use while driving. Other laws, rules and norms will gradually emerge to address some of the problems brought on my mobile devices. Already, norms are developing around the use of technology at the dinner table. Rules, in some instances, disallow use of devices in certain spaces on university campuses. The software on mobile devices is increasingly sophisticated and personalizable, already making it possible for people to set limits on their own use."

A research scientist based in Europe commented, "If during breaks (school, work, etc.) people come together and do something that doesn't involve any digital device, I think that could be really meaningful.”

A professor wrote, "People can always change bad behaviours. Example is key."

An anonymous respondent said, "There are many actions that can be taken to mitigate potential harms [to our well being] of digital life. A fundamental cause of the harm is the that for both governments and corporations it can be useful to keep people in a state of anxiety, fear and dissatisfaction. The solutions, from the individual’s perspective, can be 1) immediate action (find ways of lessening the influence of the anxiety-inducing devices, apps and information) or 2) long-term strategies (how to make the goals of corporations and governments be in line with promoting people’s well-being rather than diminishing it). For example: 1) Immediate action – be aware of the source of your information, of your apps. Asking a salesperson if you should buy something will not yield an unbiased response - be aware of which messages and messengers are, effectively, salespeople. Be aware for the sources of your news. Some publications are committed to quality journalism – but many others are motivated by goals that conflict with that, whether it be pleasing advertisers or promoting a political agenda. It is easy, online, to lose track of of the source of a story – pay attention to this important contextual information. Limit your exposure to advertising. It’s purpose is to create an inchoate but pervasive sense of dissatisfaction, a nagging hunger for the next new thing. Beware of messages of fear. It’s good to be alert and aware. But we only not make our best or most generous decisions in a state of panic. 2) Longer term – make (or encourage the support and making of) non-commercial social and information sharing platforms. Support companies that embrace a financial structure that does not put profit-making over any other consideration. Support politicians (or become one yourself) who do not use fear-mongering and bullying to gain supporters."

A principal research technologist who works for the U.S. government commented, "Mindful assessment of new technology and how we allow it into our lives will help us mitigate negative effects. It is easier to do this when thinking of someone else (e.g., deciding how much ‘screen time’ your child should have). We could apply the same judgment to our own consumption of media and technology. I recently read an article in which a mother wrote, ‘In honesty, my children have to compete with my phone for my attention.’ That kind of self-awareness could lead to developing new habits that help us achieve the relationships we want to have, rather than the ones we default to."

An education and outreach coordinator commented, "I would like to see better ways of self-limiting screen time. I know that I can waste a lot of time. And several friends have had their accounts hijacked. I am concerned about security."

A professor of mathematics wrote, "Perhaps there should be a reminder every 15 to 30 minutes for people on social media."

A post-doctoral fellow based in North America wrote, "Actions that can be taken to minimize harms start with those in charge of distributing the technology. For example, Facebook has supposedly good intentions by wanting to connect the world to each other, but they are taking advantage of basic human psychology and using attention metrics to determine how successful they are as a company. In the future we'll need to ensure that companies are not capitalizing on the flaws of the human mind to get people engaged and instead have those in charge focus on improving humankind. If all of those seeking to change digital life started with a positive, humanitarian goal (rather than a capitalistic one), there could be widespread benefits. Educating the public and ensuring that the drivers of digital life are abiding by code of ethics that the majority of users can agree upon, we could definitely minimize the harms associated with digital life."

An associate professor at a major university on the East Coast of the U.S. wrote, "Media outlets can put processes in place to prevent the spreading of ‘fake news.’"

A director of technology at a U.S. public school district wrote, "An understanding of the tools that we have been provided is essential. Having a program on digital citizenship should be mandatory so that we can teach this understanding to young and experienced alike."

A Ph.D. student based in North America wrote, "The harms to peoples' well-being are being studied in detail. From these studies it is possible to draw potential interventions. However, the more pertinent question is whether there is financial incentive for companies to do so. Is Facebook really incentivized to keep me from feeling jealous about the material success of my peers (an oft-used example of harm)?"

A researcher and consulting statistician based in Africa commented, "Rather than censoring it, make people aware of harms from the internet. There are ways of setting up your own internet so not to get to harmful pages."

A retired lawyer and academic wrote, "Children need to be outdoors. People need to engage in physical activity and interact in real time with others. We need stress these very real needs."

A retired consultant and writer said, "In the 21st century the internet needs to be protected and implemented as a public infrastructure. New regulations around transparency; see David Brin's reciprocal accountability."

A business development director at a large law firm said, "There will be tools created to help with internet addiction and better educate people as to what that looks like."

A retired Web developer wrote, “Actions should be taken to mitigate potential harms of digital life. Whether they can be taken is up to Congress or some other level. The biggest/easiest action that can be taken is to require cell phones be turned off in all public places or have designated cell phone areas. If I can't have a cigarette, why should they be able to share their phone conversation with me? It would allow people to talk to each other in restaurants. To look up instead of down in parks. To be disconnected for just a short amount of time to enjoy the other things around them."

An anonymous respondent wrote, "1) Don’t cut taxes. We need more oversight not less in this era. 2) Social media companies need to hire editors, fact checkers and journalists to invest in truth that has been lost by newspapers/magazines that have gone out of business, just leaving a few media monopolies. 3) Find ways to financially support truth, journalism and verified information online. Sure everyone can blog on Wordpress, but that isn’t the information to view as verified truth. It is opinion. Advertising isn’t enough to support sites that require a large staff of people (not bots) to make decisions about what information is legit enough to publish. 4) Teach kids that digital is a partnership between humans and technology. *Not* a replacement of humans with technology. I know many people believe AI will save us. I’ve seen early models and they suck. AI will do what it is taught by humans or it teaches itself. AI poses all the same the risks as humans with poor judgment (or possibly more risks) because there are no rules, morals or values to guide decisions with the software platform. 5) We need more laws about what happens on the internet; Net neutrality-type laws to protect individuals."

An entrepreneur commented, "Establish alternate methods of human contact."

A data quality analyst from North America said, "If there is overwhelming demand from the masses of U.S. citizens, the NSA's intrusive mass-surveillance in our lives can be mitigated from the state level (via nullification) and federal level (via constituents' bombardment of e-mails to Congress and president). It can be done, but will it? Countering the growing problem of the decay of interpersonal face-to-face interaction is something only we ourselves can address by actual practice, finding solutions to problems encountered, and implementing the solutions with even more practice. It can be done, but will we? Only we ourselves as individuals can answer that."

An entrepreneur based in North America wrote, "People will be more conscious of their digital choices. There are already ad blockers that ensure a private internet experience. So demand will generate supply. And with the reversal of Net neutrality, I feel an entire ecosystem will develop to throttle internet access according to your digital profile. It remains to be seen if those ‘throttles’ will act in the public's best interest or their private best interest."

A retired professor wrote, "Mistakes leads to success, with each step forward harm is reduced. The less harm the better."

A professor at a major U.S. state university said, "Put the phone down. I began using an app that tracks the number of times I pick up my phone and use my phone. I was surprised at how much time I wasted online. I knew it to a degree but didn't realize how many times I pick up the phone out of habit."

A professor based in Oceania wrote, "Technology education must be upgraded and people need to learn the tricks of scammers; hackers; fakers; call-center, email and advertising, scams. All people must have equal access to the same education in hardware, software, skills, knowledge and teachers. All people must have equal access to ISPs, computers, hardware, software, etc. Information technologies must become a human right, just like a living wage must become a human right. [It could be managed by a] worldwide, honest, unbribeable group who will be paid very well – a forensic-audit financial group not controlled by countries or vested interests (not a FIFA or UN) but with equal and diverse numbers of male and female members from across various disciplines. They each must have high level of proven, honest knowledge in their specific area. They might effect recovery of unpaid taxes and stop scams, money laundering and all illegal/dishonest/unethical wealth creation and storage."

A research scientist said, "Perhaps the wider implementation like self-imposed internet down time - through the use of browser extensions or apps that limit our internet access to encourage us to engage in non-digital pursuits - would be helpful."

A chief marketing officer at a North American company commented, "Digital tool norms and society values will moderate emerging harms."

A professor of digital society based in Europe wrote, "We need to listen to ordinary citizens' telling us in their own words the ways in which technological developments might benefit or harm their well-being, and adapt our practices and policies based on what they say. This is possible – we just need to have the will to do it."

An epidemiologist based in North America wrote, "We are learning more about how technology/digital life can negatively impact mental health and feelings of social isolation, especially among youth. The more we learn about this issue, the more we can intervene/mitigate these effects on a population-level through policy or design of digital apps/platforms."

An anonymous respondent said, "Software engineers need to communicate with the individuals who use their products, not with CEOs who know nothing about the daily work life of employees."

An anonymous respondent wrote, "There has to be ways of better securing our privacy. Surely they will be able to control 'hackers' and evil people who prey on the elderly, the un-educated and those not aware of the many pitfalls that surround us."

An anonymous respondent commented, "Social media, particularly Twitter, is literally contributing to the headlong downfall of democracy and the rise of demagogues."

A director of strategy and content marketing at a U.S. marketing agency said, "As experimental technologies continue to break our ‘body barriers’ and become more biologically invasive, tech will need to be held up to rigorous standards and testing for health implications."

An anonymous respondent wrote, "I believe that as a society we need to set up some kind of regulatory body to review and consider the impact of new technology on human and family development and interaction. We need empowered technology ethicists. Profit should not be the only driver for technology-driven change.”

An anonymous research scientist commented, "Privacy protections will prevent unanticipated harms."

A social science researcher commented, "Our government, computer developers and the business industry are failing communities and people by not doing more to protect individuals. Privacy protection from unwanted intrusion or sharing of personal data should be equal to making huge profits. These entities make the laws, develop the products, reap the profits, so the onus is on them collectively to do better."

A futurist based in North America wrote, "Personal responsibility, more information from social-behavior studies."

An anonymous respondent wrote, "We need to find a better balance between freedom of expression and the ability for people to be malicious because of the anonymity that frequently exists on social media."

A technology developer/administrator commented, "I believe they exist, but do not know what they might be."

A professor wrote, “Technical architectural decisions can minimize the adverse impact of technologies on well-being."

A professor in media studies at a Norwegian university commented, "We can design technology to help people, instead of just trying to entice them. Stopping gamification of everything is an obvious first step."

A professor based in North America wrote, "Medical professionals can do a better job of counseling the public and their patients regarding the mental and physical problems associated with media engagement and wearing devices. This would require a concerted effort."

A computer scientist based in North America wrote, "The biggest factor contributing to this harm is the free nature of social content. The content is free because the companies are harvesting a wealth of information about all of its customers and selling it to advertisers. If the companies were not allowed to sell their amalgamation of consumer information, it would lead to many social platforms changing to a pay-for-use model. By adding a cost component to the consumer's decision-making process, the over-use of these platforms may be reduced."

A U.S. government statistician commented, "I am optimistic password technology will be superseded by an easier to use, more secure option to keep our personal information safe."

An internet activist from Europe said, "People can act together to become aware of the potential harms and devise mitigations for both individuals and groups. Mitigation may be as simple as using the technologies less (which is easy if you do other things more)."

A research scientist and lecturer based in Scotland wrote, "Education, education, education. And the need for privacy protection."

A professor based in Europe wrote, "It would be easy to introduce ‘do not track’ buttons or protect the user from unwanted interference, but this is not promoted as it against the business model of big companies. Technically, it is possible and I expect user power to grow to make it a standard feature. More choices can easily be enabled if there is a will."

An anonymous respondent wrote, "Appropriate regulation can minimise the employment impacts and that governments need to focus on education, training, etc. They also need to reassess regulatory frameworks to ensure that they do not favor digital services over more traditional services. Parents also need to engage with their children to manage digital media usage."

A technology developer/administrator based in Africa wrote, "By engaging all stakeholders in an elaborate effort to minimize the negative effects and potential harms of digital life. All stakeholders and especially the owners and developers of the technologies should make steps that are tangible in managing the effects of technology."

A research scientist based in Europe said, "In product development and marketing the longevity of products of all ranges, can become given as a prime criteria in engineering and product development, including major installations as well as singular items. Knowledge about the long-term and overall impact, as compared to short-term local effects of communication systems and devices, and of networking components integrated in different Products, can be transferred on a much broader base than is the case now, to new generations of engineers and to today's and tomorrow's business people and industry managers and policy and decision makers."

A senior researcher based in New Zealand wrote, "A colleague many years ago observed, ‘The good thing about the internet is anyone can publish, the bad thing about the internet is anyone can publish.’ The key thing is similar to an early observation about television, ‘Television allows into our home people we wouldn't allow into our homes.’ Similarly people need to be continually made aware that it is people who use the web and they are the same variety of people they deal with. I think high school is a good paradigm as I think the web accentuates the desire to be ‘one of the cool kids.’"

A senior information scientist wrote, "A key action is to improve critical thinking skills. K-12 education has been forced to de-emphasize this skill, but it could be brought back and aimed at internet media.  I regard the major digital ‘platforms’ as ‘media’ companies, despite their claims to be otherwise. It's not clear that regulation eliminates outright false and biased reporting, nor should it. The reader must take responsibility for being educated enough to detect falsehoods and recognizing the inherent bias in reporting, and this comes back to the important critical thinking and the tools necessary to think critically."

An anonymous respondent said, "They can do it, but they won’t. I especially rue current government opposition to Net neutrality."

A entrepreneur commented, "Yes, through education. Society has to enable people to learn and educate themselves."

A professor said, "We need to do a better job protecting data. Put limits on what companies can and cannot do with what they know about you and your circle of acquaintances. We also as a society need to think about the boundary between work and life. While technology makes it possible to work remotely, it also makes it impossible to escape work. Both would require individuals, businesses, government and large aggregators of data to rethink work, life, privacy and work to a common good. I am doubtful that will happen as compromise and common good seem to be vilified today."

An anonymous respondent wrote, "Privacy controls, e.g., the ability to wipe out information or posts relating to you and to prevent them from being reposted."

An anonymous respondent said, "Cars that disable phone use. Clearer public expectations about technology use. Norms that say it's not okay to ignore your kids to use your phone, or that it's not okay to whip out your smartphone or tablet while watching a play, or eating at a restaurant, et cetera. But I'm doubtful these things will be successful."

A research scientist based in North America commented, "Some more detail around ‘the harms of digital life’ would be helpful. Broadly speaking, media literacy will play a big role in reducing the harms you are talking about. I also think services continue to improve on models that flag inappropriate behaviors and that will help as well."

A North American professor wrote, "New norms for digital interaction and self-preservation in a connected world will emerge. People will learn to adapt, as they have with all other new technologies."

An anonymous respondent said, "There is methodology to address harms. Bit not obvious if will be used some so impact revenue."

A futurist based in North America wrote, "There's a widespread sense of unease verging on panic concerning the digital world. This could lead us to panic and withdrawal, but we're too invested online to step back."

A professor wrote, "Some of these technologies are addictive, so personal vigilance is key. Insisting on not using your phone all the time, et cetera. On a grander scale, government regulation will be needed to guide the requirements of proprietary algorithms so that they behave ethically."

A European professor wrote, "Design stable long-lived platforms that retain basic functionality. However, how the industry could adapt to remaining backwards compatible is itself a challenge. Also, how can content be designed in a way where it continues to be reliable as the user equipment ages? This is something that really is not being addressed – and for which there at present no incentives for content providers (the natural driver is to deploy new features and keep the user interface vibrant to attract new ‘viewers’)."

A technology consultant and expert on attention and workflow previously with a top-five tech company wrote, "There can be actions that help mitigate. My answer is too long as an essay for a survey."

A professor based in North America said, "There is way too much commercial collection of personal data – we love sharing but don't want what is shared to be made an asset for a company to sell us stuff. We need more ability to really opt out of ads and more ability to block private data from being collected by third parties. Europe is way ahead of the U.S. in protecting privacy, and I think that is a terrible reflection on us – commercialism ahead of individual freedom!"

A North American entrepreneur wrote, "Limiting the use of technology is one great way to mitigate potential harms. I imagine there will be support groups that are formed to do just that. I think there will be information verification tools and research tools that will help mitigate harm caused by misinformation."

A professor based at a top university in the U.S. Northeast commented, "Hacking threats should be addressed, and I'm not really sure how this can be done. Limits to data collected on individuals; strong privacy laws. Net neutrality should be reinstated."

An anonymous respondent said, "Better privacy protections and reduced corporate control over the means of digital communications, such as through Net neutrality."

A retired systems designer commented, "How do we get agreement about individuals’ bad behaviors, like trolling? This is hard. However, we can come to agreement on clear models about how businesses data mine our personal information and share them. As a frivolous example, what if every business site had to send us an email when they shared our information? What info was shared, and with whom. This would serve as a stark education of the realities of digital privacy."

A professor said, "Research into harm will result in strategies to mitigate harm."

A research scientist said, "Electromagnetic fields safety measures."

A professor based in North America wrote, "Violent and sexist material should be restricted."

A research scientist said, "Obviously something can be done. I find it impossible to believe that there can be no improvements in the use of a new technology."

A professor based in Oceania commented, "There will be sociotechnical adaptation and changes in norms. People will adapt to protect themselves from FOMO and other negative behaviours; embrace its more positive aspects and introduce new checks and balances."

A user-experience researcher commented, "The interventions would involve creating social norms that say when you are with a group of others, it is not OK to spend lots of time ignoring those people while you are messing with your phone and text messages."

An assistant professor of political science at a North American university said, "Some of the solutions will be technical, like blocks or smart timers on phones others will be old-fashioned boundaries."

A professor at a top university on the West Coast of the U.S. wrote, "Computer security will improve to protect my privacy."

A social justice advocate commented, "Do you remember when psychologists jumped on 'internet addiction'? The trends and changes are too immediate to know what is truly harmful, what needs mitigation. I do think more attention can be paid, now, to the impact of screen time on sleep, and the impact of cell towers and electromagnetism."

A North American entrepreneur wrote, "Not 100% sure what those actions are, but limits on bots, known trolls from Russia, etc., will help."

A professor of informatics and computing wrote, "Algorithms used to prioritize information to which we are exposed should be designed with trustworthiness and diversity of opinions as priorities over engagement, to mitigate the harm of misinformation and manipulation."

An entrepreneur and business leader from North America commented, "There many education, work and literacy efforts we could lead to inform people about the effects and impact of digital media. Not black or white extremes but deep and nuanced understanding. Coming from the videogames sector, I can see how we could help parents understand much more how videogames work, so they could better moderate and curate it for their young kids. Instead of being afraid or feel that their kids are isolated in their own world, they could become part of it and interact with them."

A professor said, "Well, almost everything can be improved. Education, subsidies, regulation, all help improve safety of transportation for example."

A president and chief software architect wrote, "Improvements in technology will move in the directions of pure voice-command operation and this will simplify the interaction between end users and their technology."

A doctoral researcher in communication commented, "There is no such thing as technological determinism. If we can get designers, entrepreneurs, ethicists and humanists to work together, we might be able to produce technological advancements that avoid the worst harms and provide the most benefits. But it will take critical thought before, during and after the design and launch of new products and systems, as well as critical analysis of the infrastructures, regulatory regimes and educational contexts within which they are developed and implemented."

A professor from North America said, "At the moment messages come into my (mobile/cell/handy) phone unmoderated and this can be stressful. In the future, messages will be moderated by a system. The system will use environmental factors such as am I driving or being driven, what is my mood like, how fast I've been handling previous messages and the content and metadata of the message stream to determine when a message should be delivered. The content will be analysed using Reputation, Attention and Trust (RAT). What is the reputation of the sender in my circle of colleagues or industry or society? If Elon Musk sends me a personal message I'll want to see it straight away. My attention is valuable. Will the delivery of this message serve my current goals? Trust analysis is applied to the message and the sender. Sometimes my close friends play pranks."

A professor at a top university on the West Coast of the U.S. wrote, "There should be a systematic education of youth regarding how to use social media – how to question and investigate veracity. There should be more efforts by internet platforms to downgrade false and malicious content and vet ads and accounts for signs of bots and trolls. There should be broader standards to minimize anonymous accounts."

An anonymous respondent commented, "We can find ways to encourage less-addictive uses of technology. Moving away from incentive-based features that require constant check-ins is a good start. We can find ways to leverage new technological developments to add real value to communication and encourage connection rather than silos of experience."

A political science professor from North America said, "I admit that it's hard to imagine any government or corporation lifting a finger to improve people's well-being, on the internet or off it without some revenue or profit going the other direction. But it seems that some people are taking steps to change their own behavior. I believe that this is likely to remain a relatively small proportion of people; perhaps the data on physical health in U.S., where some people take care of their nutrition and exercise but most do not, could be considered a harbinger for the future of mental health in relation to the internet."

An assistant professor of technical communication at a major state university in the U.S. said, "I teach ethics in my user-experience classes. I have faith in the students who are entering the workplace will think and work with empathy and ethical standards."

The executive director of a tech innovation firm said, "There could be more awareness and removal of the filter bubble, even proactive connections to alternate viewpoints. There could be rigorous prosecution of information warfare and more transparency."

A professor of humanities commented, "Like all radical cultural changes, this one requires new patterns of living and changed social expectations. Experience will teach us most, closely followed by more formal forms of learning."

An information science professional and director wrote, "We will have available to us the results of current and future studies on our health and digital life. I'm sure that many of the results will be incorporated into our daily lives, just like what has been done with cigarette smoking. What frightens me is our possible inability to gain control of the growing incivility that we see all over the internet. The cause of that fear is the way our current leaders in this country conduct themselves. I am ashamed that our president still calls Ms. Clinton and others derogatory names and ‘speaks out’ on social media with such hatred. That is not good for our society."

An artist, writer and independent intellectual commented, "As a society we're still very new to digital life. We need more time to develop ways of being, both personal and societal, that will help us deal with the challenges it poses."

A retired healthcare executive said, “We need to build on good information and diminish or mitigate the effects of commercialization and straight misdeeds. Of course, we need to enhance security and also find ways to stop and punish the evildoers and the exploiters of the system. The promise is great, yet the threats to individual autonomy are real. Recent U.S. elections have demonstrated that the internet can be manipulated to advance nefarious agendas."

A retired senior systems analyst wrote, "Interventions are welcome to diminish the junk on the Web on one side – for example unwanted advertisements, and Google is here a definite negative player – and on the other side to restrain no opinions but incitations to accomplish some thing, perhaps unlawful. The danger of interventions is to restrain the freedom of expression and introduce the most damageable censure."

An anonymous respondent said, "Ironically, we may need to rely on technology to build in filters, prompts and planned disconnection from technology. More regulation of online companies also is needed to provide transparency into the algorithms that shape the information that we are fed."

An information science professional commented, "My experiences with monitoring my blood glucose levels have demonstrated that it is possible for online can have positive effects. We need to see that our personal health records aren't hacked, however."

A retired public opinion researcher wrote, "Mitigating potential harms caused by digital technology will require individuals to seek and participate in behaviors which result in positive social contracts. The trustworthiness of these contracts need to be tested before inclusion."

A professor based at a top U.S. technology university wrote, "Policy can influence how digital technology can be used to benefit less fortunate members of society (e.g., physically and mentally disabled), or to just benefit a fortunate few."

An anonymous respondent commented, "While there are many things that could be done to limit the polarizing impact of our current systems of information technology, to restore civility and to restore the factual foundations of public debate, these would most likely require profound changes in the system of ownership, control and/or regulation of online media that are not, at least at present, politically feasible."

An engineer based in the South Pacific said, "We need better education, especially in the area of critical thinking and analysis. THAT would help balance the wall of facts we face."

An anonymous respondent said, "Privacy can be improved. It is possible that some actions, regulatory ones for instance, are more difficult but there are many other ways to adapt and react. In the long term, I think regulatory approaches and structures can be adapted. Learning does happen after all."

A professor at a major U.S. state university said, "Notification management is critical to allow for work-life balance, supporting well-being. Further measures to protect privacy are needed."

An executive for a major internet business wrote, "Greater education about information literacy would be helpful. But I do believe that there will soon come a time when people realize that the return on their investment in time and money into being constantly plugged into ‘information’ is flat or negative and people may become more discerning about the sorts of services they consume."

A professor of public policy at a major U.S. university wrote, "Some limits and regulation could help quite a bit to improve privacy, avoid the more extreme attacks, unmask deliberate manipulation, et cetera."

The director of a technology graduate program commented, "If the right governance systems and human values are put in place, technological development will result in positive social change."

An assistant research professor at a major state university in the United States wrote, "Similar to how people were worried about the railroad, telephone, radio, television, computer or any other type of mechanization, there is really no action that can be taken to mitigate harm. People have to be willing to take responsibility for their own actions and learn to harness the new innovations. Even with rules and laws that encourage safety (protective covering on lawn mower blades, for instance), it is still up to individuals to choose to engage with and manage the technology."

A professor at a major university based in Northern Europe wrote, "The actions are too much determined by corporate interests not by public good or citizens needs."

A professor at a major university in Australia said, "Whilst I think there are actions we can take to mitigate the potential harm of emerging technologies I do not think we have the capacity to act as we need to. Ultimately this is not about what harm technology might represent to us but it is about what our capacity is for self-harm."

A retired professor and research scientist said, "All you could do is make access more difficult, slower or unpleasant."

A high school library media specialist wrote, "Internet access is like a body. We have come to rely on it for everything. Without it, we can't seem to function. Unless people stop using it, there will be no change in behavior."

An anonymous respondent wrote, "Researchers and developers are more focused on new functions, features and capabilities of products/systems.”

A professorbased at a top university in the U.S. Upper Midwest commented, "The ship has left the harbor. Digital providers have too much power and control information. Technologists also naturally push capabilities without worries about negative impacts."

An internet pioneer and business leader said, "It will be difficult to unplug."

An anonymous respondent said, "At least in the U.S., there is no political motivation to make changes that would help the majority of people. The recent decision against Net neutrality is just one example. Short-term profit and stockholders' interests are driving policy-making, innovation and regulation."

A professor said, "The technolibertarian philosophy is the lens through which people make sense of issues, so that collective goods like a balanced democracy or a vibrant community simply don't make sense. When coupled to a political system in which tribal political loyalties and campaign contributions erode even policies that have vast political approval (like Network neutrality) there aren't many effective institutions that can counterbalance problems created by policies that generate profits. Google would like to believe it does no evil, but when tens of billions of dollars of revenue are at stake, the social and political problems resulting from reinforcing polarizing social divisions will be ignored by the company, government regulators and the media."

A futurist based in Europe commented, "There is a huge push from the economic side to use ever-more-digital tools in your life, and the means of regulators are really limited because of the global nature of such companies and activities. That is the biggest threat because needed regulation is extremely hard to enforce."

An information science professional wrote, "We are, in the United States, a people who believe in our free will to live as we choose. There would be incredible resistance to any large-scale attempt to help people moderate their use of technology. I can't imagine a government that would wish to try. Technology is so linked to commerce that suggesting people use it less would be decried as harmful to the economy. We are in a cycle where the ends justify the means that justify the end. We want what we want and from most appearances, personal risk or harm is not an acceptable reason to limit our access to what we want. Those who make money from our behavior are certainly not going to help us change it."

A researcher based in Europe wrote, "Preventitve actions not only cannot be but also should not be taken at all. People have to learn and get used to digital life though their errors. Real learning always involves making errors, which is called ‘experience’ and this must be allowed for people in the case of learning how to behave in digital life. Learning from our errors is very important for obtaining stable knowledge and developing genuine behaviour."

A technology developer/administrator said, "Each persons' reaction to digital data is different and personal. If someone published an embarrassing picture of me online, I'd shrug. Other people may be near-suicidal if a bad hair picture was posted."

A manager commented, "There is the potential for new technology to help manage the impact of more connectivity. But then the issue is who is creating, why are they creating and how do you maintain neutrality in the management of said technology."

An anonymous respondent wrote, "The more useful interventions will be those people take for themselves as they grow more accustomed and confident with connectedness. In particular, social norms will push back trash talk, fake news and other click-bait into their own ghettos."

An entrepreneur based in North America commented, "I don’t see a lot being done or talked about regarding a balance of digital life and analog life. AI and VR are heavily promoted."

An anonymous respondent wrote, "There are things that can be put in place, but – outside of the internet stopping – people will find ways around anything others put in place for them. It’s like seatbelts in cars. Once, there were no seatbelts and people got hurt a lot. Then there WERE seat belts, but only a few people used them and they were still a lot hurt. Then the industry had the car warn you when you didn't wear your seat belts, and people got around that too by buckling it before they sat down. Then states made it a law and more people used them, but some still didn't. Now kids are growing up who have never NOT used a seat belt and they think it’s stupid not to. That's pretty much the steps we have to go through with the internet, but we are still in the ‘no seat belts’ phase of the story. It’s going to take generations to have internet ‘seat belts’ become a common and accepted thing, and there are going to be millions injured in the meantime."

An anonymous respondent commented, "I don't think there's a digital utopia that's going to get us past the human error that comes with using technology."

A retired professor based in India wrote, "There should not be any interventions, by way of laws and practices; this goes against fundamental freedoms of expression."

A professor and researcher in the University of California system wrote, "Don’t make this sensational."

An online college adjunct instructor based in North America commented, "There might be short-term interactions but ultimately the user is responsible for prudent technology use."

An anonymous respondent said, "There are actions that can be taken to address potential harms to society or community, such as ways to address fake news, but I don't envision any actions that can be taken to address potential harms to individual well-being."

An attorney based in North America wrote, "The only interventions that can work are personal and parental but society is overcommitted to digital life."

An information science professional wrote, "The internet is not all good or bad. Nothing one can do will change that fact. At some point, the responsibility for using a digital service in the right manner, with the right intent and in a reasonable way lies with the individual."

The CEO of a publishing house said, "Twitter needs to go away and Google and Facebook need to do a lot more to control who is using them.”

A technology developer/administrator based at a major U.S. research organization wrote, "Security could be fundamentally improved, sparing everyone a ton of annoyance. But it won't be, because that would require a fundamental change in the architecture of the internet."

The president and founder of a small internet software company said, "The internet is a near-universal, near real-time communication amplifier for good or ill. The fact that Facebook, Google and others depend on ‘engagement’ to make money turns them into an attractive menace for hate groups, propaganda and state organs up to no good. I believe that it will be possible to evolve enforceable norms and limits to reduce the damage these agents can do."

To read the 86-page official survey report with analysis and find links to other raw data, click here:
http://www.elon.edu/e-web/imagining/surveys/2018_survey/Digital_Life_And_Well-Being_Home.xhtml

To read for-credit responses to the main survey question, please click here:
http://www.elon.edu/e-web/imagining/surveys/2018_survey/Digital_Life_And_Well-Being_credit.xhtml

To read a PDF with an expanded version of the full Digital Life report, please click here:
http://www.elon.edu/docs/e-web/imagining/surveys/2018_survey/Elon_Pew_Digital_
Life_and_Well_Being_Report_2018_Expanded_Version.pdf