Elon University

Workshop: Priorities for the Long-Term Stability of the Internet

IGF 2010 Logo

Session description: This page includes a long print-news report and video-clip highlights. This workshop featured a discussion of principles of Internet resilience and stability was an exchange among global experts and partners aimed at starting to identify priorities and principles that might enhance the long-term stability of the Internet. Participants expect teh session’s results to be shared with the technical community, business leaders and policy makers to help focus future work. The point was to identify some key issues, note what is being done to address them, and list any gaps or areas that may not be addressed sufficiently so far. Ideally, this could lead to a mapping of issues and identification of the relevant forums where they could be addressed. The print-news story is below the video window. Use the video viewer below to view several clips with brief highlights. Scroll down the right-hand column of print next to the video-viewing pane and click on the captions for each of the videos to view them.

‘Fragmentation’ of Internet seen as one of many threats; raises discussion of principles

September 14, 2010 – The participants in this first-day workshop

European Commission Vice President Neelie Kroes opened the session by outlining just a few of the many threats to Internet stability, including natural disasters, malicious interventions that target networks – for instance the cutting of deep-sea cables in the Atlantic and the denial-of-service attack in Estonia – disruptions of the physical infrastructure, and incidents affecting it, from the routing system to the domain name system.

“The Internet has proven until now to be remarkably robust and resilient,” she said. “That doesn’t mean, however, that there is no necessity for a continuous effort to address stability threats.”

Security threats generally get the most attention when it comes to stability discussions, but this workshop also identified threats raised by the politics of control and those tied to the implementation of various measures that are meant to improve the Internet.

Raul Echeberria, executive director of LACnic and president of the board of trustees of the Internet Society, said he is most concerned about the integrity and resilience of the Internet because it is threatened by fragmentation due to business approaches, political regulation and the unintended negative consequences of techniques initially aimed at risk management.

“The Internet, by definition, is something that is not very stable, and it is in permanent evolution,” he said. ““The debates over Net neutrality are one of the areas from which some of the risk to the integrity of the network could emerge. Not only this one, but the fragmentation of the network for political reasons. Neutrality of the network has been a risk for many years, before the debate for business reasons came up. The free flow of information from point to point is a serious problem in the world, and this is one of the major challenges we are facing in terms of integrity to the network.

“Another reason for Internet fragmentation is regulation. In trying to deal in good faith with cybersecurity, cybercrime and other problems in which we share objectives decisions could be taken in order to achieve the best results could be wrong and create an overregulation that could produce some fragmentation in the network.”

Earlier in the session Max Senges of Google had raised a similar warning and he proposed principles to consider. “My fear would be that we get to some kind of a central gatekeeper that exercises more control over the Internet,” he said, adding later, “I’d like to point out four principles:

“No blocking or degrading of lawful Internet traffic, so basically everything besides spam and very specifically defined traffic that is legal should not be blocked. Second, no anti-competitive behavior, so no favoring of your own traffic and similar arrangements. Third, transparency of relevant information, so basically if traffic management takes place, the information should be very clear to the user what kind of traffic management is taking place when and for what reasons. And last, as a last resort, obviously, governments have the means to conduct oversight and enforce the rules that have been agreed upon.”

Senges noted that the Dynamic Coalition on Internet Rights and Principles is working to “make progress defining these principles that we want to see.”

Andrew Cushen, senior public policy advisor at Vodaphone in New Zealand, responded to the net neutrality discussion by saying “there is a scarcity issue here, and whenever there’s a scarcity issue – and it’s quite pronounced in New Zealand when you get to international connectivity – you must make choices about how do you prioritize. Otherwise you run the risk that everything becomes horrible. So I like the point that indeed transparency could be a good start, but nevertheless there is a need for more principles to say, ‘What can you do in order to manage scarcity and overdemand on a scarce Internet resource?’”

Ram Mohan, executive vice president and chief technology officer of Afilias, added, “If you’re an operator you should still be able to provide some shaping to the traffic that’s coming through, so that you let good stuff through, but certainly when there’s a DDOS attack, it’s pretty clear that the end user is going to be impacted … We are asking ourselves, ‘Do we know what the infrastructure looks like as a whole?’ ‘Do we know about how the network sites are corrected?’ ‘Do we know what are critical loads in our infrastructure?’ ‘Do we know how the security incidents that we do have most affect our stability of our networks?’ ‘The answer is no, no, no, no, you know??So the main priority of our approach is to learn about ourselves, to learn about our national network infrastructure, just to be able to contribute to the stable operation of it.”

Avri Doria, a professor at Luleå University of Technology in Sweden and consultant to the IGF Secretariat, noted that technologists must work to scale the Internet to fit people’s needs and deal with threats, but they don’t always address policy and “the needs of society” in their decision-making processes.

“One of the gaps we still have is while we come to the IGF and we talk about it a little, we still aren’t really engaging in the work early enough when we’re looking at the problem, looking at the solutions,” she said. “Another gap I see is that we really don’t have a good way of evolving our systems … Sometimes people need to think about how one evolves to a new wheel. We don’t have a mechanism to do that. There’s no real place on a global aspect that those things are happening.?

“The last thing I’d like to mention is another gap is the motivation. We heard several times, we’ve got the fixes, but there’s no business reason to do it. And where do we find, now, a method of sort of saying there are things that need to be done for the network from a stewardship perspective, and somehow or other the gap in how we convince people that they need to do things, even if it doesn’t help their bottom line, even if it actually hurts their bottom line in the short-term so that they make a few less profit. That is another gap we have to deal with.”

This workshop was one of many during IGF 2010 in which it was mentioned that there should be more focused discussion of Internet principles, rights or values.

Dealing with DNS attacks, malware and Internet evolution

Mohan also spoke about other threats to stability from his base of expertise in managing 17 million domain names there and from his work with the ICANN board of directors and with the Security and Stability Advisory Committee. He is a leader of several international initiatives in Internet security and internationalization.

He noted that Afilias servers handle billions of queries daily, and these come with daily security challenges. “Communication and interaction inside of the DNS often presumes trust and sends sensitive data in a completely open manner,” he explained. “The authenticity of self-declared identities is taken for granted right now on the DNS. One of the biggest areas of concern for the Internet is the pervasive and malicious impact of distributed denial-of-service attacks, or DDOS attacks. A denial-of-service attack is an explicit attempt by the attackers to prevent legitimate users of the servers from using that service.”

Mohan said a well-planned DDOS attack might shut down major parts of the Internet’s core infrastructure by keeping it so busy answering false queries that it can’t take on legitimate requests.?

“The scale and size of DDOS attacks have increased dramatically in just the past five years,” he said. “Botnet operators have become sophisticated. They offer cloud-based botnet DDOS services, where you can pay as you go and, depending how much you pay, the size of DDOS attack can be increased, ramped up or taken down. In fact, some of the Botnet operators even offer you guaranteed e-mail delivery. They provide service-level agreements, the kinds of things you expect from the legitimate part of the Internet, that’s happening on the bad part of the Internet as well.”

Mohan said a small botnet is enough to damage the workings of core infrastructure, and it could possibly influence significant elements of a nation’s or corporation’s infrastructure.

“You don’t have to take down the entire Internet,” he warned. “You simply have to disrupt a few important pieces of it and everybody, or a large number of people get impacted. Imagine it Gmail went down, or imagine if Twitter went down. Simply regulating it doesn’t solve the problem. It’s going to require a significant level of investment, involvement from private operators, along with coordination from the public sector. I worry that the provisioning gap, if not addressed appropriately, is going to cause a significant problem with the long-term stability of the Internet.”

Danny McPherson, vice president for network security research at VeriSign, talked about malware.

He first noted that in the Internet routing system today there is “no authoritative source for determining who holds what resources, and absent that source you can’t secure the routing system.” He said there is no way to stop a person from asserting reachability for anyone else’s address base.

“There’s no capability to stop that or verify and ideally prevent that capability,” he said, “and that’s a huge vulnerability. DNS and applications and even botnets won’t work if the routing system doesn’t work, so that’s a huge challenge. The DNS system is hierarchical and distributed but also prone to attacks, so countermeasures and mitigation controls are extremely important.”

McPherson said every two minutes a piece of malicious software is released on the Internet. “There’s no way that virus detection capabilities can be updated in time to protect against that. As a matter of fact they’re inherently reactive, so finding ways to balance proactive capabilities to protect consumers of this resource or systems on the Internet in a model where a completely patched system is protected from probably 80% of the unique threats on the Internet on a given day is a huge challenge. It requires global coordination, acknowledgment this is a global resource, and policy development. This is a shared global infrastructure and global medium and information- or data-sharing capabilities that don’t exist today are extremely important to enable end systems to be better protected against certain types of threats.”

Alan Aina, special projects manager and engineer for AFRInic, discussed the impacts of IPv6 and new elements engineered by IETF and their influence on Internet stability. He noted, as an example, that working with IPv4 and IPv6 simultaneously brings “complexity to the Internet and may impact the stability of the Internet.”

Aina said other positive transitions and additions being made to improve and scale the Internet upward to serve more users with more security also add to its complexity and make maintaining stability a bigger challenge. Among the latest such improvements are the implementation of internationalized domain names (IDNs) and the addition of more security in the DNS, or DNSSEC. He added, however, that there’s also good news. “The Internet has managed to survive major attacks, it is being implemented in developing countries and with better coordination and collaboration among the players, I think it will work.”

Theresa Swinehart, director of global policy at Verizon, said one threat to Internet stability is if people do not effectively leverage initiatives aimed at mitigating dangers.

“I would look at this from the success side, policy side and scaling of information and resources,” she said. “On the technical side there are capacity-building initiatives, training, all sorts of other things. There was a workshop this morning on ccTLDs in Africa and the importance of strengthening ccTLD capacities to absorb attacks. The Internet Society has done work on pairing arrangements … Verizon and many other companies are involved in initiatives to really strengthen the network and make sure there’s a preparedness factor.?

“With the increased capacity that’s coming to different parts of the world, is there preparedness for what comes with it – the difference in attacks, the kinds of attacks? I think that’s one area that’s going to require some further awareness. Consensus-oriented policy discussions are the best way to insure the future, the best way to find creative solutions that address issues as they come forward.”

She emphasized the importance of addressing the increasing threats of attack. “How do we best ensure that there’s better vertical and horizontal integration of information and awareness on a global level?” she asked.

Additional points made at this workshop:

Izumi Aizu, an Internet researcher and professor from Tama University in Japan, presented a visualization of global information security. “There’s no real operational standing mechanisms” for certain threats, he noted. “This is a real good opportunity to start to think more seriously about implementing policy activities as well as perhaps operational activities.”

Bill Graham, director of strategic global engagement for the Internet Society, agreed with presenters’ mentions of the need to create more of an “intersection between the policy and the technical worlds.” He noted that the Internet Society is trying to address this with its newly launched Next-Generation Leaders program, which he added is a “kind of a long-term or a medium-term solution to an immediate problem.”

Paul Vixie, chairman of the board for ARIN, talked about vulnerabilities caused by forged addresses.

Hillar Aarelaid from the Estonian Computer Emergency Response Team noted that every working group concentrates on keeping things running in its own territory.

Andrzej Bartosiewicz of YonConsulting in Poland, noted that the growth of the Internet is a threat to its stability. Along with the positives come negatives – including crime. “The dark side of the Internet is growing, probably with the same speed as the valuable side of the Internet,” he said. He added that the rising complexity of the network of networks also raises serious challenges to stability. He said a next generation of automated tools can help automate some processes. “In my opinion, the long-term stability of the Internet is based on modernization of the threat detection and, of course ,global cooperation.”

The workshop was organized by the European Commission, the government of The Netherlands, the Republic of Lithuania, Tama University (Japan) and the Internet Society.

The UN’s video recording of the Taking Stock session can be found on this site.

The UN’s official transcript of the Taking Stock workshop can be found here.

– Video recorded from a remote location, captured from the live webstream during IGF-2010 sessions
– Senior segment producer, Janna Anderson

Imagining the Internet report on IGF-Lithuania 2010 home>

Internet Governance Forum official site>