‘Transparency’ of provider actions seen as key to open, accessible Internet
Workshop description: Network neutrality proponents contend it is crucial for maintaining content innovation, openness and diversity. Opponents, primarily Internet service and architecture providers, counter rigid network neutrality rules are unnecessary, may reduce investment in broadband infrastructure, and may reduce incentive to develop future applications dampen investment in infrastructure and content for the developing world.
Participants in the discussion included: Robert Pepper, VP global technology policy, Cisco; Thomas Lenard, president, Technology Policy Institute; Ian Peter, co-coordinator, Internet Governance Caucus; Emmanuel Edet, national ICT Development Agency, Nigeria; Jacquelyn Ruff, VP global policy and regulatory affairs, Verizon.
November 18, 2009 – The net neutrality issue looms large right now in the U.S. because the Federal Communications Commission has turned policy toward more likely formal regulation of the ways in which network providers and ISPs handle information. New FCC leadership has proposed to limit broadband providers from selectively blocking or slowing legal Web content, still allowing them to practice some degree of network management. Some Internet stakeholders question the development of formal net neutrality rules, saying they will overcomplicate the already complex information horizon. Some network providers and online business people are calling for formalized FCC rules allowing network providers, for instance, to block the trading of copyright material, and to allow them to charge high-bandwidth users more than low-bandwidth users.
On Oct. 22, the FCC voted to begin assessing how best to establish formal net neutrality rules to encourage “reasonable” network management and nondiscriminatory practices. The decision in the U.S. is likely to have a global effect, thus it was a topic discussed at the global IGF.
The IGF-Egypt network neutrality workshop was moderated by Vladimir Radunovic of DiploFoundation, and the first of three expected segments of discussion was chaired by former U.S. information technology ambassador David Gross.
Tom Lenard, president of the Technology Policy Institute, led things off by talking about economic aspects of net neutrality. “Keep in mind the recent admonition by Eric Schmidt of Google that it is possible for the government to screw the Internet up,” he said. “Broadband policy isn’t just a communications technology issue. ICT is a major driver of growth and productivity, and a source of higher living wages and standards. Broadband is a capital-intensive industry in a regulatory risk-burdened field.
“There’s pretty widespread agreement on the general goals of broadband policy. Preserve Internet choice, investment and innovation, promote competition and expand access. The debate is over how to achieve those goals and whether the open Internet is a problem that needs to be addressed. Net neutrality regulation would represent a pretty sharp departure from the status quo. Proponents claim they are trying to preserve the Internet as we know it, but it would represent a major change. Some proposals could resemble traditional public utility regulation.”
Lenard noted that the U.S. is getting close to possibly adopting some type of net neutrality regulation, and it is not right for the regulatory process to be fuzzy and lack transparency, noting that “if you read the FCC notice you see the word ‘may’ 199 times.”
“The economics of broadband markets are extremely complicated,” he said, “and it really is possible to screw up the Internet. There are applications and platforms on one side and consumers on the other side. Optimal pricing for this type of market is very hard to determine. Traditionally broadband providers have charged consumers but not service providers. Network owners want to maximize the value of their asset, so it won’t be in the best interests of the providers to establish high prices.”
Robert Pepper, vice president for global technology policy at Cisco, noted that in the FCC’s proceeding they never used the term “net neutrality.”
“They talked about policies for an open Internet,” he said. “I think the way the FCC has teed things up is the right approach.”
He brought out some statistical data gathered as part of Cisco’s Visual Networking Index to inform the discussion.
“You have to understand what the trends are in network, and traffic and how that effects things,” he said, introducing a brief slideshow with informational data. “In 1996 predicted traffic was grossly underestimated. The new predictions in 2001 were wrong. A lot of the predictions in this space – including mine – you have to take with a grain of salt. They generally underestimate usage.
“An exabyte is 10 to the 18th power, a zettabyte is 10 to the 21st,” he said as he showed slides with graphics indicating “a 6-times increase from 2007 to 2012, with a 50 percent compounded growth rate. We project that 50 percent of traffic by 2012 is going to be video. 20,000 petabytes (20 exabytes per month) used by consumers. Mobile is completely leapfrogging and replacing fixed-line telephony. With 3G and 4G we are going to be seeing mobile and wireless networks replacing fixed lines. Broadband wireless will be driven by applications. There are no mobile networks other than satellite. They are fixed and that has huge implications for networks getting broadband to antennas. You have to run broadband, probably fiber to each one of those base stations. This is part of the architecture and ecosystem that has to be built out as we move into a broadband world in a very short period of time. There’s also a shift to flash video taking off, and that is really driving demand. Today the average household on a global basis is watching 1.1 hours of video per day.”
He noted that traffic is not even and spikes are difficult to predict. He said people don’t generally understand the intricacies involved in delivering information in these new modes. “The impact on networks and network performance are different,” he said. “We tend to think of broadband as unidimensional – speed by gigabits. But it is multidimensional – it’s downstream, upstream, latency, there’s jitter. E-mail can tolerate latency. Voice over IP cannot tolerate any latency. You have two applications, and very very different needs. You need to match the application to the broadband – fit for purpose of the application. Not all bits are equal. We tend to oversimplify things and they are not simple. Streaming video, for instance, the buffer just gets too big – if I’m doing the high-definition teleconferencing called telepresence I can’t have any latency. It has to be super high-speed, probably 15 megabits per second, symmetric. You begin to think about a fit-for-purpose broadband Internet that can match and support different applications and you begin to think about having networks that are different. One size doesn’t fit all. So the proper question the FCC is asking, ‘How you can manage networks so they promote consumer benefits, consumer use, competition, and that they are not used in ways that become non-competitive.’ That’s not net neutrality. Net neutrality is a bumper sticker, it’s a label, it doesn’t mean anything. You actually have to define the question. I think the FCC has actually done a superb job. When you are looking at networks and network growth where we’re seeing non-linear growth, you have to think about differentiated services, the fact that not all bits are created equal. We want a broadband network that’s fit for different applications and it should be application-user-driven, with flexibility, managing for differentiation, not just for congestion.”
Jackie Ruff of Verizon noted that there has been some progress in defining the issue, saying, “if you had this conversation a couple of years ago we would have had completely polarized views of what net neutrality means.”
“It is nice to see people moving beyond that and putting things into context,” she added. “We believe in an open Internet. Users should have the final say in their communications experience. The future growth of the web requires investment and innovation. Flexibility in government policy is key. Providers should have the ability to manage their networks without discrimination that harms consumers. In October this was the subject of a joint statement of Verizon Wireless and the CEO of Google. What this represents is a search for a common ground, if we can find it, that balances the interests here. An extreme government mandate would change all this – saying every bit should be the same, the network should be the dumb pipe. This misreads the way the networks have been operated. It understates some of the things from the networks side.
“Our security team monitors more than 5 billion security events every day and we intercept them before they harm us or our customers. If we were asked to not distinguish between packets we couldn’t do that. Going forward there’s a desire for more reliance on quality of service and other positive business practices. Some other points to keep in mind in this discussion: There is very little evidence of problems with openness without strict rules. There is a risk of negative consequences if heavy handed policy is put in place. What’s most important is to keep the focus on consumers with respect to networks, applications and the content side. The complexities of these issues and the lack of problems calls for a case-by-case policy rather than a broad approach. A problem-solving approach that can work toward norms, from a regulatory perspective stepping back instead of risking the consequences.”
Ian Peter, a co-coordinator of the Internet Governance Caucus and one of the founders and directors of the Association for Progressive Communications, was the next speaker. “I do take some exception to the idea that charging content providers is the right way,” Peter said. “I find the term net neutrality confusing, almost useless and I am glad to hear the FCC are taking on policies for an open Internet. I have a problem when people try to make public policy via some supposed architectural principle. People think of net neutrality sometimes as an architectural principle, as they think of end-to-end as an architectural principle. We do far better when we try to explore beyond these terms toward what we’re really trying to say here what we are really trying to say with these terms. I don’t think the network ever was neutral; if it was it wasn’t for long. I’m certainly aware that in the era up to about 1992 – in an era prior to any real use of the World Wide Web – a number of U.S. universities were introducing traffic-shaping code – needing to give priority to an application called Telnet, about immediate access to databases on computers, favoring it over e-mail. Very early people were talking about traffic shaping. I like traffic shaping because I want to get to the sites and use the applications I want to use quickly. Some means of prioritizing those applications that require some immediacy over those that don’t makes a lot of sense. However, I don’t like the experience of traffic shaping I have had with my ISP. It doesn’t like BitTorrent and it slows it down and the whole account. There’s no transparency with this policy, there’s no requirement for transparency and the company just continues to do it. I do have a problem with non-transparent discriminatory traffic-shaping.”
He said industrial-age institutions are finding the transition to the digital knowledge age troubling, and they are resisting. “Throughout the world at this point in time there is a great deal of regulatory confusion around the Internet. Leaders are asking, ‘What can we do with our old analog-era regulation, either for broadcast or telecommunications?’ and sort of somehow or other bring the Internet into one or the other or a little bit of both. In the U.S. they decided that telecommunications and carriage and Internet equals content. I’m not sure that’s a sensible way to make a division. In a lot of countries there is confusion about what the Internet is, because it is a little bit of both. It is about content and about carriage. The analog age is dead, and a lot of the giants of the analogue age in terms of the industry are dying as well. The world is changing and they are unable to change quickly enough to adapt to the new world that is emerging in the digital age. This is changing distribution in a whole lot of areas – music, publishing, movies – the world is changing and a lot of these bodies aren’t going along with it.”
He showed a graphic with popular Internet applications and services all divided up in a way that’s similar to cable television’s tiered choices. “First,” he said, “I ought to be able to get the content of my choice. I ought to be able to use the applications of my choice. That’s a nice way to sell the internet, isn’t it, in levels with different costs – I don’t think so. I believe this is a serious threat – that some people will see this as a sensible way to do it. And how will they package it? Guess what? We’re going to charge the content providers for this model. This is a way in which people can make more money and sort of package the Internet. This threat is emerging two particular areas we need to look at, one in the ‘safe’ Internet. It’s only a matter of time before somebody’s going to package the ‘safe’ Internet. The ‘safe’ Internet isn’t going to visit most sites. In fact it’s going to block every site that doesn’t sign a little thing that says ‘we’re going to be nice’ and pay to be included in the service. So here’s nice little walled garden. You get a partial Internet where people are not able to access all the Internet content they might choose. And here we are, we’ve solved the problem with pornography or something like that. That isn’t the right way to proceed with a very powerful medium and all the possibilities it offers for us. The other possibility – two – is in the mobile space with its limited proprietary contracts. They will say it’s easier for us as consumers for them to offer the services in this limited way. ‘We’ll give ’em eBay and PayPal, we’ll give ’em Facebook but we won’t give ’em MySpace because Facebook happened to pay more for it’ and you get these sort of packages coming up. I do think you need to look at this carefully. You need to look at how, in a regulatory sense, you might handle this. I am totally opposed to charging content providers because it might open up all of these possibilities.”
He said corporate-driven decisions to monetize the Internet and control consumers is basically what the net neutrality battle is all about. “You don’t want to do that,” he urged. “Tell me how we are absolutely going to prevent this sort of Internet from emerging, because this is the breakdown of the Internet, this is what I think this argument called ‘net neutrality’ is all about, and I would like to see policy for an open Internet.”
Nathaniel James, director of OneWebDay, said he was on the panel to represent the users’ perspective. “What is the Internet?” he asked, then he answered: “The Internet is a system connecting computers, networks and most important end users. Who does the Internet belong to? While ATT and Verizon own the facilities they don’t own the Internet. It is human connections. Who gets to decide it’s too big and too complicated for users to participate? We have an opportunity to take real leadership on the definition of the future of the Internet. The FCC and Congress will help define how the Internet is shaped and the architecture of it. I think we can have smart and adaptive regulation. The question of investment is extraordinarily important. If we don’t have regulation, we could handle cases on an individual basis, but we don’t know if violations will be discovered. My understanding of Internet history as it is developing is that networking platforms were developed because of network diversity. I would be a little concerned that we turn back to network proprietors, network managers.
Emmanuel Edet, of the National Information Technology Development Agency, Nigeria, said his approach to the net neutrality debate is also that of a user. “We pay more for less and now we are facing the possibility of paying more and being denied access,” he said. “If monetary prospects are allowed to push packets, multimedia will not be accessible. We are developing into a knowledge economy. Now people will pay you more for what you know than what you can do. If developing countries are expected to compete adequately they should be given access to the content out there. We talk about bridging the digital divide; it won’t be possible without taking the action necessary. We try to assure communications services in Nigeria are accessible to all. How will the argument in the U.S. about network neutrality effect the people in my country? From an end-user’s perspective we may not be part of the debate, but we are really interested in how this plays out and how it effects us. If regulation protects us, we favor regulation. As the structure lies currently we are already paying more for less. If we have to pay more it will not be good for developing countries. We are already disadvantaged no matter the way the debate plays out. We are really afraid that if developing countries are not protected we’ll come out losers at the end of the day.”
Opening the discussion up
Once all panel members had taken a few minutes to share their central thoughts, Gross asked them how much of the perceived problems might stem from “a lack of competition between providers?”
Pepper answered that transparency is one of the most important principles. “Everybody should be able to protect their own equipment, access what they want to access and have transparency in that process,” he said. “I think everybody agrees with that. But if you don’t have choices it makes it more difficult. If in fact you accept these principles if you’re not happy do you have a choice? In the U.S. there are multiple broadband providers. I think it is one of the issues. We have the classic example of a two-sided market. 800 numbers is an example. I can call the airline and it pays for the call. The airline likes my business and the phone company gets paid. Classic two-sided market. A Bell basic subscriber gets 768-bit and premium gets 3MB. If you have an Internet with no ability to differentiate services – no bronze, silver, gold levels – then heavy users use more capacity and the other users wind up subsidizing the heavy users. We found 20 percent of users consume 60 percent of the capacity and 1 percent consume 20 percent across the globe. You want entry level users to pay less and you want the power users to pay for using more.”
Peter answered that market situations matter. “Transparency is going to sit right up there,” he said. “I want my ISP to let us know what’s going on with traffic. It shouldn’t be some sort of trade secret. Volume-charging is a very simple way to deal with the heavier end users. I still have problems and concerns. I don’t know how you can be transparent enough.”
Among the statements from the floor, people in the audience briefly revisited the definition of net neutrality and also brought up problems with required “lock in” by mobile phone companies and difficulties in switching between services due to contractural obligations. Common questions consumers want their providers to answer in regard to transparency are “how quickly can I exchange providers,” “how much does it cost,” “how long are the contracts” and “is there a gap in service during the switch?”
Edet said the big question from his point of view is: “If we do have net neutrality regulation, will it allow people in developing countries to have more access to information?”
Pepper said the question is, “What is the right ecosystem to build out the system on a competitive basis?”
“How can we replicate the mobile miracle?” he asked. “We have 4 billion mobile users in the world that nobody dreamed about 10 years ago. How do we replicte that with broadband? It’s a broader question of creating these incentives to get that competition. The first step is to get the competition get the investment get the networks built out. There are data centers that have been built by some providers, bringing data closer to the end-user. Those are two different technology approaches; a third is prioritizing content in a way that’s following the principles. Not allowing networks to improve performance is not technology-neutral; we have to be technology-neutral as well.
Ruff said Verizon is primarily concerned that there might be a movement toward rigid rules. “As far as consumers being able to do what they want, we do support that, we have for years,” she said. “Government- subsidized infrastructure projects have not been particularly successful. They have not been good deals for the taxpayers of those jurisdictions. Most content now is advertiser-supported – there has not been much experimentation with charging directly for access. Most content-providers must still think that’s the way to go. We shouldn’t put rules in place that preclude experimenting with that business model.”
Other topics of discussion later in the final two segments of the session were expected to cover substructures of the Internet and the regulators’ role and balance between hard regulation and soft regulation. More, different panelists were expected to participate in these aspects of the session, which was cut short due to the appearance of the first lady of Egypt, Suzanne Mubarek, earlier in the day.
– Senior segment producer, Janna Anderson
Additional reporting by Andie Diemer, Eugene Daniel,
Shelley Russell, Drew Smith and Dan Anderson