RFC1251
I am interested in transition to a world-wide multi-protocol Internet. This requires scaling to several orders of magnitude larger than the current Internet, and also requires a greater emphasis on reliability and ease of use.
I am interested in transition to a world-wide multi-protocol Internet. This requires scaling to several orders of magnitude larger than the current Internet, and also requires a greater emphasis on reliability and ease of use.
It is unproven that the current technology will survive in a competitive but unregulated environment, with uncoordinated routing policies and global network management being just two of the major issues here. Furthermore, while frequently comments are being made where the publicly available monthly increases in traffic figures would not justify moving to T3 or even gigabit-per-second networks, it should be pointed out that monthly figures are very macroscopic views. Much of the Internet traffic is very bursty and we have frequently seen an onslaught of traffic towards backbone nodes if one looks at it over fairly short intervals of time. I am not sure whether the amount of research and development efforts on the Internet has increased over time, less even kept pace with the general Internet growth (by whatever definition). I do not believe that the Internet is a finished product at this point of time and there is a lot of room for further evolution.
For some years now we have been painfully aware of the scaling problems of the Internet, and since 1982 have lived through a series of mini-disasters as various limits have been exceeded. We have been saying that “getting big” is probably a more urgent (and perhaps more difficult) research problem than “getting fast,” but it seems difficult to persuade people of the importance of launching the kind of research program we think is necessary to learn how to deal with Internet growth. It is very hard to figure out when the exponential growth is likely to stop, or when, if ever, the fundamental architectural model of the Internet will be so out of kilter with reality that it will cease be useful. Ask me again in 10 years.
The prevailing mode may be shifting towards competition, both commercial and academic. To develop protocols in a commercially competitive world, you need elaborate committee structures and rules. The action then shifts to the large companies, away from small companies and universities. In an academically competitive world, you don’t develop any (useful) protocols; you get six different protocols for the same objective, each with its research paper (which is the “real” output). This results in efficient production of research papers, but it may not result in the kind of intellectual consensus necessary to create good and useful communication protocols.
The key to all this incredible revolution happening is that the protocols are common. When we have all decided what is the next cool thing to do, we have to agree on a common way of doing it. We have a forum where all of the major players come together … We have a preliminary draft for a standard way of doing that so you’ll be able to mix different source objects within an HTML document … At the end of the day the importance is that it is a World Wide Web. When you come to a point where you are going through the Web, and it says, “I’m sorry, you cannot read this information unless you are using particular software,” then the World Wide Web is no longer worldwide. Everybody loses.
The next thing for the Web is the death of the concept of the killer application. It will be killer content. The idea of an application will disappear over time. There is one possibility … There will be a whole mingling of components of software which won’t be grouped into lumps like applications. Even the operating system will become less significant. What you will be interested in in your operating system is something which will be small and fast and get out of the way quick.
What I see as interesting is the possibility that the Web will become something driven by its data rather than by its programs. What you see on your desktop won’t be a function of what you spend at the store for shrink-wrapped software. It will be a function of where you have been browsing. As you browse you will discover interesting objects and you will be able to download the code to make those objects come to life, and behave on your screen or in a 3-D space in a way that an author or artist intended.
The truth is I haven’t the faintest idea where it is going to be in five years’ time. When the Web as an information space becomes an assumption, then it will be time for the next revolution. In five years’ time the next revolution may have happened on top of the Web. It will happen within the Web. It may be mobile code. It may be robots working for you. It may be people finding ways of interacting politically.
The technology for a secure Web already exists. But we are manacled at the moment by U.S. export controls and encryption issues. I can’t speak for the whole W3 consortium, but when you look at designing a worldwide system, trying to limit the use of cryptography won’t work. I’m very much aware of the government’s worry about abuse of privacy. But my personal opinion is that the effort to prohibit [encryption] is becoming untenable.
I can’t see how government regulation can work. Regulation of content is the industry’s responsibility. The problem of giving kids access is easier to address than terrorism. I think the model that will work for kids is safe spaces, built either by a service provider or through software controls. Of course, it’s tough to block off the dangerous sites, because they just keep popping up.