Beyond PC
 
Wednesday, June 04, 2003  

I read yesterday that Microsoft will no longer develop Internet Explorer as a separate application. Instead, MS will embed the functionality directly into the operating system. I stared at the screen, aware that this news had a significance larger than the simple fact that the admission completely undermined their anti-trust representations (if it is only now being embedded, it wasn't before, right?). Then it clicked and pieces started falling together into a coherent pattern.

I have been feeling in my bones that we are on the verge of a sea change. There are so many issues whirling around, both technical and political, which are interrelated yet are addressed in a piecemeal fashion rather than in a systematic way. I think we have gotten to the point where the issues are spilling out of their sandboxes and the decisions made in one will have a direct impact on outcomes in another. The net result is that cluelessness will not be tolerated. Legislators and judges will have to understand the technical impact of their proposals and rulings. Technologists will have to accept the technology is not the panacea.

For me, the focus of this all is the architecture of the personal computer and the networks that connect them. At pretty much the same time, there were two significant developments in computer technology. One was the invention of the personal computer by a bunch of hackers with the desire to wrest computing power from central processing and deliver it to the individual user. When IBM opened the design specs to the public, it allowed for the mass production of a basic commodity architecture. And while some of the elements of the architecture may remain proprietary, there are no secrets and there is little third-party control of the equipment once it has been made. The net result is that the consumer is able to open the box and change parts easily and at will and that the central servers have been ported to this commodity architecture.

The other was the invention of the Internet, financed by the Department of Defense The driving force behind this effort was to create a network infrastructure that would enable the various proprietary computer systems to communicate with each other. The result was the end-to-end model, where the equipment used to run the network is "dumb" and the "intelligence" of the network is at the end nodes and where the data was transported as packets that did not need a dedicated circuit connection between nodes. In other words, the machines running the network are programmed only to pass the data packets from one point to the next without "reading" the contents of each packet and without knowing by what route the packets took to get there and what route they would take to get to their end destination. On the network, all packets are treated as equal. It is the end nodes that take the data packets, determine how they are to be put together, figure out what they are supposed to do, and turn them into meaningful content.

The prime design imperatives for both the PC and the Internet were redundancy and dispersion. The positive results have been astounding and during the run-up of the dot.com bubble, everyone pretty much wanted to keep hands-off for fear of jinxing the growth. Now that the market bubble has burst, there is a widespread emphasis on addressing the negatives. Many of the supposed solutions were being developed during the bubble years but were kept in the background. They are now more prominent.

Next up: Imposition of Controls

References:
The Internet is Missing. Bob Frankston