Agonist
Slashdot
NY Times
SF Gate
Techdirt
Salon
Slate
TPM
deLong
Drezner
Aljazeera
news.com
Gizmodo
s.babe
j-walk
  
Dictionary





On Digital Imprimatur
Amos Satterlee -  October 17, 2003

Digital Imprimatur is John Walker's recent analysis of how the Internet is moving in the direction of being locked down by centralized authority. My take is that Walker lays out a scenario where everything on the Internet will have a unique identifier -- each machine, each user, and even each document. Massive registries will validate every message call, determining whether software is allowed to run on a particular machine and whether a document is allowed to be viewed. Everything will necessarily be encrypted with a unique digital signature to prevent hijacking and alteration of traffic. This will return accountability to the open range.

Centralized firewalls will divide the Internet into publishers and consumers, where consumers will provide content only through third-party servers (already happens) and their connections will be limited to a set of approved protocols -- web pages, email, etc. Publishers will register content and be responsible for the legal consequences of the content, and the audience of content will be granularly controlled based on the identifiers presented by the consumer. Kids under 18 just won't be able to display adult content, period. Or, central authority can block any content deemed subversive by unregistering the document in the registry.

How the deployment of the technologies Walker identifies will actually play out and what kinds of trade-offs will be made is beyond the scope of his paper. The value of the paper is the breadth of technologies and issues that he brings under one roof. I highly recommend at least scanning the whole article.

Walker's article brings me a certain clarity to the current intellectual property battles. For government to govern, it must define what is governable. One of the simplest ways to do this is to determine that something is property. Once defined as property, the thing becomes subject to simple laws of ownership. In ages past, there was no easy way to package ideas into discrete, bounded packages. Control of ideas was a rather crude affair involving inquisitions, censorship and propaganda.

In this new digital age, the technology now exists to package ideas into discrete pieces of property, as described by Walker, and more importantly, the technology exists to examine and compare the universe of idea packages and discover whether an idea is being used improperly. Such is the importance of the concept of intellectual property. It defines the boundaries of legitimate idea packages, assigns rights to the packages, and determines the remedies for misuse. The US government has clearly established a bi-partisan policy to extend itself into this realm. The Patent Office ignores the public realm of prior art so it can create tidy packages of methods and processes. Congress extends copyright terms to a degree that they become almost perpetual. It seems that only economic constraints impede similar extensions of patent terms.

Lost in the confusion is the affect intellectual property rights will have on old fashioned real property rights. What is the value of my real property rights to a hunk of metal and silicon, when rights to operate the machine is controlled by others? I don't own the software that runs my laptop. I have a license to use it. My ownership rights in the real property are subsumed by the ownership rights to the intellectual property. To a certain degree, my real property rights are extinguished by their intellectual property rights. As more of our world becomes digitized, the conflict between real property and intellectual property will grow. Already, my real property rights in a Lexmark printer are limited by Lexmark's intellectual property control of what toner cartridge I can use. I am still free to throw the printer out the window, but I am no longer free to operate the it as I see fit.

Two endnotes: First, it is so very appropriate that the entertainment industry is the prime combatant in the current round of battle. Artists have always been the canon fodder, dispensable soldiers in the first wave of attack. They are pioneers into the bad locales where they can gain a foothold and so the gentry can follow without getting quite so dirty. Second, I am reminded of a proposal from years ago that, rather than buy a car, the consumer licenses the use of the car. The car manufacturer retains ownership of the vehicle and therefore can be regulated on proper recycling and disposal.

The systems that Walker describes are all based on connection to the Internet. For a piece of software to run, it must first check in with the central registry through a network connection. In the business world, I imagine that the registry would reside on the local network so that access to office machines can be controlled, with only the in-house registry sever validating out. But for the consumer, computer technology will require a connection to the Internet at all times.

This will impact infrastructure. It implies an always-on connection to a network. If the operating system has to validate before it fully loads, then the machine must establish a connection during boot-up. The cost, both to the consumer and to society, of providing this infrastructure is enormous. I guess the solution will be a class of devices with embedded software that can operate independent of a network connection.

It gives the term digital divide a broader meaning. There will still be the haves and havenots. Then within the haves, there will be classes of access. There will be those that have full direct access, those that have partial direct access, and those that have sporadic partial access. These gradients already exist but in the systems that Walker describes, they will be codified and prescriptive. A level of liability will be established for each level of access, and these levels of liability will then be distributed throughout the industry -- I've got a certain liability, so I expect your software to protect me to these limits.

Encryption poses another set of problems. Security wants two things -- tamper-proof data streams and full access to data. Unfortunately, the two cannot fully mutually exist. Trade-offs are inevitable.  As tamper-resistance increases access to the data decreases. The obverse is equally true. As access to data increases, tamper-resistance decreases; the consumer machine is a toy next to the super clusters sprouting all over.

The current and past administrations (the Patriot Act was not pulled out of the rubble of the Twin Towers, it was presented first and rejected during the Clinton years) have focused on access, at least to consumer and commercial data. The problem remains for getting through layers of encryption. One can imagine a meta-registry of public/private keys which backdoor or overlay approved encryption methods. Central authority could then discern approved traffic from rogue traffic based on signature; could easily de-crypt authorized traffic; and focus investigative attention of the rogue traffic.

The overhead would be staggering. One solution is a dual net. One network for secure transactions, and one for noise. One will run on IPv6 and the other will run behind NATed enclosures on IPv4 where strong encryption is forbidden.

This adds one more aspect to the digital divide.

26 Oct 2003: Intelligence in the Internet Era -- from the CIA

<rant> Could we put an end to mooning over the utopia of end-to-end? The end-to-end theory was a design theory. It was offered as an addition to the toolbag of system design models. The paper specifically notes that it is an informational theory and should not replace theories on mechanical design. To the point, the model assumes a very generalized, robust infrastructure. Further, it assumes high performance from the infrastructure, and finally it recuses itself from issues of performance and mechanical design.

Much of the current discussion about the net focuses on the hardware systems that the theory avoids. Issues of decryption, firewalling, and quality of service have more to do with the mechanical architecture of the Internet. End-to-end can still be valid, working with a robust infrastructure that is also intelligent. A system of intelligent nodes can co-exist with an intelligent core. A stupid network is not a precondition.

Finally, to link the accident of almost anonymous, direct access to the theory does both an injustice. On one hand, the theory was not built to accommodate social aspects. It is a technical theory, should not be expected to handle socio-political aspects. On the other, the case for institutionalizing an open information architecture is fraught with a fluidity of variables. It requires a more vigorous, textured advocacy than the end-to-end theory can provide. </rant>

Home  |   Writings  |   Journeys  |   Archive  |   Links  |   Photos
Made with CityDesk