Listen to this article
James Boyle's informative column on data bases is right to point out the advantages of the free flow of basic information collected by government sources. But it is also critical to understand that the implicit trade-offs behind this calculation apply not only to data but to all forms of intellectual property, which can be either privately owned or placed in the public domain.
First, I think that it is wise to avoid the implicit, if striking, anthropomorphism of Boyle's title "Why (public) information wants to be free". The question here is how human beings should treat information in order to maximise its social value. The quetion is never how information "treats" itself.
It is, also, I think important to remember that a regime of public domain information is not a form of "socialism", benign or otherwise. Socialism champions the collective ownership of the means of production, which might describe the European control over its data. The public domain connotes no collective control over information or anything else. Each person can use what he or she will, no questions asked.
The hard question is, should information created by the government be put into the public domain. One argument in favor of this approach is that allowing reproduction at cost insures greater dissemination of the information. The argument against it, which Boyle does not address, is that the taxes needed to fund the collection of information impose a burden on other sectors of the economy. The classical "marginal cost controversy" - do we price critical goods at marginal cost - swirled over how to sort these conflicting forces out.
In this case, I think that Boyle has made the right call. One reason not to price at marginal cost is that it makes it difficult to decide whether it was worth while to invest in the production of the public resource in the first place. If there are no tolls on a $million bridge, how do we know it was worth at least $1 to its free users? But whatever the situation with bridges and hard infrastructure, Boyle's numbers suggest that this isn't a real issue here.
In other contexts, however, the public domain solution may be more difficult to defend. For example, the Bayh-Dole Act of 1980 consciously encourages universities and their inventors to patent inventions developed with government support. The theory behind the legislation is that inventions left in the public domain will languish for want of a champion to commercialise them.
Twenty-five years later it is still hard to tell whether Bayh-Dole made the right call. But no matter how that decision comes out, the case for putting information in the public domain seems a lot stronger. Data is of great value is in the use of other commercial endeavours. Open access allows individual firms to collate the data in ways that might command a premium, while leaving others access to the raw materials.
That's the approach taken with the human genome, and it seems to work here. It is nice to know that the United States has done something right. Let's hope that the European Union sees the light on this one.
The Writer is the James Parker Hall Distinguished Service Professor of Law, The University of Chicago, and the Peter and Kirsten Senior Fellow at the Hoover Institution.