Well, Bruce Sterling, as usual, has an idea. It seems to me that we are walking a knife edge, nay, a ceramic blade edge of incredible sharpness, on one side of which is evolved conciousness, and the other, dismal slavery. That blade hurts my feet.
Well, Bruce Sterling, as usual, has an idea. It seems to me that we are walking a knife edge, nay, a ceramic blade edge of incredible sharpness, on one side of which is evolved conciousness, and the other, dismal slavery. That blade hurts my feet.
This is what Larry Lessig (http://lessig.org/) has been yelling about for a few years now. In his 1999 book ‘Code Is Law’ (http://www.code-is-law.org/, ) he literally argued that the architecture we devise for our information systems are like laws that are directly enforceable. In the real world leniency is built in because full enforceability would be too difficult, expensive, unpractical, or unrealistic. This allows a certain degree of flexibility within which exceptions can have their space, thus avoiding suffocation by complete control over everything. However in the architecture of information systems we can ensure rules are followed whatever the situation. Such level of enforceability are not necessarily desirable, and an obvious example is in the domain of copyright and how computer systems can enforce them, leaving little or no space for fair use.
This is why the model of the commons is so crucial, especially where technology inserts itself in-between people’s relationships. A social network software which cannot be changed by its users is simply a new form of totalitarism (cybertotalitarism?). How about online marketplaces where participants can’t have a say in how the market should be operated, or where they can be excluded arbitrarily?
Let’s avoid bleeding our feet by holding the vision high for the new systems we are creating. Let’s give ourselves the tools we need in order to become responsible in our choices. But first, let’s give ourselves systems where choice is possible, otherwise how can we practice our responsibility?
François, this is a great comment. I’ve often noticed in projects that I’ve worked on how the way we design the code ends up forcing human behavior and not always in the happiest ways. I think. One of the simplest most obvious indicators of how important an issue this is, is just to look at “skins” and how much people like the choice that they embody. The really issue is to make that choice not just be “skin” deep, but go as far down into the code as possible.