Free Culture by Lawrence Lessig (short story to read .TXT) π
Sometimes this borrowing was slight. Sometimes it was significant. Think about the fairy tales of the Brothers Grimm. If you're as oblivious as I was, you're likely to think that these tales are happy, sweet stories, appropriate for any child at bedtime. In fact, the Grimm fairy tales are, well, for us, grim. It is a rare and perhaps overly ambitious parent who would dare to read these bloody,
Read free book Β«Free Culture by Lawrence Lessig (short story to read .TXT) πΒ» - read online or download for free at americanlibrarybooks.com
- Author: Lawrence Lessig
- Performer: 0143034650
Read book online Β«Free Culture by Lawrence Lessig (short story to read .TXT) πΒ». Author - Lawrence Lessig
Consider the life of my Adobe eBook Reader.
An e-book is a book delivered in electronic form. An Adobe eBook is not a book that Adobe has published; Adobe simply produces the software that publishers use to deliver e-books. It provides the technology, and the publisher delivers the content by using the technology.
On the next page is a picture of an old version of my Adobe eBook Reader.
As you can see, I have a small collection of e-books within this e-book library. Some of these books reproduce content that is in the public domain: Middlemarch, for example, is in the public domain. Some of them reproduce content that is not in the public domain: My own book The Future of Ideas is not yet within the public domain.
Consider Middlemarch first. If you click on my e-book copy of
Middlemarch, you'll see a fancy cover, and then a button at the bottom called Permissions.
If you click on the Permissions button, you'll see a list of the permissions that the publisher purports to grant with this book.
According to my eBook Reader, I have the permission to copy to the clipboard of the computer ten text selections every ten days. (So far, I've copied no text to the clipboard.) I also have the permission to print ten pages from the book every ten days. Lastly, I have the permission to use the Read Aloud button to hear Middlemarch read aloud through the computer.
Here's the e-book for another work in the public domain (including the translation): Aristotle's Politics.
According to its permissions, no printing or copying is permitted at all. But fortunately, you can use the Read Aloud button to hear the book.
Finally (and most embarrassingly), here are the permissions for the original e-book version of my last book, The Future of Ideas:
No copying, no printing, and don't you dare try to listen to this book! Now, the Adobe eBook Reader calls these controls "permissions"-- as if the publisher has the power to control how you use these works. For works under copyright, the copyright owner certainly does have the power--up to the limits of the copyright law. But for work not under copyright, there is no such copyright power.21 When my e-book of Middlemarch says I have the permission to copy only ten text selections into the memory every ten days, what that really means is that the eBook Reader has enabled the publisher to control how I use the book on my computer, far beyond the control that the law would enable.
The control comes instead from the code--from the technology within which the e-book "lives." Though the e-book says that these are permissions, they are not the sort of "permissions" that most of us deal with. When a teenager gets "permission" to stay out till midnight, she knows (unless she's Cinderella) that she can stay out till 2 A.M., but will suffer a punishment if she's caught. But when the Adobe eBook Reader says I have the permission to make ten copies of the text into the computer's memory, that means that after I've made ten copies, the computer will not make any more. The same with the printing restrictions: After ten pages, the eBook Reader will not print any more pages. It's the same with the silly restriction that says that you can't use the Read Aloud button to read my book aloud--it's not that the company will sue you if you do; instead, if you push the Read Aloud button with my book, the machine simply won't read aloud.
These are controls, not permissions. Imagine a world where the Marx Brothers sold word processing software that, when you tried to type "Warner Brothers," erased "Brothers" from the sentence.
This is the future of copyright law: not so much copyright law as copyright code. The controls over access to content will not be controls that are ratified by courts; the controls over access to content will be controls that are coded by programmers. And whereas the controls that are built into the law are always to be checked by a judge, the controls that are built into the technology have no similar built-in check.
How significant is this? Isn't it always possible to get around the controls built into the technology? Software used to be sold with technologies that limited the ability of users to copy the software, but those were trivial protections to defeat. Why won't it be trivial to defeat these protections as well?
We've only scratched the surface of this story. Return to the Adobe eBook Reader.
Early in the life of the Adobe eBook Reader, Adobe suffered a public relations nightmare. Among the books that you could download for free on the Adobe site was a copy of Alice's Adventures in Wonderland. This wonderful book is in the public domain. Yet when you clicked on Permissions for that book, you got the following report:
Here was a public domain children's book that you were not allowed to copy, not allowed to lend, not allowed to give, and, as the "permissions" indicated, not allowed to "read aloud"!
The public relations nightmare attached to that final permission. For the text did not say that you were not permitted to use the Read Aloud button; it said you did not have the permission to read the book aloud. That led some people to think that Adobe was restricting the right of parents, for example, to read the book to their children, which seemed, to say the least, absurd.
Adobe responded quickly that it was absurd to think that it was trying to restrict the right to read a book aloud. Obviously it was only restricting the ability to use the Read Aloud button to have the book read aloud. But the question Adobe never did answer is this: Would Adobe thus agree that a consumer was free to use software to hack around the restrictions built into the eBook Reader? If some company (call it Elcomsoft) developed a program to disable the technological protection built into an Adobe eBook so that a blind person, say, could use a computer to read the book aloud, would Adobe agree that such a use of an eBook Reader was fair? Adobe didn't answer because the answer, however absurd it might seem, is no.
The point is not to blame Adobe. Indeed, Adobe is among the most innovative companies developing strategies to balance open access to content with incentives for companies to innovate. But Adobe's technology enables control, and Adobe has an incentive to defend this control. That incentive is understandable, yet what it creates is often crazy.
To see the point in a particularly absurd context, consider a favorite story of mine that makes the same point.
Consider the robotic dog made by Sony named "Aibo." The Aibo learns tricks, cuddles, and follows you around. It eats only electricity and that doesn't leave that much of a mess (at least in your house).
The Aibo is expensive and popular. Fans from around the world have set up clubs to trade stories. One fan in particular set up a Web site to enable information about the Aibo dog to be shared. This fan set up aibopet.com (and aibohack.com, but that resolves to the same site), and on that site he provided information about how to teach an Aibo to do tricks in addition to the ones Sony had taught it.
"Teach" here has a special meaning. Aibos are just cute computers. You teach a computer how to do something by programming it differently. So to say that aibopet.com was giving information about how to teach the dog to do new tricks is just to say that aibopet.com was giving information to users of the Aibo pet about how to hack their computer "dog" to make it do new tricks (thus, aibohack.com).
If you're not a programmer or don't know many programmers, the word hack has a particularly unfriendly connotation. Nonprogrammers hack bushes or weeds. Nonprogrammers in horror movies do even worse. But to programmers, or coders, as I call them, hack is a much more positive term. Hack just means code that enables the program to do something it wasn't originally intended or enabled to do. If you buy a new printer for an old computer, you might find the old computer doesn't run, or "drive," the printer. If you discovered that, you'd later be happy to discover a hack on the Net by someone who has written a driver to enable the computer to drive the printer you just bought.
Some hacks are easy. Some are unbelievably hard. Hackers as a community like to challenge themselves and others with increasingly difficult tasks. There's a certain respect that goes with the talent to hack well. There's a well-deserved respect that goes with the talent to hack ethically.
The Aibo fan was displaying a bit of both when he hacked the program and offered to the world a bit of code that would enable the Aibo to dance jazz. The dog wasn't programmed to dance jazz. It was a clever bit of tinkering that turned the dog into a more talented creature than Sony had built.
I've told this story in many contexts, both inside and outside the United States. Once I was asked by a puzzled member of the audience, is it permissible for a dog to dance jazz in the United States? We forget that stories about the backcountry still flow across much of the world. So let's just be clear before we continue: It's not a crime anywhere (anymore) to dance jazz. Nor is it a crime to teach your dog to dance jazz. Nor should it be a crime (though we don't have a lot to go on here) to teach your robot dog to dance jazz. Dancing jazz is a completely legal activity. One imagines that the owner of aibopet.com thought, What possible problem could there be with teaching a robot dog to dance?
Let's put the dog to sleep for a minute, and turn to a pony show-- not literally a pony show, but rather a paper that a Princeton academic named Ed Felten prepared for a conference. This Princeton academic is well known and respected. He was hired by the government in the Microsoft case to test Microsoft's claims about what could and could not be done with its own code. In that trial, he demonstrated both his brilliance and his coolness. Under heavy badgering by Microsoft lawyers, Ed Felten stood his ground. He was not about to be bullied into being silent about something he knew very well.
But Felten's bravery was really tested in April 2001.22 He and a group of colleagues were working on a paper to be submitted at conference. The paper was intended to describe the weakness in an encryption system being developed by the Secure Digital Music Initiative as a technique to control the distribution of music.
The SDMI coalition had as its goal a technology to enable content owners to exercise much better control over their content than the Internet, as it originally stood, granted them. Using encryption, SDMI hoped to develop a standard that would allow the content owner to say "this music cannot be copied," and have a computer respect that command. The technology was to be part of a "trusted system" of control that would get content owners to trust the system of the Internet much more.
When SDMI thought it was close to a standard, it set up a competition. In exchange for providing contestants with the code to an SDMI-encrypted bit of content, contestants were to try to crack it and, if they did, report the problems
Comments (0)