minilop.net/ writings/ Internet Privacy and US Encryption Policy

This is the final written draft of a speech I gave in my Speech 111 class, Spring term of 1999; I probably tweaked it a little after transferring it to notecards. (I got an A on the speech too - woohoo!) Since then, the US government has made moves in a direction which looks promising. This being the government that we're talking about here, no one I've heard from is entirely sure precisely what the government did - but the Electronic Frontier Foundation posted this news item:

"Encryption Policy News: (Sept. 16, 1999)"

Oh, and I just have to tell this story. At the October 1998 general meeting of the Portland PC Users Group, a guy from Pacifier (a local ISP) came to present - and tried to talk about serious issues like internet taxation and other government regulations. And an older PPCUG member got upset because the guy wasn't trying to sell us anything! Here we get a nice, interesting discussion of important issues, and one guy gets upset because for once nobody's reaching into his wallet while he's not looking. Sheesh.

Jamey Sharp May 1999

You're sitting at home one evening, clicking away, browsing the wonders of the internet. Suddenly, a loud knock at your front door startles you. ``Police! Open up!'', you hear. You know you haven't done anything illegal. As your heart begins pounding, you wonder, ``Was it that e-mail I sent to Jim about Paramount's new movie 'bombing'?''

It seems unlikely that law-enforcement officials would bother to read anyone's e-mail, but supposedly they need access to it to find evidence of criminal activity, and a certain US law is written with that goal in mind. I want to give you an idea why I think this law is silly at best and dangerous at worst. The relevant part of this law simply restricts the export of encryption software. ``Encryption'' refers to the same concept as ``cryptography'' or ``encoding''. You know how to write a ``secret message'' by saying ``A is 1, B is 2, and so on''? Well, that's the simplest encyption ``algorithm'', or method, there is.

I know some of you don't have a computer at home, let alone an e-mail account. But these laws affect everybody, if perhaps indirectly. For one thing, they've hurt some major businesses. You might not use these businesses' products or hold their stock, but anything that affects the United States economy affects you. In addition, even if you managed to avoid ever using any kind of desktop computer or any products from affected companies, information about you is stored and transferred among computers all over the world, and if you don't care how easy it is to capture that data, I think you're crazy. The current trend is for more and more personal information to be processed by computers, while the US government, in the guise of protecting us, is trying to remove what few protections are currently available on that information. To be honest, I find that kind of scary.

Now, let's go through in more detail why government restriction of cryptographic exports is a bad thing, and after that I'll give you an idea what I think should be done about it.

The specific law which I believe should be changed is known as ITAR, the International Traffic in Arms Regulations, which governs movement of military-related stuff in and out of the country. The key part of this law is called the United States Munitions List. In order to export anything that's on the Munitions List from the US, you first have to register with the Department of State, with costs up to $1,000 for a 5-year registration, and then hope they'll grant you a license to export your product, though it's not guarenteed that they'll do so. In fact, even if you don't want to export anything but you make something that's on the list, you still have to register. This list has 21 categories of defense products and services, though for some reason categories 18 and 19 are marked ``reserved''. Anyway, the list covers everything from guns and ammunition to tanks and what they call ``vessels of war''. I have nothing against that, of course - I wouldn't want a US company sending missiles or anything else out of the country unless it was carefully controlled. But the problem is in category 13, part b, which adds ``Information Security Systems and equipment, cryptographic devices, [and] software'' to the list, including basically anything ``with the capability of maintaining secrecy or confidentiality of information''. In other words, US companies can't export anything which would provide anyone privacy!

One important effect of the ITAR export control of encryption is that it limits free speech, especially that related to scientific research. As the Electronic Frontier Foundation or EFF, arguably the most active group working to protect our digital civil liberties, puts it, ``When undue regulation burdens and even prevents worldwide discourse concerning cryptography, new encryption methods cannot be tested adequately, workable international encryption standards cannot be developed, and cryptographers unable to publish or obtain essential peer review without fear of prosecution cannot be persuaded to enter the field of cryptography at all.''

Several years ago, while one Daniel Bernstein was working on his Ph.D. at Berkeley, he wrote a program called Snuffle which apparently used a very good encryption algorithm. I say ``apparently'' because the State Department wouldn't let Mr. Bernstein post his program on the Internet so other cryptographers could examine it. After all, once it's on the Internet, there's nothing preventing it from being retrieved by someone outside of the country, which counts as exporting. With the EFF's help, Bernstein, now a professor teaching classes on cryptography, sued several government agencies on first amendment grounds. He won, but the government appealed, and the case moved to the Ninth Circuit Court of Appeals, which covers Oregon and California, among other states. Just last month, the Circuit Court also ruled in favor of Bernstein. This case will probably go to the Supreme Court, and if they rule for Bernstein, it will be a victory for privacy and free speech and against insane rules.

Speaking of insane rules, here's a great example. In early 1994, Phil Karn, a cell phone technology researcher, interested in both programming and politics, filed two separate requests with the Department of State, one to determine whether he could export a book titled ``Applied Cryptography'', the other regarding a floppy disk with the source code for the software from the book and nothing else. The Department of State said that because of the first amendment, exporting the book was legal, but exporting the floppy was not. As Mr. Karn commented [http://people.qualcomm.com/karn/export/], ``It's old news that the US Government believes only Americans... can write [software], but now they have apparently decided that foreigners can't type either!'' Not only is it easy to retype the code from a book, but Karn says the same software has been available from Internet sites outside the US for years.

Now that raises another important point. What are we trying to protect by restricting the export of this software? Presumably, if you're trying to keep anyone else from finding out about your technology, it's because your technology is better than anyone else's. That's why there's such an uproar over China stealing our nuclear technology secrets. But in the case of encryption software, the best software is already available outside the US! Pretty Good Privacy, possibly the best encryption software available, was simply published as a 900-page book and exported to Europe, where other people handled a mostly automated process resulting in them having the source code in digital form again. In fact, most of the encryption software in use today, even by people in the US, came from outside the US.

So the real effect of ITAR is that researchers like Daniel Bernstein can't come up with better ways of protecting our privacy, but anyone can legally get around the law in order to obtain encryption software for their own use. This is why ITAR is both dangerous and silly.

So what can be done about it? I think the answer is obvious: remove category 13, part b from the United States Munitions List. It's a very simple solution, because at heart this is a very simple problem. The US government's mistake of restricting cryptographic software exports has so many ramifications, but it's only a few sentences out of a huge document.

Removing these few paragraphs is a good idea because individual privacy is important, and unregulating cryptographic software will safeguard our personal information. Related to that is the fact that cryptographic software prevents overzealous law-enforcement agencies from being too authoritarian, as in the example I gave earlier. After all, just because someone uses a word like ``bomb'' doesn't mean they're talking about bombs, necessarily - it all depends on the context.

In addition, this plan will benefit the economy and spur research into better cryptographic algorithms. When researchers all over the world can freely exchange ideas and review each other's work for flaws, law-abiding citizens can and will live more safely, and we'll have far less of a chance of having our privacy invaded.

Since ITAR doesn't actually prevent criminals from getting cryptographic software, removing the appropriate part of the Munitions List won't make things any worse. This is sort of like the argument against gun control, except that I'm in favor of gun control. Guns, after all, only have two purposes: to damage things, and to kill things, even if only in self-defense. You can't, however, kill someone with cryptography. Encryption software is beneficial in so many ways that its benefits outweigh the possibility that a few criminals might perhaps have e-mailed each other about their activities and been smart enough to encrypt the messages well enough that law-enforcement agencies couldn't read them. My personal opinion is that gun control is an entirely different situation, but if you disagree with that opinion, you should definitely agree that regulating cryptography is not the smart thing to do.

Now that you understand the problems of restricting cryptographic software exports and my simple solution for dealing with it, I want to leave you with this thought. Imagine that you're one of the many people each year whose identities are stolen. Somebody is racking up charges on your credit cards, using a copy of your driver's license when they get a ticket, and generally getting you into big trouble for a bunch of things you didn't even do! If only cryptographers had been able to share their ideas and research, they would have been able to keep ahead of the criminals like the one bothering you, but because of a few sentences in an obsure law, you're living the nightmare that is identity theft. If you're ever in this situation, I wish you luck - you'll have a hard time getting out of it. Just remember, it didn't have to happen to you.