You are on page 1of 34

Cryptography: Past, Present, and Future

Group Three
Ahmed Abdalla, Troy Brant, Gabe Campbell, Chansereyratana Lim, Saudamini Zarapkar
CS4235 Information Security
Georgia Institute of Technology

Table Of Contents
1. Introduction.......................................................................................................2
2. A Brief History of Cryptography.................................................................2
2.1 Early Cryptography..........................................................................................3
2.2 Entering the Modern Era..................................................................................5

3. Symmetric Encryption...................................................................................6
3.1 The Serpent Algorithm.....................................................................................7
3.2 The TwoFish Algorithm...................................................................................9

4. Asymmetric Cryptography............................................................................11
4.1
4.2
4.3
4.4

The RSA Algorithm..........................................................................................12


The Elgamal Algorithm....................................................................................13
The Future of Asymmetric Cryptography.........................................................15
Problems with Asymmetric Cryptography.......................................................16

5. Politics in Cryptography................................................................................17
5.1
5.2
5.3
5.4

The National Security Agency..........................................................................17


Challenges to Government Control Over Cryptography..................................17
Government Control Over Exporting Cryptography........................................19
Current Political Status.....................................................................................20

6. Quantum Cryptography................................................................................. 21
6.1
6.2
6.3
6.4

The Effect of Quantum Computing on Current Encryption Algorithms...........23


The BB84 Algorithm........................................................................................23
Using Entanglement for Quantum Key Distribution........................................25
Challenges and the Future................................................................................25

7. Conclusion........................................................................................................ 26
Appendix A.............................................................................................31
Appendix B.............................................................................................32

Abstract
In this paper we describe the history and evolution of cryptography starting from the beginning
of the 20th century and continuing into the current day. Specifically, the following five topics are
addressed: the cryptography used from 1900 until the end of World War II, the history of the
politics involved with government control over cryptography, the history and current status of
asymmetric cryptography, the history and current status of symmetric cryptography, and the
future of the field of cryptography using quantum physics. By analyzing these areas, we hope to
provide a more complete picture of where the field has been and where it is headed.

1. Introduction
Cryptography is a subject that has been studied and applied since ancient Roman times,
and research into better encryption methods continues to this day. Cryptography is the art of
encoding and decoding messages so that messages can be securely transmitted from a sender to a
receiver without fear of an outside party intercepting and reading or altering the message's
contents. The purpose of this paper is to describe the history and evolution of cryptography
starting from the beginning of the 20th century and continuing into the present day. Specifically,
the following five topics will be addressed: the cryptography used from 1900 until the end of
World War II, the history of the politics involved with government control over cryptography, the
history and current status of asymmetric cryptography, the history and current status of
symmetric cryptography, and the future of the field of cryptography using quantum physics.

2. A Brief History of Cryptography


The history of cryptography goes back thousands of years. Until recently, this history has
consisted of the classical components of cryptography ciphers written with pen on paper or
other simple mechanical services. The turn of the 20th century however, brought about several
advances in this subject. The invention of electromagnetic machines such as the German Enigma

provided a much more efficient method of encrypting messages, and subsequently led to the
introduction of cryptography into the emerging field of computing. The fundamental objective
of cryptography is to enable two people, usually referred to as Alice and Bob, to communicate
over an insecure channel in such a way that an opponent, Oscar, cannot understand what is being
said (Zotos). A cipher is an algorithm for performing encryption (and in the reverse,
decryption); it is in effect a series of well-defined steps that can be followed as a procedure. The
original information is usually known as plaintext and the encrypted messages are known as
cipher text.

2.1 Early Cryptography


One of the oldest cipher styles is the substitution cipher, whose first examples are dated
back thousands of years. Substitution is a method of encrypting by which units of plaintext are
substituted with cipher text according to a regular system. An example of this is the Atbash
cipher, which dates from 500 BC (Kartalopoulos). This cipher is based on the Hebrew alphabet,
and is regulated where the first letter is substituted by the last letter, the second letter by the
second to last letter and so on. Because it is a monoalphabetic cipher and can have only one
possible key, this cipher is relatively weak; however this was not a viable concern during its time
as literacy was not common. These types of ciphers were used frequently in religious texts and
writings, and it was probably this use that developed the method of frequency analysis to analyze
the messages (Kartalopoulos). Frequency analysis is where one examines the frequency of
substituted letters, from which they can estimate certain letters which appear repeatedly in the
plaintext language.
Frequency analysis was a huge advance in cryptanalysis; however, around 1467 there was

a big development in cryptography the polyalphabetic cipher. Based on substitution, it used


multiple substitution alphabets. The Vigenre cipher (see table in Appendix A) is probably the
best-known example of this cipher, though it is a simplified special case. The Enigma machine,
used in WWII, is more complex but still fundamentally a polyalphabetic substitution cipher. The
cipher was invented by Leon Battista Alberti in 1467. Alberti would use a common Caesar
cipher to encrypt his messages, but whenever he wanted he would switch alphabet keys,
indicating his switch by capitalizing the first letter of the new alphabet. He also created a
decoder apparatus, quite similar to the Vigenere table, called an encryption disk which he would
use to decrypt the messages. The polyalphabetic method was not readily adopted however, until
the 18th and 19th centuries. From before 1400 until about 1750 only one kind of encryption
method was widely used substitution (Reeds). When Thomas Jefferson adopted the Vigenere
cipher as part of his encryption methods, he was thought to be an innovator, despite its 300 year
existence. It was in the nineteenth century that the Vigenere table came into regular use. The
multiple alphabet possibilities for each letter and the relative security provided if both A and B
knew the key made the cipher a popular and at one time almost impenetrable.
A huge breakthrough for cryptanalysis came in the mid-1800s with Charles Babbage and
his study in the field. The strength in the Vigenere and other polyalphabetic ciphers was its
ability to foil frequency analysis. The critical weakness in the cipher that Babbage (and the man
who published the studies, Friedrich Kasiski) found was the short and repetitive nature of the key
(Reeds). If a cryptanalyst could discover the keys length, they could apply the fundamentals of
frequency analysis and decrypt the cipher text using that method. The test took advantage of the
fact that certain common words like "the" could be encrypted using the same key letters, leading
to repeated groups in the cipher text. For example, assuming the key is ABCD, certain

sequences, such as CRYPTO (ciphered into CSASTP) are repeated because the key placement is
the exact same (Reeds). Thus one can tabulate the frequency of these sequences and substitute
corresponding letters to decrypt the cipher. This test of Babbages went on to assist several
British military campaigns.

2.2 Entering the Modern Era


An important era for the promotion and proliferation of cryptography was WWII, when
rapid intelligence needed to be sent by secret methods on all sides of the fronts. The Germans
had their famous Enigma machine, a cipher machine consisting of rotors, a plug board and a
keyboard. The machine worked with any combination rotors, where a specific rotor revolved a
certain amount with each keystroke, thus changing the current flow through the machine and
therefore the encryption of the letter (see Appendix B-1) (Hinsley). British and American forces
had their own electronic encryption machines, called the TypeX and SIGABA respectively. The
reason the Enigma machine gained so much notoriety was the fact that the Allies were able to
decrypt many of its messages. This was accomplished by obtaining an Enigma machine in a
diplomatic bag and using this to create their own decryption machine, which they codenamed
ULTRA. This intelligence was a significant aid to the Allied war effort, and some critics say that
the intelligence gained by ULTRA concluded the war a full two years early (Hinsley).
The world wars also brought about the common use of the one time pad (OTP) algorithm.
OTP is an encryption algorithm where the plaintext is combined with a random key that is as
long as the plaintext so each character is used only once. If the key is truly random, never
reused, and kept secret, the OTP can be proven to be unbreakable. It was invented around 1917,
with one of the contributors being Gilbert Vernam, an AT&T Bell Labs engineer (Hinsley).

Claude Shannon, a fellow engineer at Bell Labs, later proved that the OTP was unbreakable.
Shannon would also prove to be the father of modern mathematically based cryptology.
His work on the theory of communication and his other studies in information theory provided a
sound basis for theoretical cryptology and cryptanalysis. With the advent of his work, the end of
WWII and the beginning of the Cold War, cryptography slipped away from the public sector and
began to become strictly a governmental instrument. Major advances in cryptography wouldnt
be seen until the mid-1970s with the advent of the Data Encryption Standard.

3. Symmetric Encryption
With the invention of calculators and computers in the middle of the twentieth century,
older codes were becoming more and more easy to break. In 1975 IBM submitted the DES
encryption algorithm to the government in order to create a standard method of encryption
between government and the private sector. After being reviewed by the NSA, it was accepted as
the first standard encryption algorithm for the US government. DES uses 64 bit keys, but the last
8 bits of the key do not affect the implementation. These are only used as check digits, giving
an effective key length of 56 bits. Soon after its release, it was proven to be insecure because of
the shortness of its key and the increasing power of computer. Later, double DES and triple DES
were proposed. Double DES used two keys and was intended to give an affective key length of
double DES; however, this did not improve the security of the algorithm. Triple DES, on the
other hand, uses two keys in three steps. This algorithm has better security than both DES and
double DES, but its effective key length is only doubled in length. Worrying that this algorithm
could be broken easily with more powerful computers, the National Institute of Standards and
Technology (NIST) called for new algorithms to be submitted and chosen as a new standard.

In response to the call of the NIST's encryption contest many cryptographers submitted
their versions of encryption algorithms. Only one of them (Rijndeal) was selected and considered
to be the new encryption standard, Advanced Encryption Standard (AES). However, many
popular algorithms did not win the contest and are still widely used. Two of these algorithms,
Serpent and TwoFish, are investigated in further detail below.
The Rijndeal algorithm was chosen to become the Advanced Encryption Standard. This
algorithm is different from DES in both key length and algorithm complexity. AES uses a
repeating cycle of 9, 11, or 13 steps requiring 128, 192, and 256 bits keys respectively. Each
cycle (or round), it executes four steps: Byte Substitution, Shift row, Mix Column, and Add
subkey. Rijndeal was chosen over other algorithm because of its speed in both software and
hardware and its effectiveness in encrypting data.

3.1 The Serpent Algorithm


After the AES selection process, Serpent was only second to the Rijndael algorithm. It
was designed by Ross Anderson, Eli Biham and Lars Knudsen. This algorithm met every
requirement that the NIST imposed. However, Serpent ran slower than Rijndael. Similar to the
Rijndael algorithm, Serpent uses a block size of 128 bits encrypting a 128-bit plaintext block P to
a 128-bit ciphertext C in 32 rounds under the control of 33 128-bit subkeys, K0, K32. Its key
length vary from 128 to 256 bits long. If the key is shorter than 256 bits, a 1" is appended to the
end of most significant bit, followed by as many 0" bits as required to make up 256 bits. The
algorithm consists of three phases. First phase is the initial phase where eight S-boxes are
generated (Appendix B-2). The second phase is where the plain text is transformed to semi-finish
ciphertext, which is then transformed to ciphertext in the third phase.

Inspired by the Rivest Cipher 4 (RC4) algorithm, the S-box is generated using a 32 x
16 matrix. The matrix was initialized with the 32 rows of the DES S-boxes and transformed by
wrapping the entries in the rth array depending on the value of the entries in the (r+1)st array and
on an initial string representing a key. If the resulting array has the desired (differential and
linear) properties, save the array as a Serpent S-box. This procedure is repeated until eight Sboxes have been generated. Then, the algorithm runs three operations thirty-two rounds: Bit-wise
XOR with the 128-bit Round Key Kr, Substitution via thirty-two copies of one of eight S-boxes,
Data mixing via a Linear Transformation. (Anderson) These operations are performed in each of
the thirty-two rounds with the exception of the last round. In the last round, the Linear
Transformation is replaced with a bit-wise XOR with a final 128-bit key. This algorithm can be
described in equation form:
- B0 = IP(P); B0 is the input to the first round, IP is initial permutation
- Bi+1 = Ri(Bi); Bi+1 is the out put of round Bi
Ri(Y) = L(Si(Y XOR Ki)), where i = 0,..,30 and
Ri(Y) = L(Si(Y XOR K32)), i = 31
Sj is the application of S-box, Sj mod 8 32 times in parallel, j = 0,,7
L is linear transformation.
- C = FP( B32); C is the ciphertext, FP is the final permutation, which is the inverse of the
initial permutation.
When Serpent was proposed to the National Institute of Standards and Technology, the
probability of the best successful attack is not higher than 2-120. Nonetheless, in 2002, Courtois
and Pieprzyk observed that Serpent could be expressed as a system of quadratic equations. So, in

their experiment, Cryptanalysis of Block Ciphers with Overdefined Systems of Equations, they
concluded that Serpent for key lengths [of] 192 and 256 bits can be broken by using eXtended
Sparse Linearization algorithm (XSL) with one or two know plaintext. However, the process still
takes about 2200 attacks (Courtoi).
In terms of hardware implementation, Elbirt and Paar evaluated Serpent and concluded
that this algorithm can be implemented to run fast with the register-rich architecture of the
chosen Xilinx XCV1000 FPGA. This implementation could perform encryption at the rate of
more than 4Gbit/s (Elbirt).

3.2 The TwoFish Algorithm


The third place finalist in the NIST contest was the Twofish algorithm. This algorithm
was designed by Bruce Schneier, John Kelsey, Doug Whiting, David Wagner, Chris Hall, and
Niels Ferguson. It is similar to the AES standard and Serpent. It uses a block size of 128 bits and
has a key size of up to 256 bits.
Twofish works as follows: The plaintext is split into four 32-bit words. In the initial step,
called the "input whitening" step, these words are XORred with four words of the key. This is
followed by sixteen rounds of encryption. In each round, the two words on the left are used as
inputs to the g function (one of them is rotated by 8 bits first). The g function consists of four
byte-wide key-dependent S-boxes, followed by a linear mixing step based on a Maximum
Distance Separable (MDS) matrix. The results of the two g functions are combined using a
Pseudo-Hadamard Transform (PHT). PHT is defined as a = a + b mod 232, and b = a + 2b mod
232, where a and b are given two inputs. Then, the two keywords are added. These two results
are then XORed into the words on the right (one of which is rotated left by 1 bit first, the other is
9

rotated right afterwards). The left and right halves are then swapped for the next round. After all
the rounds are executed, the swap of the last operation is reversed, and the four words are
XORed with four more keywords to produce the ciphertext. (see in Figured 2 for graphical
display of the algorithm).
In mathematical terms, the algorithm works as follows: the 16 bytes plain text, p0,, p15
are split into 4 words P0, .., P3 of 32 bits each using little-endian conversion.

(Eq. 1)
Then, these words are XORed with four words of the expanded key.
(Eq. 2)
They are used as input to the next round. The third word is XORed with the first output
work of from left and rotated right by one bit. Conversely, the fourth word is rotated one bit to
the left and then XORed with the second output of the left. This process is repeated 16 times.
The final outputs are swapped in the final round to undo the swap in the initial step and then
XORed with four more words of the expanded key.
(Eq. 3)
The ciphertexts are then written as 16 bytes c0, , c15 using the same little-endian
conversion as used for plaintext.

(Eq. 4)
When this algorithm was submitted to the contest, there was no attack more efficient than
brute force. The most efficient attack against Twofish with a 128-bit key had a complexity of
2128, the most efficient attack against Twofish with a 192-bit key had a complexity of 2192,
10

256-bit key has a complexity of 2256. After this, there have been many attempts to find the best
way to break this algorithm; however, no one has found a way to break it faster than brute force.
In 2001, Stefan Lucks attacked Twofish using several methods such as Key Finding and
Distinguisher attacks. He discovered that the Key Finding attack was only two to four time faster
than an exhaustive search brute force, and the Distinguisher attack had the probability of success
of only 25% , with 232 to 2127 chosen plaintexts. Moreover, these attacks only break one-half of
Twofish cipher texts because of its one-bit rotation. (Schneier)
No matter how good or widely used Serpent and Twofish are; they still suffer from key
distribution/exchange problems and key management disadvantages (Lucks). Finally, it is only a
matter of time before these algorithms can be easily broken using more powerful computer and
technique (Courtoi).

4. Asymmetric Cryptography
The idea of asymmetric (or public key) cryptography was first published in 1976 by
Whitfield Diffie and Martin Hellman in their paper "New Directions in Cryptography" (Menezie,
2). In this document Diffie and Hellman approached the issue of cryptographic algorithms and
their necessity of secure channels of communication. While this proposed the theory behind
asymmetric cryptographic algorithms, it did not provide a method of implementation. It was not
until Rivest, Shamir, and Adlement created the RSA algorithm in 1978 that an algorithm was
created that could make use of the technique proposed by Diffie and Hellman in 1976. The RSA
algorithm is based upon the difficulty of factoring large numbers. In 1984 the Elgamal algorithm
was proposed which included the functionality to perform the Diffie-Hellman (DH) key
exchange (as described in their paper), and was based on the discrete logarithm problem,

11

considered a more sound mathematical problem than prime factorization (Menezie, 6). This
section will focus primarily on the two main public key algorithms and their implementations.

4.1 The RSA Algorithm


In their paper Diffie and Hellman focused on the issue of key exchange and secure
channels in contemporary (symmetric) cryptosystems. They defined public key cryptosystems as
a system with two separate keys such that computing one from the other is "computationally
infeasible", allowing one of the keys to be published and the other to be kept secret. This
separation of keys allows the public key to be transported across insecure channels without
worry. The next advantage of public key encryption discussed is the ability to sign documents,
giving the ability to authenticate senders. Prior to the publishing of this paper, the authenticity of
the sender could not be determined by the receiver, making it difficult at times to ensure the
validity of messages (DH, 1-2).
As mentioned, the Diffie-Hellman paper proposed the method for asymmetric (public
key) cryptosystems using computationally secure algorithms, but did not propose a solution. In
their 1978 paper "A Method for Obtaining Digital Signatures and Public-Key Cryptosystems",
Rivest, Shamir, and Adleman discussed in detail the "trap door one-way permutation" function
originally proposed by Diffie and Hellman and their idea of digital signatures. A "trap door oneway permutation" is a function which takes a message and can easily encrypt it with one key,
creating an encrypted message that is computationally difficult to decrypt without the second key
(RSA, 1-2).
The method used to create the one-way permutation in RSA, in general terms, is based on
the computational difficulty to factor large numbers. Mathematically an encryption this may be

12

demonstrated with the equation:


C = Me( mod N )

(Eq. 5)

where C is the cipher text, Me indicates multiplication by the plaintext e and mod N indicates the
modulus operation with N=PQ where P and Q are the public and private keys, very large prime
numbers. Because of the use of N in the modulus operation, it is assumed that N will be
computationally difficult to find, therefore making this algorithm dependent on the difficulty of
factoring large numbers (N).

4.2 The Elgamal Algorithm


In 1985, Taher Elgamal proposed a new method of public key encryption using discrete
logarithms instead of prime factoring. His paper, "A public key cryptosystem and a signature
scheme based on discrete logarithms", discusses a new method for public key cryptosystems and
their use for digital signatures. This is considered a more mathematically correct method of
encryption (Ellis, 3).
The Elgamal algorithm differed in its implementation from RSA in that it uses discrete
logarithms for the basis of its trapdoor. Mathematically an encryption can be described with the
equation:
C = ek mod p

(Eq. 6)

where C is the cipher text, ek is the plaintext raised to the exponent of k a secret session key that
can be computed by the two private keys of the sender and recipient, and mod p is the modulus
operator using the sender's private key p. Due to the exponent k, cryptanalysis of the algorithm
becomes a logarithmic function which is computationally difficult for large values of k. Because
of the method with which this algorithm implements the keys, the cipher text is twice as long as

13

the original plain text (Elgamal, 1).


While RSA was originally used for almost all public key cryptography, Elgamal has
replaced it in a few applications. In PGP, Elgamal replaced RSA as the implementation after
version 5. This was due to the fact that Elgamal was a better subset of the Diffie-Hellman key
exchange, was considered more mathematically sound, and had a better implementation of
digital signatures (Odlyzko 95, 4). RSA remains the algorithm used in ssl and many PKI
implementations. Each algorithm has their own benefits and drawbacks:

Drawbacks of using Diffie-Hillman/Elgamal over RSA include:

Message expansion: DH/Elgamal creates ciphertext that's longer than the message. This
is not generally a problem as it is used to transfer session keys.

Signature Strength: DH/Elgamal uses smaller keys, which may not be large enough to
guarantee security for a sufficient period of time.

Computational Intensity: DH/Elgamal require more processes to compute than RSA.

Need for 'good' randomness: DH/Elgamal requires a random value which must be unique
and unpredictable for each message. If an attacker recovers k or if two messages are
encrypted with k, the private key may be obtained.

Benefits of DH/Elgamal over RSA include:

No patent/copyright issues: DH/Elgamal are free and open source algorithms, whereas
RSA Labs must license RSA for use in the US and Canada.

RSA is forgeable: A malicious user could generate fake data that is easily factorable.
This is difficult to check without access to the private key.

RSA key generation requires a lot of computation; it is not good for use in systems

14

require ephemeral keys.

RSA offers less security per key bit than DH/Elgamal

DH/Elgamal seems to be based on better mathematical theory.

DH/Elgamal uses evanescent keys: An eavesdropper can not find the contents of a
DH/Elgamal message once the session key is destroyed. In RSA the private key decrypts
all messages.
In 1994, the Elgamal algorithm was agreed upon for use in the Digital Signature Standard

and was designated Digital Signature Algorithm for use in the US government. This was decided
upon as because this algorithm generates a signature based upon message size instead of
depending on the size of the key, a problem with RSA keys (FIPS186). Elgamal also replaced
RSA keys in version 5 of Pretty Good Privacy (PGP) and all versions of GNU Privacy Guard
(GPG), effectively replacing RSA for use in encrypted emails. The RSA algorithm remains the
algorithm for SSL connections and public key infrastructures, as these require fewer key
generations and is computationally less intensive (Simpson).

4.3 Future of Asymmetric Cryptography


Public key infrastructures (PKI) and systems are being deployed more and more rapidly
in corporate environments and government agencies. Due to the current level of encryption that
public key encryption provides and the rate at which new technologies are implemented, public
key encryption should be able to weather the upcoming years with increases in key length,
however as quantum computing becomes more and more of a reality, these technologies will
most likely become obsolete. Until that time, companies should be able to deploy public key
infrastructures to protect their networks, data, and emails without worry (RK 2).

15

A future implementation of public key cryptography will be an algorithm utilizing elliptic


curves, another problem in number theory. The use of elliptic curves were first proposed in 1985
independently by V.S. Miller and N. Kobilitz. They offer a few advantages over DiffieHellman/Elgamal, and RSA keys in that the require smaller keys for the same level of security
and are faster to compute than both RSA and DH base systems (Simpson). Space has been
included in the OpenPGP standard for the use of elliptic curves in public key algorithms
(RFC2440).

4.4 Problems With Asymmetric Cryptography


So far neither of the public key algorithms have been proven secure. The level of
security is dependent on how much computing power the cryptanalyst has, how much time he
takes to break the cipher text, and the length of the key used. A key length of 512 bits, once
considered secure for a number of years, can now be cracked in a matter of months (or less).
with increases in computational power and the advent and implementation of quantum
computing, public key cryptosystems will become outdated (Ellis, 2). According to Ronald
Rivest and Burt Kaliski in their paper "RSA Problem", an attacker could theoretically create
algorithms and/or methods of key generation that would severely decrease the ability of RSA to
work properly or allow decryption of a message using only a cipher text and a public key (RK,
3). This along with papers like those of Odlyzko question the validity of discrete logarithms and
the security of prime factorization in the near future, and their abilities to keep current data
secure, a valid concern in cryptography (Odlyzko).

16

5. Politics in Cryptography
The history of the politics of cryptography started shortly after World War II and
continues to this day. After World War II, the government realized the importance of
cryptographic methods and put in place laws that give the government strict control over all
issues dealing with cryptography. However, with the coming of the Information Age and
widespread use of computers, there has been considerable tension between the government and
public developers of cryptographic algorithms. In particular, the export of cryptographic
algorithms and programs has been regulated fiercely by the government. The history of politics
and cryptography in the United States and the current legislation regarding cryptography will be
addressed in detail.

5.1 The National Security Agency


The most important government agency involved in the cryptographic political struggle is
the highly secretive National Security Agency. The creation of the National Security Agency was
authorized by President Harry Truman in June of 1952 and was officially established by the
National Security Council Intelligence Directive 9 on December 29, 1952 (Records of the
NSA). The NSA is under the Department of Defense, and it was created in order perform signal
processing and code-breaking for ensuring the security of the United States (National Security
Agency). The agency is inherently involved with cryptography because it has to decode any
encoded messages it intercepts while monitoring communications. From World War II to the
1970s, most of the research and development in the field of cryptography in the United States
was done exclusively by the NSA.
As the Information Age approached and computer use became more mainstream, tensions

17

between the government and the private and academic sectors began to arise. Private companies
needed cryptography to securely transfer sensitive information, and academics were trying to
pursue research in cryptography. But the NSA didnt want any cryptographic techniques to be
developed that they could not break since it would interfere with their ability to monitor
communications.

5.2 Challenges to Government Control over Cryptography


The first publicized controversy came in the 1970s with the advent of the Data
Encryption Standard (DES) encryption algorithm. The National Bureau of Standards identified a
need for a government-wide standard for encrypting unclassified, sensitive material. The
National Bureau of Standards opened a competition to the public for anyone or any group who
could design an acceptably secure cryptographic algorithm. The winning algorithm was one
developed by IBM, and it was sent to the NSA for validation. Controversy ensued after the
algorithm was returned with two major alterations. First, the substitution boxes, or so-called sboxes, in the algorithm had been changed, and second, the key size used in the algorithm had
been reduced from 128 bits to 56 bits. There was speculation that the NSA had added a trap-door
into the DES algorithm that would allow them to decode any message that was encoded using
DES. An investigation was opened in the case, and the United States Select Committee on
Intelligence found that the NSA had been free from any wrongdoing and that the changes made
were made independently by IBM with feedback from the NSA used only as recommendations
("NSRP: Cryptography").
In 1989, the Khufu and Khafre block ciphers were created by current Georgia Tech
professor Ralph Merkle while he then worked at Xerox. Xerox sent a request to the NSA to

18

publish the paper Merkle had written about his research into the two encryption algorithms, and
the NSA denied the request to publish the paper. However, when the paper was being reviewed, a
copy was passed to John Gilmore, who made the paper publicly available on the sci.crypt
newsgroup. Because Gilmore published it using legal methods, the government took no legal
action against Gilmore (Gilmore). This situation is evidence of how difficult it is for the NSA to
enforce restrictions on the spread of information in a networked world.
In 1991, Senate bill 266 was proposed that would require trap-doors to be added to
networking equipment used in the private sector so that the government could monitor business
communications (S.266). Phil Zimmerman, the eventual creator of Pretty Good Privacy (PGP),
believed that the government was close to outlawing secure communications between private
citizens. He wrote the PGP software and released it publicly on the Internet before Congress
could create legislation that mandated trap-doors for the government be inserted in private
encryption systems. The government opened a criminal investigation into Zimmerman for
releasing the PGP software to the public without the governments permission. The government
argued that the technology would weaken the governments ability to protect national security
because rouge countries and terrorists could use the strong encryption to make their
communications unreadable. Zimmerman argued that developing and releasing the software was
freedom of speech and protected under the First Amendment. Eventually, in 1996, the
government dropped its case against Zimmerman under strong pressure from free speech
advocates and civil rights organizations. (Bender) This was a victory for a computing industry
that was trying to generate more secure forms of communication in networks and for web
applications such as banking websites that needed highly secure channels of communication.

19

5.3 Government Control Over Exporting Cryptography


In addition to limiting the development and disclosure of cryptographic algorithms
through the NSA, the government has also labeled cryptographic materials and ideas as
munitions that have restrictive export laws (United States Munitions List). Cryptography
exports are handled jointly by the Department of Commerce and the Department of State.
Because cryptography is considered a type of munitions, then the State Department determines if
cryptographic materials are fit for export on a case-by-case basis. If the State Department defers
jurisdiction over a cryptographic export, then the Bureau of Industry and Security in the
Commerce Department determines whether or not cryptographic materials can be exported. In
the past, the government would not allow encryption with a key size larger than 40 bits to be
exported. (US Crypto. Export/Import Laws) Today, encryption that uses any key size can be
exported after a technical review from the Department of Commerce. ("Admin. Implements
Updated Encrypt. Export Policy")
In 1994, Phil Karn filed a lawsuit against the State Department to challenge defining
encryption as a type of munitions. The State Department had ruled that the book Applied
Cryptography by Bruce Schneier was exportable and protected by the First Amendment under
freedom of speech, even though it contained complete source code for several encryption
algorithms. But a floppy disk that came with the book and contained the exact same source code
that was in the book was ruled a type of munitions and couldnt be exported. In year 2000, The
Department of Commerces Bureau of Export Administration relaxed the export laws to allow
publicly available encryption source code to be freely exportable. Karn then dropped the case
because under the new laws, the floppy disk with public encryption source code was not
considered a type of munitions. ("Detailed History of Applied Crypto. Case")

20

5.4 Current Political Status


Today, there are much fewer limitations on developing and releasing cryptographic
materials than there were in the past. In the year 2000, the Department of Commerces Bureau of
Export Administration relaxed the restrictions on exporting cryptography. Publicly available
encryption source code is now freely exportable. Any cryptographic algorithms developed by an
individual or company are also exportable, but they still require export licenses from the
government. Export of cryptography to rogue countries or terrorist organizations is still strictly
prohibited. ("Admin. Implements Updated Encrypt. Export Policy")
The struggle between the government and everyone else developing cryptography still
continues to this day. For a large part of the last 50 years, the NSA had a monopoly on
cryptography development and strictly limited public access and research into cryptographic
algorithms. The government labeled cryptography as a type of munitions, and thus enforced strict
laws governing the export of cryptography. Starting in the 1990s, the control the government had
over cryptography began to break down as the Internet allowed for the near instantaneous spread
of information with millions of people around the world. Today, the government has relaxed the
restraints over the distribution of cryptography, and publishing a paper on cryptography or
starting an open source project to create a new encryption algorithm are much easier to do than
they were in the past.

6. Quantum Cryptography
With the development of quantum computing, fundamental changes must be made in the
way quantum cryptography is looked at. To understand these changes and why they must take

21

place, one must first look at the physics principles behind both quantum computing and quantum
cryptography. Three quantum mechanical phenomena that are used uniquely in both of these are
the uncertainty principle, superposition, and entanglement. As this paper is more focused on the
cryptographic side of the issue, it will not go into full detail on these, merely providing a basic
overview.
The uncertainty principle dictates that there are certain related quantum states of which
the measurement of one property will disturb the other, the classic example being that of
momentum and position. Any measurement made attempting to determine the position of a
particle will introduce some amount of uncertainty into the momentum. The more precise one
measurement is, the more uncertainty is introduced to the other. This principle will become a
key factor in the detection of eavesdroppers in quantum cryptography.
Superposition refers to the fact that a particle exists as a superposition of multiple
quantum states. Only when this quantity is observed does the wave function describing the
particle collapse into only one of the states. Quantum computing can utilize this property to
carry out certain complex computations very quickly. While a traditional bit is in either the zero
or one state, a quantum bit, or qubit, can utilize superposition to exist in both states
simultaneously, allowing for previously complex calculations to be performed very quickly as
many possible solutions to problems can be analyzed at once.
Entanglement is the quantum mechanical phenomena in which the quantum states of two
or more particles are linked, even when they become spatially separated. For example, after two
particles are created that are entangled, both particles would have a probability of having a
certain spin. When the measurement is made on the first particle, thereby forcing it into a single
state, its entangled particle then always will measured to have the opposite spin.

22

6.1 The Effect of Quantum Computing on Current Encryption Algorithms


As described previously in this paper, the RSA algorithm is dependent on the fact that it
is a computationally complex and intense process to determine the prime factors of a number.
However, with quantum computers utilizing superposition, Shor's algorithm can theoretically
factor a number, N, in O(log(N)3) time, effectively breaking the encryption algorithm (Hey).
Even the previously mentioned Elgamal algorithm, which claimed to have a more secure method
of encryption falls under the unique abilities of quantum computers, as Shor's algorithm is also
able to more quickly solve the problem of discrete logarithms, upon which the encryption
algorithm is founded. Also, while not as significantly advantageous, a quantum computing
method known as Grover's quantum searching algorithm has been shown to dramatically
decrease the amount of time needed to brute force a DES key, although this can be solved by
some degree by doubling the key length (Hey). The main result of the possible failing of the
RSA and DES algorithms is the need for new methods of key exchange and authentication. This
is where quantum cryptography comes in to play.

6.2 The BB84 Algorithm


One of the oldest and most thoroughly investigated quantum cryptography methods was
proposed by C.H. Bennett and G. Brassard in 1984 and come to be known as the BB84
algorithm. This algorithm is largely based off the uncertainty principle and the fact that any
eavesdropper intercepting and measuring the quantum states of particles being exchanged will
also alter those states. Photons used here can be polarized either horizontally and vertically or
diagonally (45 and 135) each corresponding to a 0 or 1 respectively. The algorithm would

23

work as follows:
1. The sender, Alice, chooses a random bit string and a random sequence of polarizations
2. She then sends the other user (Bob) a train of photons each representing one bit of the
string.
3. Bob randomly chooses to measure each arriving photon rectilinearly or diagonally.
4. Bob tells Alice the polarizations he used for measurement via a public channel.
5. Alice tells Bob which measurements were correct.
6. Bob and Alice choose a certain number of bits to compare to check for tampering
(Bennett)
At this point Alice and Bob have successfully exchanged a key without fear of
eavesdropping by the third party Eve. Because of the uncertainty principle, attempting to
measure in one polarization will effectively randomize the other. Because the polarizations
being sent out are random, and any incorrect reading effectively destroys the information, any
attempt at eavesdropping will not only be unsuccessful, with at best half the key being correctly
found, but Bob and Alice would no longer have the same key due to the lost information, making
the eavesdropper's presence known to both parties (Bennett).
The disadvantage to the BB84 method being that it while is secure when only one photon
is sent for each bit, current lasers can often send multiple photons, allowing Eve to intercept one
without the other parties knowing. Therefor true on-demand single photon sources are desired
in order to make QKD efficient and unconditionally secure (Curcic). Similarly, there is a lack of
high quality single-photon detectors, (Kincade), meaning sending a single photon may not
always be possible on either end of the line. Of course BB84 is by no means the only QKD
method to exist. One other method that highlights an alternative technique is shown below.

24

6.3 Using Entanglement for QKD


In 1991 A. K. Ekert proposed a method of QKD that instead relied on the principle of
entanglement, with some similarities to the BB84 algorithm. First, an entangled pair of polarized
photons are generated by a trusted source. This can be a third party or even Alice. One photon
of the pair is sent to Alice, the other to Bob. Each measures the polarization using a certain basis
(rectilinear or diagonal, just as in BB84), and they announce to each other the bases used to
measure each photon. If the same basis was used for a pair of entangled photons and they were
entangled, then the resulting measurements will be correlated due to spooky action (Poppe).
Also just as before, should Eve try and read the photons in transit, she would only be able to
guess the polarization correctly half the time, destroying the information the other half.
Entanglement also provides additional security in that there are tests which can be performed to
check for entanglement, thereby preventing Eve from creating her own photons to send to Bob
(Poppe).
Because it is single photons that are linked, the problem of generating multiple photons
per pulse and allowing Eve to intercept them as stated earlier is eliminated. While there is an
added security benefit in using entangled pairs, the overall cost of equipment for generating
single photons is significantly lower, making the small security trade-off worth it from a
commercial perspective (Kincade).

6.4 Challenges and the Future


Quantum cryptography is still an emerging technology and has many hurdles to
overcome before it becomes widely usable. To start with, the entry cost in establishing a QKD

25

system is quite high. Dedicated fiber must be established between sites wishing to exchange
keys, and the equipment needed for photon generation and measurement can be quite expensive.
However, the technology has been steadily improving. While the first actual quantum key
exchange was over a distance of only 30cm, current experiments have been carried out over over
150km (Curcic)
The future of the technology appears to be very bright. The logic behind the algorithms
has by now been thoroughly proven as completely secure (Chau) and even gone as far as to be
put into test implementations. New algorithms using other quantum techniques are consistently
being published, such as implementation in wireless LANs (Nguyen), and as more progress is
made in the field of quantum physics, so will more exploits become available for secure
cryptography.

7. Conclusion
This paper discussed the technical history and details of cryptography and cryptographic
systems. Section 2 detailed the history of cryptography before the advent and proliferation of
computers and general computing with a focus on the beginning of the twentieth century.
Section 3 approached the use of symmetric cryptography in computing, systems that require a
single key to encrypt and decrypt data. Section 4 explained the two main asymmetric
cryptographic algorithms, their uses, and touched on the future of public key technology.
Section 5 gave historic accounts of the differences between the government and public entities
over the publishing and exportation of cryptographic software and information. Section 5
introduced the subject of quantum cryptography and the future of quantum computing on
cryptography and cryptosystems. It would seem that cryptographic algorithms and applications

26

are secured for the time being against modern cryptanalyst attacks, however as they are all only
computationally secure, their life span is limited. Cryptography as a field has a bright future,
with new research and development prompting new algorithms and methods. Quantum
computing, perhaps the next, largest step in computing, also provides the newest hopes for
cryptography, creating the potential for new cryptographic methods an algorithms, obsolescing
modern applications and algorithms at the same time. By looking at modern and past methods
cryptographers can look to the future with experience, creating better, more efficient algorithms
without recreating the mistakes of the past.

27

References
Hinsley, Harry. "The Enigma of Ultra." History Today 43 (1993). EBSCOHost. Georgia Tech
Library, Metz. 16 July 2006. Keyword: Cryptography.
Kartalopoulos, Stamatios V. "A Primer on Cryptography in Communications." IEEE
Communications Magazine (2006): 146-151. EBSCOHost. Georgia Tech Library, Metz.
16 July 2006. Keyword: Cryptography.
Reeds, Jim. "Review of "the Code Book: the Evolution of Secrecy From Mary Queen of Scots to
Quantum Cryptography" by Simon Singh. Anchor Books." Rev. of The Code Book, by
Simon Singh. ACM SIGACT News June 2001: 6-11.
Zotos, Kostas, and Andreas Litke. Cryptography and Encryption. Dept. of Applied Informatics,
University of Macedonia. 16 July 2006
<http://arxiv.org/ftp/math/papers/0510/0510057.pdf>.
R. Anderson, E. Biham, and L. Knudsen, Serpent: A Proposal for the Advanced Encryption
Standard, First Advanced Encryption Standard (AES) Conference, Ventura, CA, 1998.
Nicolas Courtois, Josef Pieprzyk, "Cryptanalysis of Block Ciphers with Overdefined Systems of
Equations". The Association for Computer Machinery. Lecture Notes In Computer
Science; Vol. 2501. pg. 267 287. 2002.
AJ Elbirt, C. Paar. An FPGA Implementation and Performance Evaluation of the Serpent Block
Cipher. The Association for Computer Machinery. International Symposium on Field
Programmable Gate Arrays. Pg 33-40. 2000.
http://portal.acm.org/citation.cfm?id=329176&coll=portal&dl=ACM
B. Schneier, J. Kelsey, D. Whiting, D. Wagner, C. Hall, and N. Ferguson. Twofish: A 128-Bit
Block Cipher.
Stefan Lucks. The Saturation Attack - A Bait for Twofish. The Association for Computer
Machinery. Lecture Notes In Computer Science; Vol. 2355. pg. 1 15. 2001.
P. K. Mohapatra. Public Key Cryptography. The Association for Computer Machinery.
http://www.acm.org/crossroads/xrds7-1/crypto.html
"New Directions in Cryptography." Diffie, Whitfield and Hellman, Martin. IEEE Transactions
on Information Theory Vol. IT-22. 6 Nov. 1976.
"A Public Key Cryptosystem and a Signature Scheme Based on Discrete Logarithms." Elgamal,
Taher. IEEE Transactions on Information Theory Vol. IT-31. 4 July 1985.
"The Handbook of Applied Cryptography." A. Menezes, P. van Oorschot, S. Vanstone. CRC

28

Press, 1996.
"The History of Non-Secret Encryption." J. H. Ellis. Cryptologia. July 1999.
"DIGITAL SIGNATURE STANDARD (DSS)", NIST Federal Information ProcessingStandards
Publication 186-1, Dec 1998.
J.Callas, L.Donnerhacke, H.Finney, R.Thayer, "OpenPGP Message Format", RFC 2440, Nov
1998.
"PGP DH vs. RSA FAQ." Simpson, Sam. 1999 <http://www.scramdisk.clara.net/pgpfaq.html>
A.M.Odlyzko, "The Future of Integer Factorization", RSA CryptoBytes, Volume 1, Number 2,
Summer 1995.
A.M.Odlyzko, "Discrete logarithms: The past and future", 5th July 1999.
"Records of the National Security Agency/Central Security Service." The National Archives. 17
July 2006 <http://www.archives.gov/research/guide-fed-records/groups/457.html#457.1>.
"National Security Agency." University of San Diego. Dept. of History, U. of San Diego. 18 July
2006 <http://history.sandiego.edu/gen/20th/nsa.html>.
"NSRP: Cryptography." U. of Texas. School of Information, U. of Texas. 18 July 2006
<http://www.gslis.utexas.edu/~netsec/crypto.html>.
Gilmore, John. "Merkle's "a Software Encryption Function" Now Published and Available."
sci.crypt. 13 July 1989. 18 July 2006
<http://groups.google.com/group/sci.crypt/msg/e86ff3c3089f97c8>.
"S.266." Library of Congress. 1991. 18 July 2006 <http://thomas.loc.gov/cgi-bin/query/z?
c102:S.266.IS:>.
Bender, Adam. Carnegie Mellon. Carnegie Mellon. 18 July 2006
<http://www.andrew.cmu.edu/user/abender/pgp/history.html>.
United States. Department of State. The United States Munitions List. 18 July 2006
<https://www.demil.osd.mil/documents/app1_97.pdf>.
"6.4 United States Cryptography Export/Import Laws." RSA Security. 18 July 2006
<http://www.rsasecurity.com/rsalabs/node.asp?id=2327>.
"Detailed History of Applied Cryptography Case." Qualcomm. 24 Feb. 1999. 18 July 2006
<http://people.qualcomm.com/karn/export/history.html>.
"Administration Implements Updated Encryption Export Policy." Center for Democracy and

29

Technology. 12 Jan. 2000. Department of Commerce. 18 July 2006


<http://www.cdt.org/crypto/admin/000112commercefactsheet.shtml>.
K. Kincade, Bob and Alice Beef Up Security Laser Focus World, v 42, n 5, May 2006, 10913
A. Poppe, A. Fedrizzi, H. Hbel, R. Ursin, A. Zeilinger, Entangled State Quantum Key
Distribution and Teleportation, 31st European Conference on Optical Communication,
2005, pt. 5, 61 vol.5
C. Bennet, and G. Brassard, G. Quantum cryptography: Public key distribution and coin
tossing, IEEE International Conference on Computers, Systems, and Signal Processing,
IEEE Press, LOS ALAMITOS, 1984.
T. Curcic et. al. Quantum Networks: From Quantum Cryptography to Quantum Architecture
Computer Communication Review, v 34, n 5, October, 2004, p 3-8
T. Nguyen, M. Sfaxi, S. Ghernouti, Integration of Quantum Cryptography in 802.11 Networks
Proceedings. The First International Conference on Availability, Reliability and Security,
2006, 8 pp.
H. Chau, Unconditionally Secure Key Distribution in Higher Dimensions by Depolarization
IEEE Transactions on Information Theory, v 51, n 4, April, 2005, p 1451-1468
T. Hey, Quantum Computing: An Introduction Computing & Control Engineering Journal, v.
10 , issue 3, June 1999, pp: 105 - 112
T. Nguyen, Integration of Quantum Cryptography in 802.11 Networks Availability, Reliability
and Security, 2006. ARES 2006. The First International Conference on, 20-22 April 2006, 8
pp.

30

Appendix A: A Vigenere Table

31

Appendix B-1: Serpent Algorithm Diagram

32

Appendix B-2: Twofish

33

You might also like