Professional Documents
Culture Documents
Group Three
Ahmed Abdalla, Troy Brant, Gabe Campbell, Chansereyratana Lim, Saudamini Zarapkar
CS4235 Information Security
Georgia Institute of Technology
Table Of Contents
1. Introduction.......................................................................................................2
2. A Brief History of Cryptography.................................................................2
2.1 Early Cryptography..........................................................................................3
2.2 Entering the Modern Era..................................................................................5
3. Symmetric Encryption...................................................................................6
3.1 The Serpent Algorithm.....................................................................................7
3.2 The TwoFish Algorithm...................................................................................9
4. Asymmetric Cryptography............................................................................11
4.1
4.2
4.3
4.4
5. Politics in Cryptography................................................................................17
5.1
5.2
5.3
5.4
6. Quantum Cryptography................................................................................. 21
6.1
6.2
6.3
6.4
7. Conclusion........................................................................................................ 26
Appendix A.............................................................................................31
Appendix B.............................................................................................32
Abstract
In this paper we describe the history and evolution of cryptography starting from the beginning
of the 20th century and continuing into the current day. Specifically, the following five topics are
addressed: the cryptography used from 1900 until the end of World War II, the history of the
politics involved with government control over cryptography, the history and current status of
asymmetric cryptography, the history and current status of symmetric cryptography, and the
future of the field of cryptography using quantum physics. By analyzing these areas, we hope to
provide a more complete picture of where the field has been and where it is headed.
1. Introduction
Cryptography is a subject that has been studied and applied since ancient Roman times,
and research into better encryption methods continues to this day. Cryptography is the art of
encoding and decoding messages so that messages can be securely transmitted from a sender to a
receiver without fear of an outside party intercepting and reading or altering the message's
contents. The purpose of this paper is to describe the history and evolution of cryptography
starting from the beginning of the 20th century and continuing into the present day. Specifically,
the following five topics will be addressed: the cryptography used from 1900 until the end of
World War II, the history of the politics involved with government control over cryptography, the
history and current status of asymmetric cryptography, the history and current status of
symmetric cryptography, and the future of the field of cryptography using quantum physics.
provided a much more efficient method of encrypting messages, and subsequently led to the
introduction of cryptography into the emerging field of computing. The fundamental objective
of cryptography is to enable two people, usually referred to as Alice and Bob, to communicate
over an insecure channel in such a way that an opponent, Oscar, cannot understand what is being
said (Zotos). A cipher is an algorithm for performing encryption (and in the reverse,
decryption); it is in effect a series of well-defined steps that can be followed as a procedure. The
original information is usually known as plaintext and the encrypted messages are known as
cipher text.
sequences, such as CRYPTO (ciphered into CSASTP) are repeated because the key placement is
the exact same (Reeds). Thus one can tabulate the frequency of these sequences and substitute
corresponding letters to decrypt the cipher. This test of Babbages went on to assist several
British military campaigns.
Claude Shannon, a fellow engineer at Bell Labs, later proved that the OTP was unbreakable.
Shannon would also prove to be the father of modern mathematically based cryptology.
His work on the theory of communication and his other studies in information theory provided a
sound basis for theoretical cryptology and cryptanalysis. With the advent of his work, the end of
WWII and the beginning of the Cold War, cryptography slipped away from the public sector and
began to become strictly a governmental instrument. Major advances in cryptography wouldnt
be seen until the mid-1970s with the advent of the Data Encryption Standard.
3. Symmetric Encryption
With the invention of calculators and computers in the middle of the twentieth century,
older codes were becoming more and more easy to break. In 1975 IBM submitted the DES
encryption algorithm to the government in order to create a standard method of encryption
between government and the private sector. After being reviewed by the NSA, it was accepted as
the first standard encryption algorithm for the US government. DES uses 64 bit keys, but the last
8 bits of the key do not affect the implementation. These are only used as check digits, giving
an effective key length of 56 bits. Soon after its release, it was proven to be insecure because of
the shortness of its key and the increasing power of computer. Later, double DES and triple DES
were proposed. Double DES used two keys and was intended to give an affective key length of
double DES; however, this did not improve the security of the algorithm. Triple DES, on the
other hand, uses two keys in three steps. This algorithm has better security than both DES and
double DES, but its effective key length is only doubled in length. Worrying that this algorithm
could be broken easily with more powerful computers, the National Institute of Standards and
Technology (NIST) called for new algorithms to be submitted and chosen as a new standard.
In response to the call of the NIST's encryption contest many cryptographers submitted
their versions of encryption algorithms. Only one of them (Rijndeal) was selected and considered
to be the new encryption standard, Advanced Encryption Standard (AES). However, many
popular algorithms did not win the contest and are still widely used. Two of these algorithms,
Serpent and TwoFish, are investigated in further detail below.
The Rijndeal algorithm was chosen to become the Advanced Encryption Standard. This
algorithm is different from DES in both key length and algorithm complexity. AES uses a
repeating cycle of 9, 11, or 13 steps requiring 128, 192, and 256 bits keys respectively. Each
cycle (or round), it executes four steps: Byte Substitution, Shift row, Mix Column, and Add
subkey. Rijndeal was chosen over other algorithm because of its speed in both software and
hardware and its effectiveness in encrypting data.
Inspired by the Rivest Cipher 4 (RC4) algorithm, the S-box is generated using a 32 x
16 matrix. The matrix was initialized with the 32 rows of the DES S-boxes and transformed by
wrapping the entries in the rth array depending on the value of the entries in the (r+1)st array and
on an initial string representing a key. If the resulting array has the desired (differential and
linear) properties, save the array as a Serpent S-box. This procedure is repeated until eight Sboxes have been generated. Then, the algorithm runs three operations thirty-two rounds: Bit-wise
XOR with the 128-bit Round Key Kr, Substitution via thirty-two copies of one of eight S-boxes,
Data mixing via a Linear Transformation. (Anderson) These operations are performed in each of
the thirty-two rounds with the exception of the last round. In the last round, the Linear
Transformation is replaced with a bit-wise XOR with a final 128-bit key. This algorithm can be
described in equation form:
- B0 = IP(P); B0 is the input to the first round, IP is initial permutation
- Bi+1 = Ri(Bi); Bi+1 is the out put of round Bi
Ri(Y) = L(Si(Y XOR Ki)), where i = 0,..,30 and
Ri(Y) = L(Si(Y XOR K32)), i = 31
Sj is the application of S-box, Sj mod 8 32 times in parallel, j = 0,,7
L is linear transformation.
- C = FP( B32); C is the ciphertext, FP is the final permutation, which is the inverse of the
initial permutation.
When Serpent was proposed to the National Institute of Standards and Technology, the
probability of the best successful attack is not higher than 2-120. Nonetheless, in 2002, Courtois
and Pieprzyk observed that Serpent could be expressed as a system of quadratic equations. So, in
their experiment, Cryptanalysis of Block Ciphers with Overdefined Systems of Equations, they
concluded that Serpent for key lengths [of] 192 and 256 bits can be broken by using eXtended
Sparse Linearization algorithm (XSL) with one or two know plaintext. However, the process still
takes about 2200 attacks (Courtoi).
In terms of hardware implementation, Elbirt and Paar evaluated Serpent and concluded
that this algorithm can be implemented to run fast with the register-rich architecture of the
chosen Xilinx XCV1000 FPGA. This implementation could perform encryption at the rate of
more than 4Gbit/s (Elbirt).
rotated right afterwards). The left and right halves are then swapped for the next round. After all
the rounds are executed, the swap of the last operation is reversed, and the four words are
XORed with four more keywords to produce the ciphertext. (see in Figured 2 for graphical
display of the algorithm).
In mathematical terms, the algorithm works as follows: the 16 bytes plain text, p0,, p15
are split into 4 words P0, .., P3 of 32 bits each using little-endian conversion.
(Eq. 1)
Then, these words are XORed with four words of the expanded key.
(Eq. 2)
They are used as input to the next round. The third word is XORed with the first output
work of from left and rotated right by one bit. Conversely, the fourth word is rotated one bit to
the left and then XORed with the second output of the left. This process is repeated 16 times.
The final outputs are swapped in the final round to undo the swap in the initial step and then
XORed with four more words of the expanded key.
(Eq. 3)
The ciphertexts are then written as 16 bytes c0, , c15 using the same little-endian
conversion as used for plaintext.
(Eq. 4)
When this algorithm was submitted to the contest, there was no attack more efficient than
brute force. The most efficient attack against Twofish with a 128-bit key had a complexity of
2128, the most efficient attack against Twofish with a 192-bit key had a complexity of 2192,
10
256-bit key has a complexity of 2256. After this, there have been many attempts to find the best
way to break this algorithm; however, no one has found a way to break it faster than brute force.
In 2001, Stefan Lucks attacked Twofish using several methods such as Key Finding and
Distinguisher attacks. He discovered that the Key Finding attack was only two to four time faster
than an exhaustive search brute force, and the Distinguisher attack had the probability of success
of only 25% , with 232 to 2127 chosen plaintexts. Moreover, these attacks only break one-half of
Twofish cipher texts because of its one-bit rotation. (Schneier)
No matter how good or widely used Serpent and Twofish are; they still suffer from key
distribution/exchange problems and key management disadvantages (Lucks). Finally, it is only a
matter of time before these algorithms can be easily broken using more powerful computer and
technique (Courtoi).
4. Asymmetric Cryptography
The idea of asymmetric (or public key) cryptography was first published in 1976 by
Whitfield Diffie and Martin Hellman in their paper "New Directions in Cryptography" (Menezie,
2). In this document Diffie and Hellman approached the issue of cryptographic algorithms and
their necessity of secure channels of communication. While this proposed the theory behind
asymmetric cryptographic algorithms, it did not provide a method of implementation. It was not
until Rivest, Shamir, and Adlement created the RSA algorithm in 1978 that an algorithm was
created that could make use of the technique proposed by Diffie and Hellman in 1976. The RSA
algorithm is based upon the difficulty of factoring large numbers. In 1984 the Elgamal algorithm
was proposed which included the functionality to perform the Diffie-Hellman (DH) key
exchange (as described in their paper), and was based on the discrete logarithm problem,
11
considered a more sound mathematical problem than prime factorization (Menezie, 6). This
section will focus primarily on the two main public key algorithms and their implementations.
12
(Eq. 5)
where C is the cipher text, Me indicates multiplication by the plaintext e and mod N indicates the
modulus operation with N=PQ where P and Q are the public and private keys, very large prime
numbers. Because of the use of N in the modulus operation, it is assumed that N will be
computationally difficult to find, therefore making this algorithm dependent on the difficulty of
factoring large numbers (N).
(Eq. 6)
where C is the cipher text, ek is the plaintext raised to the exponent of k a secret session key that
can be computed by the two private keys of the sender and recipient, and mod p is the modulus
operator using the sender's private key p. Due to the exponent k, cryptanalysis of the algorithm
becomes a logarithmic function which is computationally difficult for large values of k. Because
of the method with which this algorithm implements the keys, the cipher text is twice as long as
13
Message expansion: DH/Elgamal creates ciphertext that's longer than the message. This
is not generally a problem as it is used to transfer session keys.
Signature Strength: DH/Elgamal uses smaller keys, which may not be large enough to
guarantee security for a sufficient period of time.
Need for 'good' randomness: DH/Elgamal requires a random value which must be unique
and unpredictable for each message. If an attacker recovers k or if two messages are
encrypted with k, the private key may be obtained.
No patent/copyright issues: DH/Elgamal are free and open source algorithms, whereas
RSA Labs must license RSA for use in the US and Canada.
RSA is forgeable: A malicious user could generate fake data that is easily factorable.
This is difficult to check without access to the private key.
RSA key generation requires a lot of computation; it is not good for use in systems
14
DH/Elgamal uses evanescent keys: An eavesdropper can not find the contents of a
DH/Elgamal message once the session key is destroyed. In RSA the private key decrypts
all messages.
In 1994, the Elgamal algorithm was agreed upon for use in the Digital Signature Standard
and was designated Digital Signature Algorithm for use in the US government. This was decided
upon as because this algorithm generates a signature based upon message size instead of
depending on the size of the key, a problem with RSA keys (FIPS186). Elgamal also replaced
RSA keys in version 5 of Pretty Good Privacy (PGP) and all versions of GNU Privacy Guard
(GPG), effectively replacing RSA for use in encrypted emails. The RSA algorithm remains the
algorithm for SSL connections and public key infrastructures, as these require fewer key
generations and is computationally less intensive (Simpson).
15
16
5. Politics in Cryptography
The history of the politics of cryptography started shortly after World War II and
continues to this day. After World War II, the government realized the importance of
cryptographic methods and put in place laws that give the government strict control over all
issues dealing with cryptography. However, with the coming of the Information Age and
widespread use of computers, there has been considerable tension between the government and
public developers of cryptographic algorithms. In particular, the export of cryptographic
algorithms and programs has been regulated fiercely by the government. The history of politics
and cryptography in the United States and the current legislation regarding cryptography will be
addressed in detail.
17
between the government and the private and academic sectors began to arise. Private companies
needed cryptography to securely transfer sensitive information, and academics were trying to
pursue research in cryptography. But the NSA didnt want any cryptographic techniques to be
developed that they could not break since it would interfere with their ability to monitor
communications.
18
publish the paper Merkle had written about his research into the two encryption algorithms, and
the NSA denied the request to publish the paper. However, when the paper was being reviewed, a
copy was passed to John Gilmore, who made the paper publicly available on the sci.crypt
newsgroup. Because Gilmore published it using legal methods, the government took no legal
action against Gilmore (Gilmore). This situation is evidence of how difficult it is for the NSA to
enforce restrictions on the spread of information in a networked world.
In 1991, Senate bill 266 was proposed that would require trap-doors to be added to
networking equipment used in the private sector so that the government could monitor business
communications (S.266). Phil Zimmerman, the eventual creator of Pretty Good Privacy (PGP),
believed that the government was close to outlawing secure communications between private
citizens. He wrote the PGP software and released it publicly on the Internet before Congress
could create legislation that mandated trap-doors for the government be inserted in private
encryption systems. The government opened a criminal investigation into Zimmerman for
releasing the PGP software to the public without the governments permission. The government
argued that the technology would weaken the governments ability to protect national security
because rouge countries and terrorists could use the strong encryption to make their
communications unreadable. Zimmerman argued that developing and releasing the software was
freedom of speech and protected under the First Amendment. Eventually, in 1996, the
government dropped its case against Zimmerman under strong pressure from free speech
advocates and civil rights organizations. (Bender) This was a victory for a computing industry
that was trying to generate more secure forms of communication in networks and for web
applications such as banking websites that needed highly secure channels of communication.
19
20
6. Quantum Cryptography
With the development of quantum computing, fundamental changes must be made in the
way quantum cryptography is looked at. To understand these changes and why they must take
21
place, one must first look at the physics principles behind both quantum computing and quantum
cryptography. Three quantum mechanical phenomena that are used uniquely in both of these are
the uncertainty principle, superposition, and entanglement. As this paper is more focused on the
cryptographic side of the issue, it will not go into full detail on these, merely providing a basic
overview.
The uncertainty principle dictates that there are certain related quantum states of which
the measurement of one property will disturb the other, the classic example being that of
momentum and position. Any measurement made attempting to determine the position of a
particle will introduce some amount of uncertainty into the momentum. The more precise one
measurement is, the more uncertainty is introduced to the other. This principle will become a
key factor in the detection of eavesdroppers in quantum cryptography.
Superposition refers to the fact that a particle exists as a superposition of multiple
quantum states. Only when this quantity is observed does the wave function describing the
particle collapse into only one of the states. Quantum computing can utilize this property to
carry out certain complex computations very quickly. While a traditional bit is in either the zero
or one state, a quantum bit, or qubit, can utilize superposition to exist in both states
simultaneously, allowing for previously complex calculations to be performed very quickly as
many possible solutions to problems can be analyzed at once.
Entanglement is the quantum mechanical phenomena in which the quantum states of two
or more particles are linked, even when they become spatially separated. For example, after two
particles are created that are entangled, both particles would have a probability of having a
certain spin. When the measurement is made on the first particle, thereby forcing it into a single
state, its entangled particle then always will measured to have the opposite spin.
22
23
work as follows:
1. The sender, Alice, chooses a random bit string and a random sequence of polarizations
2. She then sends the other user (Bob) a train of photons each representing one bit of the
string.
3. Bob randomly chooses to measure each arriving photon rectilinearly or diagonally.
4. Bob tells Alice the polarizations he used for measurement via a public channel.
5. Alice tells Bob which measurements were correct.
6. Bob and Alice choose a certain number of bits to compare to check for tampering
(Bennett)
At this point Alice and Bob have successfully exchanged a key without fear of
eavesdropping by the third party Eve. Because of the uncertainty principle, attempting to
measure in one polarization will effectively randomize the other. Because the polarizations
being sent out are random, and any incorrect reading effectively destroys the information, any
attempt at eavesdropping will not only be unsuccessful, with at best half the key being correctly
found, but Bob and Alice would no longer have the same key due to the lost information, making
the eavesdropper's presence known to both parties (Bennett).
The disadvantage to the BB84 method being that it while is secure when only one photon
is sent for each bit, current lasers can often send multiple photons, allowing Eve to intercept one
without the other parties knowing. Therefor true on-demand single photon sources are desired
in order to make QKD efficient and unconditionally secure (Curcic). Similarly, there is a lack of
high quality single-photon detectors, (Kincade), meaning sending a single photon may not
always be possible on either end of the line. Of course BB84 is by no means the only QKD
method to exist. One other method that highlights an alternative technique is shown below.
24
25
system is quite high. Dedicated fiber must be established between sites wishing to exchange
keys, and the equipment needed for photon generation and measurement can be quite expensive.
However, the technology has been steadily improving. While the first actual quantum key
exchange was over a distance of only 30cm, current experiments have been carried out over over
150km (Curcic)
The future of the technology appears to be very bright. The logic behind the algorithms
has by now been thoroughly proven as completely secure (Chau) and even gone as far as to be
put into test implementations. New algorithms using other quantum techniques are consistently
being published, such as implementation in wireless LANs (Nguyen), and as more progress is
made in the field of quantum physics, so will more exploits become available for secure
cryptography.
7. Conclusion
This paper discussed the technical history and details of cryptography and cryptographic
systems. Section 2 detailed the history of cryptography before the advent and proliferation of
computers and general computing with a focus on the beginning of the twentieth century.
Section 3 approached the use of symmetric cryptography in computing, systems that require a
single key to encrypt and decrypt data. Section 4 explained the two main asymmetric
cryptographic algorithms, their uses, and touched on the future of public key technology.
Section 5 gave historic accounts of the differences between the government and public entities
over the publishing and exportation of cryptographic software and information. Section 5
introduced the subject of quantum cryptography and the future of quantum computing on
cryptography and cryptosystems. It would seem that cryptographic algorithms and applications
26
are secured for the time being against modern cryptanalyst attacks, however as they are all only
computationally secure, their life span is limited. Cryptography as a field has a bright future,
with new research and development prompting new algorithms and methods. Quantum
computing, perhaps the next, largest step in computing, also provides the newest hopes for
cryptography, creating the potential for new cryptographic methods an algorithms, obsolescing
modern applications and algorithms at the same time. By looking at modern and past methods
cryptographers can look to the future with experience, creating better, more efficient algorithms
without recreating the mistakes of the past.
27
References
Hinsley, Harry. "The Enigma of Ultra." History Today 43 (1993). EBSCOHost. Georgia Tech
Library, Metz. 16 July 2006. Keyword: Cryptography.
Kartalopoulos, Stamatios V. "A Primer on Cryptography in Communications." IEEE
Communications Magazine (2006): 146-151. EBSCOHost. Georgia Tech Library, Metz.
16 July 2006. Keyword: Cryptography.
Reeds, Jim. "Review of "the Code Book: the Evolution of Secrecy From Mary Queen of Scots to
Quantum Cryptography" by Simon Singh. Anchor Books." Rev. of The Code Book, by
Simon Singh. ACM SIGACT News June 2001: 6-11.
Zotos, Kostas, and Andreas Litke. Cryptography and Encryption. Dept. of Applied Informatics,
University of Macedonia. 16 July 2006
<http://arxiv.org/ftp/math/papers/0510/0510057.pdf>.
R. Anderson, E. Biham, and L. Knudsen, Serpent: A Proposal for the Advanced Encryption
Standard, First Advanced Encryption Standard (AES) Conference, Ventura, CA, 1998.
Nicolas Courtois, Josef Pieprzyk, "Cryptanalysis of Block Ciphers with Overdefined Systems of
Equations". The Association for Computer Machinery. Lecture Notes In Computer
Science; Vol. 2501. pg. 267 287. 2002.
AJ Elbirt, C. Paar. An FPGA Implementation and Performance Evaluation of the Serpent Block
Cipher. The Association for Computer Machinery. International Symposium on Field
Programmable Gate Arrays. Pg 33-40. 2000.
http://portal.acm.org/citation.cfm?id=329176&coll=portal&dl=ACM
B. Schneier, J. Kelsey, D. Whiting, D. Wagner, C. Hall, and N. Ferguson. Twofish: A 128-Bit
Block Cipher.
Stefan Lucks. The Saturation Attack - A Bait for Twofish. The Association for Computer
Machinery. Lecture Notes In Computer Science; Vol. 2355. pg. 1 15. 2001.
P. K. Mohapatra. Public Key Cryptography. The Association for Computer Machinery.
http://www.acm.org/crossroads/xrds7-1/crypto.html
"New Directions in Cryptography." Diffie, Whitfield and Hellman, Martin. IEEE Transactions
on Information Theory Vol. IT-22. 6 Nov. 1976.
"A Public Key Cryptosystem and a Signature Scheme Based on Discrete Logarithms." Elgamal,
Taher. IEEE Transactions on Information Theory Vol. IT-31. 4 July 1985.
"The Handbook of Applied Cryptography." A. Menezes, P. van Oorschot, S. Vanstone. CRC
28
Press, 1996.
"The History of Non-Secret Encryption." J. H. Ellis. Cryptologia. July 1999.
"DIGITAL SIGNATURE STANDARD (DSS)", NIST Federal Information ProcessingStandards
Publication 186-1, Dec 1998.
J.Callas, L.Donnerhacke, H.Finney, R.Thayer, "OpenPGP Message Format", RFC 2440, Nov
1998.
"PGP DH vs. RSA FAQ." Simpson, Sam. 1999 <http://www.scramdisk.clara.net/pgpfaq.html>
A.M.Odlyzko, "The Future of Integer Factorization", RSA CryptoBytes, Volume 1, Number 2,
Summer 1995.
A.M.Odlyzko, "Discrete logarithms: The past and future", 5th July 1999.
"Records of the National Security Agency/Central Security Service." The National Archives. 17
July 2006 <http://www.archives.gov/research/guide-fed-records/groups/457.html#457.1>.
"National Security Agency." University of San Diego. Dept. of History, U. of San Diego. 18 July
2006 <http://history.sandiego.edu/gen/20th/nsa.html>.
"NSRP: Cryptography." U. of Texas. School of Information, U. of Texas. 18 July 2006
<http://www.gslis.utexas.edu/~netsec/crypto.html>.
Gilmore, John. "Merkle's "a Software Encryption Function" Now Published and Available."
sci.crypt. 13 July 1989. 18 July 2006
<http://groups.google.com/group/sci.crypt/msg/e86ff3c3089f97c8>.
"S.266." Library of Congress. 1991. 18 July 2006 <http://thomas.loc.gov/cgi-bin/query/z?
c102:S.266.IS:>.
Bender, Adam. Carnegie Mellon. Carnegie Mellon. 18 July 2006
<http://www.andrew.cmu.edu/user/abender/pgp/history.html>.
United States. Department of State. The United States Munitions List. 18 July 2006
<https://www.demil.osd.mil/documents/app1_97.pdf>.
"6.4 United States Cryptography Export/Import Laws." RSA Security. 18 July 2006
<http://www.rsasecurity.com/rsalabs/node.asp?id=2327>.
"Detailed History of Applied Cryptography Case." Qualcomm. 24 Feb. 1999. 18 July 2006
<http://people.qualcomm.com/karn/export/history.html>.
"Administration Implements Updated Encryption Export Policy." Center for Democracy and
29
30
31
32
33