You are on page 1of 4

HISTORY OF COMPUTER

A machine for performing calculations automatically, A computer is a programmable machine that


receives input, stores and manipulates data, and provides output in a useful format. A programmable
device that performs mathematical calculations and logical operations, especially one that can process,
store and retrieve large amounts of data very quickly.

Blaise Pascal A calculating machine developed in 1642 by French mathematician. It could only add and
subtract, but gained attention because 50 units were placed in prominent locations throughout Europe.
Accountants expressed grave concern that they might be replaced by technology.

Gottfried Wilhelm Leibniz (July 1, 1646 – November 14, 1716) was a German philosopher and
mathematician.Leibniz occupies a prominent place in the history of mathematics and the history of
philosophy. He developed the infinitesimal calculus independently of Isaac Newton, and Leibniz's
mathematical notation has been widely used ever since it was published. He became one of the most
prolific inventors in the field of mechanical calculators. While working on adding automatic multiplication
and division to Pascal's calculator, he was the first to describe a pinwheel calculator in 1685[4] and
invented the Leibniz wheel, used in the arithmometer, the first mass-produced mechanical calculator. He
also refined the binary number system, which is at the foundation of virtually all digital computers.

Charles Babbage, (26 December 1791 – 18 October 1871)[2] was an English mathematician,
philosopher, inventor, and mechanical engineer who originated the concept of a programmable computer.
[3]
Considered a "father of the computer",[4] Babbage is credited with inventing the first mechanical
computer that eventually led to more complex designs.

Augusta Ada King, Countess of Lovelace (10 December 1815 – 27 November 1852), born Augusta
Ada Byron, was an English writer chiefly known for her work on Charles Babbage's early mechanical
general-purpose computer, the analytical engine. Her notes on the engine include what is recognised as
the first algorithm intended to be processed by a machine; as such she is sometimes portrayed in popular
culture as the "world's first computer programmer", although some authorities disagree with this
perspective.[1][2]

punched card, punch card, IBM card, or Hollerith card is a piece of stiff paper that contains digital
information represented by the presence or absence of holes in predefined positions. Now an obsolete
recording medium, punched cards were widely used throughout the 19th century for controlling textile
looms and in the late 19th and early 20th century for operating fairground organs and related instruments.
They were used through the 20th century in unit record machines for input, processing, and data storage.
Early digital computers used punched cards, often prepared using keypunch machines, as the primary
medium for input of both computer programs and data. Some voting machines use punched cards.
Punched cards were first used around 1725 by Basile Bouchon and Jean-Baptiste Falcon as a more
robust form of the perforated paper rolls then in use for controlling textile looms in France. This technique
was greatly improved by Joseph Marie Jacquard in his Jacquard loom in 1801.

John von Neumann (December 28, 1903 – February 8, 1957) was a Hungarian American mathematician
who made major contributions to a vast range of fields,[1] including set theory, functional analysis,
quantum mechanics, ergodic theory, continuous geometry, economics and game theory, computer
science, numerical analysis, hydrodynamics (of explosions), and statistics, as well as many other
mathematical fields. He is generally regarded as one of the greatest mathematicians in modern history.[2]

Alan Mathison Turing 23 June 1912 – 7 June 1954), was an English mathematician, logician,
cryptanalyst and computer scientist. He was highly influential in the development of computer science,
providing a formalization of the concepts of "algorithm" and "computation" with the Turing machine, which
played a significant role in the creation of the modern computer.[1]

First Generation (1945-55)All computers developed in the initial phase of computer growth (ENIAC,
EDVAC, UNIVAC and Mark-1) belong to the first generation of computers. These computers were mostly
made between 1945 and 1955. Computers developed in the generation used the Vacuum tubes. These
computers were the fastest calculating device of their time. However, the memory size of these
computers was limited. They were huge in size, valuable and unreliable.

Second Generation (1955-64)Transistors were used in the second generation of computers.


Transistors replaced vacuum tubes and gave birth to the second generation of computers (Like UNIVAC
2, IBM 1401 and CDC 1604). These computers were smaller in size, faster and more trustworthy as
compared to the first generation computers.

Third Generation (1964-70)In early 1960, the electronic technology of solid-state was introduced. The
development of integrated circuits is referred to as Solid-state technology. The Integrated Circuits (ICs)
are the gathering of many electronic devices, like transistors on a single chip (small piece) of silicon. The
computers using this technology belong to the third generation. This generation of computers was even
more reliable, faster and smaller than previous generation computers. The IBM 360, PDP 11 and HP
3000 are the computers developed in this generation.

Fourth Generation (1970 Onwards) In 1970’s the integrated circuit technology was developed
adequately to assimilate all main functions of a computer on a single chip called a microprocessor. The
introduction of microprocessors brought the computer age into fourth generation. A direct result is the
development of microcomputers like IBM PC and Apple Macintosh which brought computers down from
the organizational level to the personal level.

Fifth Generation:(Present & Ahead)Computing devices, based on artificial aptitude, are still in
development, though there are some applications, for instance voice recognition, that are being used
today. The use of parallel processing and superconductors is helping to make artificial intelligence a
reality. Quantum computation and molecular and nanotechnology will radically transform the face of
computers in years to come. The objective of fifth-generation computing is to build up devices that
respond to natural language input and are proficient of learning and self-organization.

INVENTIONS

 Abacus 2400 BC the first known calculator was invented in Babylonia. It was a major step
towards the era of computing.

 Panini 500 BC an ancient Indian Sanskrit grammarian came up with the predecessor of the
modern formal language theory.

 Pingala 300 BC invented the binary number system that serves as the foundation of computing
systems the world over.

 John Napier 1614 designed the system of moveable rods, which used algorithms to perform the
basic mathematical operations.

 William Oughtred 1622 invented slide rules.

 Charles Babbage 1822 devised the first mechanical computer.

 John V. Atanasoff 1937 devised the first digital electronic computer


 Atanasoff 1939 and Clifford Berry came up with the ABC prototype.

 Alan Turing 1940 introduced the concept of a theoretical computing device now known as
a Turing machine. The concept of this machine, which could theoretically perform any
mathematical calculation, was important in the development of the digital computer.

 The electromechanical Z machines by Konrad Zuse 1941 proved being an important step in the
evolution of computers.

 Colossus 1943, which was able to decode German messages, was designed at Bletchley Park
in Britain.

 Harvard Mark I in 1944,a computer with lesser programmability was designed.

 John von Neumann 1945 described a stored program architecture, for the first time ever. This
architecture was the heart of the computer systems developed thereafter. This architecture,
which came to be known as the von Neumann architecture is a part of every computer till today.

 The Ballistics Research Laboratory of the United States came up with the Electronic Numerical
Integrator and Computer (ENIAC) in 1946 . It was the first general purpose electronic computer;
but had an inflexible architecture.

 The US National Bureau of Standards came up with the Standards Electronic/Eastern Automatic
Computer (SEAC) in 1950 . It was the first computer using diodes for handling logic.

 Lynos Electronic Office (LEO), the first business computer was developed by John Simmons
1951 and T. Raymond Thompson. UNIVAC I, the first commercial computer was designed in the
United States by John Presper Eckert and John W. Mauchly. EDVAC, the electronic discrete
variable automatic computer was introduced.

 Bell Labs introduced its first transistor computer in 1955. Transistors made computers energy-
efficient.

 Advanced Research Projects Agency (ARPA) in 1958 was formed. This year also witnessed the
making of the first silicon chip by Jack Kilby and Robert Noyce.

 DEC launched the first mini computer known as PDP-8 in 1968

 The US Department of Defense founded the Advanced Research Projects Agency Network
(ARPANET) in 1969. It was established with intent to develop a computer network and is the
predecessor of the Internet.

 Microcomputers came up with microprocessors and Ted Hoff at Intel, introduced 4-bit 4004 in
1971.

 the creation of 8080 microprocessors by Intel in 1972.

 A minicomputer that was called Xerox Alto was developed during 1973. It was an important
milestone in the development of personal computers.
 Researchers at the Xerox Palo Alto Research Center came up with Alto in 1974, which was the
first workstation with an inbuilt mouse. It had a fair amount of storage capacity and offered menus
and icons. It could also connect to a network.

 Altair in 1975 came up with the first portable computer. The foundation of the present-day
relationship between portability and computing was laid way back in 1975! Tandem computers,
the first computers with online transaction processing capacities were born during this period.

 By 1979, more that half a million computers were in use in the United States. This number
crossed 10 million by 1983.

 The American National Standards Institute (ANSI) was founded in 1981. It was during the same
year that the first 32-bit chip was introduced by Hewlett-Packard.

 Intel announced the 80286 processor in 1982.

 In 1983, the Time magazine nominated personal computer for the title ‘machine of the year’.

 Intel in 1985 introduced the 80386 processor that consists of a 16MHz processor.

 The World Wide Web was born in 1990 . Tim Berners-Lee, a researcher at CERN, developed
HTML. He came up with specifications such as URL and HTTP. He based the World Wide Web
on enquiry-based system that used hypertext and enabled people to collaborate over a network.
His first web server and browser became available to the public.

 Till now The development of newer versions of computer systems continues.

You might also like