Professional Documents
Culture Documents
COMPUTER AND
MATHEMATICAL
SCIENCES
PHI 454
HISTORY AND
PHILOSOPHY OS SCINCE
TITTLE :
HISTORY OF THE DEVELOPMENT OF COMPUTERS
AND ITS IMPACT ON SOCIETY
PREPARED BY
FARAH IZZAH BINTI MOHD YUSAK
2011158779
CS2273C
VERIFIED BY
PROF. MADYA HAJI SHAFIE MEHAD
TABLE OF CONTENT
No.
________
Content
Page
1.0
Introduction.
1.1
2.0
3.0
Objectives..
Introduction of computers..
2.1
3.1
First generation.
10
3.2
Second Generation
14
3.3
Third generation...
16
3.4
Fourth Generation...
19
3.5
Fifth Generation
21
4.0
23
5.0
27
5.1
Meaning of HCI.
27
5.2
29
5.3
31
6.0
Networking..
32
7.0
Artificial Intelligence..
33
2
8.0
9.0
Conclusion...
35
8.1
36
Reference
37
1.0 INTRODUCTION
This term paper is to make a brief history of the development of computer also
its impact on society. We can look through the advantages and disadvantages of using
computer nowadays on society especially among children and adult. As the world is
3
1.1
OBJECTIVE
small, simple devices that are often used to control other devices. For example they
may be found in machines ranging from fighter aircraft to industrial robots, digital
cameras, and even children's toys.
Computers are tools used to process the data according to commands that have
been formulated. Computer words originally used to describe people who work
perform arithmetic calculations, with or without a walker, but the meaning of this
word is then transferred to the machine itself. Originally, the processing of
information is almost exclusively related to arithmetical problems, but modern
computers
are
used
for
many
tasks
unrelated
to
mathematics.
Broadly, the computer can be defined as an electronic device that consists of several
components, which can cooperate between the components with one another to
produce information based on existing programs and data. The computer components
are included: Screen Monitor, CPU, Keyboard, Mouse and Printer (as
complementary). Without a computer printer can still do its job as a data processor,
but not limited to the monitor screen looks in print form (paper).
There are several additional, commonly used terms related to the study of
computers. The most popular of these is information technology (IT), which can be
defined as the branch of technology devoted to the study and application of data and
the processing thereof. Information Technology (IT) is concerned with technology to
treat information. IT can also be thought of as applied computer systems, including
both hardware and software, usually in the context of a business or other enterprise,
and often including networking and telecommunications. The term computer science
is usually reserved for the more theoretical, academic aspects of computing. Another
commonly used term, information systems (IS), refers to the application of computers
to support the operations of businesses and other organizations. It includes the
installation, operation and maintenance of computer hardware, software and data.
In relative terms, it wasn't long ago that the Information Technology
department might have consisted of a single Computer Operator, who might be
storing data on magnetic tape, and then putting it in a box down in the basement
somewhere. IT is a wide based term and encompasses many areas. Professionals in
information technology may perform a wide variety of tasks that range from installing
computer applications to designing widely complex computer networks and
information databases. While technology today encompasses a wide range of
individual focuses, it is becoming increasing clear that the IT field of the future will
include many more topics and more demand than ever before.
2.1
use chips or monitors. They weren't as small nor big like the ones today. The history
of computer development is often referred to in reference to the different generations
of computing devices. Each generation of computer is characterized by a major
technological development that fundamentally changed the way computers operate,
resulting in increasingly smaller, cheaper, and more powerful and more efficient and
reliable devices. Read about each generation and the developments that led to the
current devices that we use today.
Development of computer architecture started as soon as the creation of
Abacus in 500BC. The development continues and improved from time to time. In
1801, Joseph Marie Jacquard invented loom with punched cards. Blaise Pascal
invented the first calculating machine that can do addition and subtraction in 1642.
Baron Wilhelm Von Leibniz builds a machine that can multiply and divide. In 1800s
Charles Babbage created an analytical engine that not only perform calculation but
also print the output. George Boole then developed binary theory of logic which
explains relationship between binary arithmetic and Boolean Logic.
3.1
FIRST GENERATION
First generation was powered by thousands of vacuum tubes. The
vacuum tubes themselves were large (the size of todays light bulb). They
required great amounts of energy, and they generated much heat.
Unfortunately, a tube failure occurred average once every 7 minutes. Since it
took more than 15 minutes to find and replace the faulty tube, it was difficult
to get any useful computing work done. Moreover, the ENIAC was enormous,
occupying 1500 square feet and weighing 30 tons. The computers memory
was stored on magnetic storage devices, primarily magnetic tapes and
magnetic drums. Most of the data were entered into the computers on punched
cards similar to those used in Jacquards process. Output consisted of punched
cards or paper. 1937 John V. Atanasoff and Clifford Berry created ABC, the
first binary-based machine. In 1946 John Mauchly and J. Eckert come out
with the first digital computer called ENIAC.
10
The computer referred to as IAS that was built by John Von Neumann
in 1947 at the Princeton Institute for Advanced Studies. It was completed in
1951 and fully operational in 1952. The machine was binary computer with a
40-bit word, storing to 20 bit instructions in each word. The IAS instructions
can be grouped into; data transfer that move data between and ALU register or
between two ALU registers, unconditional branch which normally the control
unit executes instructions in sequence from memory; this sequence can be
changed by a branch instruction, this facilities repetitive operation, conditional
branch which the branch can be made dependent on a condition, thus allowing
decision points, arithmetic where performed by ALU, and address
modifications that permits the address to be computed in the ALU and then
into instructions stored in memory. The importantly, the IAS machine was the
first design to mix programs and data in a single memory. It used about 2300
tubes in its circuitry. The addition time was 62 microseconds and the
multiplication time was 713 microsecond. It was an asynchronous machine,
11
meaning that there was no central clock regulating the timing of the
instructions. One instruction started executing when the previous one finished.
12
3.2
SECOND GENERATION
The device that characterized the second generation computers was the
transistors. In 1948, the invention of the transistor greatly influenced the
development of computers. Transistor replaced vacuum tubes and ushered in
the second generation of computers. Second generation of computers was
invented in 1947. Transistor was invented in Bell Labs in 1948 by John
Barden, Walter Brattain and William Shockley. Transistor is a small device
that transfer electronic signal across a resistor. A transistor is essentially a tiny
electronically operated switch, or gate, that can be alternate between on and
off many millions of times per second. Transistors were made of
semiconductor and controlled the flow of electricity through circuits. The
transistor was far superior to the vacuum tube, allowing computers to become
smaller, faster, cheaper, more energy-efficient and more reliable than their
first-generation predecessors. Also fewer transistor than tubes were required to
operate a computer. Transistor was not fragile and they lasted longer than
vacuum tubes.
During second generation, the first transistor was build was The TX-0
at M.I.T. this machine was merely intended as a device to test the much
13
fancier TX-2. After TX-0 was build, PDP-1 was manufactured by DEC in
1961. It had 4K of 18-bit words and a cycle time of 5 microseconds. It cost
$120,000. One of the PDP-1s many innovations was a visual display (CRT)
and the ability to plot points anywhere on its 512 x 512 screen. A few years
later DEC introduced the PDP-8 which uses a single bus, the omnibus. The
third innovation was build is IBM 7090. The performances was double that of
PDP-1. It was the fastest computers in the world at that time. It cost millions
of dollars. Later IBM introduced the 7094. Both 7090 and 7094 marked the
end of ENIAC type machine. CDC 6600 was introduced by CDC. It was
highly parallel machine. It had several functional units for and all of them
could run in parallel. The last innovation were built during second generation
is The Burroughs B5000 that programmed in Algol 60, a forerunner of Pascal.
The idea that software also counted was born.
14
3.3
THIRD GENERATION
Third generation was built during 1964-1971 which is Integrated
Circuits (IC). Integrated circuits signified the beginning of third generation
computers. The development of the integrated circuit was the hallmark of the
third generation of computers. Unlike transistors and circuit boards that were
assembled manually, integrated circuits (ICs) were single, complete electronic
semiconductor circuits contained on pieces of silicon, sometimes called chips.
Transistors were miniaturized and placed on silicon chips, called
semiconductors, which drastically increased the speed and efficiency of
computers. ICs could be manufactured by machinery, which ultimately
resulted in a lower cost. Instead of punched cards and printouts, users
interacted with third generation of computers through keyboards and monitors
and interface with an operating system, which allowed the device to run many
different applications at one time with a central program that monitored the
memory. Computers for the first time became accessible to a mass audience
because they were smaller and cheaper than their predecessors.
By 1969, as many as 1,000 transistors can be built on a chip of silicon.
Magnetic disks were improved and were used more for storage. Monitors and
keyboards were introduced for data input and output. IBM360 allow the
15
16
3.4
FOURTH GENERATION
The microprocessor brought the fourth generation of computers, as
thousands of integrated circuits were built onto a single silicon chip. What in
the first generation filled an entire room could now fit in the palm of the hand.
The Intel 4004 chip, developed in 1971, located all the components of the
computer from the central processing unit and memory to input/output
controls on a single chip. The Intel 4004 chip, developed in 1971, took the
18
3.5
FIFTH GENERATION
Fifth generation (1991 2005 and beyond) Our current generation
referred to as the connected generation because of the industrys massive
effort to increase the connectivity of computers. The rapidly expanding
Internet, World Wide Web, and intranets have created an information
superhighway that has enabled both computer professionals and home
computer users to communicate with others across the globe. The fifth
generation begins with the creation and use of a computer with Artificial
Intelligence. Fifth generation computing devices, based on artificial
20
intelligence, are still in development, though there are some applications, such
as voice recognition, that are being used today. The use of parallel processing
and superconductors is helping to make artificial intelligence a reality.
Quantum computation and molecular and nanotechnology will radically
change the face of computers in years to come. The goal of fifth-generation
computing is to develop devices that respond to natural language input and are
capable of learning and self-organization.
They will be able to take commands in an audio visual way and carry
out instructions. Many of the operations which require low human intelligence
will be performed by these computers. Parallel Processing is coming and
showing the possibility that the power of many CPU's can be used side by
side, and computers will be more powerful than those under central
processing. Advances in Super Conductor technology will greatly improve the
speed of information traffic. Future looks bright for the computers.
22
todays society. There are some advantages and disadvantages of computers effects on
society same goes to our industry and our self.
Computers changed the world a lot. It helped man step forward into the future.
Thanks to computers, space exploration came true, new designs of vehicles and other
transportation were made; entertainment became more entertaining, medical science
made more cures for diseases. The computers impacted our lives in many ways. You
may not notice it but they did make life a lot easier. Without computers, the world
would be a harder place to live in. Thanks to the computers, everyday life is easier for
us. Some people may disagree but most wouldn't. Some people say that computers are
taking away man power. That may be true but computers did make the impossible
possible. Computers impacted many items in today's society. One area the computer
impacted on is the business area. Business uses computers for keeping track of
accounts, money, or items that they need. You may notice business people using
computers a lot, especially the laptop computers, portable computers that can be taken
to your work area. You may see people use things like pie-charts and graphs when
they present information to other business people in meetings. Most of those charts
were made by computers. The business field uses the computers a lot for their
companies and organizations.
23
keep a record of what medication to give to a patient and the amount they need. Most
computers in the hospital are used to keep data of patience and their status.
Computers also keep track of equipment placement and status as well. Scientists need
the help of computers to find cures for diseases that need cures like cancer and STDs.
Without the computers help, cures for a lot of diseases wouldnt have been found.
Computer helped the medical area a lot and we are grateful for that since they keep
track of our health. Other areas the computer impacted on are space exploration and
designs of transportation.
even see who they are in real live. Through computer also children especially
spending too much time playing online games or chatting over the computer instead
of reading books, taking walks, and doing homework. And after a while, the people
will soon become addicted to the games, and cannot stop from play it. It is good to
find something you enjoy and to do it to relax, but doing too much of it will also
result in negative consequences. Being on a computer too much can also lead to antisocial behavior and depression.
MEANING OF HCI
Human computer interaction (HCI) is a discipline concerned with the design,
evolution and implementation of interactive computing system for human use and
with the study of major phenomena surrounding them. HCI is also involving of study,
planning, and design of the interaction between people (users) and computers. Human
computer interaction is an interaction between users and computers that occur at the
user interface which includes both software and hardware. For example, characters or
objects displayed by software on a personal computers monitor, input received from
user via hardware peripherals such as keyboards and mouse, and other user
interactions with large-scale computerized systems such as aircraft and power plants.
26
5.2
Can machines think like a human think? The initial successes of computers in
replicating seemingly intelligent behavior quickly led to argument and speculation
about what it would mean for a computer to be 'intelligent. We should go back to
basic on the meaning of the computer. Originally, word computer comes from
compute means calculate. A computer is a programmable machine designed to
automatically carry out a sequence of arithmetic or logical operations. The particular
28
sequence of operations can be changed readily, allowing the computer to solve more
than one kind of problem.
Computer cannot think, there are involve of artificial intelligence algorithms
that give the illusion of the computer thinking but its really just following
recognition. An important class of computer operations on some computing platforms
is the accepting of input from human operators and the output of results formatted for
human consumption.
Alan Turing asked that question in 1950 and proposed a test to determine if a
computer could think. Turing described a simple party game which involves three
players. Player A is a man, player B is a woman and player C is an interrogator. The
set-up is such that player C is unable to see either of A or B and only communicate
with them using written media. By asking questions to player A and B , player C tries
to determine which of the two is the man, and which of the two is the woman. Player
A is to trick the interrogator into making the wrong decision, while player B attempts
to assist the interrogator. Turing proposed that player A be replaced with a computer.
The success of the computer is determined by comparing the outcome of the game
when player A is a computer against the outcome when player A is a man. Or to put it
in Turings words:
The interrogator decides wrongly as often when the game is played (with the
computer) as he does when the game is played between a man and a woman, and then
it can be argued that the computer is intelligent
As with the Original Imitation Game Test, the role of player A is performed by
a computer. The difference is that now the role of player B is to be performed by a
man, rather than by a woman. In this version both player A (the computer) and player
B are trying to trick the interrogator into making an incorrect decision. A man can fail
the OIG Test, but it is argued that this is a virtue of a test of intelligence if failure
indicates a lack of resourcefulness. It is argued that the OIG Test requires the
29
5.3
more capable when compared to our brains simply because they can perform
calculations thousands of time faster, workout logical computations without error and
store memory at incredible speeds with flawless accuracy. The human brain, we can
only estimate the processing power of the average human brain as there is no way to
measure it quantitively as of yet. If the theory of taking nerve volume to be
proportional to processing power is true we then, may have a correct estimate of the
human brain's processing power. Human retina seems to process about ten onemillion-point images per second.
The computer the most powerful experimental super computers in 1998,
composed of thousands or tens of thousands of the fastest microprocessors and
costing tens of millions of dollars, can do a few million MIPS. These systems were
used mainly to stimulate physical events for high-value scientific calculations. The
brain has about 100 million MIPS worth of processing power while recent supercomputers only has a few million MIPS worth in processor speed. That said, the brain
is still the winner in the race. Because of the cost, enthusiasm and efforts still
required, computer technology has still some length to go before it will match the
human brain's processing power.
30
We must know that brains are analogue, however computers are digital.
Another difference is that the brain uses content-addressable memory, such that
information can be accessed in memory through spreading activation compare to
the computer refer to addressable memory by polling its precise memory address.
6.0 NETWORKING
A computer network, often simply referred to as a network, is a collection of
hardware components and computers interconnected by communication channels that
allow sharing of resources and information. A network consists of two or more
computers that are linked in order to share resources (such as printers and CDs),
exchange files, or allow electronic communications. Where at least one process in one
device is able to send/receive data to/from at least one process residing in a remote
device, then the two devices are said to be in a network. Networks may be classified
according to a wide variety of characteristics such as the medium used to transport the
data, communications protocol used, scale, topology, and organizational scope. The
computers on a network may be linked through cables, telephone lines, radio waves,
satellites, or infrared light beams. Two very common types of networks include is
Local Area Network (LAN) and Wide Area Network (WAN). Besides that, you may
also see references to a Metropolitan Area Networks (MAN), a Wireless LAN
(WLAN), or a Wireless WAN (WWAN).
31
around particular institutions, the work of individual researchers, and the solution of
specific problems and the application of widely differing tools. The central problems
of AI include such traits as reasoning, knowledge, planning, learning, communication,
perception and the ability to move and manipulate objects.
Artificial Intelligence (AI) is the area of computer science focusing on creating
machines that can engage on behaviors that humans consider intelligent. The ability
to create intelligent machines has intrigued humans since ancient times and today
with the advent of the computer and 50 years of research into AI programming
techniques, the dream of smart machines is becoming a reality. Researchers are
creating systems which can mimic human thought, understand speech, beat the best
human chess player, and countless other feats never before possible. Find out how the
military is applying AI logic to its hi-tech systems, and how in the near future
Artificial Intelligence may impact our lives.
Haugeland, 1985 said the exciting new effort to make computers think.
Machines with minds, in the full and literal sense. In the context of behavior Rich and
Knight, 1991 said the study of how to make computers do things at which, at the
moment, people are better. Intelligent agents are involved in AI to perceive,
understand, and act, example is speech recognition and understanding and synthesis,
image understanding and ability to take actions and have an effect.
33
8.0 CONCLUSION
Computer is the most important thing in our lives. Now we are in the
technology time and everything is connected with computers. Most of colleges and
Universities have classes that deal with computers. Computer is a very essential thing
in our life. During the past 10 years, the use of computers in education has increased
dramatically and a wide range of educational computer programmed is now widely
available for individual and classroom use. The computer had made its mark
everywhere in society and built up a huge industry. The future is promising for the
computer industry and its technology.
Computers distribute a lot of function to make human life easier and faster.
There are advantages and disadvantages of using computer in our daily life, but we
have to control our self in using computer and used it wisely. However there is lots of
things to do in our life, that would not be possible by doing own self. Thats the
reason we need computer.
8.1
34
From this term paper I get a lot of knowledge about computer development
and the history of development. I know that computer have its own advantages that
we cant deny that computer help people in many area of studies. Though, its
sometime can give a bad influence to us. Besides that, I know that machine can never
be compare to human thinking even though computer can process faster and
accurately, but it cannot think logically like human think. It also helps me know how
computer connected to the networking and artificial intelligence area to make the
computer works more technologically.
This is a very exciting time to be alive since we all get to see how quickly
computer technology is evolving, and how much it is changing all of our lives for the
better. It is a vast and exciting world that is always changing. We are lucky to be alive
to witness computers past and present.
9.0 REFERENCES
ACM SIGCHI Curricula for HCI (Hewett et al.1992)
Michael Egnor, World Wide Web, www.evolutionnews.org
World Wide Web, www.webopedia.com
Turing A. Computing machinery and intelligence. 1950.
Stanford Encyclopedia of Philosophy.
35
36