You are on page 1of 14

Internet

The Internet
Internet is a Network of networks which links computer systems around the world. The Internet
is a global system of interconnected computer networks. A computer that connects to the Internet can
access information from a vast number of servers and other computers. An Internet connection also
allows the computer to send information onto the network; that information may be saved and ultimately
accessed by a variety of servers and other computers. Much of the widely accessible information on the
Internet consists of the interlinked hypertext documents and other resources of the World Wide Web
(WWW). Web users typically send and receive information using a web browser; other software for
interacting with computer networks includes specialized programs for electronic mail, online chat, file
transfer and file sharing.
Information is moved around the Internet by packet switching using the standardized Internet
Protocol Suite (TCP/IP). It is a "network of networks" that consists of millions of private and public,
academic, business, and government networks of local to global scope that are linked by copper wires,
fiber-optic cables, wireless connections, and other technologies.
Terminology strictly speaking the Internet is a massive, worldwide network of computers (or
network of networks), which can communicate with each other by means of following a few simple
protocols. Anyone with a server, some relatively inexpensive software, and a domain name can setup a
web server and connect to the Internet. Anyone with a computer with a modem and a connection to an
Internet Service Provider (ISP) can connect to the Internet as a user.
History
The Internet began officially in 1973 as a research project of the U.S. Defense Advanced
Research Projects Agency (DARPA). The objective of the project was to develop protocols for
interlinking networks. The communications protocol TCP/IP resulted from this research. Variations of
this protocol remain the primary communication protocol of the Internet today.
In 1986, the U.S. National Science Foundation (NSF) initiated the development of the NSFNET
contributing a major communication backbone. The National Aeronautics and Space Administration
(NASA) contributed additional backbone support.
By the end of 1991, the Internet has grown to include some 5,000 networks in over three dozen
countries, serving over 700,000 host computers used by over 4,000,000 people.
The bulk of the system today is made up of private networking facilities in educational and
research institutions, businesses and in government organizations across the globe.

Internet Characteristics
The Internet has the following characteristics:
A Complex Network With the simplified definition as a network of networks that
comprises over 150 million computers.
Disorganized The Internet can be cumbersome and confusing, even for experienced
users.
A Decentralized System - Millions of individual networks and over 140 million individual
computers connected throughout the world.
Composed of billions of Files Files pertaining to thousands of subjects, disciplines
and professions are available in different file formats.
1
Internet

Widely Used More than 147 million people use the Internet, over 50 million of whom
use it daily.
International in Scope This global network is accessed by people in approximately
140 countries; people in over 155 countries use Internet for electronic mail purpose.
Dynamic Changing every minute of every day. On an average, a new network is
connected to the Internet every 10 minutes.
Expanding Exponentially The Internet is growing at the rate of 20% per month.

Impartment Feature: Some of the features available on the Internet are:


World Wide Web The Internet application that is currently drawing maximum attention
is the World Wide Web. It has dramatically influenced the online world and continues to
grow in popularity.
Direct Communication Through e-mail, message can be sent to or received from any
part of the world within a few seconds. Using Internet Relay Chat (IRC), we can
communicate online with people over the Internet. We can log into a chat room and
converse with others by typing messages that are instantly delivered. With the
improvement of network technologies and increase in broadband, not only we can use
test messages but also graphics, audio and video for communication with the people.
Round the Clock Availability Information on the Internet is available to users 24
hours a day.
Central Repository of Data The Internet is like a huge central warehouse of data that
people from all over the world can access.
Search Engines The Internet provides a rich information base that people from across
the globe can access. Search engines are like directories which help get any kind of
information from the world over within a few seconds.
Advertisement Different companies can advertise their products about features, sales
and service through Internet.
E-Commerce The Internet has removed all barriers of distance and nationality. We can
shop for products and services across the world by logging on to the Web Portal. We can
also transfer money between different accounts with the click of a mouse. Through the
Internet, we can shop and pay by using credit cards of different banks or request our
bank to transfer our money to a different account, without even leaving our desk.
Distance Learning Several online distance learning courses are now being offered by
Indian and foreign Universities on the Internet. We can register and pay online, and
complete a course on different interest areas. We can also pursue specialized higher
studies now in the comfort of our own office or home.
Wide Area Networks Using the Internet, organizations can collect and compile
information from offices spread over large geographical area.
Shareware Software The Internet is also a great medium for downloading free
software. We can gets lots of free games, utilities and trial versions of software through
the Nets.

2
Internet

BBS and New Services The Internet is perhaps the cheapest medium for online help.
BBS (Bulletin Board System) services are available on the Net through which we can
ask questions and get immediate troubleshooting assistance.
Banking Banks are using information technology to provide online banking facilities to
their customers. Using the Internet, we can now view our account details. The use of
ATMs has shifted the mundane back-office work to the customer himself. Instead of
hiring any staff, banks can now uses ATMs to considerable reduce time and operational
costs.
Reservation and Tour & Travel - Using Internet, we can reserve our seat on railway
ticket or air plane ticket from any place to other place . Travel agencies can publish their
services on the Web along with the latest discounts, package and available details.
Bill Payments The Government sector as well as private sectors realize the benefits
of IT. Now we can make online payments for public utilizes such as water, electricity,
insurance, phones, etc.

Hardware and Software Requirement:

HARDWARE REQUIREMENTS
The minimum requirement for accessing the terminal account is Window 9.X machines or higher
versions running on P4 or higher.
P4 or higher processor machine with an adequate hard disk and with minimum 56Kbps modem,
we can able connect Internet. At present, with the development of new technology we can connect
Internet by using broadband connects at a minimum speed of 1 gbps to 3.2 gbps.
The minimum requirement for accessing a TCP/IP account, which requires graphics capability
with 256MB RAM or more.

SOFTWARE REQUIREMENTS
Using Windows XP system software, TCP/IP stack software will be required. TCP/IP stack is
communications program that Windows programs use for communicating via TCP/IP. Most of these
have an auto-install program. During installation, it will ask for two IP address for the DNS (Domain
Name Server) services, one primary and one secondary. If we are installing e-mail software such as
Eudora, then one more IP address for a SMTP (Simple mail transfer protocol) server needed.
For a TCP/IP account, in addition to the browser and e-mail applications, we may need software
for other applications, like Telnet and FTP (File Transfer Protocol). Application software packages for
networking employees a design called client/server architecture, where the software setting on the PC,
that is our software packages are optimized for ease-of-use and may be referred to as the Client.

3
Internet

Governance
For the most part, the Internet has functioned as collaboration among cooperating parties.
However, more formal control of critical functions is required. There are several organizations
responsible for these activities.
The Internet Activities Board (IAB) was created in 1983 to guide the evolution of the TCP/IP
Protocol Suite-the communications protocol of the Internet--and to provide research advice to the
Internet community. Today, the renamed Internet Architecture Board, the IAB is both a committee of the
Internet Engineering Task Force (IETF) and as an advisory body of the Internet Society (ISOC).
Since 1992, the Internet Society has served as the international organization for global
coordination and cooperation on the Internet, promoting and maintaining a broad spectrum of activities
focused on the Internet's development, availability, and associated technologies.
Other groups have been formed to deal with technical and research issues. One of these is the
Internet Engineering Task Force (IETF), which is a large open international community of network
designers, operators, vendors, and researchers concerned with the evolution of the Internet
architecture and the smooth operation of the Internet
Another important technical governing body is the World Wide Web Consortium (W3C). The
mission of the W3C is "To lead the World Wide Web to its full potential by developing protocols and
guidelines that ensure long-term growth for the Web."
The growth and distribution of users in the World has given rise to new questions and
challenges about the structure and control of the Internet.
One issue is the exclusive use of Latin characters in domain names and URLs. Given the fact
that in years to come, the majority of the users on the Internet will be Asian, the use of Latin characters
could become problematic (BBC News Article, Oct 11, 2006).
In addition, there is some concern about the control the United States has over domain names
and other important constituents of the network. This is especially true in the today's political climate in
the US in which concerns about national security sometimes override those of personal privacy and
freedom.
As a result, in November 2005, at the World Summit on the Information Society in Tunis, the
Secretary General of the United Nations was asked to convene a new forum for multi-stakeholder
policy dialogue called the Internet Governance Forum (IGF). The first general meeting of the IGF took
place in Athens, Greece October 30-November 2, 2006. The major issues discussed in that meeting
are Openness, Security, Diversity and Access. The efforts of this new governing body will be worth
watching.
Who Owns The Internet?
Ownership of the internet is a complicated issue. In theory, the internet is owned by everyone
that uses it. Yet, in reality, certain entities exert more influence over the "mechanics" and regulation of
the internet than others. To understand the notion of ownership, one must understand the backbone of
the internet--Domain Name Systems. As the internet continues to become a larger component of
education, teachers need to be aware of the political, commercial, and public influences affecting the
internet. The internet opens the door to new horizons of curriculum development, communications,
research, and resources to support education. As educators, the Domain Name System has the
potential to provide direction and simplification of internet resources. The following issues will be
examined in this discussion of ownership:

4
Internet

Domain Name Systems


Control of Domain Name Systems
Conflicts and Inequities in the Domain Name System
Relevance to Education
Domain Name Systems:
The Domain Name System(DNS) is the address system of the internet. It facilitates the users'
ability to navigate with the aid of the domain name and a corresponding Internet Protocol (IP) number.
Each domain name is linked to a unique IP address.
The DNS is divided into categories called top level domains. The top level domains are
subdivided into generic top level domains (gTLD) and country-code top level domains(ccTLD).(WIPO
Internet Domain Name Process) Within the gTLD there are presently seven domains. They are .com,
.net, .org, .int, .gov, .edu, and .mil. The first three (.com, .net, .org) are considered open domains since
there are no restrictions on who may register names within theses domains. The other four (.int, .gov,
.edu, and .mil) are restricted and only qualifying individuals or groups may register names within the
domains. The domain .int is restricted to use by international organizations; .edu, is restricted to use by
four-year colleges and universities; .gov is restricted to use by agencies of the federal government of
the United States of America; and .mil is restricted to use by the military of the United States of
America.
Country code top level domains are two letter designations assigned to individual countries. For
example, Canada is given the domain .ca and Italy is given the domain .it. From a functional standpoint,
ccTLDs and gTLDs are the same. They both provide the same connectivity. The ccTLDs are governed
by the entity that owns the domain and can be restricted or open depending on the individual entity's
rules.
Control of Domain Name Systems:
In the early stages of the development of the WWW, the National Science Foundation (NSF)
was given the task of managing the major internet backbone-the NSFNET. NSF's original purpose was
to promote research and education at the University level.(Webopedia) The NSF was quickly
overwhelmed with this task and passed this job onto the private and commercial sector. In 1993, NSF
granted exclusive rights to Network Solutions Incorporated (NSI) for the registration and management
of the top level domains .com, .org, .net, and .edu.(NSF-InterNic)
Essentially, NSI was granted a monopoly of the public open domains. This monopoly gave NSI
and the United States government significant control of the internet. This agreement between NSI and
the United States remained until April of 1999 at which time the Internet Corporation for Assigned
Names and Numbers announced that five additional companies would be selected to participate in an
initial two month test period of a new shared registration service.(ICANN) Additional plans are in the
works to allow 29 companies to compete for Domain Name registration rights. This plan is the first step
in achieving President Clinton's goals of the privatization of the management of Domains while
providing competition for domain name services.(US Department Of Commerce News)
One of the major criticisms of the Domain Name System has been that the United States
government and NSI have had too much control. Other nations have had to rely on country code
domains or purchase generic top level domains through NSI. This attempt to open up the registry to
competing organizations helps to decentralize some of the control that NSI and the United States have
held for several years. Even though the U.S. has maintained this level of control, it by no means can be

5
Internet

said that it owns the internet. The internet does not have a central point of authority or control and
therefore cannot be said to be owned by any individual group. This autonomy offers great freedoms yet
can also be a source of concern.
Conflicts and Inequities in the Domain Name System:
A major concern of many individuals is the limited number of open top level domains. Currently
there are only three (.com, .org, .net). (Domain Name Registration) The criticism is that anyone should
be allowed to create a top level domain. The success of these domains would be determined by the
what the internet market dictates.(Green Paper) The addition of these top level domains could lead to a
more descriptive system of classification. For example, the domain .arts would be a clear indication that
the site located at this domain would include information pertaining to the arts. As the system stands
today, it is difficult to tell what an individual might find at a given domain based on the domain name
alone.
The counter argument is that allowing anyone to create a domain would lead to chaos that
would lead to problems at the root server level. Furthermore, it would be more difficult for companies to
protect their trademarks if there were a large number of top level domains. A trademark dilemma has
already begun with only three domains. For example, the domain peta.org was originally registered to a
parody organization known as People Eating Tasty Animals. In the real world peta is an acronym for the
organization People for the Ethical Treatment of Animals. As a result of this controversy the Domain
Name Rights Coalition was founded to provide protection to organizations wishing to protect
trademarks.(Whose Internet Is It Anyway?) A great debate continues on whether a company that holds
a trademark in the real world has exclusive rights to this trademark in cyberspace. The US Department
of Commerce states "For cyberspace to function as effective commercial market, businesses must
have confidence that their trademarks can be protected. On the other hand, management of the
Internet must respond to the needs of the Internet community as a whole, and not trademark owners
exclusively. The balance we strike is to provide trademark owners with the same rights they have in the
physical world, to ensure transparency, to guarantee a dispute resolution mechanism with resort to a
court system, and to add a new top-level domains carefully during the transition to private sector
coordination of the domain name system."(Green Paper) Even with these reservations seven new
domains are proposed to be added in the near future. (BBC News)
Relevance to Education:
Educators must be aware of domains and what may lie in each type of domain. As the Domain
Name System stands today this is not always a clear issue. The .com, .net, and .org domains can be
misleading in that any type of information can be found housed at these sites. Sites in these domains
need to be examined carefully before they are used with students in a classroom setting. That is not to
say that commercial sites are unworthy of educational use; they just need to be previewed carefully.
Restricted domains such as .edu and .gov are less misleading in that certain restrictions apply to who
may post information on these sites. However, even with these restrictions teachers must take time to
carefully examine the site. In addition, teachers need to inform students about domains as well. For
example, it is important for students to realize that .com is a commercial site and .edu is a higher
education site if they are to evaluate the type and validity of information presented.
From an educator's standpoint, a more descriptive domain system would be a welcome addition
to the internet community. New domains that accurately describe the nature of the site would greatly
benefit teachers and students alike. For such a system to work, a rigorous organizational structure must
be put into place to insure that sites are registered into their proper domains. Until such a system

6
Internet

exists, however, teachers and students need to continue to search through numerous sites seeking
information.
ANATOMY OF INTERNET
Anatomy refers to the study of structure. While studying the structure of the Internet, it can be
broadly said that the Internet is a network of computer networks. Internet brings together these
computers through the communication media and protocols. It also enables the Computers to
Communicate with one another.
A study of the Internet outlines the following major components of the Structure:
Internet Services
Elements of the Internet
Uniform Resource Locators
Internet Protocol

INTERNET SERVICES
The Internet is a combination of many of networks and a large number of databases and other
services. Most of the services can be accessed by using a Web browser such as Netscape Navigator,
Microsoft Internet Explorer or Mozilla Firefox. The major services offered on the Internet are given
below:

Internet Service Started Description

E-Mail(Electronic Mail) 1970 E-mail is the most common service of the internet.

It is designed for uploading and downloading of


FTP 1973
larger file on internet.

Usenet is a public messaging and bulletin board


Newsgroups(Usenet) 1979 system. It comprises of more than 34,000 individual
forums, and each one pertains to a specific topic.

Mailing lists are a group-based messaging service.


Once subscribed, we would receive mailing list
Mailing Lists 1981
message via a standard e-mail account. There are
currently over 90,000 Internet mailing lists.
This service features user-friendly publishing and
multimedia documents and files. Web pages are
World Wide Web 1992
created using HTML, JavaScript, Java and many
more.
ELEMENTS OF THE INTERNET
A simplified hierarchical model of the Internet includes client PCs, server computers and
networks (composed of both clients and servers).

7
Internet

Clients PCs These are the computers that request information from servers. Client computers
typically maintain intermittent (part-time) connections. If our personal computer has access to
the Internet, it is categorized as a client computer.
A web client is the requesting program associated with the user. The web browser in our
computer is a client that requests HTML files from the web-server.
Server Computers - A server is a computer that holds the files for one or more web sites. These are
relatively powerful computers with a persistent(full-time). Internet connection and can provide
data to multiple client computers simultaneously.
Networks - These are composed of one or more server computers and multiple client PCs.
Nodes - Node is a generic term used to describe a client, server, or network (composed of clients
and servers).
UNIFORM RESOURCE LOCATORS
Uniform resource locators (URLs), are the unique addresses of Internet resources. A
URL is divided into four parts.
Transfer Protocol
Server Name
Directory Path
File Name
INTERNET PROTOCOL (IP)
IP is a method by which data is sent from one computer to another over the network.
Each computer which is connected to the Internet has at least one IP address which uniquely
identifies this computer from other computers. All other types of protocols are used to send
data.
INTERNET SERVICE PROVIDERS
Internet service providers are companies those provides access to the internet. An
Internet Service Provider (ISP) is one gateway to the Internet. In most cases, we connect to an
ISPs modems over a standard telephone line or wireless. Our modem connects to a single
modem among a bank of modems at our ISP. This is called a dial-up connection. Users within
corporations and large organizations typically connect to an ISP via a high-speed link (typically
over fiber optic cabling) called a direct connection.
TRANSMISSION CONTROL PROTOCOL / INTERNET PROTOCOL (TCP/IP)
TCP/IP is a protocol suite that is used to transfer data over the Internet. The two
main protocols in this protocol suite are:
TCP : It forms the higher layer of TCP/IP and divides a file or message into smaller packets, which are
then transmitted over the Internet. Following this, a TCP layer receives these packets on the other side
and reassembles them into the original message.
When two computers seek a reliable communication between each other, they establish a
connection. This is analogous to making a telephone call. If we wants to speak to anybody, anywhere, a
connection is established when we dial his phone number and he answers. The TCP guarantees that
data sent from one end of the connection actually reaches the other end in the same order it was sent.
Otherwise, an error is reported.
8
Internet

IP : It forms the lower layer of the protocol suite. The address part of all the packets is handled by it
in such a manner that they reach the desirable destination. Usually, this address is checked by each
gateway computer on the network to identify where the message is to be forwarded. This implies that
all the packets of a message are delivered to the destination regardless of the route used for delivering
the packets.
It is the basic protocol of the Internet. The task of the IP is to send a packet for one computer to
another. The IP does not verify that the packet really reaches its destination, not is it the task of the IP
to make sure that it reaches error-free, and in the correct order.
The working of TCP/IP can be compared to shifting our residence to a new location. It involves
packing our belongings in smaller boxes for easy transportation, with the new address and a number
written on each of them. We then load them on multiple vehicles. These vehicles may take different
routes to reach the same destination. The delivery time of vehicles depends on the amount of traffic
and the length of the route. Once the boxes are delivered to the destination, we check them to make
sure that all of them have been delivered in good shape. After that, we unpack the boxes and
reassemble our house.
HYPERTEXT TRANSFER PROTOCOL
HTTP is a protocol that transfer files (text, image, video, sound and other multimedia files) using
the Internet. It runs on top of the TCP/IP protocol suite and is an application protocol that forms the
foundation protocol of the Internet. It assists in defining how messages are transmitted and formatted,
and specifies the actions that Web browsers and Web servers must engage in while responding to the
issued commands. HTTP is based on a client/server architecture where the Web browser acts as an
HTTP client making requests to the Web server machines. In addition to Web pages, a server machine
contains an HTTP daemon that handles the Web page requests. Typically, when a user clicks on a
hypertext link or types a URL, an HTTP request is built by the browser and sent to the IP address
specified in the URL. This request is then received by the HTTP daemon on the destination server,
which, in response, sends back the Web page that is requested.
HTTP is a stateless protocol, which means each request is processed independently without
any knowledge of the previous request. This is why server side programming languages such as JSP,
PHP and ASP.NET have gained popularity.
FILE TRANSFER PROTOCOL (FTP)
FTP is an application protocol that allows files to be exchanged between computers through the
Internet. It is the simplest protocol for downloading / uploading a file from / to a server, and is therefore
also the most commonly used one.
Numerous FTP servers all over the world allow users anywhere on the Internet to log in and
download files placed on them. FTP also works on a client / server architecture where an FTP client
program is used to make a request to an FTP server.FTP can be used with a simple command line
interface such as MS-DOS prompt or with a commercial program that comes with a graphical user
interface. Typically, a login to an FTP server is needed for this purpose. However, anonymous FTP can
be used to access files that are publicly available.
IP ADDRESS
Since the Internet consists of a large number of computers connected to each other, it requires
a proper addressing system to uniquely identify each computer on the network. Each computer
connected to the Internet is associated with a unique number and / or a name called the computer
address. Before we can access any Web page on a computer, we require the computer address.
9
Internet

An IP address is a unique number associated with each computer, making it uniquely


identifiable among all the computers connected to the Internet. This is a 32 bit number and is divided
into four octets. For human readability, it is represented in a decimal notation, separating each octet
with a period. First octet can range from 0 223 where as other three octet can range from 0 255.
Each address consists of a network number, an optional sub-network number, and a host number.
UNIFORM RESOURCE LOCATOR
URL refers to address on the internet. Billions of documents multimedia files can be
accessed on Internet through their URLs. The URL contains the following information.
The type of service the resource is served by (HTTP, FTP, etc.).
The Internet name of the site containing the resource (document data).
The Internet port number of the service. ( If this is omitted, the browser assumes a
commonly accepted default value.)
The location of the resource in the directory structure of the server.
STRUCTURE OF A URL
The following is an outline of the most common form of a URL:
http:/ /www.address.edu :1234 /path/subdir/file.htm
service host port file and resource details
SERVICE
The first part of the URL is the service specifier ( here http service ) which identifies both the
protocol and server. This is the part before the colon.
ADDRESS AND PORT NUMBER
The second part is usually the Internet address of the server, indicated by the double forward
slash(//). This address can also contain the (optional)port number the service lists at. If want to use the
default port number, we can leave out both the colon and the number, that is //www.address.edu/.
commonly a web servers name will begin with www for World Wide Web. The .edu suffix (called a
domain indicator) indicates that address is a school or university.
RESOURCE LOCATION
The forward slash after the host and port specifications indicates the end of the address and the
beginning of the specification for the file/resource to be accessed. The resource is specified by a path
relative to the root directory of the server.

10
Internet

ABSOLUTE URL
A fully qualified URL that specifies the location of a resource that resides on the Internet is
called an absolute URL. It is the complete path including the domain file name. Example:
http://www.ibdhot.com/images/logo.gif specifies an image file (logo.gif) located in the images direcory,
for the www.ibdhost.com domain. This type of URL is what you must use when you want to link ( or
load) a file that is on another server.
RELATIVE URL
A partially qualified URL is the one that specifies a resource on the Internet whose location is
relative to a starting point specified by an absolute URL. In fact the concatenated absolute and relative
URLs constitute a complete URL.
The relative URL points to a file or directory in relation to the present file or directory(folder).
Relative URLs, help in web site maintenance. It is easy to move a file from one directory (folder) to
another or a web site from one domain name to another. We dont have to worry about updating the
link(s) or the src(img) path(s).
WORLD WIDE WEB
World Wide Web is more popularly known as www or the Internet. World Wide Web (www) is a
huge collection of hypertext pages on the Internet. The concept of www started by Tim Berners-Lee
developed in Switzerland at the European Particle Research Center (known as CERN) in the year
1989. The first text based prototype was operational in 1991. In the month of December, 1991 a public
demonstration was given at Hypertext91 conference in San Antonio, Texas (USA). In the year 1993,
the first graphical interface software package called Mosaic was released. In the first year after Mosaic
was released, the number of www servers grew from 100 to 7000.
All the Web servers on the Internet are collectively referred to as the World Wide Web. The W3
Consortium is the closest anyone gets to setting the standards for and enforcing rules about the World
Wide Web. We can visit the Consortiums home page at http://www.w3.org/. The Second group of
organizations that influences the Web is the browser developers themselves, most notably Netscape
Communication Corporation and Microsoft Corporation of USA.
BASIC FEATURES OF WEB
The Web is one of the most flexible and exciting tools for surfing the Internet. Using Mosaic
viewer, the www made it possible for a site to set up a number of pages of information containing text,
picture, sound and even video with embedded links to other pages. By clicking on a link, the user is
moved to the page pointed to by that link.
Hypertext Information System
The idea behind hypertext is that instead of reading text in a rigid, linear structure, we can skip
easily from one point to another. We can get more information, go back, jump to other topics, and
navigate through the text based on what interests we at a time in the World Wide Web.
Graphical and Easy to Navigate
One of the best features of the Web is its ability to display both text and graphics in full colour
on the same page. Before the Web, using the Internet involved simple text-only connections or
complicated interfaces or encoding to view graphic.
The Web now provides capabilities for graphics, sound and video to be incorporated with the
text. Newer Web browsers include capabilities for multimedia and embedded applications. More

11
Internet

importantly, the interface to all this is easily navigable just jump from link to link, from page to page,
across sites and servers.
Cross-platform
Cross-platform means that we can access Web information equally well from any computer
hardware running any operating system using any type of display. The World Wide Web is not limited to
any one kind of machine, or developed by any one company. The Web is entirely cross-platform.
The Web is Distributed
The Web is successful in providing so much information because that information is distributed
globally across thousands of Web sites, each of which contributes the space for the information in
publishes. We as a consumer of that information, go to that site to view the information. When our work
is done, we can go to some other site. We do not have to install it, or change disks, or do anything
other than point our browser at that site.
The Web is Dynamic
Because information on the Web is contained on the site that published it, the people who
published it in the first place can update it at any time. If we are browsing that information, we do not
have to install a new version of the help system, buy another book or call technical support to get
updated information.
Accessing Many Forms of Internet Information
There are dozens of different ways of getting the information on the Net namely, FTP, Gopher,
Usenet news, Telnet and e-mail. Before the Web became as popular as it is now, to get to these
different kinds of information we had to use different tools for each one, all of which had to be installed
and all of which used different commands. The Web browsers namely Internet Explorer, Netscape
Navigator have changed all this.
The Web is Interactive
Interactive is the ability to talk back to the Web server. The Web is interactive. It means the act
of selecting a link and jumping to another Web Page to go somewhere else on the Web. In addition to
this simple interactivity, the Web also enables us to communicate with the publisher of the pages we
are surfing.
In addition to forms, advanced features of Web development provide more facilities. For
example, Java enables us to include entire programs and games inside Web pages. Developments in
3D-world enable us and our readers to browse the Web as if they were wandering through real three-
dimensional rooms and meeting other people.
WWW BROWSERS
A Web browser is a program we use to view pages on Net and navigate the World Wide Web. A
wide range of Web browsers are available for every type of systems we can imagine, including GUI and
CUI for dial-up UNIX connection. Most of the browsers are freeware. The most common browsers for
World Wide Web are Netscapes Navigator developed by Netscape Communication Corporation and
Internet Explore developed by Microsoft Corporation.
Retrieving documents from the Web and formatting them for our system are the two tasks that
make up the core of a browsers functionality. However, depending on the browser we use and the
features it includes, we may also be able to play multimedia files, view and interact with Java Applets,
read our mail or use other advanced features that a particular browser offers.

12
Internet

A Web browser does the following two types of services:


Given a URL address, it should be able to access that information. For hypertext Web documents, this
means that the browser must be able to communicate with the Web Server using the HTTP protocol.
The Web can also manage information contained on FTP and Gopher Servers, in Usenet news
postings, in e-mail, and so on, browsers can often communicate with those servers or protocols as well.
SOME POPULAR WEB BROWSERS
Netscape Navigator
Netscape Navigator is available for Windows, Macintosh and many different versions of UNIX
running the X-Window System. It is well supported and provides up to the minute features including an
integrated news and mail reader, support for Java Applets, and the ability to handle plug-ins for more
new and interesting features.
Microsoft Internet Explorer
Microsofts browser Internet Explorer, usually just called Explorer. It is runs on all versions of
Windows OS and Macintosh. It is free for downloading from Microsofts Web Site.
Mozilla Firefox 2.0
It is a fast, full featured Web browser that makes browsing very efficient. Firefox includes pop-up
blocking, tab-browsing, integrated Google searching, simplified privacy controls that lets us cover our
tracks more effectively. A streamlined browsers window that shows us more of the page, than any other
browser. A number of additional features that work with us to help us to get the most out of our time
online. This is also available free.
Web Page
The name says it all - a web page is a page on the web. It's a digital document, a computer
file, if you like, that is typically viewed using a web browser program. In the interconnected network of
document and web sites which we call the World Wide Web (WWW), the linking is actually between
web pages and, hence, I like to consider them as the building blocks of the web.
How are web pages made?
Web pages are created using HTML which stands for Hypertext Markup Language. All web
pages, whether big or small, have to be developed in HTML to be displayed in web browsers. HTML,
contrary to its name, is not a language per se. Rather, it consists of tags that specify the purpose of
what they enclose. For instance, by surrounding a block of text on a web page with the <p> tag (the
paragraph tag) tells the browser that all that text is to be placed as paragraph or using the <em>
around a phrase will give emphasis to it. Most HTML tags come in pairs - there is an opening tag and a
closing tag. Here is an example in which I've used the two tags mentioned above - paragraph and
emphasis:
<p>This is a simple paragraph on a web page using HTML paragraph tags that come in a pair -
the opening tag specifies the beginning while the ending tag marks the end. Note the ending tag has a
forward slash.
You can also <em>emphasize some text</em> inside a paragraph.</p>
The above is displayed on a web page as (without the yellowish background color and the
dotted border, of course):

13
Internet

This is a simple paragraph on a web page using the HTML paragraph tags that come in a pair -
the opening tag specifies the beginning while the ending tag marks the end. Note the ending tag has a
forward slash. You can also emphasize some text inside a paragraph.
What do web pages contain?
A web page digital file consists of only text and HTML code. It is read by a web browser,
parsed and displayed to the user based on the instructions in HTML tags. Though you'll see multimedia
content such as images, audio and video or Flash animations on a web page when its shown in a
browser window, the web page file itself doesn't contain this information. The multimedia files are
separate and are included in the web page through HTML tags.
A bunch of web pages with their multimedia files makes a web site. Having said that, a web site
can also have just one web page - the home page of the web site - and nothing else.
How do you create a web page?
As mentioned above, web pages are developed with HTML which is quite simple to learn and
you'll find a basic as well as an advanced HTML tutorial on this web site. To learn HTML and start
creating web pages you don't need any specialized software - you already have what you will require,
which is a plain text editor such as Notepad (in Windows) and a web browser to review your work. But,
and here is the good news, if you don't have the time or the inclination to study HTML you can still
create web pages - and fairly quickly, if I may add - using WYSIWYG HTML editors. WYSIWYG stands
for What You See Is What You Get and in these editors you simply place elements - text, images etc. -
and the program will automatically "write" the HTML code for you. There are tons of HTML editors
available - some are simple to pick up while others have a steeper learning curve; some are free while
others cost a pretty penny. All-in-all, it's not a bad idea to try your hands with a free WYSIWYG HTML
editor and see where you go from there. I also recommend reading the advantages and disadvantages
of using HTML editors for create web pages.

14

You might also like