Global Communication Technique and Media

view with charts and images

Global Communication Technique and Media

Global communication is the process of transmitting and receiving information on a world-wide scale. People have been communicating on a global scale for centuries (Lubbers & Koorevaar, 2000). Isolation and distance have always been factors in the study of Australian history (Wiseman, 1998)and until recent times it was difficult to communicate with other countries, with factors such as distance, time, and language barriers being major restrictions, however with the evolution of technology, global communication has become increasingly easy, faster, clearer and more effective (Lubbers & Koorevaar, 2000). The evolution of global communication can be linked closely to the evolution of technology. As new creations, such as the internet are continually being invented, improved and converged with other products, they are enabling new modes of interaction.

Modern communication technology depends upon computer which is a device made up of electronic and electromechanical components. There are a number of communication channels reporting both the wide range of possible uses of data communication and the various new technologies in the communication industries. Oral or voice media in electronic forms includes – voice mail, audio tape and video tape, tale conferencing and video conferencing, close circuit television, instant message etc. written media in electronic forms include – e-mail, fax, computer conferencing and website.

The use of internet and other technologies such as e-mail, voice mail and fax etc. introduce effective organizational communication of the computers.


Electronic mail, commonly called email or e-mail, is a method of exchanging digital messages across the Internet or other computer networks. Email systems are based on a store-and-forward model in which email server computer systems accept, forward, deliver and store messages on behalf of users, who only need to connect to the email infrastructure, typically an e-mail server, with a network-enabled device for the duration of message submission or retrieval. Originally, email was transmitted directly from one user’s device to another user’s computer, which required both computers to be online at the same time.

The @ symbol: Ray Tomlinson chose the @ symbol to tell which user was “at ’’ what computer. The @ goes in-between the user’s login name and the name of his/her host computer.

Dave Crocker

It soon became obvious that the ARPANET was becoming a human- communication medium with very important advantages over normal U.S. mail and over telephone calls. One of the advantages of the message systems over letter mail was that, in an ARPANET message, one could write tersely and type imperfectly, even to an older person in a superior position and even to a person one did not know very well, and the recipient took no offense. The formality and perfection that most people expect in a typed letter did not become associated with network messages, probably because the network was so much faster, so much more like the telephone.

J.C.R. Licklider, Albert Vezza, Applications of Information Networks, Proc of the IEEE, 66(11), Nov 1978.

Who invented email?

Electronic mail is a natural and perhaps inevitable use of networked communication technology that developed along with the evolution of the Internet. Indeed, message exchange in one form or another has existed from the early days of timesharing computers. Network capable email was developed for the ARPANET shortly after its creation, and has now evolved into the powerful email technology that is the most widely used application on the Internet today. Key events and milestones in the invention of email are described below:

Timesharing computers. With the development in the early 1960’s of timesharing computers that could run more than one program at once, many research organizations wrote programs to exchange text messages and even real-time chat among users at different terminals. As is often the case, more than one person at the same time noticed that it was a natural use of a new technology to extend human communications. However, these early systems were limited to use by the group of people using one computer.

SNDMSG & READMAIL. In the early 1970’s, Ray Tomlinson was working on a small team developing the TENEX operating system, with local email programs called SNDMSG and READMAIL. In late 1971, Tomlinson created the first ARPANET email application when he updated SNDMSG by adding a program called CPYNET capable of copying files over the network, and informed his colleagues by sending them an email using the new program with instructions on how to use it.

To extend the addressing to the network, Tomlinson chose the “commercial at” symbol to combine the user and host names, providing the naturally meaningful notation “user @ host” that is the standard for email addressing today. These early programs had simple functionality and were command line driven, but established the basic transactional model that still defines the technology — email gets sent to someone’s mailbox.

MAIL & MLFL. In 1972, the commands MAIL and MLFL were added to the FTP program (RFC 385) to provide standard network transport capabilities for email transmission. FTP sent a separate copy of each email to each recipient, and provided the standard ARPANET email functionality until the early 1980’s when the more efficient SMTP protocol was created. Among other improvements, SMTP enabled sending a single message to a domain with more than one addressee, after which the local server would locally copy the message to each recipient.

RD. The Director of ARPA, Steve Lukasik, asked Lawrence Roberts, then the director of the IPTO, to improve on READMAIL, which required messages to be read in order, and with no ability to save or reply. Roberts wrote RD in one three-day weekend as a collection of macros in the Tenet text editor TECO (Text Editor and Corrector), and called the program called RD.

The new program included capabilities to sort email headers by subject and date, giving users the ability to order the messages in their Inbox, and to read, save, and delete messages in the order they wished. In a sign of the pragmatism associated with much of the email development over the years, RD was developed not as a research effort, but as a practical effort to solve a real-world problem of email management.

NRD. The DARPA researcher Barry Wissler improved on RD, and called the new program NRD, including several new usability features.

WRD / BANANARD. Marty Yoke recoded SNDMSG and NRD into an independent program called WRD. This was the first program to integrate reading, sending, and a user-friendly help system in the same application, and was later renamed BANANARD.

MSG. John Vital improved on BANANARD and called the new program MSG, with powerful features like message forwarding, a configurable interface, and an Answer command that automatically created properly addressed replies. MSG can fairly be called the first modern email program.

Dave Crocker (see <href=”#ms”>MS below) feels the Answer command was revolutionary: “My subjective sense was that propagation of MSG resulted in an exponential explosion of email use, over roughly a 6-month period. The simplistic explanation is that people could now close the Shannon-Weaver communication loop with a single, simple command, rather than having to formulate each new message. In other words, email moved from the sending of independent messages into having a conversation.”

MS / MH. In 1975, the DARPA program manager Steve Walker initiated a project at RAND to develop an MSG-like email capability for the Unix operating system. The project was undertaken by Dave Farber, professor at the University of California at Irvine. Dave Crocker, starting graduate school at the University of Southern California’s Annenberg School, designed the functional specifications, and Steve Tipper and Bill Crosby did the programming.

The resulting system supported multiple user interfaces, from the basic Unix email command to the MSG interface, and was called MS. Crocker comments: “The program was very powerful, and very, very slow.” A follow-on project at RAND rebuilt the program to take more advantage of the Unix system environment, breaking the commands out into individual programs that ran in individual Unix shells. Bruce Borden did most of the programming, and named the resulting application MH as an abbreviation of Mail Handler. Since 1982, Marshall Rose and others have upgraded and maintained MH, and it has become the standard email application for the UNIX environment.

RFC 733. In 1977, Crocker, John Vital, Kenneth Program, and D. Austin Henderson collaborated on a DARPA initiative to collect various email data formats into a single, coherent specification, resulting in RFC 733.

The specification combined existing documentation with a bit of innovation, and was the first RFC explicitly declared an Internet standard in order to try and bring some order to the various email formats in use across the ARPANET — an effort not initially greeted with universal approval among the independent, distributed research community. In 1982, Crocker revised RFC 733 to produce RFC 822, which was the first standard to describe the syntax of domain names.

MMDF. In 1978, Crocker followed Dave Farber to the University of Delaware, where they took on a project for the U.S. Army Materiel Command (AMC) to develop a capability to relay email over dial-up telephone lines for sites that couldn’t connect directly to the ARPANET. Crocker created the first version of what would become the Multi-purpose Memo Distribution Facility (MMDF) in six months of work, and then set up and operated an experimental relay site at the University of Delaware for various AMC sites.

The MMDF link-level protocol was developed by Ed Szurkowski. Several others worked on the software after Crocker left, including Doug Kingston, Craig Partridge, and Steve Killed, developing enhancement such as creation of a robust TCP/IP layer. Killed adapted the software to support the ISO/CCITT OSI X.400 email standard, one of the first systems to do so, naming the software “PP” after “Postman Pat”, British vernacular for the local postal delivery person. MMDF was also deployed to provide the initial email relay capability for the <href=”#mmdf”>CSNET.

Send mail. In the early 1980’s, email relaying was also being performed using the simple UUCP technology at the University of California at Berkeley, where the <href=”#bsd”>BSD Unix operating system was developed. Eric Allan later created a program called deliver mail to cobble together multiple email transport services, creating, in effect, a switch rather than an integrated email store-and-forward capability. Allan then built on this experience to create the send mail program, which was distributed with BSD Unix, and has gone on to become the the most commonly used SMTP server on the Internet.

Commercial Email. In 1988, Vinton Cerf arranged for the connection of MCI Mail to the NSFNET through the Corporation for the National Research Initiative (CNRI) for “experimental use”, providing the first sanctioned commercial use of the Internet. Shortly thereafter, in 1989, the CompuServe mail system also connected to the NSFNET, through the Ohio State University network.

Online Services. In 1993, the large network service providers America Online and Delphi started to connect their proprietary email systems to the Internet, beginning the large scale adoption of Internet email as a global standard.

Advantages of E-mail

1. Email enables speedy communication

2. Email provides the receiver an option to respond immediately

3. Email saves tons of trees daily (Alternative to papers)

4. Email saves tons of fuels daily (Freight vehicle fuels)

5. User enjoys sending colorful and attractive messages using HTML

6. Businesses enjoys no or low cost communication

7. A new business opportunity for email service providers through Ads

8. A free communication media for users

9. More delightful fast and rich media presentation for receiver

10. After all, email is the first source to interact with a person personally when he is free unlike in a telephone or any real time we need the receiver to be engaged

11. e-mail is a low cost way to transmit messages

12. e-mail messages are always easy to locate

13. it is always a secure means of sending messages

14. it eliminates the need for conventional surface mail

15. you can choose priority (high which is fast or low which is slow)

Disadvantages of E-mail

1. Lack of computer knowledge among people, so not sure all receiving parties use email system

2. Unwanted SPAM emails

3. Illegal contents including VIRUS damages end user systems, data & reputation

4. Email might not send due to loss of connection to the interne

5. Email can become time-consuming for answering complicated questions and misunderstandings can arise because cultural differences in the interpretation of certain words. The telephone is much better for providing detailed answers or if you feel that the question is not absolutely clear.

6. Email can compromise the security of an organization because sensitive information can be easily distributed accidently or deliberately. Email should be entrusted to well trained and trusted staff members.

7. Email can become impersonal or misunderstood.


A Samsung fax machine

A fax (short for facsimile and sometimes called telescoping) is the telephonic transmission of scanned-in printed material (text or images), usually to a telephone number associated with a printer or other output device. The original document is scanned with a fax machine, which treats the contents (text or images) as a single fixed graphic image, converting it into a bitmap. In this digital form, the information is transmitted as electrical signals through the telephone system. The receiving fax machine reconverts the coded image and prints a paper copy of the document.

A fax (short for facsimile) is a document sent over a telephone line. Fax machines have existed, in various forms, since the 19th century (see <href=”#History” title=”Fax”>”History” below), though modern fax machines became feasible only in the mid-1970s as the sophistication increased and cost of the three underlying technologies dropped. Digital fax machines first became popular in Japan, where they had a clear advantage over competing technologies like the teleprompter, since at the time (before the development of easy-to-use input method editors) it was faster to handwrite kanji than to type the characters. Over time, faxing gradually became affordable, and by the mid-1980s, fax machines were very popular around the world. Although businesses usually maintain some kind of fax capability; the technology has faced increasing competition from Internet-based alternatives. However, fax machines still retain some advantages, particularly in the transmission of sensitive material which, if sent over the Internet unencrypted, may be vulnerable to interception. In some countries, because electronic signatures on contracts are not recognized by law while faxed contracts with copies of signatures are, fax machines enjoy continuing support in business In many corporate environments, standalone fax machines have been replaced by “fax servers” and other computerized systems capable of receiving and storing incoming faxes electronically, and then routing them to users on paper or via an email (which may be secured). Such systems have the Advantage of reducing costs by eliminating unnecessary printouts and reducing the number of inbound analog phone lines needed by an office.


Alexander Bain

Facsimile transmission has been loosely defined as the means of creating an exact copy of a document at a distance. Transmission of graphic material, in either printed or pictorial form, may be via the public service telephone network (PSTN), private circuit or radio link.

The first facsimile equipment for use in communications was the chemical telegraph invented by Alexander Bain (1810-1877) in 1842 and patented during the following year. This consisted of a metallic contact resting on a moving paper slip saturated with an electrolytic solution. The wire and the tape formed part of an electric circuit and when current flowed, discoloration of the tape occurred.

It is thought that the first working model of Bain’s chemical telegraph was constructed and operated at about the time of the World Fair held in London in 1851. At this fair a second facsimile machine was demonstrated by Bake well, who had been granted the relevant patent in 1848.

Scanning principle first used in Bakewell´s “Copying telegraph”. The transmitting stylus, moving slowly along the lead screw,

traces a spiral line on the message surface

Bakewell´s Facsimile Recorder

In principle the two machines operated in similar fashion using damp electrolytic paper as a recording medium and relied for transmission on a scanning stylus being in physical contact with the text of the message, the text being in relief form with raised lettering. Both systems also depended on associated pendulums and electromagnets for synchronization. The Bain machine was essentially a flat bed machine while in Bake well’s model the relief text and receiving electrolytic papers were wound on drums.

For many years the development of facsimile equipment was directed towards improving the mechanics of the scanning and reproduction functions. In 1865 the first working trials for a commercially viable facsimile machine was set up in France by an Italian, Casella. Shortly after this Meyer facsimile machines were also tried out in the French telegraph systems.

Caselli´s Pantelegraph

Although the Catelli and Meyer machines had been brought into service there were still two major areas of difficulty to be investigated: synchronization and contact transmission. A practical method of synchronizing the early facsimile machines was finally worked out, and culminated in the La Court tuning fork controlled motor synchronization.

Facsimile was first used commercially in France as an electromechanical telegraph. In 1870 there were some 17 Meyer facsimile instruments in service in the French telegraph system in conjunction with 4000 electromechanical telegraph machines. It appears that the facsimile facility was used to a large extent by the French government and to carry information relating to stock broking. The main advantages seen at this time were the virtual elimination of errors in transmission and the availability of a facsimile signature.

The contact type transmitters used up to the early years of the 20th century were not satisfactory and limited the speed of transmission via facsimile. This was overcome through the development of a suitably sensitive photoelectric cell by Dr Arthur Koran of Germany in 1902, and his application of this cell to photo telegraphy work. The technique of this system was to transmit light through a photographic negative of the original, wound on a glass cylinder, to a photocell which converted the light pulses to electrical signals. The receiving medium was sensitized paper and the picture was reproduced in positive form. By 1910 Koran had established photo telegraphy links from Berlin to Paris and London, and in 1922 successfully transmitted by radio a picture from Rome to New York. In 1926 a commercial radio link for facsimile working was opened between the London office of the Marconi Wireless and Telegraph Company, and the New York office of the RCA.

The need to have material photographed to provide a negative for transmission, and the consequential high cost of the equipment developed on this principle, led to further research, and a system of transmission based on reflected light was evolved. In 1935 the Associated Press of the USA installed a country-wide network based on this system.

By the 1920s pictures for publication in newspapers were being transmitted around the world. Later developments of the service in the 1930’s included the introduction of weather maps and wire photo services. Technology had improved sufficiently beyond the late 19th century equipment to ensure that facsimile was a technically viable proposition even though the basic techniques and concept were unchanged.

Among the later adaptations of facsimile service by a telegraph company was that of the Western Union in the 1930s when they made facsimile machines available in public places for the transmission of messages to the nearest Western Union office. The message was then forwarded from the office in the normal telegraph manner. Unfortunately this system proved prone to vandalism and was phased out. Western Union was involved in another similar venture: “Desk Fax” introduced in 1948. Using this system private companies who rented transmitters were able to send short messages via a Western Union telegraph office.

The main area in which facsimile proved successful in augmenting telegraph facilities was in the transmission of photographs i.e. photo telegrams – mainly newspaper pictures, but also pictures of documents, machine drawings and fingerprints. This service grew from the start of the New York – London link in 1926 and continued to thrive. By 1950 access to 24 countries was available and in 1963 the Post Office photo telegraphic system was operating services to and from 56 European terminals and 38 extra-European terminals. In January 1976 these figures were 47 and 51 respectively to a total of 65 countries.

The success of photo telegraphy was not reflected in other uses devised for facsimile. Attempts to introduce home news broadcasts in manuscript form and thus bring facsimile into the residential market failed. Such systems were tried as early as 1929 in America and throughout the 1930s. Once television was introduced there was no possibility of facsimile competing.

As a telecommunications medium facsimile remained from the 1930’s to the early 1960’s essentially a system for specialized applications with sophisticated expensive machines – the two main sections of use being in distributing weather charts and in the newspaper industry.

Although suitable telephone coupling devices were available from the 1930s it was not until the 1960s that relatively cheap facsimile machines were available for connection to the PSTN. Growth in the market was prompted by declining postal services in the USA, and in Japan by the pictorial nature of the alphabet. These new machines became known as document facsimile machines and were used for transmitting handwritten, typed or printed text and drawings. A contributory factor to the late development of a simple dial-up facsimile unit was the relatively late stage at which solid state techniques were introduced to the facsimile system.

Europe lagged behind the USA and Japan, but early growth followed agreed standards on machine design by the International Telegraph and Telephone Consultative Committee (CCITT). The introduction of Group 1 standard in 1968 was a significant step in the development of facsimile, despite slow and unreliable terminals and lack of full compatibility. It took 6 minutes to transmit an A4 page, but the machine stimulated interest in the concept of sending text and graphic material by telephone around the world instead of heavy reliance on the postal service.

A Group 2 standard was agreed in 1976, which halved the time of transmission to 3 minutes and improved quality with a scanning density of 100 lines per inch. But the density remained unsatisfactory for sending documents containing small print and the time for transmission still meant that a 10 page document took half an hour to receive.

A further CCITT standard was agreed in 1980 for Group 3 machines, which used digital transmission techniques and took less than one minute per page with an improved scanning resolution of 200 lines per inch. All were compatible and could communicate with most Group 2 machines regardless of supplier.

[This article was contributed by the BT Archives and Historical Information Centre]

Advantages of Fax

· Fax machine sends inexpensive messages and provides a written record of the message transmitted. The delivered page shows the phone no. and time of the transmission so there is a written record which can be used in a court of law. In this respect it is superior to an email message because that detail is printed right on the document.

· Each page is assigned a number.

· A fax transmission is secure in that it goes only to phone number address. It is also instantaneous and receipt is acknowledged. The trend in faxing right now is the internet fax service. It is very cost effective since it does not require paper, dedicated phone line and ink. It is a web based technology making online fax documents accessible anytime and anywhere as long as you have a device connected to the internet.

Disadvantages of Fax

· Need a separate space for the machine.

· Needs a telephone line.

· Needs power/electricity

· Requires paper for receiving and sending faxes

· Machine needs to be on always to receive faxes. If the machine is off, we may lose the faxes.

· Large probability of losing faxes, which could potentially be important.

· Storage and Archival of these Printed Paper Faxes are very cumbersome. They’ll need a good amount of space in the office.

· Aging of papers could lead to difficulties in managing records.

· Internet


The Internet is a global system of interconnected computer networks that use the standard Internet Protocol Suite (TCP/IP) to serve billions of users worldwide. It is a network of networks that consists of millions of private, public, academic, business, and government networks of local to global scope that are linked by a broad array of electronic and optical networking technologies. The Internet carries a vast array of information resources and services, most notably the inter-linked hypertext documents of the World Wide Web (WWW) and the infrastructure to support electronic mail.

Most traditional communications media, such as telephone and television services, are reshaped or redefined using the technologies of the Internet, giving rise to services such as Voice over Internet Protocol (VoIP) and IPTV. Newspaper publishing has been reshaped into Web sites, blogging, and web feeds. The Internet has enabled or accelerated the creation of new forms of human interactions through instant messaging, Internet forums, and social networking sites.

History of the Internet

The origins of the Internet reach back to the 1960s when the United States funded research projects of its military agencies to build robust, fault-tolerant and distributed computer networks. This research and a period of civilian funding of a new U.S. backbone by the National Science Foundation spawned worldwide participation in the development of new networking technologies and led to the commercialization of an international network in the mid 1990s, and resulted in the following popularization of countless applications in virtually every aspect of modern human life. As of 2009, an estimated quarter of Earth’s population uses the services of the Internet.

The USSR‘s launch of Sputnik spurred the United States to create the Advanced Research Projects Agency (ARPA or DARPA) in February 1958 to regain a technological lead.<href=”#cite_note-1″>[2]<href=”#cite_note-2″>[3] ARPA created the Information Processing Technology Office (IPTO) to further the research of the Semi Automatic Ground Environment (SAGE) program, which had networked country-wide radar systems together for the first time. The IPTO’s purpose was to find ways to address the US Military’s concern about survivability of their communications networks, and as a first step interconnect their computers at the Pentagon, Cheyenne Mountain, and SAC HQ. J. C. R. Licklider, a promoter of universal networking, was selected to head the IPTO. Licklider moved from the Psycho-Acoustic Laboratory at Harvard University to MIT in 1950, after becoming interested in information technology. At MIT, he served on a committee that established Lincoln Laboratory and worked on the SAGE project. In 1957 he became a Vice President at BBN, where he bought the first production PDP-1 computer and conducted the first public demonstration of time-sharing.

Professor Leonard Kleinrock with one of the first ARPANET Interface Message Processors at UCLA

At the IPTO, Licklider’s successor Ivan Sutherland in 1965 got Lawrence Roberts to start a project to make a network, and Roberts based the technology on the work of Paul Baran,<href=”#cite_note-3″>[4] who had written an exhaustive study for the United States Air Force that recommended packet switching (opposed to circuit switching) to achieve better network robustness and disaster survivability. Roberts had worked at the MIT Lincoln Laboratory originally established to work on the design of the SAGE system. UCLA professor Leonard Kleinrock had provided the theoretical foundations for packet networks in 1962, and later, in the 1970s, for hierarchical routing, concepts which have been the underpinning of the development towards today’s Internet.

Sutherland’s successor Robert Taylor convinced Roberts to build on his early packet switching successes and come and be the IPTO Chief Scientist. Once there, Roberts prepared a report called Resource Sharing Computer Networks which was approved by Taylor in June 1968 and laid the foundation for the launch of the working ARPANET the following year.

After much work, the first two nodes of what would become the ARPANET were interconnected between Kleinrock’s Network Measurement Center at the UCLA’s School of Engineering and Applied Science and Douglas Engelbart’s NLS system at SRI International (SRI) in Menlo Park, California, on October 29, 1969. The third site on the ARPANET was the Culler-Fried Interactive Mathematics centre at the University of California at Santa Barbara, and the fourth was the University of Utah Graphics Department. In an early sign of future growth, there were already fifteen sites connected to the young ARPANET by the end of 1971.

The ARPANET was one of the “eve” networks of today’s Internet. In an independent development, Donald Davies at the UK National Physical Laboratory also discovered the concept of packet switching in the early 1960s, first giving a talk on the subject in 1965, after which the teams in the new field from two sides of the Atlantic ocean first became acquainted. It was actually Davies’ coinage of the wording “packet” and “packet switching” that was adopted as the standard terminology. Davies also built a packet switched network in the UK called the Mark I in 1970. <href=”#cite_note-4″>[5]

Following the demonstration that packet switching worked on the ARPANET, the British Post Office, Telenet, DATAPAC and TRANSPAC collaborated to create the first international packet-switched network service. In the UK, this was referred to as the International Packet Switched Service (IPSS), in 1978. The collection of X.25-based networks grew from Europe and the US to cover Canada, Hong Kong and Australia by 1981. The X.25 packet switching standard was developed in the CCITT (now called ITU-T) around 1976.

A plaque commemorating the birth of the Internet at Stanford University

X.25 was independent of the TCP/IP protocols that arose from the experimental work of DARPA on the ARPANET, Packet Radio Net and Packet Satellite Net during the same time period.

The early ARPANET ran on the Network Control Program (NCP), a standard designed and first implemented in December 1970 by a team called the Network Working Group (NWG) led by Steve Crocker. To respond to the network’s rapid growth as more and more locations connected, Vinton Cerf and Robert Kahn developed the first description of the now widely used TCP protocols during 1973 and published a paper on the subject in May 1974. Use of the term “Internet” to describe a single global TCP/IP network originated in December 1974 with the publication of RFC 675, the first full specification of TCP that was written by Vinton Cerf, Yogen Dalal and Carl Sunshine, then at Stanford University. During the next nine years, work proceeded to refine the protocols and to implement them on a wide range of operating systems. The first TCP/IP-based wide-area network was operational by January 1, 1983 when all hosts on the ARPANET were switched over from the older NCP protocols. In 1985, the United States’ National Science Foundation (NSF) commissioned the construction of the NSFNET, a university 56 kilobit/second network backbone using computers called “fuzzballs” by their inventor, David L. Mills. The following year, NSF sponsored the conversion to a higher-speed 1.5 megabit/second network. A key decision to use the DARPA TCP/IP protocols was made by Dennis Jennings, then in charge of the Supercomputer program at NSF.

The opening of the network to commercial interests began in 1988. The US Federal Networking Council approved the interconnection of the NSFNET to the commercial MCI Mail system in that year and the link was made in the summer of 1989. Other commercial electronic e-mail services were soon connected, including OnTyme, Telemail and Compuserve. In that same year, three commercial Internet service providers (ISPs) were created: UUNET, PSINet and CERFNET. Important, separate networks that offered gateways into, then later merged with, the Internet include Usenet and BITNET. Various other commercial and educational networks, such as Telenet, Tymnet, Compuserve and JANET were interconnected with the growing Internet. Telenet (later called Sprintnet) was a large privately funded national computer network with free dial-up access in cities throughout the U.S. that had been in operation since the 1970s. This network was eventually interconnected with the others in the 1980s as the TCP/IP protocol became increasingly popular. The ability of TCP/IP to work over virtually any pre-existing communication networks allowed for a great ease of growth, although the rapid growth of the Internet was due primarily to the availability of an array of standardized commercial routers from many companies, the availability of commercial Ethernet equipment for local-area networking, and the widespread implementation and rigorous standardization of TCP/IP on UNIX and virtually every other common operating system.

This NeXT Computer was used by Sir Tim Berners-Lee at CERN and became the world’s first Web server.

Although the basic applications and guidelines that make the Internet possible had existed for almost two decades, the network did not gain a public face until the 1990s. On 6 August 1991, CERN, a pan European organization for particle research, publicized the new World Wide Web project. The Web was invented by British scientist Tim Berners-Lee in 1989. An early popular web browser was ViolaWWW, patterned after HyperCard and built using the X Window System. It was eventually replaced in popularity by the Mosaic web browser. In 1993, the National Center for Supercomputing Applications at the University of Illinois released version 1.0 of Mosaic, and by late 1994 there was growing public interest in the previously academic, technical Internet. By 1996 usage of the word Internet had become commonplace, and consequently, so had its use as a synecdoche in reference to the World Wide Web.

Meanwhile, over the course of the decade, the Internet successfully accommodated the majority of previously existing public computer networks (although some networks, such as FidoNet, have remained separate). During the 1990s, it was estimated that the Internet grew by 100 percent per year, with a brief period of explosive growth in 1996 and 1997.<href=”#cite_note-5″>[6] This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary open nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network.<href=”#cite_note-6″>[7] The estimated population of Internet users is 1.67 billion as of June 30, 2009.<href=”#cite_note-stats1-7″>[8]

Advantages of Internet

The Internet provides many facilities to the people. The main advantages of Internet are discussed below.

1. Sharing Information

You can share information with other people around the world. The scientist or researchers can interact with each other to share knowledge and to get guidance etc. Sharing information through Internet is very easy, cheap and fast method.

2. Collection of Information

A lot of information of different types is stored on the web server on the Internet. It means that billions websites contain different information in the form of text and pictures. You can easily collect information on every topic of the world. For this purpose, special websites, called search engines are available on the Internet to search information of every topic of the world. The most popular search engines are,,, etc. The scientists, writers, engineers and many other people use these search engines to collect latest information for different purposes. Usually, the information on the Internet is free of cost. The information on the Internet is available 24 hours a day.

3. News

You can get latest news of the world on the Internet. Most of the newspapers of the world are also available on the Internet. They have their websites from where you can get the latest news about the events happening in the world. These websites are periodically updated or they are immediately updated with latest news when any event happens around the world.

4. Searching Jobs

You can search different types of jobs all over the world, Most of the organizations/departments around the world, advertise their vacant vacancies on the Internet. The search engines are also used to search the jobs on Internet. You can apply for the required job through Internet.

5. Advertisement

Today, most of the commercial organizations advertise their product through Internet. It is very cheap and efficient way for the advertising of products. The products can be presented with attractive and beautiful way to the people around the world.

6. Communication

You can communicate with other through Internet around the world. You can talk by watching to one another; just you are talking with your friends in your drawing room. For this purpose, different services are provided on the Internet such as;

· Chatting

· Video conferencing

· E-mail

· Internet telephony etc.

7. Entertainment

Internet also provides different type of entertainments to the people. You can play games with other people in any part of the world. Similarly, you can see movies, listen music etc. You can also make new friends on the Internet for enjoyment.

8. Online Education

Internet provides the facility to get online education. Many websites of different universities provide lectures and tutorials on different subjects or topics. You can also download these lectures or tutorials into your own computer. You can listen these lectures repeatedly and get a lot of knowledge. It is very cheap and easy way to get education.

9. Online Results

Today, most of the universities and education boards provide results on the Internet. The students can watch their results from any part of country or world.

10. Online Airlines and Railway Schedules

Many Airline companies and Pakistan Railway provide their schedules of flights and trains respectively on the Internet.

11. Online Medical Advice

Many websites are also available on the Internet to get information about different diseases. You can consult a panel of online doctors to get advice about any medical problem. In addition, a lot of material is also available on the Internet for research in medical field.

Disadvantages of Internet

Although Internet has many advantages but it also has some disadvantages. The main disadvantages are:

1. Viruses

Today, Internet is the most popular source of spreading viruses. Most of the viruses transfer from one computer to another through e-mail or when information is downloaded on the Internet. These viruses create different problems in your computer. For example, they can affect the performance of your computer and damage valuable data and software stored in your computer.

2. Security Problems

The valuable websites can be damaged by hackers and your valuable data may be deleted. Similarly, confidential data may be accessed by unauthorized persons.

3. Immorality

Some websites contains immoral materials in the form of text, pictures or movies etc. These websites damage the character of new generation.

4. Filtration of Information

When a keyword is given to a search engine to search information of a specific topic, a large number of related links a displayed. In this case, it becomes difficult to filter out the required information.

5. Accuracy of Information

A lot of information about a particular topic is stored on the websites. Some information may be incorrect or not authentic. So, it becomes difficult to select the correct information. Sometimes you may be confused.

6. Wastage of times

A lot of time is wasted to collect the information on the Internet. Some people waste a lot of time in chatting or to play games. At home and offices, most of the people use Internet without any positive purpose.

7. English language problems

Most of the information on the Internet is available in English language. So, some people cannot avail the facility of Internet.


Spamming refers to sending unwanted e-mails in bulk, which provide no purpose and needlessly obstruct the entire system. Such illegal activities can be very frustrating for you, and so instead of just ignoring it, you should make an effort to try and stop these activities so that using the Internet can become that much safer.


This is perhaps the biggest threat related to your children’s healthy mental life. A very serious issue concerning the Internet. There are thousands of pornographic sites on the Internet that can be easily found and can be a detrimental factor to letting children use the Internet.

Though, internet can also create havoc, destruction and its misuse can be very fatal, the advantages of it outweigh its disadvantages.

So at the end we can say internet, fax and e-mail maintain an important role in global communication. Internet is a voluntary co- operative mode of communication through computer. The internet is a accessible to individuals, companies, educational institution, government agencies and other institutions all over the world. It links thousands of smaller computer networks and millions of individual computer users in home business, government office and educational institution worldwide.

Fax is an improved electronics version of the telex machine. It is the latest device for sending written information and message from one place to another. The facsimile of the written information is transmitted to the address and it is simultaneously reproduced on a paper automatically in the receiving machine.

E-mail is a method of communication on through computer network. It is a person to person communication process in the world. Electronic mail is usual to exchange message, sound, picture and data file. E-mail has become a key part of the communication networks of the overall world.