Table of Contents
References & Edit History Quick Facts & Related Topics

The Internet

The Internet grew out of funding by the U.S. Advanced Research Projects Agency (ARPA), later renamed the Defense Advanced Research Projects Agency (DARPA), to develop a communication system among government and academic computer-research laboratories. The first network component, ARPANET, became operational in October 1969. With only 15 nongovernment (university) sites included in ARPANET, the U.S. National Science Foundation decided to fund the construction and initial maintenance cost of a supplementary network, the Computer Science Network (CSNET). Built in 1980, CSNET was made available, on a subscription basis, to a wide array of academic, government, and industry research labs. As the 1980s wore on, further networks were added. In North America there were (among others): BITNET (Because It’s Time Network) from IBM, UUCP (UNIX-to-UNIX Copy Protocol) from Bell Telephone, USENET (initially a connection between Duke University, Durham, North Carolina, and the University of North Carolina and still the home system for the Internet’s many newsgroups), NSFNET (a high-speed National Science Foundation network connecting supercomputers), and CDNet (in Canada). In Europe several small academic networks were linked to the growing North American network.

All these various networks were able to communicate with one another because of two shared protocols: the Transmission-Control Protocol (TCP), which split large files into numerous small files, or packets, assigned sequencing and address information to each packet, and reassembled the packets into the original file after arrival at their final destination; and the Internet Protocol (IP), a hierarchical addressing system that controlled the routing of packets (which might take widely divergent paths before being reassembled).

What it took to turn a network of computers into something more was the idea of the hyperlink: computer code inside a document that would cause related documents to be fetched and displayed. The concept of hyperlinking was anticipated from the early to the middle decades of the 20th century—in Belgium by Paul Otlet and in the United States by Ted Nelson, Vannevar Bush, and, to some extent, Douglas Engelbart. Their yearning for some kind of system to link knowledge together, though, did not materialize until 1990, when Tim Berners-Lee of England and others at CERN (European Organization for Nuclear Research) developed a protocol based on hypertext to make information distribution easier. In 1991 this culminated in the creation of the World Wide Web and its system of links among user-created pages. A team of programmers at the U.S. National Center for Supercomputing Applications, Urbana, Illinois, developed a program called a browser that made it easier to use the World Wide Web, and a spin-off company named Netscape Communications Corp. was founded to commercialize that technology.

Netscape was an enormous success. The Web grew exponentially, doubling the number of users and the number of sites every few months. Uniform resource locators (URLs) became part of daily life, and the use of electronic mail (email) became commonplace. Increasingly business took advantage of the Internet and adopted new forms of buying and selling in “cyberspace.” (Science fiction author William Gibson popularized this term in the early 1980s.) With Netscape so successful, Microsoft and other firms developed alternative Web browsers.

Originally created as a closed network for researchers, the Internet was suddenly a new public medium for information. It became the home of virtual shopping malls, bookstores, stockbrokers, newspapers, and entertainment. Schools were “getting connected” to the Internet, and children were learning to do research in novel ways. The combination of the Internet, email, and small and affordable computing and communication devices began to change many aspects of society.

It soon became apparent that new software was necessary to take advantage of the opportunities created by the Internet. Sun Microsystems, maker of powerful desktop computers known as workstations, invented a new object-oriented programming language called Java. Meeting the design needs of embedded and networked devices, this new language was aimed at making it possible to build applications that could be stored on one system but run on another after passing over a network. Alternatively, various parts of applications could be stored in different locations and moved to run in a single device. Java was one of the more effective ways to develop software for “smart cards,” plastic debit cards with embedded computer chips that could store and transfer electronic funds in place of cash.

E-commerce

Early enthusiasm over the potential profits from e-commerce led to massive cash investments and a “dot-com” boom-and-bust cycle in the 1990s. By the end of the decade, half of these businesses had failed, though certain successful categories of online business had been demonstrated, and most conventional businesses had established an online presence. Search and online advertising proved to be the most successful new business areas.

Some online businesses created niches that did not exist before. eBay, founded in 1995 as an online auction and shopping website, gave members the ability to set up their own stores online. Although sometimes criticized for not creating any new wealth or products, eBay made it possible for members to run small businesses from their homes without a large initial investment. In 2003 Linden Research, Inc., launched Second Life, an Internet-based virtual reality world in which participants (called “residents”) have cartoonlike avatars that move through a graphical environment. Residents socialize, participate in group activities, and create and trade virtual products and virtual or real services. Second Life has its own currency, the Linden Dollar, which can be converted to U.S. dollars at several Internet currency exchange markets.

Maintaining an Internet presence became common for conventional businesses during the 1990s and 2000s as they sought to reach out to a public that was increasingly active in online social communities. In addition to seeking some way of responding to the growing numbers of their customers who were sharing their experiences with company products and services online, companies discovered that many potential customers searched online for the best deals and the locations of nearby businesses. With an Internet-enabled smartphone, a customer might, for example, check for nearby restaurants using its built-in access to the Global Positioning System (GPS), check a map on the Web for directions to the restaurant, and then call for a reservation, all while en route.

The growth of online business was accompanied, though, by a rise in cybercrime, particularly identity theft, in which a criminal might gain access to someone’s credit card or other identification and use it to make purchases.

Social networking

Social networking services emerged as a significant online phenomenon in the 2000s. These services used software to facilitate online communities, where members with shared interests swapped files, photographs, videos, and music, sent messages and chatted, set up blogs (Web diaries) and discussion groups, and shared opinions. Early social networking services included Classmates.com, which connected former schoolmates, and Yahoo! 360°, Myspace, and SixDegrees, which built networks of connections via friends of friends. By 2018 the leading social networking services included Facebook, Twitter, Instagram, LinkedIn, and Snapchat. LinkedIn became an effective tool for business staff recruiting. Businesses began exploring how to exploit these networks, drawing on social networking research and theory which suggested that finding key “influential” members of existing networks of individuals could give access to and credibility with the whole network.

Blogs became a category unto themselves, and some blogs had thousands of participants. Trust became a commodity, as sharing opinions or ratings proved to be a key to effective blog discussions, as well as an important component of many e-commerce websites. Daily Kos, one of the largest of the political blogs, made good use of ratings, with high-rated members gaining more power to rate other members’ comments; under such systems, the idea is that the best entries will survive and the worst will quickly disappear. The vendor rating system in eBay similarly allowed for a kind of self-policing that was intended to weed out unethical or otherwise undesirable vendors.

Ubiquitous computing

The combination of the connectedness of the Internet with the ability of new microprocessors that can handle multiple tasks in parallel has inspired new ways of programming. Programmers are developing software to divide computational tasks into subtasks that a program can assign to separate processors in order to achieve greater efficiency and speed. This trend is one of various ways that computers are being connected to share information and to solve complex problems. In such distributed computing applications as airline reservation systems and automated teller machines, data pass through networks connected all over the world. Distributed computing promises to make better use of computers connected to ever larger and more complex networks. In effect, a distributed network of personal computers becomes a supercomputer. Many applications, such as research into protein folding, have been done on distributed networks, and some of these applications have involved calculations that would be too demanding for any single computer in existence.

Considerable work in research laboratories is extending the actual development of embedded microprocessors to a more sweeping vision in which these chips will be found everywhere and will meet human needs wherever people go. For instance, the Global Positioning System (GPS)—a satellite communication and positioning system developed for the U.S. military—is now accessible by anyone, anywhere in the world, via a smartphone. In conjunction with various computer-mapping softwares, GPS can be used to locate one’s position and plan a travel route, whether by car, by public transit, or on foot.

Some researchers call this trend ubiquitous computing or pervasive computing. Ubiquitous computing would extend the increasingly networked world and the powerful capabilities of distributed computing—i.e., the sharing of computations among microprocessors connected over a network. (The use of multiple microprocessors within one machine is discussed in the article supercomputer.) With more powerful computers, all connected all the time, thinking machines would be involved in every facet of human life, albeit invisibly.

Xerox PARC’s vision and research in the 1960s and ’70s eventually achieved commercial success in the form of the mouse-driven graphical user interface, networked computers, laser printers, and notebook-style machines. Today the vision of ubiquitous computing foresees a day when microprocessors will be found wherever humans go. The technology will be invisible and natural and will respond to normal patterns of behavior. Computers will disappear, or rather become a transparent part of the physical environment, thus truly bringing about an era of “One person, many computers.”

Paul A. Freiberger Michael R. Swaine