Table of Contents
References & Edit History Quick Facts & Related Topics

Social networking services emerged as a significant online phenomenon in the 2000s. These services used software to facilitate online communities, where members with shared interests swapped files, photographs, videos, and music, sent messages and chatted, set up blogs (Web diaries) and discussion groups, and shared opinions. Early social networking services included Classmates.com, which connected former schoolmates, and Yahoo! 360°, Myspace, and SixDegrees, which built networks of connections via friends of friends. By 2018 the leading social networking services included Facebook, Twitter, Instagram, LinkedIn, and Snapchat. LinkedIn became an effective tool for business staff recruiting. Businesses began exploring how to exploit these networks, drawing on social networking research and theory which suggested that finding key “influential” members of existing networks of individuals could give access to and credibility with the whole network.

Blogs became a category unto themselves, and some blogs had thousands of participants. Trust became a commodity, as sharing opinions or ratings proved to be a key to effective blog discussions, as well as an important component of many e-commerce websites. Daily Kos, one of the largest of the political blogs, made good use of ratings, with high-rated members gaining more power to rate other members’ comments; under such systems, the idea is that the best entries will survive and the worst will quickly disappear. The vendor rating system in eBay similarly allowed for a kind of self-policing that was intended to weed out unethical or otherwise undesirable vendors.

Ubiquitous computing

The combination of the connectedness of the Internet with the ability of new microprocessors that can handle multiple tasks in parallel has inspired new ways of programming. Programmers are developing software to divide computational tasks into subtasks that a program can assign to separate processors in order to achieve greater efficiency and speed. This trend is one of various ways that computers are being connected to share information and to solve complex problems. In such distributed computing applications as airline reservation systems and automated teller machines, data pass through networks connected all over the world. Distributed computing promises to make better use of computers connected to ever larger and more complex networks. In effect, a distributed network of personal computers becomes a supercomputer. Many applications, such as research into protein folding, have been done on distributed networks, and some of these applications have involved calculations that would be too demanding for any single computer in existence.

Considerable work in research laboratories is extending the actual development of embedded microprocessors to a more sweeping vision in which these chips will be found everywhere and will meet human needs wherever people go. For instance, the Global Positioning System (GPS)—a satellite communication and positioning system developed for the U.S. military—is now accessible by anyone, anywhere in the world, via a smartphone. In conjunction with various computer-mapping softwares, GPS can be used to locate one’s position and plan a travel route, whether by car, by public transit, or on foot.

Some researchers call this trend ubiquitous computing or pervasive computing. Ubiquitous computing would extend the increasingly networked world and the powerful capabilities of distributed computing—i.e., the sharing of computations among microprocessors connected over a network. (The use of multiple microprocessors within one machine is discussed in the article supercomputer.) With more powerful computers, all connected all the time, thinking machines would be involved in every facet of human life, albeit invisibly.

Xerox PARC’s vision and research in the 1960s and ’70s eventually achieved commercial success in the form of the mouse-driven graphical user interface, networked computers, laser printers, and notebook-style machines. Today the vision of ubiquitous computing foresees a day when microprocessors will be found wherever humans go. The technology will be invisible and natural and will respond to normal patterns of behavior. Computers will disappear, or rather become a transparent part of the physical environment, thus truly bringing about an era of “One person, many computers.”

Paul A. Freiberger Michael R. Swaine
Top Questions

What is artificial intelligence?

Are artificial intelligence and machine learning the same?

artificial intelligence (AI), the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings. The term is frequently applied to the project of developing systems endowed with the intellectual processes characteristic of humans, such as the ability to reason, discover meaning, generalize, or learn from past experience. Since their development in the 1940s, digital computers have been programmed to carry out very complex tasks—such as discovering proofs for mathematical theorems or playing chess—with great proficiency. Despite continuing advances in computer processing speed and memory capacity, there are as yet no programs that can match full human flexibility over wider domains or in tasks requiring much everyday knowledge. On the other hand, some programs have attained the performance levels of human experts and professionals in executing certain specific tasks, so that artificial intelligence in this limited sense is found in applications as diverse as medical diagnosis, computer search engines, voice or handwriting recognition, and chatbots.

What is intelligence?

What do you think?

Explore the ProCon debate

All but the simplest human behavior is ascribed to intelligence, while even the most complicated insect behavior is usually not taken as an indication of intelligence. What is the difference? Consider the behavior of the digger wasp, Sphex ichneumoneus. When the female wasp returns to her burrow with food, she first deposits it on the threshold, checks for intruders inside her burrow, and only then, if the coast is clear, carries her food inside. The real nature of the wasp’s instinctual behavior is revealed if the food is moved a few inches away from the entrance to her burrow while she is inside: on emerging, she will repeat the whole procedure as often as the food is displaced. Intelligence—conspicuously absent in the case of the wasp—must include the ability to adapt to new circumstances.

Psychologists generally characterize human intelligence not by just one trait but by the combination of many diverse abilities. Research in AI has focused chiefly on the following components of intelligence: learning, reasoning, problem solving, perception, and using language.

Learning

There are a number of different forms of learning as applied to artificial intelligence. The simplest is learning by trial and error. For example, a simple computer program for solving mate-in-one chess problems might try moves at random until mate is found. The program might then store the solution with the position so that, the next time the computer encountered the same position, it would recall the solution. This simple memorizing of individual items and procedures—known as rote learning—is relatively easy to implement on a computer. More challenging is the problem of implementing what is called generalization. Generalization involves applying past experience to analogous new situations. For example, a program that learns the past tense of regular English verbs by rote will not be able to produce the past tense of a word such as jump unless the program was previously presented with jumped, whereas a program that is able to generalize can learn the “add -ed” rule for regular verbs ending in a consonant and so form the past tense of jump on the basis of experience with similar verbs.

(Read Ray Kurzweil’s Britannica essay on the future of “Nonbiological Man.”)

computer chip. computer. Hand holding computer chip. Central processing unit (CPU). history and society, science and technology, microchip, microprocessor motherboard computer Circuit Board
Britannica Quiz
Computers and Technology Quiz