Related Topics:
computer science
women

Computer science, like many other STEM (science, technology, engineering, and mathematics) disciplines, is primarily male-dominated. Women have made important contributions to the field with significantly less recognition than their male counterparts, and, as of 2023, approximately a fifth of computer science degrees were earned by women. Nonetheless, many women and nonbinary people have made invaluable contributions to computing. Here is a list of some of them.

Ada Lovelace (1815–52)

The English mathematician Ada Lovelace has been called the “first computer programmer” and worked alongside inventor Charles Babbage. Babbage created a prototype of a digital computer, the Analytical Engine, for which Lovelace developed a program. Her contributions are remembered with an international holiday on the second Tuesday in October, and an early programming language was named Ada in her honor.

Grace Hopper (1906–92)

The programmer who famously popularized the term computer bug (after discovering an actual moth in a computer’s hardware), Grace Hopper was a U.S. Navy officer who helped to devise the first commercial electronic computer and naval applications for the programming language COBOL (common-business-oriented language). A digital pioneer, just as Lovelace was, Hopper wrote the first computer manual, A Manual of Operation for the Automatic Sequence Controlled Calculator (1946), and designed a large-scale calculator while working for the Naval Reserve. The Grace Hopper Celebration, the largest technology conference for women and nonbinary people in computing, is held annually in her honor.

Katherine Johnson (1918–2020)

The American mathematician Katherine Johnson, whose story was immortalized in the 2016 film Hidden Figures, played an integral role in sending the first American astronauts to the Moon. Before her time working at NASA, Johnson helped to desegregate the all-white West Virginia University. When beginning her professional career at the National Advisory Committee for Aeronautics (NACA), she faced discrimination as a result of Jim Crow laws, with separate bathrooms and cafeterias designated for Black “computers” (those who manually performed calculations). After NACA was combined with NASA, Johnson was officially deemed an engineer and received credit as the first woman in her position to coauthor a research paper.

John Glenn’s Stipulation

Before he became the first human to orbit Earth in the Friendship 7 mission, astronaut John Glenn insisted that Katherine Johnson check and confirm the computer calculations for the flight trajectory.

Margaret Hamilton (1936–)

Creator of the term software engineer, American computer programmer Margaret Hamilton, like Katherine Johnson, played a key role in advancing space travel. Her code was used in the command and lunar modules of the Apollo missions to the Moon. She also invented code used to identify enemy aircraft in the first U.S. air defense system. In the 1960s software engineering was not formally taught; Hamilton began to use the term software engineer to demonstrate that her work at NASA was as important as any other engineer’s contributions. She received the Presidential Medal of Freedom in 2016.

Marsha Rhea Williams (1948–)

Marsha Rhea Williams, in 1982, was the first Black woman to receive a Ph.D. in computer science. Her academic history is diverse; Williams also has an M.S. in physics and systems and information science. Her research mainly revolved around databases and other data management systems. While working as a professor at Tennessee State University, Williams directed Project MISET (Minorities in Science, Engineering, and Technology). For the project, Williams used her knowledge of databases to improve minorities’ participation in STEM.

Adele Goldberg (1945–)

Adele Goldberg is an American computer scientist who made important contributions to object-oriented programming languages. She helped to create Smalltalk in the 1970s, a language that differed from the procedural languages popular at the time. In 1973 Goldberg joined the company Xerox PARC (Palo Alto Research Center), where she helped to develop new theories about education. She also helped to create the basis for graphical user interface (GUI) concepts.

Are you a student?
Get a special academic rate on Britannica Premium.

Annie Easley (1933–2011)

Like Katherine Johnson, Annie Easley began her career at NACA after learning that the facility was looking for those with strong skills in mathematics. After working as a manual computer, Easley began coding in languages such as FORTRAN (Formula Translating System) and SOAP (Simple Object Access Protocol). She earned a degree in mathematics in the 1970s; while earning the degree, she encouraged students to explore their interests in STEM. She also worked to address the gender and racial disparities in tech-related fields, drawing from her experiences as a Black woman in the space.

“My head is not in the sand. But my thing is, if I can’t work with you, I will work around you. I was not about to be so discouraged that I’d walk away. That may be a solution for some people, but it’s not mine.” — Annie Easley speaking about facing prejudice in a 2001 interview

Audrey Tang (1981–)

Audrey Tang was the youngest and first transgender and nonbinary member of the Taiwanese government cabinet. The prodigy, who was just 14 when they dropped out of school, started a business a year later. They worked in multiple high-level technology positions, including as an advisor for Apple, where they helped to develop the company’s virtual assistant Siri. They were named digital minister of Taiwan in 2016 and have embraced a policy of government transparency. Tang’s work was crucial to the country’s response to the COVID-19 pandemic, as they helped to create a map of locations where citizens could find face masks.

Lynn Conway (1938–2024)

An alum of Xerox PARC, Lynn Conway helped to revolutionize microchip design in the 1970s. She was also one of the first Americans to undergo a medical gender transition, helping to usher in more queer and transgender computer scientists in the following years. When she transitioned, she was fired from her position at IBM and had to fight to make her contributions visible in the following years. With Carver Mead, she led what came to be known as the Mead-Conway revolution in VLSI (Very Large Scale Integration), which makes it possible to add more transistors to an integrated circuit.

Fei-Fei Li (1976–)

A Chinese American Stanford University professor and artificial intelligence (AI) innovator, Fei-Fei Li helped to improve knowledge of computer vision systems, which can give computers the ability to “recognize” common objects and faces. She has also spoken about making AI more human-centered and advocates for greater inclusion in the field. She has been vocal about the risks of AI, regarding how it may hurt certain aspects of democracy or spread misinformation. In 2019 Li cofounded the Stanford Institute for Human-Centered Artificial Intelligence, which seeks to use AI research to improve society.

“I often tell my students not to be misled by the name ‘artificial intelligence’—there is nothing artificial about it. AI is made by humans, intended to behave by humans and, ultimately, to impact humans’ lives and human society.” — Fei-Fei Li in The New York Times in 2018

Tara Ramanathan
Table of Contents
References & Edit History Quick Facts & Related Topics
Top Questions

What is a computer?

Who invented the computer?

What is the most powerful computer in the world?

How do programming languages work?

What can computers do?

Are computers conscious?

computer, device for processing, storing, and displaying information.

Computer once meant a person who did computations, but now the term almost universally refers to automated electronic machinery. The first section of this article focuses on modern digital electronic computers and their design, constituent parts, and applications. The second section covers the history of computing. For details on computer architecture, software, and theory, see computer science.

What do you think?

Explore the ProCon debate

Computing basics

The first computers were used primarily for numerical calculations. However, as any information can be numerically encoded, people soon realized that computers are capable of general-purpose information processing. Their capacity to handle large amounts of data has extended the range and accuracy of weather forecasting. Their speed has allowed them to make decisions about routing telephone connections through a network and to control mechanical systems such as automobiles, nuclear reactors, and robotic surgical tools. They are also cheap enough to be embedded in everyday appliances and to make clothes dryers and rice cookers “smart.” Computers have allowed us to pose and answer questions that were difficult to pursue in the past. These questions might be about DNA sequences in genes, patterns of activity in a consumer market, or all the uses of a word in texts that have been stored in a database. Increasingly, computers can also learn and adapt as they operate by using processes such as machine learning.

Computers also have limitations, some of which are theoretical. For example, there are undecidable propositions whose truth cannot be determined within a given set of rules, such as the logical structure of a computer. Because no universal algorithmic method can exist to identify such propositions, a computer asked to obtain the truth of such a proposition will (unless forcibly interrupted) continue indefinitely—a condition known as the “halting problem.” (See Turing machine.) Other limitations reflect current technology. For example, although computers have progressed greatly in terms of processing data and using artificial intelligence algorithms, they are limited by their incapacity to think in a more holistic fashion. Computers may imitate humans—quite effectively, even—but imitation may not replace the human element in social interaction. Ethical concerns also limit computers, because computers rely on data, rather than a moral compass or human conscience, to make decisions.

Internet http://www blue screen. Hompepage blog 2009, history and society, media news television, crowd opinion protest, In the News 2009, breaking news
Britannica Quiz
What Do You Actually Know About the Internet?

Analog computers

Analog computers use continuous physical magnitudes to represent quantitative information. At first they represented quantities with mechanical components (see differential analyzer and integrator), but after World War II voltages were used; by the 1960s digital computers had largely replaced them. Nonetheless, analog computers, and some hybrid digital-analog systems, continued in use through the 1960s in tasks such as aircraft and spaceflight simulation.

One advantage of analog computation is that it may be relatively simple to design and build an analog computer to solve a single problem. Another advantage is that analog computers can frequently represent and solve a problem in “real time”; that is, the computation proceeds at the same rate as the system being modeled by it. Their main disadvantages are that analog representations are limited in precision—typically a few decimal places but fewer in complex mechanisms—and general-purpose devices are expensive and not easily programmed.

Are you a student?
Get a special academic rate on Britannica Premium.

Digital computers

In contrast to analog computers, digital computers represent information in discrete form, generally as sequences of 0s and 1s (binary digits, or bits). The modern era of digital computers began in the late 1930s and early 1940s in the United States, Britain, and Germany. The first devices used switches operated by electromagnets (relays). Their programs were stored on punched paper tape or cards, and they had limited internal data storage. For historical developments, see the section Invention of the modern computer.

Mainframe computer

During the 1950s and ’60s, Unisys (maker of the UNIVAC computer), International Business Machines Corporation (IBM), and other companies made large, expensive computers of increasing power. They were used by major corporations and government research laboratories, typically as the sole computer in the organization. In 1959 the IBM 1401 computer rented for $8,000 per month (early IBM machines were almost always leased rather than sold), and in 1964 the largest IBM S/360 computer cost several million dollars.

These computers came to be called mainframes, though the term did not become common until smaller computers were built. Mainframe computers were characterized by having (for their time) large storage capabilities, fast components, and powerful computational abilities. They were highly reliable, and, because they frequently served vital needs in an organization, they were sometimes designed with redundant components that let them survive partial failures. Because they were complex systems, they were operated by a staff of systems programmers, who alone had access to the computer. Other users submitted “batch jobs” to be run one at a time on the mainframe.

Such systems remain important today, though they are no longer the sole, or even primary, central computing resource of an organization, which will typically have hundreds or thousands of personal computers (PCs). Mainframes now provide high-capacity data storage for Internet servers, or, through time-sharing techniques, they allow hundreds or thousands of users to run programs simultaneously. Because of their current roles, these computers are now called servers rather than mainframes.