fingerprint, impression made by the papillary ridges on the ends of the fingers and thumbs. Fingerprints afford an infallible means of personal identification, because the ridge arrangement on every finger of every human being is unique and does not alter with growth or age. Fingerprints serve to reveal an individual’s true identity despite personal denial, assumed names, or changes in personal appearance resulting from age, disease, plastic surgery, or accident. The practice of utilizing fingerprints as a means of identification, referred to as dactyloscopy, is an indispensable aid to modern law enforcement.

Each ridge of the epidermis (outer skin) is dotted with sweat pores for its entire length and is anchored to the dermis (inner skin) by a double row of peglike protuberances, or papillae. Injuries such as superficial burns, abrasions, or cuts do not affect the ridge structure or alter the dermal papillae, and the original pattern is duplicated in any new skin that grows. An injury that destroys the dermal papillae, however, will permanently obliterate the ridges.

Any ridged area of the hand or foot may be used as identification. However, finger impressions are preferred to those from other parts of the body because they can be taken with a minimum of time and effort, and the ridges in such impressions form patterns (distinctive outlines or shapes) that can be readily sorted into groups for ease in filing.

Male muscle, man flexing arm, bicep curl.
Britannica Quiz
Facts You Should Know: The Human Body Quiz

Early anatomists described the ridges of the fingers, but interest in modern fingerprint identification dates from 1880, when the British scientific journal Nature published letters by the Englishmen Henry Faulds and William James Herschel describing the uniqueness and permanence of fingerprints. Their observations were experimentally verified by the English scientist Sir Francis Galton, who suggested the first elementary system for classifying fingerprints based on grouping the patterns into arches, loops, and whorls. Galton’s system served as the basis for the fingerprint classification systems developed by Sir Edward R. Henry, who later became chief commissioner of the London metropolitan police, and by Juan Vucetich of Argentina. The Galton-Henry system of fingerprint classification, published in June 1900, was officially introduced at Scotland Yard in 1901 and quickly became the basis for its criminal-identification records. The system was adopted immediately by law-enforcement agencies in the English-speaking countries of the world and is now the most widely used method of fingerprint classification. Juan Vucetich, an employee of the police of the province of Buenos Aires in 1888, devised an original system of fingerprint classification published in book form under the title Dactiloscopía comparada (1904; “Comparative Fingerprinting”). His system is still used in most Spanish-speaking countries.

Fingerprints are classified in a three-way process: by the shapes and contours of individual patterns, by noting the finger positions of the pattern types, and by relative size, determined by counting the ridges in loops and by tracing the ridges in whorls. The information obtained in this way is incorporated in a concise formula, which is known as the individual’s fingerprint classification.

There are several variants of the Henry system, but that used by the Federal Bureau of Investigation (FBI) in the United States recognizes eight different types of patterns: radial loop, ulnar loop, double loop, central pocket loop, plain arch, tented arch, plain whorl, and accidental. Whorls are usually circular or spiral in shape. Arches have a moundlike contour, while tented arches have a spikelike or steeplelike appearance in the centre. Loops have concentric hairpin or staple-shaped ridges and are described as “radial” or “ulnar” to denote their slopes; ulnar loops slope toward the little finger side of the hand, radial loops toward the thumb. Loops constitute about 65 percent of the total fingerprint patterns; whorls make up about 30 percent, and arches and tented arches together account for the other 5 percent. The most common pattern is the ulnar loop.

Dactyloscopy, the technique of fingerprinting, involves cleaning the fingers in benzene or ether, drying them, then rolling the balls of each over a glass surface coated with printer’s ink. Each finger is then carefully rolled on prepared cards according to an exact technique designed to obtain a light gray impression with clear spaces showing between each ridge so that the ridges may be counted and traced. Simultaneous impressions are also taken of all fingers and thumbs.

Are you a student?
Get a special academic rate on Britannica Premium.

Latent fingerprinting involves locating, preserving, and identifying impressions left by a culprit in the course of committing a crime. In latent fingerprints, the ridge structure is reproduced not in ink on a record card but on an object in sweat, oily secretions, or other substances naturally present on the culprit’s fingers. Most latent prints are colourless and must therefore be “developed,” or made visible, before they can be preserved and compared. This is done by brushing them with various gray or black powders containing chalk or lampblack combined with other agents. The latent impressions are preserved as evidence either by photography or by lifting powdered prints on the adhesive surfaces of tape.

Though the technique and its systematic use originated in Great Britain, fingerprinting was developed to great usefulness in the United States, where in 1924 two large fingerprint collections were consolidated to form the nucleus of the present file maintained by the Identification Division of the FBI. The division’s file contained the fingerprints of more than 250 million persons by the early 21st century. Fingerprint files and search techniques have been computerized to enable much quicker comparison and identification of particular prints.

Other “fingerprinting” techniques have also been developed. These include the use of a sound spectrograph—a device that depicts graphically such vocal variables as frequency, duration, and intensity—to produce voicegraphs, or voiceprints, and the use of a technique known as DNA fingerprinting, an analysis of those regions of DNA that vary among individuals, to identify physical evidence (blood, semen, hair, etc.) as belonging to a suspect. The latter test has been used in paternity testing as well as in forensics.

J. Edgar Hoover The Editors of Encyclopaedia Britannica

biometrics, measures of individuals’ unique physical characteristics or behavioral traits that are typically used in automated recognition technology to verify personal identity. Physical characteristics used include fingerprints, faces, retinas, and voice patterns. Biometric authentication may be used to manage an individual’s access to resources such as buildings, rooms, computers, and phones.

History

Timeline: Biometrics Technology
  • 1879: Alphonse Bertillon introduces signaletics in Paris
  • 1880s: the Parisian police force adopts signaletics
  • 1892: Francis Galton develops a fingerprint classification system
  • 1903: New York state prisons begin using fingerprinting to identify incarcerated individuals
  • 1924: the FBI establishes an identification division to store fingerprint records
  • 1936: Frank Burch suggests using iris patterns as biometric data
  • 1952: Bell Labs designs Audrey, the first rudimentary speech recognition system
  • 1960s: Woodrow (“Woody”) Bledsoe semiautomates facial recognition
  • 1970s: speech recognition systems are further developed with funding from the U.S. Department of Defense
  • 1985: David Sidlauskas develops a hand-geometry recognition system, and Joseph Rice develops vascular pattern recognition
  • 1990: the company Dragon releases the first voice recognition system for consumers
  • 1994: John Daugman develops an iris recognition system
  • 2013: Apple introduces Touch ID for the new iPhone 5S
  • 2017: Apple introduces Face ID for the new iPhone X

Automated biometric systems did not become popular until the 1990s, but the ideas on which they are based originated thousands of years ago. Ancient Egyptians identified trustworthy traders by their physical descriptions. In Babylon fingerprints were used as identification by merchants who recorded business transactions on clay tablets dating back to 500 bce. Chinese merchants also used fingerprints as identifiers for business transactions, in the 14th century.

Official biometric classification systems for security purposes began to appear in the late 1870s and early 1880s. In 1879 Paris police officer Alphonse Bertillon developed and introduced an identification system known as Bertillonage, or “signaletics,” which identified people by head and body measurements as well as by the shape of their ears, eyes, eyebrows, and mouth. The system also accounted for tattoos, scars, and personality traits. The information was recorded on cards that included photographs of individuals’ front and side profiles. The Parisian police force adopted the system in the early 1880s.

French National Police: patrolling
More From Britannica
police: Biometrics

In 1892 British scientist Frances Galton published a book about a fingerprint classification system he had developed based on three main pattern types: loops, whorls, and arches. Although his initial purpose was to identify distinctions in fingerprints between different races (reasoning that did not hold up to experimentation), his system showed that no two people’s fingerprints are the same and that fingerprints remain the same throughout a person’s life. Four years later, after consulting Galton, Edward Henry, inspector general of the Bengal Police of India, implemented the fingerprinting system. His assistant Khan Bahadur Qazi Azizul Haque developed a mathematical formula to classify fingerprints. The Henry Classification System, as it came to be known, laid the groundwork for systems that would be used by the Federal Bureau of Investigation (FBI) and other law enforcement organizations for many years. The Henry system effectively replaced the Bertillon system in the early 1900s.

What do you think?

Explore the ProCon debate

New York state prisons began using fingerprints for identifying people released from their custody in 1903, after which several other states and law enforcement agencies established their own fingerprinting systems. In 1924 the FBI established an identification division to serve as a national repository and clearinghouse for fingerprint records.

Over the next several decades more biometric technologies were developed. Some notable advancements include the semiautomation of facial recognition, developed by Woodrow (“Woody”) Bledsoe in the 1960s; speech recognition systems funded by the U.S. Department of Defense in the 1970s; a hand-geometry recognition system devised by David Sidlauskas in 1985; vascular pattern recognition developed by Joseph Rice in 1985; and an iris recognition system devised by John Daugman in 1994.

Types of biometric systems

Physiological biometrics are based on relatively fixed individual features, such as facial contours, fingerprints, iris patterns, and vein geometry. Systems that use physiological biometrics include fingerprint scanners, facial recognition, ear authentication, and retina scanners.

Are you a student?
Get a special academic rate on Britannica Premium.

Behavioral biometrics, on the other hand, rely on an individual’s characteristic voluntary and involuntary movements and gestures. Examples include typing rhythm, walking, and voice, which correspond to the biometric methods known as keystroke recognition, gait analysis, and voice recognition.

Biometric authentication systems rely on three components: a scanning device or reader that captures an individual’s data, a database with stored information, and software that processes the individual’s data and searches the database for a match.

Modes of operation

Biometric systems have two different modes. Verification mode confirms an individual’s identity by a one-to-one comparison of the individual’s captured data with existing data about that individual found in a system database. Identification mode identifies an unnamed individual by performing a one-to-many comparison of the individual’s captured data to find a match in a system database.

Systems are also differentiated based on how many markers they use. Unimodal authentication systems capture and analyze biometric data from a single biometric marker (e.g., a retina scan), while multimodal authentication systems capture and analyze biometric data from two or more identifiers (e.g., a retina scan combined with voice recognition), making it more difficult for bad actors to access private information.

Privacy

The use of biometrics has raised concerns about privacy infringement. Biometrics can be used to access personal data for purposes other than those for which it was originally collected for (called function creep), or to collect personal data without an individual’s knowledge or consent. Well-designed biometric systems aim to provide convenience and can protect individuals from unauthorized parties who might seek to steal their identities.

When using biometric data to protect a user’s privacy, there is always the chance that the data may be compromised. For example, in 2018 the largest ID database in the world, Aadhaar, was hacked by malicious actors who collected users’ fingerprints and iris scans, among other personal information. In a case that traversed multiple sectors, the biometrics system Biostar 2’s data was breached in 2019. The system, created by the security firm Suprema, was used by both the police and banking industries. Such breaches highlight the risks of using biometric technology as a security measure. For example, breaches could expose whether someone had accessed a certain type of healthcare or attended a confidential meeting. Biometric data may even be attained from users’ social media profiles. Makeup tutorials and the like reveal influencers’ eye shapes, ear shapes, and voices, among other data. Such information can be scraped for nefarious purposes, such as creating deepfakes or accessing accounts through voice recognition.

Common uses

In banking and credit card processing, biometric systems are used to manage customer and employee identities to help combat fraud and increase transaction security. In an industry where passwords and pins are often insufficient to prevent hacking and security leaks, biometrics add a layer of security to the process. This is especially true with behavioral biometrics, which can alert banks to unusual customer activity based on the speed at which they respond to an alert or the manner in which they enter their password.

Hospitals use biometric systems to create digital profiles of patients, complete with their medical histories, in order to accurately track each patient, identify patients in an emergency, and make sure the right patient gets the right care. The technology can also be used for newer applications, such as prescribing medication remotely.

In travel, airports use biometric systems to rapidly identify passengers through real-time screenings for a smoother airport experience. Electronic passports (e-passports) that store biometric data are also available.

Apple introduced the Face ID feature on iPhones starting with the iPhone X in 2017. The system replaced the previous fingerprint Touch ID feature, which had been introduced with the iPhone 5S in 2013. Face ID places tiny dots onto users’ facial contours to create a map of their features. It also captures an infrared image of users’ faces. The system then compares the captured image with the existing facial mapping stored on the phone, which allows a user to unlock the phone. (Facial mapping information is not stored remotely on the cloud.) Face ID, which Apple states is more secure than Touch ID, appears on iPhones and the iPad Pro.

Laura Payne