computational linguistics, language analysis that uses computers. Computational analysis is often applied to the handling of basic language data—e.g., making concordances and counting frequencies of sounds, words, and word elements—although numerous other types of linguistic analysis can be performed by computers.
Interest in computational linguistics began with the arrival of electronic digital computers after the end of World War II, and from about 1955 to 1965 researchers in the United States and Great Britain undertook projects that would lead to computerized or mechanical translation, particularly of Russian, involving grammatical and semantic analysis of sentences. Support for research in mechanical translation diminished after it became apparent that the problem of producing automatic translations of high quality was far more difficult than it had been thought to be.
Beginning in the late 1960s, research on computational linguistics drew on approaches from work on artificial intelligence, particularly on creating programs that could understand language. As computers became more powerful and the amount of written material online grew with the development of the World Wide Web, computational linguistics developed statistical approaches to studying language that allowed computers to better understand human language.
Techniques developed in computational linguistics have been used in other fields; e.g., the study of style in literature often uses frequency counts of language elements, and information retrieval usually makes use of automated grammatical analysis.