research and development

print Print
Please select which sections you would like to print:
verifiedCite
While every effort has been made to follow citation style rules, there may be some discrepancies. Please refer to the appropriate style manual or other sources if you have any questions.
Select Citation Style
Feedback
Corrections? Updates? Omissions? Let us know if you have suggestions to improve this article (requires login).
Thank you for your feedback

Our editors will review what you’ve submitted and determine whether to revise the article.

Also known as: R and D, R&D
Abbreviation:
R and D
Or:
R & D

research and development, in industry, two intimately related processes by which new products and new forms of old products are brought into being through technological innovation.

Introduction and definitions

Research and development, a phrase unheard of in the early part of the 20th century, has since become a universal watchword in industrialized nations. The concept of research is as old as science; the concept of the intimate relationship between research and subsequent development, however, was not generally recognized until the 1950s. Research and development is the beginning of most systems of industrial production. The innovations that result in new products and new processes usually have their roots in research and have followed a path from laboratory idea, through pilot or prototype production and manufacturing start-up, to full-scale production and market introduction. The foundation of any innovation is an invention. Indeed, an innovation might be defined as the application of an invention to a significant market need. Inventions come from research—careful, focused, sustained inquiry, frequently trial and error. Research can be either basic or applied, a distinction that was established in the first half of the 20th century.

Basic research is defined as the work of scientists and others who pursue their investigations without conscious goals, other than the desire to unravel the secrets of nature. In modern programs of industrial research and development, basic research (sometimes called pure research) is usually not entirely “pure”; it is commonly directed toward a generalized goal, such as the investigation of a frontier of technology that promises to address the problems of a given industry. An example of this is the research being done on gene splicing or cloning in pharmaceutical company laboratories.

Applied research carries the findings of basic research to a point where they can be exploited to meet a specific need, while the development stage of research and development includes the steps necessary to bring a new or modified product or process into production. In Europe, the United States, and Japan the unified concept of research and development has been an integral part of economic planning, both by government and by private industry.

History and importance

The first organized attempt to harness scientific skill to communal needs took place in the 1790s, when the young revolutionary government in France was defending itself against most of the rest of Europe. The results were remarkable. Explosive shells, the semaphore telegraph, the captive observation balloon, and the first method of making gunpowder with consistent properties all were developed during this period.

The lesson was not learned permanently, however, and another half century was to pass before industry started to call on the services of scientists to any serious extent. At first the scientists consisted of only a few gifted individuals. Robert W. Bunsen, in Germany, advised on the design of blast furnaces. William H. Perkin, in England, showed how dyes could be synthesized in the laboratory and then in the factory. William Thomson (Lord Kelvin), in Scotland, supervised the manufacture of telecommunication cables. In the United States, Leo H. Baekeland, a Belgian, produced Bakelite, the first of the plastics. There were inventors, too, such as John B. Dunlop, Samuel Morse, and Alexander Graham Bell, who owed their success more to intuition, skill, and commercial acumen than to scientific understanding.

While industry in the United States and most of western Europe was still feeding on the ideas of isolated individuals, in Germany a carefully planned effort was being mounted to exploit the opportunities that scientific advances made possible. Siemens, Krupp, Zeiss, and others were establishing laboratories and, as early as 1900, employed several hundred people on scientific research. In 1870 the Physicalische Technische Reichsanstalt (Imperial Institute of Physics and Technology) was set up to establish common standards of measurement throughout German industry. It was followed by the Kaiser Wilhelm Gesellschaft (later renamed the Max Planck Society for the Advancement of Science), which provided facilities for scientific cooperation between companies.

Are you a student?
Get a special academic rate on Britannica Premium.

In the United States, the Cambria Iron Company set up a small laboratory in 1867, as did the Pennsylvania Railroad in 1875. The first case of a laboratory that spent a significant part of its parent company’s revenues was that of the Edison Electric Light Company, which employed a staff of 20 in 1878. The U.S. National Bureau of Standards was established in 1901, 31 years after its German counterpart, and it was not until the years immediately preceding World War I that the major American companies started to take research seriously. It was in this period that General Electric, Du Pont, American Telephone & Telegraph, Westinghouse, Eastman Kodak, and Standard Oil set up laboratories for the first time.

Except for Germany, progress in Europe was even slower. When the National Physical Laboratory was founded in England in 1900, there was considerable public comment on the danger to Britain’s economic position of German dominance in industrial research, but there was little action. Even in France, which had an outstanding record in pure science, industrial penetration was negligible.

World War I produced a dramatic change. Attempts at rapid expansion of the arms industry in the belligerent as well as in most of the neutral countries exposed weaknesses in technology as well as in organization and brought an immediate appreciation of the need for more scientific support. The Department of Scientific and Industrial Research in the United Kingdom was founded in 1915, and the National Research Council in the United States in 1916. These bodies were given the task of stimulating and coordinating the scientific support to the war effort, and one of their most important long-term achievements was to convince industrialists, in their own countries and in others, that adequate and properly conducted research and development were essential to success.

At the end of the war the larger companies in all the industrialized countries embarked on ambitious plans to establish laboratories of their own; and, in spite of the inevitable confusion in the control of activities that were novel to most of the participants, there followed a decade of remarkable technical progress. The automobile, the airplane, the radio receiver, the long-distance telephone, and many other inventions developed from temperamental toys into reliable and efficient mechanisms in this period. The widespread improvement in industrial efficiency produced by this first major injection of scientific effort went far to offset the deteriorating financial and economic situation.

The economic pressures on industry created by the Great Depression reached crisis levels by the early 1930s, and the major companies started to seek savings in their research and development expenditure. It was not until World War II that the level of effort in the United States and Britain returned to that of 1930. Over much of the European continent the depression had the same effect, and in many countries the course of the war prevented recovery after 1939. In Germany Nazi ideology tended to be hostile to basic scientific research, and effort was concentrated on short-term work.

The picture at the end of World War II provided sharp contrasts. In large parts of Europe industry had been devastated, but the United States was immensely stronger than ever before. At the same time the brilliant achievements of the men who had produced radar, the atomic bomb, and the V-2 rocket had created a public awareness of the potential value of research that ensured it a major place in postwar plans. The only limit was set by the shortage of trained persons and the demands of academic and other forms of work.

Since 1945 the number of trained engineers and scientists in most industrial countries has increased each year. The U.S. effort has stressed aircraft, defense, space, electronics, and computers. Indirectly, U.S. industry in general has benefited from this work, a situation that compensates in part for the fact that in specifically nonmilitary areas the number of persons employed in the United States is lower in relation to population than in a number of other countries.

Outside the air, space, and defense fields the amount of effort in different industries follows much the same pattern in different countries, a fact made necessary by the demands of international competition. (An exception was the former Soviet Union, which devoted less R and D resources to nonmilitary programs than most other industrialized nations.) An important point is that countries like Japan, which have no significant aircraft or military space industries, have substantially more manpower available for use in the other sectors. The preeminence of Japan in consumer electronics, cameras, and motorcycles and its strong position in the world automobile market attest to the success of its efforts in product innovation and development.