Table of Contents
References & Edit History Related Topics

History

insteel
print Print
Please select which sections you would like to print:
verifiedCite
While every effort has been made to follow citation style rules, there may be some discrepancies. Please refer to the appropriate style manual or other sources if you have any questions.
Select Citation Style
Share
Share to social media
URL
https://www.britannica.com/technology/steel
Feedback
Corrections? Updates? Omissions? Let us know if you have suggestions to improve this article (requires login).
Thank you for your feedback

Our editors will review what you’ve submitted and determine whether to revise the article.

External Websites
Britannica Websites
Articles from Britannica Encyclopedias for elementary and high school students.

The steel industry has grown from ancient times, when a few men may have operated, periodically, a small furnace producing 10 kilograms, to the modern integrated iron- and steelworks, with annual steel production of about 1 million tons. The largest commercial steelmaking enterprise, Nippon Steel in Japan, was responsible for producing 26 million tons in 1987, and 11 other companies generally distributed throughout the world each had outputs of more than 10 million tons. Excluding the Eastern-bloc countries, for which employment data are not available, some 1.7 million people were employed in 1987 in producing 430 million tons of steel. That is equivalent to about 250 tons of steel per person employed per year—a remarkably efficient use of human endeavour.

Primary steelmaking

Early iron and steel

Iron production began in Anatolia about 2000 bc, and the Iron Age was well established by 1000 bc. The technology of iron making then spread widely; by 500 bc it had reached the western limits of Europe, and by 400 bc it had reached China. Iron ores are widely distributed, and the other raw material, charcoal, was readily available. The iron was produced in small shaft furnaces as solid lumps, called blooms, and these were then hot forged into bars of wrought iron, a malleable material containing bits of slag and charcoal.

The carbon contents of the early irons ranged from very low (0.07 percent) to high (0.8 percent), the latter constituting a genuine steel. When the carbon content of steel is above 0.3 percent, the material will become very hard and brittle if it is quenched in water from a temperature of about 850° to 900° C (1,550° to 1,650° F). The brittleness can be decreased by reheating the steel within the range of 350° to 500° C (660° to 930° F), in a process known as tempering. This type of heat treatment was known to the Egyptians by 900 bc, as can be judged by the microstructure of remaining artifacts, and formed the basis of a steel industry for producing a material that was ideally suited to the fabrication of swords and knives.

The Chinese made a rapid transition from the production of low-carbon iron to high-carbon cast iron, and there is evidence that they could produce heat-treated steel during the early Han dynasty (206 bcad 25). The Japanese acquired the art of metalworking from the Chinese, but there is little evidence of a specifically Japanese steel industry until a much later date.

The Romans, who have never been looked upon as innovators but more as organizers, helped to spread the knowledge of iron making, so that the output of wrought iron in the Roman world greatly increased. With the decline of Roman influence, iron making continued much as before in Europe, and there is little evidence of any change for many centuries in the rest of the world. However, by the beginning of the 15th century, waterpower was used to blow air into bloomery furnaces; as a consequence, the temperature in the furnace increased to above 1,200° C (2,200° F), so that, instead of forming a solid bloom of iron, a liquid was produced rich in carbon—i.e., cast iron. In order to make this into wrought iron by reducing the carbon content, solidified cast iron was passed through a finery, where it was melted in an oxidizing atmosphere with charcoal as the fuel. This removed the carbon to give a semisolid bloom, which, after cooling, was hammered into shape.

Blister steel

In order to convert wrought iron into steel—that is, increase the carbon content—a carburization process was used. Iron billets were heated with charcoal in sealed clay pots that were placed in large bottle-shaped kilns holding about 10 to 14 tons of metal and about 2 tons of charcoal. When the kiln was heated, carbon from the charcoal diffused into the iron. In an attempt to achieve homogeneity, the initial product was removed from the kiln, forged, and again reheated with charcoal in the kiln. During the reheating process, carbon monoxide gas was formed internally at the nonmetallic inclusions; as a result, blisters formed on the steel surface—hence the term blister steel to describe the product. This process spread widely throughout Europe, where the best blister steel was made with Swedish wrought iron. One common steel product was weapons. To make a good sword, the carburizing, hammering, and carburizing processes had to be repeated about 20 times before the steel was finally quenched and tempered and made ready for service. Thus, the material was not cheap.

About the beginning of the 18th century, coke produced from coal began to replace charcoal as the fuel for the blast furnace; as a result, cast iron became cheaper and even more widely used as an engineering material. The Industrial Revolution then led to an increased demand for wrought iron, which was the only material available in sufficient quantity that could be used for carrying loads in tension. One major problem was the fact that wrought iron was produced in small batches. This was solved about the end of the 18th century by the puddling process, which converted the readily available blast-furnace iron into wrought iron. In Britain by 1860 there were 3,400 puddling furnaces producing a total of 1.6 million tons per year—about half the world’s production of wrought iron. Only about 60,000 tons were converted into blister steel in Britain; annual world production of blister steel at this time was about 95,000 tons. Blister steel continued to be made on a small scale into the 20th century, the last heat taking place at Newcastle, Eng., in 1951.

Crucible steel

A major development occurred in 1751, when Benjamin Huntsman established a steelworks at Sheffield, Eng., where the steel was made by melting blister steel in clay crucibles at a temperature of 1,500° to 1,600° C (2,700° to 2,900° F), using coke as a fuel. Originally, the charge in the crucible weighed about 6 kilograms, but by 1870 it had increased to 30 kilograms, which, with a crucible weight of 10 kilograms, was the maximum a man could be expected to lift from a hot furnace. The liquid metal was cast to give an ingot about 75 millimetres in square section and 500 millimetres long, but multiple casts were also made. Sheffield became the centre of crucible steel production; in 1873, the peak year, output was 110,000 tons—about half the world’s production. The crucible process spread to Sweden and France following the end of the Napoleonic Wars and then to Germany, where it was associated with Alfred Krupp’s works in Essen. A small crucible steelworks was started in Tokyo in 1895, and crucible steel was produced in Pittsburgh, Pa., U.S., from 1860, using a charge of wrought iron and pig iron.

The crucible process allowed alloy steels to be produced for the first time, since alloying elements could be added to the molten metal in the crucible, but it went into decline from the early 20th century, as electric-arc furnaces became more widely used. It is believed that the last crucible furnace in Sheffield was operated until 1968.