Iron And Steel

Added byIN Others  Save
 We keep Archaeologs ad-free for you. Support us on Patreon or Buy Me a Coffee to keep us motivated!
added by

Since the Iron Age, iron has been the most commonly used metal. Iron ores are widely available: the most common as rocks are siderite, haematite and limonite. Iron may also be obtained from the ‘bog iron ore’ precipitated underwater or in boggy waterlogged ground. Although iron is much more abundant than copper — it makes up fully five per cent of the earth’s crust — its exploitation comes later in history and its technology follows a vastly different course, for reasons inherent in the metal itself. Far more than any other metal in common use, iron is affected in its properties by the techniques used to produce and work it, from the smelting process (which may add alloying carbon) to the final shaping, hammering or heat treatment. The exploitation of any metal naturally depends on the cost of producing it and the qualities it can be seen to offer; in the case of iron, however, the most useful properties of the metal are not readily apparent and were only slowly evoked by the ingenuity of generations of craftsmen. The history of iron thus cannot be understood without giving close consideration to methods of production and treatment. Pure iron melts at 1535°C, a temperature inaccessible before the 19th century ad. Hence in antiquity iron could be reduced from its ore in molten form only if during the smelting process it absorbed carbon from a charcoal fire, thereby forming an alloy with a lower melting point. When the carbon content reaches 3 to 4.5 per cent the melting point falls to about 1150°C, not much higher than the melting point of copper (1083’C) or the temperature of a good pottery kiln. The resulting alloy, called cast iron or pig iron, is hard and brittle (too brittle to make a dependable sword). In China iron was regularly produced in the form of cast iron from the time of the metal’s first exploitation around the 6th century bc (see iron and steel, China). Elsewhere in the world, however, the production of iron depended on processes in which the metal was never melted but was instead obtained in solid form. Iron can be reduced from its ore at a temperature well below the melting point to form a spongy mass called a bloom; the bloom can then be welded into a compact mass and purified of slag by repeated forging (hammering while hot). The iron so produced, which does not contain carbon, is called wrought iron; it is both softer and tougher (stronger, less brittle) than cast iron, and it could not be melted in any furnace before modem times. Iron was probably first produced, in the form of wrought iron, as an accidental by-product of lead and copper smelting, for in the ancient Near East the iron ore hematite was often added to siliceous ores of lead and copper as a flux. (A flux is a material which combines with the earthy parts of the ore to form a slag that is liquid at the furnace temperature and easily separated from the metal.) The deliberate production of iron originated perhaps in Anatolia around 2000 bc. It was probably the process of forging the white-hot bloom that led to the discovery of steel — or rather, to the discovery that careful treatment of wrought iron could improve its properties enormously. Steel is an alloy of iron and carbon containing about 0.3 to 1.0 per cent carbon. It is therefore intermediate in carbon content between wrought iron (no carbon) and cast iron (3 to 4.5 per cent carbon), and in principle can be made either by adding carbon to wrought iron or by removing it from cast iron. In the West, where cast iron does not seem to have been made deliberately before the 14th century ad, the manufacture of steel prior to that time began with wrought iron and added carbon to it by a simple process called cementation: a piece of wrought iron deeply embedded in a charcoal fire is converted to steel by prolonged heating, which allows it to absorb small amounts of carbon by solid-state diffusion. Although smiths stumbled on the cementation process before the end of the 2nd millennium bc, the essential alloying role of carbon was not realized until the 18th century ad: until then ‘steel’ was simply the name given to a mysteriously fine iron that a good smith knew how to produce. The smith certainly never guessed that his charcoal fire was adding carbon to the iron to form an alloy; indeed, he was more likely to believe that he was purifying the iron in ‘the refiner’s fire’. The property that sets steel apart is its response to heat treatment. Like any other metal, steel can be hardened to some extent by hammering (work-hardening); unlike other common metals, it can be hardened dramatically by quenching (rapid cooling from temperatures above red heat, 725°C). The exact compromise between hardness (which entails brittleness) and toughness best suited to any particular application can then be achieved by tempering (reducing the hardness by reheating to temperatures between about 200 and 400°C). It should be noted that the presence of carbon is essential to these processes: quenching has little effect on pure iron. The difficulty of producing a fine sword by carburizing, forging, quenching and tempering will be appreciated if it is kept in mind that the smith had no way of judging the carbon content of the metal and had only its colour as a measure of its temperature. Before the mastery of carburizing and quenching, iron could not be made to equal the performance of a good work-hardened bronze, and its use is likely to reflect only the abundance of its ores and the consequent cheapness of the metal. Carburized and quench-hardened, however, iron becomes immensely superior to bronze. In the Near East a few examples of quench-hardened steel can be dated as early as the end of the 2nd millennium bc, but the difficulties of the process are such that it was not widely used until-much later. The process of quenching steel was certainly known to Homer, who used it as a simile for the blinding of Polyphemos (Odyssey, Book 9). It might be added that the techniques required to make good steel cannot be borrowed ready made from the bronze workshop: bronze and copper are hammered cold rather than hot (red-hot bronze may even shatter when struck), they do not form alloys with carbon, and quenching has no effect on them (heating copper or bronze anneals it, i.e. softens it, regardless of whether the subsequent cooling is fast or slow). As long as it was forged from wrought iron by smiths, steel could be made only in fairly small quantities. The proud names given to swords like Excalibur and Durandal hint that steel-making was not an industry but an art — and perhaps also that even the most expert smith could not make two swords alike. Largescale steel manufacture depends on the production of cast iron, which in Europe dates only from the 14th century ad. The West did not enter the ‘Age of Steel’ until the 19th century with the invention of the Bessemer and Siemens processes, which are industrial processes for obtaining liquid metal of any desired carbon content by the decarburization of cast iron. In principle these modem techniques were anticipated by many centuries in China, where steel was made from cast iron as early as the last few centuries bc; see iron and steel (China).

The Macmillan dictionary of archaeology, Ruth D. Whitehouse, 1983Copied

0