Made In Britain

Delivering to UK-EU-Worldwide

History of Thermometery

History of Thermometery

The History of Thermometers

The history of temperature stretches back for thousands of years. Temperature has always been an important and essential part of daily life and society, ever since bakers and blacksmiths relied on temperature to control chemical reactions.

Temperature is vitally important to these reactions; for instance, the many different chemical reactions needed to keep a human being alive will only work at a temperature very close to 37 degrees Celsius, and the body has developed elaborate temperature-control mechanisms to keep temperature constant.

Nowadays, temperature is better understood than ever, and a wide range of temperature-measuring equipment – thermoscopes, thermocouples and many types of thermometer – is necessary to measure and to help control it.

The first known writers on temperature and its measurement were Philo of Byzantium and Heron of Alexandria. Both of these men wrote in Ancient Greek, and the word ‘thermometer’ comes from the Ancient Greek words ‘thermo,’ meaning ‘heat’, and ‘meter,’ meaning ‘to measure’; therefore the word ‘thermometer’ literally means ‘to measure heat.’

Philo (? – 50 A.D.) was a Hellenistic Jewish philosopher who conducted an early experiment on the expansion of air with heat. He created a device which has been called the first thermometer. A tube connected to a hollow sphere was extended over a jug of water. Philo noticed that if the sphere was in the sun, bubbles were released in the jug as air expanded out of the sphere, whereas when the device was placed in the shade, the air contracted with the cooler temperature and the water rose up the tube again.

At roughly the same time period, Heron of Alexandria (10 – 70 A.D.), an Ancient Greek mathematician and engineer, was writing on the observation of temperature and drawing up plans for a basic thermometer for use in medicine.

However, neither of these writers worked on or developed their designs for thermometers. The invention and creation of the first working thermometer has been credited variously to Abu Ali Ibn Sina (known as Avicennna in the Western world), Cornelius Drebbel, Robert Fludd, Galileo Galilei, and Santorio Santorio.

Abu Ali Ibn Sina (980 – 1037 A.D.) was a Persian polymath, physician and Islamic philosopher, who created a simple thermometer to test the temperature of air. Cornelius Drebbel (1572 – 1633 A.D.) was a Dutch engineer and inventor of the submarine. Interestingly, Drebbel discovered carmine dye when one of his thermometers, which used coloured liquid, broke on a windowsill and he noticed that the dye grew more intense in colour when exposed to the sun. Galileo Galilei (1564 – 1642), the famous Tuscan physician, mathematician and astronomer, came up with a device for registering temperature change at the height of the Scientific Revolution. He also noticed the principle behind the device known today as ‘Galileo’s thermometer’ - that is, that glass spheres filled with aqueous alcohol of slightly different densities would rise and fall.

However, none of these early designs were true thermometers. They were in fact thermoscopes rather than thermometers, as the absence of a scale meant that they only registered changes in temperature rather than measuring it. A true thermometer must include a temperature sensor - where physical change occurs with changes in temperature - and a means of converting that physical change into a readable value. For a long time, this meant a bulb (containing some form of liquid) and a scale, generally displayed on the tube through which the liquid expanded or contracted with changes in temperature.

The invention of the first true thermometer is generally credited to Robert Fludd (1574 – 1637 A.D.), an English Paracelsian physician, astrologer, and mystic. Although the first detailed diagram of a thermoscope was created by Giuseppe Biancini (1566 – 1624 A.D.), an Italian Jesuit astronomer and mathematician, it was Robert Fludd who produced the first diagram of a true thermometer with both a temperature sensor and a scale.

The first person who developed the idea of the thermometer and actively used it was Santorio Santorio (1561 – 1636 A.D.), an Italian physiologist, physician and professor. He developed a clinical thermometer for use in his experiments at the University of Padua, and claimed to have produced it by adapting the design from Heron of Alexandria’s thermoscope. Santorio Santorio used his thermometer to produce an estimated heat of a patient’s heart by measuring the heat of his expired air.

All these early thermoscopes and thermometers had the same design flaw. They were all sensitive to air pressure as well as temperature, and therefore also functioned as barometers rather than as pure thermometers. The first thermometer which gave a clear reading of temperature, unaffected by any other factor, was invented by Ferndinando II de’Medici in 1654. Medici (1610 – 1670 A.D.), Grand Duke of Tuscany, created the first modern thermometer, and the blueprint for many successive thermometer manufacturers. This was a sealed tube partially filled with alcohol, with a bulb and a stem. Because the tube was sealed, air pressure no longer affected the movement of the alcohol up or down the stem, leaving temperature as the only thing which was measured.

However, there was still one big problem in the thermometer industry. Every thermometer manufacturer had his own scale and his own system for measuring temperature. The scales and measurements were not standardised or calibrated to one another.

An early attempt at encouraging the use of a universal scale was in October 1663. The Royal Society in London proposed the use of one of Robert Hooke’s many thermometer scales as standard in the industry (Hooke was an English natural philosopher, architect and inventor).

Still, the Royal Society had no real power to implement its recommendation, and a variety of thermometers and measures remained in use. Slowly a scale evolved: Christian Huygens in 1665 suggested the melting and boiling points of water as standard lower and upper limits, and in 1701 Isaac Newton proposed a scale of twelve degrees, with the extremes being melting ice and body temperature.

Eventually, it was market forces which decided which thermometer scale would become standard use. Ole Christiansen Romer (1644 – 1710 A.D.), the royal mathematician of Denmark and a noted astrologer, created a scale where the upper limit was body temperature (the temperature of a healthy adult male’s armpit), and the lower limit a mixture of salt and ice. This is known as a ‘frigorific’ mixture: two materials whose temperatures can vary, but which always produce the same temperature when mixed together.

However, it was when Daniel Gabriel Fahrenheit visited Romer in 1708, and started using his scale in 1724, that it really caught on. Fahrenheit (1686 – 1738 A.D.), a German physicist and engineer, was the first thermometer manufacturer to make his thermometers with mercury instead of alcohol. Mercury is a better substance to use because its movement corresponds more exactly to temperature change, and so a thermometer containing it can produce a more accurate reading than a thermometer using alcohol. So Fahrenheit’s thermometers became the most popular designs, and eventually the standard ones. Because those buying the thermometers had to use the scale with which they came equipped, his scale eventually became the standard one as well, and still bears his name today.

Fahrenheit wanted a scale which was divisible by twelve, and so he called his upper point (body temperature) 96 degrees. As body temperature varies, the upper limit of the Fahrenheit scale was later changed to the temperature of boiling water, which was said to be 212 degrees. Nowadays, the Fahrenheit scale is only used widely in the United States of America and a few other countries (for example, Belize). The scale most widely used in thermometers of all kinds is the Celsius scale.

The Celsius scale was developed by Anders Celsius (1701 – 1744 A.D.), a Swedish astronomer who devised a scale of 100 degrees, with zero as the boiling point of water and 100 as its freezing point. He set this scale out in his paper ‘Observations of two persistent degrees on a thermometer’ in 1742. As he died just two years later, his assistant Carolus Linnaeus was instrumental in developing and publicizing the scale, and in encouraging its use among thermometer manufacturers.

Linnaeus reversed the scale, making zero the freezing point of water and 100 its boiling point, and used it in his patented linnaeus-thermometers, which were thermometers for use in greenhouses.

The scale caught on, with the endorsement of such figures as Daniel Ekstrom, Sweden's leading instrument-maker at the time, and Pehr Elvius, the secretary of the Royal Swedish Academy of Sciences. Since about 1950, the ‘centigrade’ scale (officially named the Celsius scale in 1948) has been the most widely used thermometer scale worldwide, and is used in thermometers of all kinds and in all industries, with the exception of some scientific fields (e.g. astrophysics or low-temperature research) where the specialised Kelvin scale is used instead.

The development of thermometers has moved quickly since the eighteenth century. In 1866, Sir Thomas Clifford Allbutt devised a clinical thermometer which produced a body temperature reading in five minutes rather than twenty. Since then, thermometers have become essential and highly accurate devices used to analyse and control chemical reactions in fields as diverse as astrophysics, restaurant catering, and electronical manufacture.

Since the introduction of the International Temperature Scale in 1990 (ITS-90), many different thermometer designs have been required to cover the whole range of temperatures. These range from ‘absolute zero’, where all energy (expressed as heat) has been removed from a substance or atmosphere, to very hot temperatures – thermometers have been developed that can even measure the temperature of the surface of the sun (5526 degrees Celsius)!

Nowadays, many different types of thermometer exist, including the alcohol thermometer, the mercury thermometer, the medical thermometer, the reversing thermometer, the maximum minimum thermometer, the thermistor, the thermocouple, the coulomb blockade thermometer, the Beckmann differential thermometer, the bi-metal mechanical thermometer, the silicon bandgap temperature sensor, and the liquid crystal thermometer. However, the most common in general manufacturing purposes remains the electronic thermometer, which uses tiny microchips to pick up and to measure information on temperature.

The thermocouple is now the most widely used thermometer, or 'temperature sensor.' It uses electrical technology to show temperature. Two metals are
used, one contained within the thermocouple, and one forming a probe which acts as a sensor to test the temperature of a substance or atmosphere. The word 'thermocouple' comes from the idea of the 'coupling' of two different metals.

The difference between their temperatures is expressed electrically
through their difference in voltage. As the temperature of the metal inside
the thermocouple is already known, the difference between the two temperatures can let us easily deduce the temperature of the metal attached to the probe. This deduction is usually carried out electronically by a tiny microchip inside the instrument, so that the scale or display on a thermocouple thermometer simply shows the temperature which the probe has sensed.

Thermocouples are used extensively in electrical engineering and industry. For instance, they are essential in fields such as heating appliance safety, radiation testing and in many areas of manufacture. The principle behind thermocouples was discovered by the German-Estonian physicist Thomas Johann Seebeck in 1821, and is known as the 'thermoelectric effect' or 'Seebeck effect'.