Temperature Sensors, Thermometers and Probes

Monday, 24 November 2014

History of Thermometery, Part 3: Explosive Development

The history of temperature stretches back for thousands of years. Temperature has always been an important and essential part of daily life and society, ever since bakers and blacksmiths relied on temperature to control chemical reactions.

Nowadays, temperature is better understood than ever, and a wide range of temperature-measuring equipment – thermoscopes, thermocouples and many types of thermometer – is necessary to measure and to help control it.

This blog series hopes to open your eyes about the history of temperature measurement, from the ancient through to the modern day. Enjoy!
The Allbutt clinical thermometer (photo sourced from http://www.ssplprints.com/)
The development of thermometers has exploded since the eighteenth century. In 1866, Sir Thomas Clifford Allbutt devised a clinical thermometer which produced a body temperature reading in five minutes rather than twenty. 

 Since then, development has followed development, with the invention of the ear thermometer in 1964 by medical researcher Dr. Theodor H. Benzinger, invention of the liquid crystal thermometer by inventor Bob Parker in the 1970s, and invention of the electronic digital thermometer by a group of inventors from Hunstville, Alabama in 1970.

Increasingly throughout the 20th century, thermometers became essential and highly accurate devices used to analyse and control chemical reactions in fields as diverse as astrophysics, restaurant catering, and industrial manufacture.

Since the introduction of the International Temperature Scale in 1990 (ITS-90), many different thermometer designs have been required to cover the whole range of temperatures. These range from ‘absolute zero’, where all energy (expressed as heat) has been removed from a substance or atmosphere, to very hot temperatures – thermometers have been developed that can even measure the temperature of the surface of the sun (5526 degrees Celsius)!

Nowadays, many different types of thermometer exist, including the alcohol thermometer, the mercury thermometer, the medical thermometer, the reversing thermometer, the maximum minimum thermometer, the thermistor, the thermocouple, the coulomb blockade thermometer, the Beckmann differential thermometer, the bi-metal mechanical thermometer, the silicon bandgap temperature sensor, and the liquid crystal thermometer. 

However, the most common in general manufacturing purposes remains the electronic thermometer, which uses tiny microchips to pick up and to measure information on temperature. This is safer than mercury thermometers, as mercury is harmful to humans. Electronic thermometers are also much more precise, reliable and much quicker than traditional thermometers. 

For manufacturing purposes, when delicate processes are taking place which need to be controlled, it is important to be able to rely on an accurate thermometer. Equally, in food manufacture and catering, harmful bacteria can thrive and multiply on food that is kept at too high a temperature, or on food that is cooked at too low a temperature. 

Next in this series of posts, learn about the most widely used modern thermometer; the thermocouple.

Monday, 17 November 2014

History of Thermometery, Part 2: Establishing a Scale

The history of temperature stretches back for thousands of years. Temperature has always been an important and essential part of daily life and society, ever since bakers and blacksmiths relied on temperature to control chemical reactions.

Nowadays, temperature is better understood than ever, and a wide range of temperature-measuring equipment – thermoscopes, thermocouples and many types of thermometer – is necessary to measure and to help control it.

This blog series hopes to open your eyes about the history of temperature measurement, from the ancient through to the modern day. Enjoy!


The Fludd Thermometer (Photo sourced from: http://www.kumc.edu/)

The invention of the first true thermometer is generally credited to Robert Fludd (1574 – 1637 A.D.), an English Paracelsian physician, astrologer, and mystic. Although the first detailed diagram of a thermoscope was created by Giuseppe Biancini (1566 – 1624 A.D.), an Italian Jesuit astronomer and mathematician, it was Robert Fludd who produced the first diagram of a true thermometer with both a temperature sensor and a scale.

The first person who developed the idea of the thermometer and actively used it was Santorio Santorio (1561 – 1636 A.D.), an Italian physiologist, physician and professor. He developed a clinical thermometer for use in his experiments at the University of Padua, and claimed to have produced it by adapting the design from Heron of Alexandria’s thermoscope. Santorio Santorio used his thermometer to produce an estimated heat of a patient’s heart by measuring the heat of his expired air.

All these early thermoscopes and thermometers had the same design flaw. They were all sensitive to air pressure as well as temperature, and therefore also functioned as barometers rather than as pure thermometers. The first thermometer which gave a clear reading of temperature, unaffected by any other factor, was invented by Ferndinando II de’Medici in 1654. 


Medici (1610 – 1670 A.D.), Grand Duke of Tuscany, created the first modern thermometer, and the blueprint for many successive thermometer manufacturers. This was a sealed tube partially filled with alcohol, with a bulb and a stem. Because the tube was sealed, air pressure no longer affected the movement of the alcohol up or down the stem, leaving temperature as the only thing which was measured.

However, there was still one big problem in the thermometer industry. Every thermometer manufacturer had his own scale and his own system for measuring temperature. The scales and measurements were not standardised or calibrated to one another.

An early attempt at encouraging the use of a universal scale was in October 1663. The Royal Society in London proposed the use of one of Robert Hooke’s many thermometer scales as standard in the industry (Hooke was an English natural philosopher, architect and inventor).

Still, the Royal Society had no real power to implement its recommendation, and a variety of thermometers and measures remained in use. Slowly a scale evolved: Christian Huygens in 1665 suggested the melting and boiling points of water as standard lower and upper limits, and in 1701 Isaac Newton proposed a scale of twelve degrees, with the extremes being melting ice and body temperature.

Eventually, it was market forces which decided which thermometer scale would become standard use. Ole Christiansen Romer (1644 – 1710 A.D.), the royal mathematician of Denmark and a noted astrologer, created a scale where the upper limit was body temperature (the temperature of a healthy adult male’s armpit), and the lower limit a mixture of salt and ice. This is known as a ‘frigorific’ mixture: two materials whose temperatures can vary, but which always produce the same temperature when mixed together.

However, it was when Daniel Gabriel Fahrenheit visited Romer in 1708, and started using his scale in 1724, that it really caught on. Fahrenheit (1686 – 1738 A.D.), a German physicist and engineer, was the first thermometer manufacturer to make his thermometers with mercury instead of alcohol. 

Mercury is a better substance to use because its movement corresponds more exactly to temperature change, and so a thermometer containing it can produce a more accurate reading than a thermometer using alcohol. So Fahrenheit’s thermometers became the most popular designs, and eventually the standard ones. Because those buying the thermometers had to use the scale with which they came equipped, his scale eventually became the standard one as well, and still bears his name today.

Fahrenheit wanted a scale which was divisible by twelve, and so he called his upper point (body temperature) 96 degrees. As body temperature varies, the upper limit of the Fahrenheit scale was later changed to the temperature of boiling water, which was said to be 212 degrees. Nowadays, the Fahrenheit scale is only used widely in the United States of America and a few other countries (for example, Belize). The scale most widely used in thermometers of all kinds is the Celsius scale.

The Celsius scale was developed by Anders Celsius (1701 – 1744 A.D.), a Swedish astronomer who devised a scale of 100 degrees, with zero as the boiling point of water and 100 as its freezing point. He set this scale out in his paper ‘Observations of two persistent degrees on a thermometer’ in 1742. As he died just two years later, his assistant Carolus Linnaeus was instrumental in developing and publicizing the scale, and in encouraging its use among thermometer manufacturers.

Linnaeus reversed the scale, making zero the freezing point of water and 100 its boiling point, and used it in his patented linnaeus-thermometers, which were thermometers for use in greenhouses.

The scale caught on, with the endorsement of such figures as Daniel Ekstrom, Sweden's leading instrument-maker at the time, and Pehr Elvius, the secretary of the Royal Swedish Academy of Sciences. Since about 1950, the ‘centigrade’ scale (officially named the Celsius scale in 1948) has been the most widely used thermometer scale worldwide, and is used in thermometers of all kinds and in all industries, with the exception of some scientific fields (e.g. astrophysics or low-temperature research) where the specialised Kelvin scale is used instead.

Next in this series of posts, learn about the explosion of thermometer development that occurred in the 20th century.

Friday, 14 November 2014

New Developments: MM7005 USB Barcode Scanning Thermometer

Here at TME, we're always squirrelling away to develop new time, money and problem-solving temperature technology. In this series of posts, we'll keep an up-to-date log of all of our latest developments and newest products. 
The MM7005 USB Barcode Thermometer
Would you like all the benefits of a paperless temperature monitoring system, but are unable to wirelessly download data? TME is proud to announce its' newest development: an alternative to their best-selling MM7000 Bluetooth Barcode Thermometer: the MM7005 USB Barcode Thermometer.

Why USB?

Whilst Bluetooth downloading is a perfect solution for a lot of users, as Bluetooth is quick, easy to use, and most modern devices are Bluetooth-enabled, some users prefer a wired connection. This is where USB comes in. With USB, there’s no need to worry about connectivity or discoverability. Simply plug a USB cable into the MM7005 and download your data to computer, PDA, tablet or smartphone.

Who uses USB?
  • Public environments, such as exhibition halls that have a lot of background wireless traffic going on, resulting in slower Bluetooth signals
  • Organisations that hold sensitive information, and may have security worries about sending information wirelessly
  • Medical environments, such as hospitals and nursing homes, where wireless signals can interfere with medical equipment
  • Any organisation that prefers to download data using a wired connection

How does it compare?

The new MM7005 is very similar to the MM7000, in that it harnesses barcode technology to deliver sophisticated paperless temperature data recording across a range of industries and applications – from food manufacturing and food service to industrial processing, water temperature monitoring and logistics. Both models boast an integrated barcode scanner which logs not only the temperature, time and date of each scan, but also, for complete due diligence, the unique identity and location of the test point.

The MM7000’s recently-introduced alarm function is also present in the MM7005. This function allows users to set their own critical alarm values for every temperature test point, which, when breached, instantly display a Low or High visual alarm so that remedial action can be quickly taken.


For more details on this and other innovative temperature measurement solutions, visit www.tmethermometers.com for online purchasing or contact the Sales Team on 01903 700651 or sales@tmethermometers.com

Monday, 10 November 2014

History of Thermometery, Part 1: The Ancients

The history of temperature stretches back for thousands of years. Temperature has always been an important and essential part of daily life and society, ever since bakers and blacksmiths relied on temperature to control chemical reactions.

Nowadays, temperature is better understood than ever, and a wide range of temperature-measuring equipment – thermoscopes, thermocouples and many types of thermometer – is necessary to measure and to help control it.

This blog series hopes to open your eyes about the history of temperature measurement, from the ancient through to the modern day. Enjoy!



The ancient Philo Thermometer (Photo sourced from http://collectionsonline.nmsi.ac.uk)

The first known writers on temperature and its measurement were Philo of Byzantium and Heron of Alexandria. Both of these men wrote in Ancient Greek, and the word ‘thermometer’ comes from the Ancient Greek words ‘thermo,’ meaning ‘heat’, and ‘meter,’ meaning ‘to measure’; therefore the word ‘thermometer’ literally means ‘to measure heat.’

Philo (ca. 200BC) was a Greek engineer who conducted an early experiment on the expansion of air with heat. He created a device which has been called the first thermometer, now known as the Philo thermometer. A tube connected to a hollow sphere was extended over a jug of water. Philo noticed that if the sphere was in the sun, bubbles were released in the jug as air expanded out of the sphere, whereas when the device was placed in the shade, the air contracted with the cooler temperature and the water rose up the tube again.

Philo was a big influence on Heron of Alexandria (10 – 70 A.D.), an Ancient Greek mathematician and engineer, who wrote about temperature and drew up plans for a basic thermometer for use in medicine.

However, neither of these writers worked on or developed their designs for thermometers. The invention and creation of the first working thermometer has been credited variously to Abu Ali Ibn Sina (known as Avicennna in the Western world), Cornelius Drebbel, Robert Fludd, Galileo Galilei, and Santorio Santorio.

Abu Ali Ibn Sina (980 – 1037 A.D.) was a Persian polymath, physician and Islamic philosopher, who created a simple thermometer to test the temperature of air.

Cornelius Drebbel (1572 – 1633 A.D.) was a Dutch engineer and inventor of the submarine. Interestingly, Drebbel discovered carmine dye when one of his thermometers, which used coloured liquid, broke on a windowsill and he noticed that the dye grew more intense in colour when exposed to the sun.

Galileo Galilei (1564 – 1642), the famous Tuscan physician, mathematician and astronomer, came up with a device for registering temperature change at the height of the Scientific Revolution. He also noticed the principle behind the device known today as ‘Galileo’s thermometer’ - that is, that glass spheres filled with aqueous alcohol of slightly different densities would rise and fall.

However, none of these early designs were true thermometers. They were in fact thermoscopes rather than thermometers, as the absence of a scale meant that they only registered changes in temperature rather than measuring it. A true thermometer must include a temperature sensor - where physical change occurs with changes in temperature - and a means of converting that physical change into a readable value.

The next blog post in this series will expand on the invention of the first true thermometer, and how universal temperature scales came into being.