All measurements you make have some uncertainty in them. If you measure a distance of 14.5 inches with a ruler, for example, you don't know for certain that the distance was exactly 14.5 inches, because your eyes and the ruler can't tell the difference between 14.5 and 14.499995. A more sensitive instrument can give you a smaller uncertainty, but there will always be some uncertainty in your measurements nonetheless. The same holds true for temperature.

Touch your thermometer to the object whose temperature you want to measure.

Watch the reading if your thermometer is digital. If the reading fluctuates, the uncertainty is equal to the range of the fluctuation. For example, imagine that the temperature reading on a digital thermometer wanders back and forth from 20.12 to 20.18 degrees. Your uncertainty would be 0.06 degrees.

Go to the last digit of the reading if the thermometer holds steady and constant. In this kind of situation, the last digit will be considered uncertain. If your thermometer reads 36.12 degrees, for example, the uncertainty would be 0.01 degrees, because the last digit (the 2 in 36.12) sets the limit of your precision.

Watch the mercury or alcohol in the column if you are using a traditional thermometer. Read the temperature to the nearest 0.1 degree if possible -- if not, try reading it to the nearest 0.5 degrees. Either way, your uncertainty will be equal to the limits of your precision. If you could only estimate the temperature to the nearest 0.1 degrees, for example, your uncertainty is 0.1. If you could only estimate it to the nearest 0.5, your uncertainty is 0.5, and so forth.