|Using and Calibrating a Hydrometer|
A hydrometer is used to track the fermentation progress by measuring the conversion of sugar to ethanol by the yeast which is known as attenuation.
A hydrometer, as far as the brewer is concerned, determines the amount of alcohol content in a liquid by measuring the specific gravity or difference in gravity (density) between pure water and water with sugar dissolved in it.
Density is the weight of an object divided by the volume it occupies. Specific gravity is usually expressed to three decimal places or simply as "gravity points" or the last three decimals For example, ale with an original specific gravity of 1.059 can be described as having 59 "gravity points". Beers typically have a final gravity between 1.015 and 1.005. Before you use your hydrometer you should calibrate it in pure (I use distilled) water.
Your hydrometer should read 1.000 at the specified temperature when in pure water. If it reads either higher (1.001 or more) or lower (0.9999 or less), simply subtract or add the amount of the difference from your readings in wort or beer. For example, if your hydrometer reads 0.998 in pure water at 60°F (its calibration temperature) it is reading two "points" low and means that two "points" need to be added to any reading taken in wort or beer. In other words, if your wort reads 1.050, your corrected reading would be 1.052. Conversely if it read 1.002 in water, it is reading two "points" high and you would need to subtract two points from your reading, e.g. a wort or beer reading of 1.052 should be adjusted to 1.050.