Our turbidity readings are always below 1 NTU. Why does the procedure use a 20 NTU Stablcal Standard as the lowest standard for calibration?
Turbidity calibration starts at 20 NTU, but measurements are much lower than 20 NTU
Although it may seem more accurate to calibrate with low-level standards, there are several practical reasons why this does not work as intended and leads to inaccurate calibration. Contamination is a significant concern at low turbidity levels. The calibration standard can easily become contaminated from particles in the air, sample cells, or dilution water. If the dilution water used to prepare a 0.1 NTU standard had a turbidity of 0.04 NTU, for instance, the actual concentration would be higher than expected by 40 percent. Stray light and sample cell variation are also significant at low NTU values but are negligible at 20 NTU. If a calibration is made with low-level standards, these factors of contamination, stray light, and sample cell variation can easily skew the calibration curve and make all subsequent readings inaccurate. On the other hand, a 20 NTU standard can be prepared accurately and reproducibly and any turbidity contribution from contamination, stray light, and sample cell variation is negligible relative to the 20 NTU value. The zero point of the calibration curve is taken as the dark current in the instrument before the lamp is turned on, and the calibration curve is linear from 0-40 NTU, therefore calibration accuracy is maintained at lower concentrations. After calibration with a 20 NTU standard, Hach recommends verifying the accuracy of low-level measurements using Stablcal low-level standards, available down to 0.100 NTU. When meticulous care is taken to minimize contamination, your instrument should be able to read within the tolerance specified on the standard.