What is the meaning of the sensor icon in the instrument’s display?
The icon indicates the condition of the electrode. It is update after a successful calibration. No icon is displayed before the first calibration is carried out. The icon disappears when only a 1-point-calibration is performed (2 or more points are required).
How is the temperature compensated when measuring pH?
pH measurements depend on the sample’s temperature. Two points are important:
A. Influence of the temperature on the slope of the electrode
The pH electrode measures the potential between measuring and reference half-cell. The instrument calculates the pH value from this potential using the temperature dependent factor -2.3 * R * T / F where R is the universal gas constant, T the temperature in Kelvin and F the Faraday constant. At 298 K (25 °C), the factor is -59.16 mV/pH. This is the slope one knows from the calibration of the electrode. At different temperatures the following values are calculated: -56.18 mV/pH at 10 °C, -58.17 mV/pH at 20 °C, -60.15 mV/pH at 30 °C and so on. This influence on the pH measurement is corrected with the automatic (ATC) or manual temperature compensation (MTC). Hence it is important to know the temperature of the sample or to use a temperature probe. A wrong set temperature means an error of 0.12 pH units per 5 °C difference.
B. Influence of the temperature on the pH value of the sample
The pH value of the sample changes with the temperature. This is a chemical effect and therefore individual for each kind of sample. This influence CANNOT be compensated; only the real pH value at the actual temperature is displayed. Hence it is important to compare only pH values measured at the same temperature.
Exception: The temperature dependence of pH in many commercial buffer solutions is stored in the instrument. The electrode can be calibrated at different temperatures because the measured potentials are automatically referred to 25 °C or 20 °C. To benefit from this feature it is important to select the correct buffer group and to measure the temperature during calibration.
How is the temperature compensated when measuring conductivity?
The conductivity measurement is strongly temperature-dependent (about 2% variation per °C). Results can only be compared if the temperature of all samples is identical or if the value refers to a certain reference temperature.
Most often the linear temperature compensation is used. The operator has to select 20 °C or 25 °C as the reference temperature. The difference between the measured and reference temperature is multiplied with a compensation factor called α (alpha, unit %/°C). The conductivity reading is then compensated with this percentage.
To have the correct temperature compensation when measuring the conductivity, the linear compensation coefficient alpha must be determined for each kind of sample. By approximation, the temperature dependence is taken as linear. But in reality this “linear” coefficient itself depends on the ion concentration and temperature of the sample. The factory setting of alpha is 2.00 %/°C, which can appreciatively be used for standard samples. No icon is shown before the first calibration is carried out and saved.