AN ASSESSMENT OF FACTORS AFFECTING THE RELIABILITY OF THE TOTAL ALKALINITY MEASUREMENT  

Benjaporn Somridhivej* and Claude E. Boyd
 School of Fisheries, Aquaculture and Aquatic Sciences
 Auburn University, Auburn, Alabama 36849 USA
 somribe@tigermail.auburn.edu

The optimum, endpoint pH for total alkalinity titrations decreased from 5.0 at 10 mg/L alkalinity to 4.2 for 300 mg/L alkalinity or more. The appropriate color changes for bromocresol green-methyl red (BG-MR) and methyl orange (MO) indicators also varied with the initial total alkalinity of samples. Despite differences in pHs at endpoints for samples of different alkalinities, when the best endpoint pH (Table 1), best color of BG-MR and MO, or the endpoint of methyl purple were used in titrations of  standard solutions, there were few differences between measured alkalinities and standard alkalinities - the accuracy was better than ±3 mg/L. Results of spike and recovery tests on aquaculture pond water samples also revealed that an accuracy of ±3 mg/L alkalinity could be achieved on either unfiltered or filtered samples by all four methods of acceptable endpoint detection. Although precision of measurements could not be consistently maintained below ±1 mg/L, coefficients of variation for repeated measurements usually were less than 5% for all methods of endpoint detection. Nevertheless, this degree of precision was adequate to achieve good accuracy that is the major concern in water analysis.      

Variations in alkalinity measurement that could result from improper selection of endpoint pH (or color) were rather small - usually not more than ±5 mg/L. In an interlaboratory comparison of alkalinity determinations on standard solutions, most laboratories reported inaccurate alkalinities. These inaccuracies were greater than possible endpoint variations. It was clear that most of the participating laboratories did not have a satisfactory method of quality control.