The following paper was prepared by James Vickers, P.E., Vice-President of SPI. Since 1988, he has been involved in the advancement of membrane technology for drinking water and recycled water treatment. Vickers participated in the development of publications for the US Environmental Protection Agency, AWWA, andWater Environment & Reuse Foundation. He has a bachelor’s degree in chemical engineering from Youngstown State University (Youngstown, Ohio) and a masters’ degree in engineering administration from George Washington University (Washington, D.C.). Vickers is a registered professional engineer in California as well as four other states.
It is intuitive to suggest that reverse osmosis (RO), which is used for the removal of dissolved solutes (salts), can also be used to achieve the removal of virus, Giardia, and Cryptosporidium. The challenge is that the use of conductivity for water quality monitoring of the RO process lacks sufficient sensitivity, and a direct integrity test that translates water quality performance to regulatory compliance for pathogen removal has not been established or accepted as a practice. The basis of this article is as follows: It is proposed for a production‐level (e.g., 1 mgd or larger) RO system that the sensitivity can be increased by using the results of a conductivity profile, which is an existing diagnostic tool used to identify integrity defects within an RO unit. The proposed direct integrity test methodology uses the results of a conductivity profile to (1) determine that RO unit integrity exists within statistical limits, (2) isolate and differentiate between the conductivity associated with diffusion and with a defect, and (3) calculate the log removal value (LRV) that would be associated with an RO membrane defect. The resulting calculation approach significantly increases the sensitivity of the LRV calculation and is supported by full‐scale testing data using MS2 coliphage as the challenge organism.
Complete paper can be read here