In an era of heightened scrutiny over water use, especially for industrial and commercial uses such as data centers, water reuse is more relevant than ever. To help get a perspective on this, we asked ASSE, ARCSA and IAPMO technical staff their thoughts on ways that facilities can reduce their environmental impact, keep our public water clean and healthy, and use less water (or even become “water positive”).

SCOTT HAMILTON,
SENIOR DIRECTOR OF COMPETENCY DEVELOPMENT SERVICES, ASSE INTERNATIONAL
Facilities today face three pressures: 1) Protect public water resources, 2) Reduce potable water dependence, and 3) Demonstrate measurable environmental improvement. Industry is responding with water reuse and risk mitigation strategies to reduce the consumption of water and to move toward “water-positive” performance.
The following strategies are practical plans to achieve these goals.
Developing Risk Mitigation Strategies
Water reuse must always begin with risk control. Without structured and documented planning, reuse systems can create regulatory, biological, or operational exposure. We must always strive to remember that water is our most important natural resource, and every measure must be taken to protect it.
We begin by conducting a water risk assessment. The assessment should be conducted by the facility representative along with other industry professionals, including those with ASSE 12060 series, Water Quality Certifications for Plumbers, Pipefitters/HVAC Technicians, and Sprinkler Fitters, or ASSE 12080, Legionella Water Safety and Management Specialist, certifications. Potable intake and distribution systems, process demand, cooling load, blowdown discharge, stormwater runoff, and sanitary discharge should all be considered.
The next part of the risk mitigation strategy is the implementation of multi-barrier treatment. Redundancy and real-time monitoring are critical for facilities. All reuse systems should include layered safeguards, including pre-filtration for sediment removal, membrane filtration (reverse osmosis), disinfection (UV or chlorination), continuous monitoring for turbidity, conductivity, and microbial counts, and automated alarms and shutoffs.
The final part is to establish a formal water management plan. A team of experts should be appointed to assist with the plan. Once again, those certified to the ASSE 12060 series and ASSE 12080 could play integral roles on the team. The water management plan reduces public health risk and strengthens defensibility in audits or inspections. The plan should include cooling tower management for Legionella control, cross-connection prevention, maintenance and inspection schedules, emergency bypass procedures, and water quality testing protocols.
Eliminating Potable Water for Non-Essential Use
One of the most important strategies is very simple: stop using potable water for uses that don’t require it. Cooling towers are often the largest water consumer in commercial buildings and data centers. Facilities can increase cycles of concentration to reduce blowdown, install side-stream filtration to extend water reuse, convert to reclaimed municipal water for cooling makeup, and/or capture and reuse HVAC condensate.
Another large water use is for landscape irrigation. Steps can be taken to reduce potable water demand by 50%-100%. Again, planning and using trained professionals, such as those who hold ASSE 21000, Rainwater Catchment Systems Personnel, certification, is critical for effectiveness and safety. Industrial facilities can replace potable water with treated process effluent, captured stormwater, or recycled rinse water.
Finally, certain plumbing fixtures can be supplied by alternative, treated water. The same safety factors we discussed above apply.
In conclusion, water reuse must strengthen, not compromise, public water health. It is critical to implement strict cross-connection control; all non-potable water systems and outlets must be clearly labeled; a water management plan, along with routine monitoring, must be established. When properly designed and installed, reuse systems reduce demand on municipal supplies, lower discharge volumes, and improve overall resilience while still maintaining and protecting our most important natural resource — water.

JOHN HIGDON, P.E.,
DIRECTOR OF STANDARDS DEVELOPMENT, ASSE INTERNATIONAL
One highly effective approach is the use of closed loop cooling and testing systems whenever possible. These systems significantly reduce water demand by recirculating water instead of using it once and discharging it. In addition to conserving water, closed-loop designs help limit the release of heated or treated water back into public systems, supporting overall water quality.
Rainwater harvesting is another practical solution. Captured rainwater can be used for irrigation and other non-potable purposes, reducing reliance on municipal water supplies while also helping manage stormwater runoff. Many current installations clearly demonstrate that the return on investment for rainwater harvesting systems is highly favorable.
Installing water-efficient plumbing fixtures such as low-flow toilets, faucets, and urinals can deliver immediate water savings. These upgrades should be supported by strong maintenance practices. Frequent inspections and quick repairs help stop water waste and cut costs (the U.S. Environmental Protection Agency (EPA) notes that an average leaking toilet will waste over 200 gallons of water per day).
Additional safeguards can further protect public water systems. Installing detector backflow assemblies on fire lines helps identify unauthorized or unintended water use. Coupled with advanced metering, realtime monitoring and data analytics can further enable facilities to track usage patterns, identify inefficiencies, and continuously improve performance.
Together, these strategies demonstrate how thoughtful design, technology, and operations can reduce water consumption, protect public health, and support long-term sustainability goals.

HEATHER KINKADE,
EXECUTIVE DIRECTOR, ARCSA INTERNATIONAL
Water reuse has shifted from a sustainability “nice-to-have” to an operational and reputational necessity. Facilities are increasingly expected not only to reduce their overall water footprint, but also to protect public water supplies and demonstrate leadership by moving toward water neutrality — or even becoming water positive. Achieving these goals requires rethinking where water comes from, how it is used, and how often it can be reused before leaving the site.
One of the most impactful strategies is the incorporation of alternative water sources into facility operations, especially for cooling systems, which are often the largest drivers of industrial water demand. Rainwater harvesting and stormwater capture are particularly well-suited for this purpose. Rooftop rainwater collection systems can supply cooling towers, adiabatic cooling units, or evaporative coolers, reducing reliance on potable water while also mitigating localized flooding and runoff pollution. Similarly, stormwater captured from parking lots or paved surfaces — once appropriately treated — can be stored and reused for cooling processes. In regions with seasonal rainfall, these systems can be paired with onsite storage or underground cisterns to smooth supply across dry periods.
Beyond rain and stormwater, facilities can also integrate other non-traditional sources such as airhandling condensate or process water generated onsite. Data centers, for example, produce significant volumes of condensate from HVAC systems, which are often discharged to drains despite being relatively clean. Capturing and reusing this water for cooling makeup or equipment washing can meaningfully reduce potable water demand while improving overall system efficiency.
Equally important is eliminating the use of potable water for non-essential tasks altogether. Cooling operations, landscape irrigation, toilet flushing, and equipment cleaning rarely require drinking-quality water, yet many facilities continue to rely on municipal supplies by default. Transitioning to non-potable distribution systems, reclaimed water, harvested rainwater, or treated gray water to serve these functions safely and reliably. For landscape irrigation, drought tolerant plantings combined with drip irrigation and soil moisture monitoring can dramatically reduce water use while maintaining healthy outdoor spaces.
Some facilities are taking these strategies a step further by adopting cascading water use models, where water is reused multiple times at progressively lower quality thresholds. For example, captured rainwater may first be used for cooling, then reused for irrigation, and finally directed to onsite treatment systems before infiltration or reuse. When paired with real-time monitoring and smart controls, these systems not only reduce total water withdrawals but also protect public waterways by minimizing contaminated discharges.
This systems-based approach is part of what I have formalized as Cascade Water Cycling — a framework for intentionally extending the value of water by matching water quality to function, integrating alternative water sources, and reducing both potable water demand and environmental discharge. By viewing water as a flowing asset rather than a single-use input, facilities can move beyond efficiency alone toward truly resilient and water-positive operations.
Ultimately, becoming water positive is less about a single technology and more about a systems-level mindset. By diversifying water sources, matching water quality to their actual use, and designing facilities for reuse rather than disposal, industrial and commercial operators can significantly reduce environmental impact, safeguard public water supplies, and build long-term resilience in a water-constrained world.

CHRISTOPH LOHR, P.E., CPD, LEED AP BD+C, ASSE 12080,
VICE PRESIDENT OF TECHNICAL SERVICES AND RESEARCH, IAPMO
I’m a bit cautious about making broad claims regarding water use in data centers because what I’m hearing lately is that many new projects are actively moving toward air-cooled and closed-loop approaches to substantially reduce (and in some cases nearly eliminate) onsite evaporative water use. That means the “right answer” can vary by climate, heat density, and local constraints and the key is to treat water and energy as a linked tradeoff, not a single-metric debate. That said, keep in mind that switching to air-cooled technologies means that more energy is needed … I just found out that APS in Arizona is going to keep open an additional seven years a coal power plant that was set to go offline in 2031 to help with the additional energy needs in Phoenix data centers.
Where I feel most confident about the opportunity for water reuse is health care, where water safety and water sustainability collide. Many hospitals implement (or need to implement) routine flushing protocols of their potable domestic water to reduce water age and maintain disinfectant residuals, which can help manage risks from waterborne pathogens. The problem is that this safety measure can consume large volumes of potable water that are literally sent down the drain.
That creates a practical, “fit-for-purpose” reuse opportunity: capture and treat the flushed potable water (as gray water) and reuse it for non-potable demands such as toilet/urinal flushing or, where appropriate, mechanical makeup water. Doing so can reduce total potable demand while preserving the safety benefits of flushing. In other words, reuse can become a “middle road” between two competing pressures: conserving water and maintaining safe building water quality.
From a risk-mitigation standpoint, the essentials are clear separation of potable/non-potable systems, strong cross-connection control, and monitoring/maintenance practices that keep the reuse system operating within its intended performance envelope. As codes and policies continue to evolve, focusing on proven, fit-for-purpose applications in buildings like health care can deliver real water savings while keeping public health protection front and center.
KEY TAKEAWAYS AND RECURRING THEMES
- Reduce or eliminate potable water for non-essential use
- Cooling systems are the highest-impact intervention point for water reduction
- Diversify water supply sources beyond potable municipal water (rainwater, stormwater, condensate, etc.)
- Protect public health through structured risk mitigation — water reuse must not compromise safety.
- Measure, monitor, and continuously maintain systems through real-time performance tracking, metering, and detection

TOM PALKON,
ASSE INTERNATIONAL EXECUTIVE DIRECTOR, IAPMO EXECUTIVE V.P. & CHIEF TECHNICAL SERVICES OFFICER
Facilities are increasingly challenged to reduce their environmental impact, protect public water systems, and lower overall consumption while maintaining operational reliability. Fortunately, a combination of technology, alternative water sourcing, and proactive planning can significantly reduce potable water dependence and, in some cases, enable facilities to move toward “water positive” outcomes.
One of the most impactful opportunities lies in advanced cooling technologies. Traditional evaporative cooling systems can consume significant volumes of water. In contrast, closed-loop cooling systems recirculate water with minimal losses, dramatically reducing makeup water demand. Emerging approaches such as direct-to-chip liquid cooling and immersion cooling further reduce or eliminate evaporative losses by transferring heat more efficiently at the source. Several hyperscale and high-performance computing environments have already demonstrated that liquid based cooling can support higher rack densities while using substantially less water than conventional air-cooled designs.
Facilities are also exploring alternative water sources to offset potable water use. Non-potable reclaimed water, treated gray water, and harvested rainwater can be safely integrated into cooling systems, cooling towers, and other non-consumptive processes when properly designed and monitored. For example, some campuses store stormwater onsite and reuse it for cooling during peak demand periods, reducing both freshwater withdrawals and stormwater discharge impacts. Others partner with municipalities to use recycled wastewater that would otherwise be discharged, turning a waste stream into a reliable operational resource. Consider using alternate water source systems tested and certified to the IAPMO/ANSI Z1324, Alternate Water Source Systems for Multi-Family, Residential, and Commercial Use, standard. Another critical step is eliminating potable water for non-essential uses. Cooling, landscape irrigation, and equipment wash-down are often suitable for non-potable alternatives. Xeriscaping and native plantings can significantly reduce or eliminate irrigation needs altogether. Where irrigation is required, smart controllers and weather-based sensors help ensure water is applied only when necessary, further conserving resources.
Beyond infrastructure, facilities must also adopt risk mitigation and governance strategies. This includes conducting water risk assessments that evaluate local watershed stress, drought exposure, and utility reliability. Redundancy in water supply sources, onsite storage, and clear operational response plans can help ensure continuity during water shortages or regulatory changes. Equally important is continuous monitoring — using meters, sensors, and analytics — to detect leaks, optimize performance, and validate conservation efforts. Consider using products tested and certified for leak detection to the IAPMO IGC 115, Automatic Water Leak Detection Devices.
Finally, regulatory awareness and foresight are essential. Water-related legislation is evolving rapidly, with increasing focus on disclosure, efficiency, and reuse. Facilities that actively track regulatory trends and engage early with regulators and utilities are better positioned to adapt, secure permits, and avoid costly retrofits. In many cases, early adoption of water-efficient technologies also provides a competitive advantage as sustainability expectations rise. Even in facilities that aren’t high water demand buildings, operators and owners should source products certified to the U.S. EPA WaterSense standards to reduce water use.
Taken together, these strategies demonstrate that reducing water use and protecting public water systems is not only achievable but compatible with high-performance operations. With thoughtful design and proactive planning, facilities can move beyond compliance toward long-term water stewardship and resilience.

CHRISTOPHER L. WHITE, PH.D.,
SENIOR MANAGER OF PRODUCT CERTIFICATION, ASSE INTERNATIONAL
With the explosion of the artificial intelligence (AI) revolution comes many new challenges that will need to be addressed. One of those will be the need to power and cool the data centers that are essential to facilitate artificial intelligence.
The rapid acceleration of AI workloads is redefining the performance and thermal requirements of modern compute infrastructure. As model sizes scale and inference demands grow, power density is increasing beyond what conventional air cooled architectures can support. This shift places unprecedented pressure on data center power delivery, heat dissipation, and overall system efficiency.
In the 1990s, the rise of the internet pushed network and computer infrastructure through successive phases of technological evolution — from dial up over legacy copper, to broadband over coaxial, and eventually wireless architectures. Each step demanded new physical layer capabilities and hardware advancements. The AI transition is following a similar trajectory, but at a far more aggressive pace. Traditional server rooms once cooled by standard HVAC systems have given way to hyperscale data centers packed with high power central processing units (CPUs), graphics processing units (GPUs) and accelerators that require far more sophisticated thermal management.
Data centers — purpose built facilities engineered to house thousands of tightly integrated server nodes — now allocate more than 40% of their total electrical consumption solely to cooling. As computational intensity rises, traditional air cooling methods face significant limitations, including airflow inefficiency, heat sink constraints, and dependency on large scale evaporative cooling. These constraints are increasingly incompatible with high performance computing (HPC) and next generation AI clusters operating at extreme power densities.
Immersion cooling has emerged as a highly viable alternative, engineered for modern thermal loads. By submerging IT hardware directly in a dielectric fluid capable of transferring heat up to 3,000 times more efficiently than air, immersion systems eliminate reliance on fans, heat sinks, and complex airflow patterns. Heat is extracted at the component level and transferred through a secondary closed loop heat exchange system — typically refrigerated or attached to a facility water loop. Unlike air cooled systems, which depend heavily on evaporation and often waste millions of gallons of water annually, immersion cooling can cut water usage to nearly zero while providing drastically improved thermal performance and energy efficiency.
When paired with cogeneration (CHP) systems that recapture thermal output for reuse, immersion cooled data centers can reduce overall energy waste and decrease water consumption by more than 90%. While cogeneration has long been used in industrial applications, the extreme heat output of AI driven compute clusters may finally make large scale adoption both practical and economically advantageous.







