+With the increase in focus on renewable energy and carbon emissions, one of the biggest problems with artificial intelligence slips under the radar, the water required to cool the servers in the data centers. Throughout the cooling process, up to 9 liters of water per kilowatt of energy used are evaporated, rendered useless to the cooling process and inaccessible for those in need of drinking water. This cycle of high consumption and low recycling creates a deficit that is usually addressed by simply pulling in more water from other sources, draining the surroundings of a much needed resource. Although big companies such as Google, Microsoft, and Meta say that they aim to replenish more water than they consume by 2030, there are doubts if that is an achievable goal given that there is simply not enough water to last until then.[^3] Compounding with the incessant need for more water, data centers are often built in locations with cheap real estate and energy, such as South America and Sub-Saharan Africa, without any thought to the pre-existing water stress of those areas, which leads to a worsening of the problem as water is drawn away from those in desperate need and redirected into cooling the massive data centers. Although there are advances in technology and efficiency in regards to data centers, global water consumption needed for AI is still projected to reach between 4.2 and 6.6 billion cubic meters by 2027.[^4]
0 commit comments