Skip to content

Commit 59fa71f

Browse files
author
willitcode
committed
my copy
1 parent ff15d36 commit 59fa71f

File tree

1 file changed

+29
-1
lines changed

1 file changed

+29
-1
lines changed

content/solutions/_index.md

Lines changed: 29 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,4 +3,32 @@ title: 'Solutions'
33
description: 'Our work towards a brighter future with cleaner AI'
44
---
55

6-
Long copy here
6+
# Spiking Neural Networks
7+
Spiking Neural Networks, or SNNs, are a new kind of neural network. In Artificial Neural Networks, or ANNs, the current standard, the artificial "neurons" are basically active all the time. SNNs, on the other hand, behave more like a human brain—the neurons only activate when they are actively communicating with each other. This leads SNNs to be up to 280 times more efficient than ANNs.[^1] This increase in efficiency means a decrease in both greenhouse gas emissions *and* water consumption, since it's an increase in overall computing efficiency.
8+
9+
At CCAI, we're working to further research into SNNs and bring them into the market as a viable alternative to ANNs. We provide grant funding to researchers working on SNNs and work in industry advocacy to bring awareness to SNNs. After all, energy savings don't just benefit the environment—they save the companies that operate the AI models money as well.
10+
11+
# Lifelong Learning
12+
Lifelong Learning, or L2, is another technology that can help reduce the energy and water consumption of artificial intelligence. Right now, most neural networks have a problem: as more training is added, they have a tendency to "forget" their older training. This means that, when AI companies want to extend the functionality of their models, *they have to re-train their models from the ground up*. OpenAI, for example, had to completely re-do the training process when they created GPT-4 (and then again for GPT-4o).[^2][^3] They couldn't build new training on top of GPT-3.5 or GPT-4 because of this problem where models tend to "forget" old training.
13+
14+
Lifelong Learning solves this problem by using a set of algorithms to prevent AI models from forgetting their past training.[^1] This *substantially* reduces the energy (and therefore also the water) consumption of these models during their training phase, since models can be improved upon incrementally rather than re-training every model from the ground up.
15+
16+
Unfortunately, much like SNNs, L2 algorithms are not yet ready for use by the broad majority of AI model developers. As such, we at CCAI are also funding research into L2 algorithms alongside SNN through grants and bringing awareness to L2 algorithms in the industry.
17+
18+
# Offsetting
19+
Another way that AI's environmental impact can be effectively reduced is through offsetting. Analytical AI can be used for a variety of applications that are highly beneficial for the climate, water conservation, and many other aspects of the environment.[^4] [^5] For example, AI has been used to improve efficiency in wastewater treatment and to more accurately track iceberg shrinkage.[^4] [^5] These are significant benefits that should not be overlooked, and because they bring these benefits, some AI models are a net benefit for the environment.
20+
21+
CCAI is funding research into these applications of AI and working directly with smaller, local organizations to implement these systems in the communities that need them most.
22+
23+
# The Tough Pill to Swallow: Curtailing Generative AI
24+
The simple truth is that, while generative AI is a powerful new technology, it consumes much more energy and water than less general AI models (like analytical models) and it contributes very little to environmental conservation efforts.[^6] The societal excitement around tools like ChatGPT and Midjourney has driven heavy investment in generative AI, but smaller, more analytical models may be the way to go, at least for now: they're more energy efficient *and* they have a much broader number of pratical applications. Despite all the hype, we would be better off if we were to significantly curb our development of generative AI.
25+
26+
At CCAI, we're working to provide education about the utility of analytical AI to people and companies around the globe, and we're directly supporting some local groups that are pushing back against the unfair encroachment of generative AI infrastructure on the global south. For example, we're supporting Uruguay's Environment Ministry in their legal battle to obtain information about how much water Google's planned datacenter will need from the country.[^7]
27+
28+
[^1]: Dora, Shirin. “AI Has a Large and Growing Carbon Footprint, but There Are Potential Solutions on the Horizon.” *The Conversation*, 16 Feb. 2024, http://theconversation.com/ai-has-a-large-and-growing-carbon-footprint-but-there-are-potential-solutions-on-the-horizon-223488.
29+
[^2]: “GPT-4 Research.” *OpenAI*, 14 Mar. 2023, https://openai.com/index/gpt-4-research/.
30+
[^3]: “Hello GPT-4o.” *OpenAI*, 13 May 2024, https://openai.com/index/hello-gpt-4o/.
31+
[^4]: Bosch, Hilmer, et al. “AI’s Excessive Water Consumption Threatens to Drown out Its Environmental Contributions.” *The Conversation*, 21 Mar. 2024, http://theconversation.com/ais-excessive-water-consumption-threatens-to-drown-out-its-environmental-contributions-225854.
32+
[^5]: Masterson, Victoria. “9 Ways AI Is Helping Tackle Climate Change.” *World Economic Forum*, 12 Feb. 2024, https://www.weforum.org/stories/2024/02/ai-combat-climate-change/.
33+
[^6]: Shim, Christina. “The Future of AI and Energy Efficiency.” *IBM Think*, 17 Oct. 2024, https://www.ibm.com/think/insights/future-ai-energy-efficiency.
34+
[^7]: “El agua que Google necesitará en Uruguay para su data center desata una pelea judicial.” *El Observador*, 17 Feb. 2023, https://www.elobservador.com.uy/nota/el-agua-que-google-necesitara-en-uruguay-para-su-data-center-desata-una-pelea-judicial-2023217143847.

0 commit comments

Comments
 (0)