You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{ name:"File-level code review", avgI:2500, avgO:1200, note:"Review single file" },
81
81
{ name:"Bug fix assistance", avgI:1500, avgO:800, note:"Debug with context" },
@@ -196,20 +196,81 @@ const chart = Plot.plot({
196
196
display(chart)
197
197
```
198
198
199
-
<details>
199
+
<details open>
200
200
<summary>carbon</summary>
201
201
202
202
Enter your assumed rate of energy consumption (kWh per million tokens) and grid emission factor (grams of CO₂ equivalent per kWh) to produce a back-of-the-envelope estimate of carbon emissions from model inference.
${d3.format(",.0f")((u.sessions)*(u.avgI+u.avgO)/1e6*365)} million tokens and ${d3.format(",.0f")((u.sessions)*(u.avgI+u.avgO)/1e6*energy*365)} kWh per year
${d3.format(",.0f")((u.sessions)*(u.avgI+u.avgO)/1e6*365)} million tokens and ${d3.format(",.0f")((u.sessions)*(u.avgI+u.avgO)/1e6*energy*365)} kWh per year
216
+
<details>
217
+
<summary>kWh/Mtok references</summary>
218
+
219
+
| Study / model | kWh/Mtok | Context | Original metric | Source |
220
+
| --- | --- | --- | --- | --- |
221
+
| GPT-3 175B (GPT-3 paper) | **~13** | GPT-3 on 2020 infra | 0.4 kWh per 100 pages (~30k tokens) | [GPT-3 paper](https://arxiv.org/abs/2005.14165) |
222
+
| Husom et al. (older ChatGPT) | **~9** | GPT-3-era ChatGPT | ~9 mWh/token | [Husom et al. summary](https://github.com/kmaasrud/ai-ecosystem-carbon-footprint) |
223
+
| LLaMA-65B — Samsi et al. | **~0.8–1.1**| LLaMA-65B on V100/A100 | ~3–4 J/token | [From Words to Watts](https://arxiv.org/abs/2309.04360) |
224
+
| Local Llama-3 8B (Baquero) | **~0.17** | 8B model on Apple M3 | <200 J for ~333 tokens | [Baquero CACM blog](https://cacm.acm.org/blogcacm/the-energy-footprint-of-humans-and-large-language-models/) |
0 commit comments