Two data visualizations exploring GPU hardware efficiency and U.S. data center electricity consumption — drawn from real datasets including LBNL 2024, the IEA Energy & AI report, and the MDPI Kappa-Energy Index paper.
Six deep learning architectures benchmarked on two NVIDIA GPUs. Each row shows how training energy and top-1 accuracy interact — the ideal model sits at low energy, high accuracy. Data from the MDPI Kappa-Energy Index study (2025), Tables 3–7.
| Architecture | Params (M) | TITAN Xp Energy (Wh) | GTX 1080 Ti Energy (Wh) | Top-1 Accuracy (%) | Kappa-Energy Index | Efficiency Tier |
|---|---|---|---|---|---|---|
| AlexNet | 61.1 | 14.2 | 16.8 | 56.5 | 3.98 | Low |
| VGG-16 | 138.4 | 210.5 | 248.3 | 71.6 | 0.34 | Very Low |
| ResNet-18 | 11.7 | 28.7 | 33.4 | 69.8 | 2.43 | High |
| EfficientNet-B3 | 12.2 | 41.3 | 49.6 | 82.1 | 1.99 | High |
| ConvNeXt-T | 28.6 | 67.9 | 79.2 | 82.1 | 1.21 | Medium |
| Swin Transformer | 28.3 | 89.4 | 104.8 | 81.3 | 0.91 | Medium-Low |
ResNet-18 and EfficientNet-B3 dominate the efficiency frontier — EfficientNet achieves the highest accuracy (82.1%) at just 41 Wh, roughly 5× less energy than VGG-16 with 12% better accuracy. VGG-16 sits alone in the "energy trap" quadrant: the worst energy-per-accuracy ratio of all tested architectures.
Twenty-three years of tracked electricity demand, disaggregated by facility type: traditional enterprise, colocation, and hyperscale/cloud. The rise of hyperscale is the dominant structural shift — and AI-optimized clusters are accelerating it further. From the LBNL 2024 report, Figure 2.1 & Table 2.1.
| Year | Traditional Enterprise (TWh) | Colocation (TWh) | Hyperscale / Cloud (TWh) | Total (TWh) | YoY Growth | Hyperscale Share |
|---|
U.S. data center electricity demand grew from ~61 TWh in 2000 to ~176 TWh in 2023 — a ~190% increase. But the composition shifted dramatically: hyperscale now accounts for ~55% of total consumption, up from near-zero in 2010. Critically, even as total demand surged, efficiency improvements (better PUE, server consolidation) prevented consumption from growing proportionally with compute capacity.
Interactive analytics across GPU hardware generations — compute density, memory bandwidth proxies, and clock relationships. Filter by manufacturer, release year, and sort metric.
SOURCE — Kaggle GPU Benchmarks · wthoman API
Bubble size = memory bus width. Ideal: high shaders, large memory.
Horizontal bars sorted by GPU clock (MHz).
| Manufacturer | Product | Year | Mem (GB) | Bus Width | GPU Clock | Mem Clock | Shaders | Chip |
|---|---|---|---|---|---|---|---|---|
| Loading… | ||||||||
IEA Electricity 2024 projections — how much electricity data centers consumed in 2022 and where demand is heading by 2026 across regions, plus a breakdown of how that electricity is split inside a typical facility.
SOURCE — IEA, Electricity 2024: Analysis and Forecast to 2026 · jhu30699
IEA 2022 actual vs 2026 low / base / high scenario (TWh)
Typical breakdown by component type (IEA)
Cooling matches computing in electricity draw — a core finding that supports efficiency gains from better PUE practices and hardware co-design.
| Dataset | 2022 | 2026 | Notes |
|---|
Regional electricity consumption 2017–2030 and GPU hardware efficiency landscape — tracing how demand is growing by geography and how compute efficiency has evolved across NVIDIA architectures.
SOURCE — IEA Electricity 2024 · NVIDIA datasheets · opederso
TWh · actual and IEA base-case projections
| Region | 2022 TWh | 2024 TWh | 2030 TWh | 2024–2030 Growth | Share (2024) | Energy Mix | Carbon Intensity |
|---|---|---|---|---|---|---|---|
| United States | 130 | 183 | 426 | +133% | ~45% | Gas 41%, Nuclear 20% | High |
| China | 88 | 104 | 279 | +168% | ~25% | Coal-heavy | Very High |
| Europe (EU+UK) | 56 | 70 | 115 | +64% | ~15% | Renewables-leading | Medium |
| Japan | 14 | 19 | 34 | +79% | ~5% | Gas & Nuclear | High |
| Rest of World | 40 | 39 | 91 | +133% | ~10% | Mixed | Varies |
| Global Total | 460 | 415 | 945 | +128% | 100% | Gas 40%+ proj. | High |
NVIDIA architectures from Volta to Hopper — TDP, compute throughput, and efficiency per watt
| GPU | Architecture | Year | TDP (W) | FP16 TFLOPS | VRAM (GB) | Mem BW (TB/s) | TFLOPS/W | Tier |
|---|---|---|---|---|---|---|---|---|
| V100 SXM2 | Volta | 2017 | 300 | 112 | 32 | 0.90 | 0.37 | Data Center |
| RTX 3090 | Ampere (consumer) | 2020 | 350 | 142 | 24 | 0.94 | 0.41 | Prosumer |
| A100 SXM4 | Ampere | 2020 | 400 | 312 | 80 | 2.00 | 0.78 | Data Center |
| RTX 4090 | Ada Lovelace | 2022 | 450 | 330 | 24 | 1.01 | 0.73 | Prosumer / Edge |
| H100 SXM5 | Hopper | 2022 | 700 | 990 | 80 | 3.35 | 1.41 | Data Center (AI) |
| H200 SXM | Hopper | 2024 | 700 | 990 | 141 | 4.80 | 1.41 | Data Center (AI) |
Stacked area · TWh · IEA base-case projection (dashed after 2024)
X: TDP (W) · Y: FP16 TFLOPS/W · Bubble size: memory bandwidth
Three interactive views — a regional constellation, a growth ladder with forecast band, and a proportional ribbon — that together map where AI data center electricity use sits today and where it is heading by 2030.
Source figures from the IEA Energy and AI materials and the 2024 LBNL United States Data Center Energy Usage Report.
| Metric Group | Label | Region | Year | Value | Units | Status | Source |
|---|
Circle area scaled to TWh. Hover or click each region for details. The center circle represents the full 415 TWh global total.
Global data center electricity is concentrated — the U.S. holds ~45% of the 2024 total, followed by China (25%) and Europe (15%).
Bars show actual and IEA base-case values. The gradient ribbon represents the LBNL U.S. 2028 uncertainty range — click it for details.
The IEA base case has global demand more than doubling by 2030. The U.S. 2028 band (325–580 TWh) underscores forecast uncertainty.
The same regional share data in proportional form. Hover each segment for the exact share and implied TWh.
A few regions carry the bulk of global data center electricity use, so grid choices in those regions shape the whole global story.
Moving from 415 TWh in 2024 to 945 TWh in the IEA base case for 2030 shows how quickly this load can expand in just six years.
The U.S. 2028 band of 325–580 TWh is a reminder that forecasting infrastructure demand is a range problem, not a single-number problem.