Cloud Gaming.Expert
Analysis8 min read

The hidden environmental cost of cloud gaming

A console burns 150–200 watts in your house. The server streaming Cyberpunk to you burns 400 watts in a data centre. Cloud gaming might be more efficient than local — or it might be quietly worse. The honest answer involves more variables than either side admits.

By Marin Björk
Reviewed

The pitch you sometimes hear

Cloud gaming advocates occasionally argue that streaming is more environmentally efficient than local hardware. The case: a single high-end GPU in a data centre serves multiple users (via session-multiplexing), runs at higher utilisation, and benefits from data-centre-grade cooling efficiency relative to a household gaming rig.

This is one of those claims that has a real version and a marketing version. The real version is more complicated, and depends on factors that vary considerably across users, services, and regions.

The energy math, slowly

A high-end gaming PC drawing 350 watts at the wall, used for two hours a day, consumes about 255 kWh/year. A PS5 at ~200 watts under load, same usage, consumes about 145 kWh/year. A laptop at 50 watts during gaming consumes about 36 kWh/year.

A cloud-streaming server in a data centre uses ~400 watts under load. If that server serves a single user for the same two hours/day, it's worse than even the gaming PC. If it serves four users in rotation through the day (each for two hours), it's roughly equivalent per-user to the laptop case.

Add the always-on overhead of the client device (5–15 watts during streaming), the network infrastructure between you and the data centre, and the PUE (power usage effectiveness) overhead of the data centre itself (1.1–1.4× IT load), and the per-user energy cost depends critically on the multiplexing assumption.

The multiplexing assumption is the whole game

If you only play 2 hours a day, but the GPU serving you sits idle the other 22 hours, cloud gaming is much worse than local for you specifically. That's the worst case.

If your cloud provider can match you with three or four other users who play at non-overlapping times, the per-user energy is meaningfully better than local. That's the best case.

Modern cloud gaming services do try to multiplex. NVIDIA's GeForce Now is reportedly running at ~70% utilisation on Ultimate hardware across the day. PSN+ Premium and Xbox Cloud Gaming have lower utilisation because their hardware is less generic and harder to multiplex.

The honest answer: cloud gaming at 70% server utilisation is approximately a wash with local PC gaming on the energy axis, slightly worse on the carbon axis (because data centre power is on average dirtier than residential grids in many regions), and meaningfully better than local gaming in regions where the residential grid is mostly coal.

The water cost no one talks about

Data centres use water for cooling. A high-density gaming-grade GPU data centre can consume 200–500 ml of water per session-hour, depending on cooling architecture and climate.

Your gaming PC consumes zero water. The PS5 consumes zero water. The cloud server consumes meaningfully nonzero water.

This matters mostly in water-stressed regions — and several of the largest cloud gaming data centres (Arizona, Spain, parts of India) are in exactly such regions. NVIDIA and Microsoft have begun publishing data centre water-usage numbers in their sustainability reports. The numbers are not flattering for gaming-specific workloads, which run hotter than typical web-server workloads.

The honest comparison framework

Cloud gaming is environmentally <em>better</em> than local hardware if: you play less than ~10 hours/week (multiplexing helps), your local grid is dirty (cloud grid is usually cleaner), you'd otherwise buy a high-end gaming PC (manufacturing footprint matters more than running cost), or you live in a temperate region (residential cooling overhead).

Cloud gaming is environmentally <em>worse</em> than local hardware if: you play 30+ hours/week (you saturate any multiplexing benefit), your local grid is clean (Nordic countries, French nuclear, some US states), you'd otherwise game on a laptop you already own, or your nearest cloud data centre is in a water-stressed region.

Most cloud gaming marketing assumes the first list. Most cloud gaming critics assume the second list. Both are sometimes correct.

What we'd like to see

Every major cloud gaming service should publish, at minimum: average server utilisation across the year, average grid carbon intensity at their primary data centres, and average water consumption per session-hour. Microsoft and Google publish enough of this at the AWS/Azure layer that you can roughly back into the gaming numbers; NVIDIA does not.

Until that disclosure exists, cloud gaming's environmental claims are essentially marketing. Some of them are probably true. None of them are externally verifiable. The honest position is that cloud gaming is comparable to local gaming on environmental impact, with the variance dominated by factors most consumers can't see and most providers don't disclose.

ShareXRedditHacker News

More from the blog