Cloud Gaming.Expert
Analysis7 min read

The cloud-vs-local environmental comparison is more interesting than either side admits

Cloud gaming is greener than local gaming. Local gaming is greener than cloud gaming. Both arguments have been made loudly. The honest answer depends on which user you're modelling.

By Marin Björk
Reviewed

The two opposing claims

Cloud-bullish framing: cloud gaming consolidates rendering on shared infrastructure that operates at high utilisation. One cloud GPU serves many users serially, replacing many individual rigs that would otherwise sit idle 80% of the time. Per gaming-hour, cloud should produce less carbon than the local equivalent.

Local-bullish framing: cloud gaming adds the energy cost of the network path, the data centre cooling overhead, and the client device's display and decoder. The cumulative chain consumes more energy than a local GPU running the same workload. The 'shared infrastructure' efficiency argument double-counts.

Both framings are partially right. The honest answer depends on the user.

The energy components

Cloud chain: the cloud GPU under load (200-400 W depending on tier), the data centre cooling overhead (roughly 10-30% of compute draw at modern PUE), the network backbone (small per-bit but accumulates), the last-mile ISP (the same regardless of what you're doing), and the client display + decode (10-30 W on a TV, 5-15 W on a tablet).

Local chain: the local GPU under load (150-450 W), the rest of the local rig (60-120 W for CPU, RAM, motherboard, storage, fans), the local display (10-30 W on a TV, 100-300 W on a gaming monitor).

On a typical gaming hour, cloud gaming uses roughly 300-450 W total, local gaming uses roughly 250-600 W. The ranges overlap meaningfully and the choice of comparison hardware dominates the answer.

The utilisation argument

Local gaming PCs sit idle most of the time. The 250-600 W gaming load is consumed for 2-4 hours a day in a typical gaming household, then the rig draws 5-15 W idle for the other 20+ hours. Annualised, the idle draw is 25-40% of total energy use.

Cloud GPUs serve a stream of users serially. Each GPU is under gaming load for 12-18 hours a day, queued or idle for the rest. The idle draw per gaming-hour delivered is much smaller because the GPU is being used more of its life.

This is the cleanest cloud-bullish argument. If you assume cloud is 60% utilised against local at 12% utilised, the embodied energy and idle draw advantages of cloud are substantial.

The local-bullish counter

The utilisation argument assumes every local rig stays powered up at idle. In reality, players turn off their PCs and consoles. PS5s ship with rest mode at <2 W; consoles spend most of their non-playing time effectively off.

Cloud-side, the 'idle' GPU isn't actually 0 W. Cloud data centres maintain GPUs in a 'warm' state for fast session spawn — typically 30-60 W idle. The 'no idle waste' assumption for cloud is partially false.

Network backbone energy is also non-zero but typically over-counted. Modern internet backbones are extraordinarily energy-efficient per bit at the volumes that matter. Last-mile (your home router) draws power regardless of what you're using it for.

Where the answer is clear

A user with a high-end gaming PC who games 4+ hours daily: local rendering is marginally more energy-efficient per gaming hour than cloud. The local rig's idle waste is amortised across enough gaming hours to compete.

A user with a low-end PC or laptop who games on cloud for visual quality: cloud is meaningfully more efficient. You're not running a gaming-class GPU locally and the cloud GPU's high utilisation more than compensates.

A user who plays casually 30 minutes a day: cloud is more efficient by a substantial margin. The local rig's idle waste dominates.

A user who plays 8+ hours daily on a high-end local rig: local is more efficient. Cloud's utilisation advantage doesn't fully compensate for the network and client-side overhead at that volume.

The embodied carbon

Local rigs require manufactured hardware that has its own embodied carbon footprint. A new gaming PC's manufacture is roughly equivalent to 18-24 months of its operating energy use. A new console is roughly 12-18 months.

Cloud gaming amortises hardware manufacture across many users. A cloud GPU server serving thousands of users over its 4-5 year lifecycle has a much smaller embodied-carbon-per-user-hour than a local rig.

When embodied carbon is included, the cloud advantage grows. Cloud is greener than local for nearly all users when the hardware lifecycle is factored in — local only wins for the highest-hours daily users on hardware they keep for many years.

What to tell readers

The 'cloud is greener' meme is roughly right for the median gamer. The 'local is greener' meme is roughly right for the very high-hours minority. Both sides have been overstating their case.

If environmental impact is a meaningful part of your gaming decisions, cloud gaming on a service operating in a region with low-carbon electricity (Iceland, Nordic data centres, Quebec) is the cleanest choice. Local gaming on power-efficient hardware (Steam Deck, low-TDP APUs) is also strong. The worst choice is a high-end local rig in a coal-heavy electricity grid running 30 minutes a day.

ShareXRedditHacker News

More from the blog