What happens to a cloud GPU at 5 years old?
In 2022 cloud gaming services marketed RTX 3080-class hardware. By 2027 those same GPUs are mid-tier. How the cloud datacentre depreciation cycle actually works.
The 2022 marquee hardware
GeForce Now's RTX 3080 tier launched in 2021–2022 with the explicit marketing pitch of 'better than most consumer rigs'. At the time the RTX 3080 was a $700 retail GPU and most cloud users had nothing close to that locally.
By 2024 the RTX 4080 tier replaced it at the top. The 3080-class hardware moved to the Priority tier (the middle-tier subscription). The depreciation was visible in the product structure.
By 2026 the RTX 5080 tier is the marquee, the 4080-class is Priority, and the 3080-class is at end-of-life on the Free tier.
What end-of-life means for a cloud GPU
Cloud datacentre GPUs go through three stages. New for 18–24 months as the marquee tier. Mid-tier for 12–24 months as the cheaper alternative. End-of-life for 6–12 months on the free tier or as failover capacity, then decommissioned.
Total useful life is typically 4–5 years from purchase to decommission. NVIDIA and AMD design their datacentre SKUs around this lifecycle and the cloud gaming services budget accordingly.
Why the end-of-life tier matters for users
The free tier of every major cloud gaming service runs on the oldest hardware. GeForce Now Free runs on what was Performance tier two years ago, which was Ultimate three years before that. The user experience on the free tier is therefore measurably worse than the same service's paid tier — that's the entire point of the tiering structure.
Consumers comparing 'cloud gaming free trials' often unknowingly compare end-of-life hardware on one service against current hardware on another. The comparison is unfair to the free tier and misleads consumers about the paid tier's quality.
The depreciation accounting
Cloud gaming services depreciate GPU capex over 4–5 years. Compare to AI training datacentres, which depreciate over 2–3 years because the GPUs are obsolete faster (the next-generation H-series GPU outperforms the previous one dramatically for training).
Cloud gaming is more depreciation-friendly than AI training. A gaming GPU runs gaming workloads roughly the same in year 4 as in year 1, modulo new feature support (DLSS 4 requiring newer cards, AV1 hardware encode being a generational feature). The depreciation cycle is set by 'what features matter to gamers' rather than by raw performance falling off a cliff.
What this implies for cloud pricing over time
Cloud gaming pricing at premium tiers (RTX 4080-class and up) is anchored by the cost of new GPU hardware. Pricing stays roughly flat as new hardware replaces old, because the operator's GPU cost is roughly flat.
Cloud gaming pricing at lower tiers (RTX 3080-class and below) drops over time because the operator's effective per-GPU cost is decreasing as the hardware fully amortises. The Free tier is structurally cheap because the GPUs running it have been paid for.
Expect the gap between Free and Premium cloud tiers to widen over time, not narrow. Premium gets new hardware, free gets depreciated hardware.
Where this hits the user
Specifically: AV1 hardware encode. The RTX 4000 series introduced it. The RTX 3000 series doesn't have it. Cloud services on RTX 3000-class hardware are using H.265 or H.264 for the video stream, which is less efficient — meaning either lower visual quality at the same bitrate, or higher bitrate at the same quality.
By 2027 we'd expect the free tiers of cloud gaming services to still be running RTX 3000-class hardware without AV1, while the premium tiers are on RTX 5000-class with AV1 and likely a generation beyond. The free-vs-premium visual quality gap will be visibly larger than today.
The cloud gaming services aren't going to advertise this. But it's the underlying reason why a free trial of GeForce Now will look worse in 2027 than a paid trial does in 2026.
More from the blog
- Analysis · 7 min readModding and the streaming layer — what survives, what doesn't
- Analysis · 8 min readThe cloud gaming latency floor: where physics ends the argument
- Analysis · 9 min readWhy GeForce Now keeps winning — and what could change that
- Analysis · 8 min readMost 4K cloud streams are not actually 4K