Most 4K cloud streams are not actually 4K
Cloud services advertise 4K. Almost none of them ship a bitrate sufficient to make 4K materially better than 1440p. Here's the math, and why we'd take 1440p at high bitrate over 4K at low bitrate every time.
The bitrate the marketing doesn't show you
Streaming 4K Netflix to your TV uses roughly 25 Mbps of bitrate using HEVC. That's a pre-encoded video, with hours of offline encoder time per asset, optimised for that specific scene.
Cloud gaming has to encode the 4K stream in real time, frame by frame, with one encoder pass and roughly 8–16 ms of encoder budget per frame. The same visual quality at the same resolution takes meaningfully more bitrate in cloud gaming than in Netflix. Roughly: a true 4K cloud stream needs 60–80 Mbps to look as good as Netflix's 25 Mbps 4K.
GeForce Now's RTX 4080 tier streams 4K at up to 75 Mbps with AV1. That's actually credible 4K. Game Pass Cloud streams at 20 Mbps maximum and tops out at 1080p; there's no 4K stream at all on Game Pass, despite the catalogue including 4K-capable games. Boosteroid streams 1080p at 15 Mbps and tops out there. PS Plus Premium streams 4K at 35–40 Mbps which is the credible-but-not-impressive tier.
Why bitrate matters more than resolution
A 4K stream at 30 Mbps will show more compression artifacts in motion than a 1440p stream at 30 Mbps. The reason is simple: 4K has 1.78× the pixel count of 1440p, so each pixel gets fewer bits to encode at the same bitrate.
What this looks like in practice: edges shimmer, fine textures (foliage, grass, fabric weave) turn into smeary low-frequency blobs in fast motion, and explosion effects produce visible block artifacts. The number on the resolution selector says 4K but the image you see has less detail than a high-bitrate 1440p stream of the same scene.
AV1 helps, but not as much as you'd think
AV1 (the newer codec) is roughly 30% more efficient than H.264 and 15–20% more efficient than HEVC for the same visual quality. That's real. It means a 50 Mbps AV1 stream looks roughly like a 65 Mbps HEVC stream.
It does not make a 20 Mbps AV1 4K stream look like a real 4K stream. The codec savings move the ceiling up but don't close the gap to the bitrate budget true 4K needs. Don't let the AV1 marketing convince you that the bitrate floor has dropped to where 4K cloud is universally credible.
What the consumer should actually choose
For competitive games (Apex, CS, Valorant, Rocket League): 1440p120 at 35–50 Mbps beats 4K60 at the same bitrate every time. The higher frame rate matters more than the resolution, and the bitrate per pixel is better.
For single-player AAA games (Cyberpunk, BG3, Elden Ring): 4K60 is meaningful if the cloud service can actually push 60+ Mbps. If the cloud service tops out at 35 Mbps, set the client to 1440p and you'll get a sharper image with fewer artifacts.
For older games (anything 5+ years old): 4K is fine because the source content is simpler. Less motion detail, fewer fine textures, lower geometric complexity. 4K at 25 Mbps looks fine on a 2018 game.
Why services advertise 4K anyway
Marketing. The 4K number is the easiest number to compare across services. 'Up to 4K' is a clearer line on a comparison chart than 'up to 75 Mbps AV1 with full 4:2:0 chroma'. Even when the bitrate doesn't support the resolution, the resolution number wins the marketing argument.
We've stopped putting 4K front and centre on our service comparisons because of this. The right comparison metric is bitrate per resolution, and the right thing to ask a cloud service is 'what bitrate do you allocate to my session at peak quality'. The resolution selector is a marketing detail downstream of that.
What we'd push for
Cloud gaming services should expose effective bitrate per session in the client. The video player industry figured this out a decade ago — every YouTube viewer can check Stats for nerds and see the actual bitrate. Cloud gaming should ship the same thing. GeForce Now's Statistics overlay does this; nobody else does properly. That feature gap is the single largest impediment to a more honest conversation about cloud quality.
Until then, the consumer-facing rule of thumb: prefer 1440p at the highest bitrate offering over 4K at the same price. The trade-off in marketing-friendly resolution numbers is paid back in actual image quality, especially for fast-motion games.
More from the blog
- Analysis · 7 min readModding and the streaming layer — what survives, what doesn't
- Analysis · 8 min readThe cloud gaming latency floor: where physics ends the argument
- Analysis · 9 min readWhy GeForce Now keeps winning — and what could change that
- Analysis · 8 min readWhy anti-cheat is cloud gaming's biggest unsolved problem