For the last 25 years or so, the world of digital video has been obsessed with increasing resolution. 'HD' (1280 x 720 pixels) and 'Full HD' (1920 x 1080) were the resolutions of choice from the early-mid 2000s, and a decade later 4K (3840 x 2160, or 4x the number of pixels in Full HD) became the resolution we had to have, whether that be from our TVs and computer monitors, or when shooting video with our cameras.
Even in the mid 2010s when 4K was ramping up in popularity I questioned its real-world benefit. The jump from analog VHS video to standard-def digital DVD content (480/576 horizontal lines) was night and day in terms of image quality, while the leap from DVD res to Full HD was almost as pronounced. The difference between Full HD and 4K was far less noticeable, although it was at least a subtle improvement for most people viewing big-screen TVs at typical viewing distances.
However, as with many technological advancements, we're now at a stage where subsequent improvements yield diminishing returns. While the jump from Full HD to 4K gave a modest tangible improvement to image quality, the difference between 4K and 8K is almost imperceptible in most cases. And yet when viewed from a mathematical perspective, this doesn't make much sense. 4K has 4x the total number of pixels as Full HD, so it should provide 4x the clarity. 8K is 4x the pixels of 4K, so it too should boost clarity four-fold. But there's one constant here that hasn't been considered, and which can't change: the resolving power of the human eye. Just because your shiny new 8K TV packs 4x the number of pixels as your old 4K panel of the same size doesn't mean your eyes can actually see that extra detail when viewed from the same distance as before.
"Sure, but when I upgrade to an 8K TV, I'll also buy a bigger TV - then I'll be able to see that 8K difference, right?". Actually, it's not that simple. Boffins far smarter than me have calculated the exact resolving power of a lens like the human eye using the Reyleigh formula, and we can use this to calculate how close you'd need to view your television in order to actually see the extra resolution of 8K.
In the case of a 75" television, you'd need to be viewing from as close as 2.5 feet (79cm) to really perceive the 8K difference. You'll see a small benefit over 4K when sitting between 2.5 feet and 4.9 feet away, but step back more than 4.9 feet from your 75" TV and you'll see no difference between 4K and 8K. Given that even Panasonic recommends a viewing distance of 6-9 feet for its 70" 4K TVs, and Samsung suggests a 7.5-foot viewing distance from a 75" television, getting closer than 4.9 feet is unrealistic for most people.
(For more information on this, check out this excellent article on the subject from Forbes)
So while you are unlikely to get a tangible benefit from 8K when viewing a television, what about viewing 8K content on a computer monitor? After all, we sit a lot closer to those, so we'd surely be more likely to see the extra detail on offer. That theory checks out to an extent, but with computer monitors generally being far smaller than typical TVs, you'd need to sit really close to see an 8K difference.
And that's assuming you can even get your hands on an 8K monitor. Dell made headlines back in 2017 with its first 8K monitor - the 32-inch UltraSharp UP3218K. I thought at the time this was the start of a torrent of 8K monitors, but since then we've seen the grand total of... zero other 8K consumer monitors. Earlier this year Asus teased its 8K ProArt Display PA32KCX, but it's yet to go on sale. Even that lone Dell 8K monitor is hard to find beyond Dell's own web store - it's long been discontinued by B&H, indicating demand may not be very strong.
All this isn't to say there are no benefits to 8K. For those shooting 8K video, it gives more versatility to crop in on areas of the frame when needed and then export at 4K while loosing little, if any, detail. Furthermore, if you're shooting professional footage destined for broadcast and/or cinematic release, then sure, it makes sense to shoot at the best video quality available. TV manufacturers will also often reserve their premium features for 8K models - the latest OLED/QLED panel tech, the best possible HDR and screen brightness, the best image processing and upscaling, etc. Likewise, even if you're not buying a camera like the Canon EOS R5 or R5 Mark II specifically for its 8K video ability, you'll still benefit from other high-end stills and video features which you may not get with cheaper 4K-only cameras.
But the fact remains that when it comes to consuming content at home, unless you're rocking a home cinema with a screen size well north of 100 inches, there's no need for 8K: a 4K display with the same brightness/contrast/refresh rate/additional display features will look just as good to the human eye.
We've reached a point in video resolution where going beyond 4K is ultimately pointless for the vast, vast majority of consumers, so don't be fooled by the 8K marketing hype.