Scientists claim you can't see the difference between 1440p and 8K at 10 feet
3 hours ago
1
(Image credit: Getty Images/bluecinema)
Researchers at the University of Cambridge and Meta Reality Labs have conducted a new study into how the human eye perceives pixels on displays at different sizes and resolutions, and claim that once you get to a certain size and detail, there's no discernible difference, via TechXplore. According to the calculator they developed, at 10 feet distance, a 50-inch screen looks almost identical at 1440p and 8K resolution.
The researchers highlighted that as displays grow larger and more detailed, with ever greater resolutions, it's important to know what humans can actually see so that we aren't developing technologies that are largely redundant. But where previous studies have looked at the perceived retinal resolution, these researchers looked at what resolution the viewer could perceive with utmost clarity and no blur, indistinguishable from a perfect reference, and measured it distinctly for different colors and tones.
Using a sliding display to maintain continuous resolution control, researchers found that the human eye can perceive black and white pixels at up to 94 pixels per degree, and up to 89 pixels per degree for red and green patterns, but just 53 pixels per degree for yellow and violet.
The researchers looked into the effects of looking directly at a pixel or from an angle, the overall size of the screen, its pixel density, the brightness (or darkness) of a room, and the distance between the viewer and the screen.
In correlating all this data together, the researchers developed a calculator where you can input various factors like resolution, distance from the screen, and screen size, to see how the study would extrapolate the results. Using that data, we can see that with a 50-inch screen at 10 feet distance, the subjects of the study wouldn't have been able to tell the difference between a 1440p screen and one at 8K resolution. The calculator indicates that for a 50-inch 1440p display viewed at 10 feet, only 1 percent of the population would notice the difference between that image and a 'perfect' image. At 4K, that number becomes 0%; naturally, 8K would be the same. According to the scientists, all three resolutions would look broadly the same at that distance.
"If you have more pixels in your display, it's less efficient, it costs more and it requires more processing power to drive it," said co-author Professor Rafał Mantiuk, from Cambridge's Department of Computer Science and Technology.
"So we wanted to know the point at which it makes no sense to further improve the resolution of the display."
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Before this study, the human eye was thought to be capable of viewing 60 pixels per degree, so it raises the bar for what the human eye can perceive at certain distances and screen sizes. That could prompt display manufacturers to adjust their designs, and indeed, the researchers hope the results could help guide display development, as well as image, rendering, and video coding technologies in the future.
Although it feels like there's an argument to be made that 8K displays are still somewhat redundant when there's little native content for them or powerful enough GPUs to run them, I still feel like I can tell the difference between them and 1440p. Maybe not at extreme distances, but at something pedestrian like 10 feet? I feel like that'd be doable.
Maybe it's a perceived resolution increase from the added detail such a rendered image might show, more than actually seeing the pixels? Also, maybe I should just listen to the scientists who've done the kind of testing I'd need to do to prove my point.
Jon Martindale is a contributing writer for Tom's Hardware. For the past 20 years, he's been writing about PC components, emerging technologies, and the latest software advances. His deep and broad journalistic experience gives him unique insights into the most exciting technology trends of today and tomorrow.