Can you really tell the difference between 4K and 8K?

Some time ago, I felt the speed of semi-latest technological progress in a very real way. My roommate—a gamer—may have realized through some other gamers that video games from the 90s look best when played on TV in the 90s. This led him to find one on Craigslist, which in turn led me to help him drag it up the stairs. It’s just, let me tell you, the heaviest thing in the fuck. I can’t believe how heavy that TV is.

In the more than 25 years between the debut of that TV and its final arrival in my living room, TV has changed a lot. Now they are all flat and not particularly heavy. As you might expect, the screen technology has also been significantly improved. Some higher-priced models offer 8K Ultra HD resolution, twice the previous 4K standard. Is the picture of these 8K TVs really twice that of 4K? At some point, will there be diminishing returns in image quality?For this week’s Giz inquiry, We contacted a number of experts to understand.

Professor of Psychology at the University of California, Berkeley, with research interests in visual perception, attention and memory

The obvious answer is that it depends on (1) how big your TV is and (2) how far you watch it. The advantages of 8K are easiest to see on large TVs watched at close range. They are almost invisible on small televisions viewed from a distance. Relatively speaking, the same is true for current 4K TVs and 2K TVs (1920 × 1080 pixels, also known as “1080p”)—they work better when watching larger TVs at shorter distances.

Professor of Vision and Computational Neuroscience, Massachusetts Institute of Technology

Let’s solve this problem with some technical details: normal vision (we usually call it 20/20 vision) corresponds to being able to distinguish two points separated by “1 minute of arc”. What does it mean? A person’s thumb is about two degrees wide at the length of his arm, and one degree is 60 minutes. This means that if you draw 120 evenly spaced points on a line of the width of the thumb, you will hardly see the individual points along the length of the arm. At any larger viewing distance, or more points, you will not be able to distinguish the difference between a dashed line and a continuous line. If we convert this calculation to a TV, the result is that for viewing a 60″ wide screen from 5 feet away, our resolution limit is 4K. At this distance, we will be able to tell the difference between HD and 4K , But any increase beyond 4K (for example, to 8K) will not attract attention. We need to be close to the TV (which is a very unnatural thing) to distinguish 4K screens from 8K screens. So, unless a plan has a very A large screen, or if you plan to watch TV very close, 4K is sufficient. In most living room settings, upgrading to 8K (from 4K) may not be noticeable.

Assistant Professor of Optical and Visual Sciences, University of California, Berkeley

We have all experienced situations beyond what our eyes can see. Maybe you have been trying to read the small words on food labels, or you have difficulty recognizing your friends’ faces in a crowd. Although the human visual system is excellent, it has a series of limitations that make certain aspects of the world actually invisible. When it comes to display design, understanding these limitations is critical to understanding whether one display will look better than another.

The difference between 8K TVs and previous-generation displays comes down to the increase in the number of pixels. In modern TV monitors, individual tiny pixels are arranged side by side in a grid. Each pixel in the grid emits a color dot, which together make up the image you see on the TV. When you watch your favorite programs, you want to see these images with rich details without being distracted by seeing individual pixels. In other words, you want the image to be vivid, but the pixels are not visible.

Will 8K TV provide improvements under these conditions? It depends on many factors, such as the display contrast, the size of each pixel, and the distance you tend to look at—even depending on the type of image you are viewing and how fast that image is changing. For example, if you watch TV from far enough that every pixel is smaller than what your visual system can resolve, then these pixels will not be visible regardless of whether the display is 4K, 8K, or 100K pixels wide. If you take out some tape measures and memorize your trigonometric functions, you can easily calculate the number of pixels per degree of vision for your own viewing settings. If your degree has exceeded 60 pixels, you are unlikely to see an improvement when using an 8K TV of the same size (for reference, a visual degree is approximately the width of your thumb at the length of your arm). On the other hand, if the panel is larger or you want to watch at close range, a monitor with more pixels can in principle allow you to see more details in a wider field of view. Of course, all this assumes that the original recording also has a resolution of 8K or higher.

Increasing the number of pixels will certainly bring benefits, but the details of how you view, what, and where you view will ultimately determine what is a visible benefit to you.

Martin Banks

Professor of Optical, Visual Science, Neuroscience, and Psychology, University of California, Berkeley

There are some suggestions for the resolution of TV monitors, mobile phones, etc. These recommendations usually boil down to one thing, that is, pixels should create a viewing angle of “1 minute of radian” or less. “One minute” is a technical term. Although “minutes”, it does not involve time: only space. Think of the one-minute arc as a small cone of light shining toward the eye. That is the pixel on the TV screen, it will appear on your eyes. A minute of arc is the angle of the cone from your eye to the pixel. High-definition TVs have 2,000 pixels from left to right, and ultra-high-definition TVs have 4,000. Here we are talking about 8,000. Many people in my field think the “one-minute arc” proposal is flawed-it should be smaller.

The viewing distance is also included here. Skip some mathematical calculations, if you have a 2K TV (HD) and it is 3 feet tall, then you need to sit at 9.3 or closer to enjoy the resolution; if you are 20 feet away, you can’t tell you The difference between a new TV and a TV with a slightly smaller number of pixels. If you have a 3 feet high 4K TV, you have to be about 4.5 feet away or closer to tell the difference, and no one is sitting so close.All the way to 8K, now you have to Two feet Enjoy it from your three-foot TV. You must be a very rare audience to take advantage of this.

Do you have an urgent question about Giz Asks? Email us: tipbox@gizmodo.com.

Source link