LCD pixels: what are chess-board pixel fill patterns called?

I have just recently bought a 4k HDR 55 inch TV (Vizio P55 F1). Here is photo of brightly lit pixels: Fully lit square unifor pixels from rtings(Source) The picture shows that the pixels are clearly squared and uniform. Often when I look at uniform color on the screen (think Playstation menus, but in real-life videos as well), I see that on various mid-bright spans of the same color the pixels lit in a chessboard pattern instead of uniform. image of a PS4 menu image of a PS4 menu closeupOn the last image the white letters seem more or less uniform, but the grey background definitely is not. I drew 2 green lines; the top one is going through the green subpixels that are brightly lit, and the bottom one is going through the green subpixels that are dim. The same goes for the blue lines, and for the red subpixels. When I output 4k content to a 4k screen, I expected it to be displayed pixel by pixel, but that is obviously not the case (or does the PS4 generate these patterns by itself?). I couldn't find anything about it on the internet because I was not able to find the proper name for this effect. I must add that I just recently saw a TV that doesn't have this effect, but I don't remember the model. What is the name of this approach, so I can read about it and understand the reason behind it? Bonus questions - what is the rationale? Is uniformly lit TV better or worse? Is it generated at the source of the signal (PS4) or on the TV?

9,280 23 23 gold badges 32 32 silver badges 42 42 bronze badges asked Oct 17, 2018 at 8:11 13 3 3 bronze badges

\$\begingroup\$ I doubt that the menu is 4k content, so likely your TV is upscaling and dithering it. Check your TVs settings for a game mode or so, almost all TVs these days play fun things to get a better image, especially when HDR is needed and they can't get it with just the colour resolution of a single pixel. \$\endgroup\$

Commented Oct 17, 2018 at 8:14 \$\begingroup\$ Bayer filtering? \$\endgroup\$ Commented Oct 17, 2018 at 8:17

\$\begingroup\$ While the output sent to the screen is UHD, it's entirely possible that the content being displayed is an image file that is less than UHD resolution (maybe 1080p), so gets scaled up in the frame buffer of the PS-2 before being sent to the screen. \$\endgroup\$

Commented Oct 17, 2018 at 9:27

\$\begingroup\$ Also this doesn't seem to be about electronic design - its not a bad question at all and should probably should be migrated to a different stack, though not sure which is the best fit. \$\endgroup\$

Commented Oct 17, 2018 at 9:28

\$\begingroup\$ @PlasmaHH, I have explicitly switched PS4 from 4k to 2k and back, effect is the same. I haven't tried to wire it through non-hdr HDMI input (special low-lag gaming SDR input), and I haven't tried to disable HDR directly, will try tomorrow. I have also tested this on a Ultra HD Grand Tour episode from Prime Video built into TV. After it gathers its wits, it shows true 4k, and tv explicitly said so, and I already found good scenes to distinguish 2k from 4k. Also did the same with Youtube 4k video. I doubt it is upscaling, I thought these are "fun things" ,wanted to find the name for them. \$\endgroup\$

Commented Oct 17, 2018 at 9:52

2 Answers 2

\$\begingroup\$

This is due to the panel control driver, and you probably won't find much info about it because this is the secret sauce of each manufacturer.

This is definitely the TV doing that and not the playstation.

The reason behind it is probably, that the driver for each pixel is able to light only at certain amount of brightness, if it is for example a 4 bit per pixel ratio, you will have 16 level of brightness by pixel, for each color.

Now 16 levels per pixel is not a lot, it would give 4'096 different possible colors in total.

But now, if you set different levels per pixel rows, you can drastically increase the amount of possible colors, as anyway from your eye you won't be able to spot single pixels and will only see the average of it.

With only controlling this effect on two rows, you go from 4 bits to 2x 4 bits, this will give you 16'777'216 different possible color, while using the exact same amount of data as before.

So you go from 4'096 color levels to 16+ millions, while driving the panel with the same amount of data.

It is also probably cheaper to drive individual pixels in 4 bits than 8 bits.

Since 4k panel have a large amount of pixel, this technique is likely used to reduce the bandwidth needed to drive the panel as well as the driver cost, as you need 2x less data for the same result.

Take the 4 bits per pixel as an example, I do not know how many bit per pixel is being used, but 4 is a likely possibility.

Since the TV is given at 1 billion + color, it is possible individual pixels to be driven in 5 bits, with two rows that would give approximately 1 billion colors.

Or that each pixel is driven in 4 bits and that effect is applied on 3 rows, which is more likely as it usually a power of 2.

Followed on the comments, it seems to be called "color dithering".

answered Oct 17, 2018 at 10:19 8,094 1 1 gold badge 13 13 silver badges 31 31 bronze badges

\$\begingroup\$ Hey, Damien. Your answer is useful, and I want to accept it, but it does not answer the main question - the name of the approach I observed. After searching more and more, I came to understanding that this approach is indeed "spatial dithering", though it does not give a good tag to search for. \$\endgroup\$

Commented Dec 7, 2018 at 5:24

\$\begingroup\$ Try to search for color dithering 6bits,it shows up quite a bit off article about it. lifewire.com/lcd-displays-and-bit-color-depth-833083 \$\endgroup\$

Commented Dec 8, 2018 at 7:23

\$\begingroup\$ I still want to mark you anser as accepted, as soon as you add word "dithering" to the answer itself, meaning that is the name of the behavior I observe. \$\endgroup\$

Commented Dec 11, 2018 at 2:11

\$\begingroup\$ This would help whoever stumbled on this question in the future, while searching for "chess-board patterns" at least \$\endgroup\$

Commented Dec 11, 2018 at 2:11 \$\begingroup\$ Done @YanTS :). \$\endgroup\$ Commented Dec 11, 2018 at 3:37 \$\begingroup\$

Short Answer

"Subpixel antialiasing" may be the term you are looking for.

Longer Answer

Recognizing this is an old question, I'll be brief.

  1. Dithering is usually used as a way to increase the apparent bit depth. This does not appear to be a dithering issue.
  2. Color subsampling (4:2:2) means the color information is at half the resolution of the luminance information.
  3. Subpixel antialiasing is an "intentionally misaligned" independent use of the individual R,G,B subpixels to improve apparent resolution.
  4. Upscaling can create artifacts

4K Dirty Secret (a tangent)

I work in Hollywood. Many of the "4K" features are actually completed in 2K and then upscaled to 4K. By and large these upscalings are done using state-of-the-art technologies, and with shot by shot human review and adjustment.

But in studies conducted by ILM and others some years ago, it was found that bit depth and wider color gamut does much more for image fidelity than simply higher spatial resolution, especially for screens smaller than about five feet wide (when standing about eight feet away).

Once the screen pixel size is smaller than about one arc minute of visual angle, it is exceeding the theoretical resolving capability of the human vision system. This means that, assuming you are 8 feet away from the screen, a 2K, 60" diagonal screen is at this limit, thus smaller displays would have no real benefit to be 4K, assuming equivalent view distance.

Most people (20/20 vision) would be unable to tell the difference between a 60" 2K and 4K display, when standing 8 feet away, and all other things being equal. (Retailers turn on the image decoration nonsense like motion interpolation, and boosted contrast, to make people think there's a difference).

Where this can break down is with antialiasing and scaling artifacts, and such artifacts may be more noticeable to some people on a 2K screen. Someone with very good vision (20/12) might be able to tell the difference between a 2K and 4K display, as described, but only with content that was sensitive to that minuscule spatial difference, e.g. due to antialiasing or scaling artifacts.

The Point

When evaluating content on a 4K display, don't be surprised that there are things like scaling artifacts, or that content seems "softer" than you might expect.

Rule of Thumb (related to the above tangent)

2K is usually sufficient when the viewing distance is greater than about 1.6 times the diagonal screen size.

More Specific and Granular: The values below are based on standard 20/20 vision, in a typical indoor viewing environment, with "appropriate" or optimized content.