In this edition of You Asked: Are there any perfect LCD panels, and what’s an acceptable level of imperfection? Also, we’ll pit 8-bit plus FRC versus true 10-bit, tell you if you need HDMI 2.1 for Dolby Vision, and help you understand Samsung soundbar model codes.

LCD panel uniformity

TCL Q7 TV review
Chris Hagan / Digital Trends

Gary Scott writes: I was wondering if you could take a look at the pictures of my screen and give me your thoughts on uniformity, etc. There seems to be an odd tint at the edges.


I looked at the submitted photos and saw tinting around the edges, which is pretty typical. Usually, slight darkening will be more pronounced at the corners. LCD panel uniformity has never been perfect, and even the very best will show that kind of fringe tinting. Based on what I could see in the photos, I’d say that, overall, it was a very clean panel. I’d probably give it an A+ for an LCD TV.

If it were OLED, I’d expect better, but for an LCD-based TV, yours looks very clean overall with minimal dirty screen effect. Thankfully, this shouldn’t be noticeable during normal viewing situations, save very specific instances. For example, you may notice it slightly when watching golf or ice hockey. Those are the times when you have huge patches of mostly uniform color.

Of course, I’m just looking at photos, which may underplay the effect. It’s hard to tell if it is more severe than what I see in the photos.

Frame rate control

LG G3
LG’s G3 OLED features a 10-bit panel Zeke Jones/Digital Trends / .

Josh Collins writes: I was wondering about what in the world frame rate control (FRC) is. I can’t find any good explanations beyond. “You won’t notice the difference.”


When looking at TVs and computer monitors, you’ll usually see FRC in the context of the panel description. You may see that a TV has an 8-bit panel, an 8-bit with FRC, or a 10-bit panel. The FRC stands for frame rate control. So , what’s the difference between a 10-bit panel and an 8-bit plus FRC panel?

Taking a step back, the 8-bit and 10-bit, in this case, refers to how many colors the TV or monitor can display. An 8-bit panel can display a little over 16 million colors, and a 10-bit panel can display a little over a billion colors. So, not a small difference.

HDR requires that a display cover more than the 16 million colors that an 8-bit panel is limited to. So, for HDR, you want either a 10-bit panel, or 8-bit with FRC. Here’s the difference from a technical standpoint:

A 10-bit panel can display any of those 1 billion-plus colors from any of its pixels at any moment and at any refresh rate. So, it doesn’t matter if you’re watching 24 frames-per-second (fps) film content, 30Hz or 60Hz TV content, or 120Hz game content — the 10-bit panel can show any of those 1 billion-plus colors at the drop of a hat.

An 8-bit plus FRC panel can also cover all 1 billion plus colors, but it does so by showing two different colors in rapid succession. This process is also called dithering.

So, if we have a pixel on an 8-bit panel with FRC, and that pixel needs to display a shade of red that is beyond the display capabilities because it lives in that 10-bit color space, the 8-bit panel will use frame-rate control to switch between two shades of red so fast that your eye perceives it as the intended 10-bit shade of red. It is technical trickery, and it works great. There is only one caveat: an 8-bit + FRC panel has to use the refresh rate of the TV to pull this off. So, you could run into situations where perhaps you want ultra-high frame rate content and 10-bit color reproduction, and that just wouldn’t be possible. But I’m talking about trying to get 240Hz refresh rates with 10-bit color at the same time. How often do you need that? Well, if you do, then get a 10-bit native panel. Otherwise, don’t worry about it.

Dolby Vision HDMI ports

Sony Bravia X95K hdmi ports.
Sony X95K Dan Baker / Digital Trends

Alex Padilla writes: What approach would you recommend to utilize the Dolby Vision functionality of my AppleTV 4K AND Panasonic UB820, given that my X95K has 2 HDMI ports that support Dolby Vision, one being eARC/ARC? I’m leery of using an HDMI Female to Male “Y Cable.”


The approach I recommend is to banish from your mind the idea that only two of your HDMI ports support Dolby Vision. All four of your Sony X95K inputs support Dolby Vision. You just may need to go into the TV’s menu system and tell the TV to enable Dolby Vision for the HDMI port you want to use. I’m going to tell you how to do exactly that. But first, in an effort to help as many folks as I can around this, I want to make it clear to everyone that you do not need HDMI 2.1 to enjoy Dolby Vision. (According to Google, this is a frequently searched question.) You do need an HDR-capable TV that supports Dolby Vision, but I can’t recall a TV I’ve reviewed in the last four or five years that only supported Dolby Vision on just some of its HDMI ports.

You may, however, have to manually enable 4K HDR for your HDMI ports. This is frequently done automatically these days. You send the TV an HDR signal and it goes, “Ah, OK. We need to go to enable 4K HDR,” and it will just do it and display a message that it is doing so.

On some Sony TVs, however, you may have to enable what Sony calls “Enhanced HDMI manually,” and this is where you might have gotten confused if the confusion didn’t just stem from thinking you needed HDMI 2.1 for Dolby Vision.

Here’s how you do it, and I’ll explain why this can be confusing for everyone. Select the Settings button on your remote, or select the Settings cog in the upper-left of your screen. From there, go to Channels and inputs, and then select External inputs. Scroll down and select HDMI signal format. Now, on the X95K, you will see four options: Standard format, Enhanced Format, Enhanced Format (Dolby Vision), and Enhanced Format (VRR). This is how it starts getting confusing; you will only see Enhanced Format Dolby Vision and Enhanced Format VRR as options for two of your HDMI ports. That’s because those are the HDMI 2.1 ports and because when using those ports, you can have Dolby Vision or you can have VRR. But you cannot have both, so Sony makes it clear that you are choosing one or the other. However, in doing things this way, it might make you think the other two ports, which just have enhanced format, won’t do Dolby Vision. And that’s just not the case. Selecting Enhanced Format will get you Dolby Vision on those ports.

No wonder Sony changed this in 2023 TVs.

Connect your Blu-ray and Apple TV to HDMI 1, 2, or 4 — leaving your eARC port open — and just turn on the enhanced format for the HDMI ports that you want to use for HDR. You will then get the Dolby Vision you want, need, and deserve.

Samsung soundbar model numbers

Samsung HW-Q600C
Samsung HW-Q600C Samsung

Neman from Germany asks: I want to buy a Samsung HW-Q600C. (?), but the problem is I am confused by the different names they have.

Samsung HW-Q600C
Samsung HW-Q600B
Samsung HW-Q610B
Samsung HW-Q610B/ZG

How can I understand the differences between these kinds of naming? What is the difference between these models? In your opinion, which one delivers a better sound quality?


The first one listed, the Samsung Q600C, is a 2023 model. The other three are 2022 models, and all three are essentially the same, but targeted at different regions or countries on our lovely planet.

In Samsung’s naming convention, the numbers after the Q refer to the series. In this case, you’re looking at the 600 series soundbars. The letter that follows references the model year. A is for 2021, B is for 2022, and C is for 2023.

As you can see, options two, three, and four that you have listed are all 600 series soundbars from the model year 2022. I think 600 is used in the U.S., and 610 in the U.K. and likely other parts of Europe. That ZG is a tricky one, though. That is like a country subcode. It doesn’t matter, though, because the Q600B or 610B will be the same.

As for the difference between the Q600B and the latest Q600C? Well, technically, I can’t find any easily. The Q600C may support Linear PCM if you wanted to send it an already decoded signal for some reason. But I think the only meaningful difference is going to be the price since the 2022 models are discontinued and a bit older, so they will be less expensive. If you can find a brand-new 2022 model from an authorized retailer so the warranty is intact, you might as well buy that one — that’s the “B” model. If not, the “C” model should be easily available, but I wouldn’t expect better performance from it.

Editors’ Recommendations

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums