On today’s You Asked: How low will the LG G4 OLED go this year? Ethernet or Wi-Fi for the best streaming TV quality? What’s the difference between mini-LED and Full Array Local Dimming? And is your Blu-ray player screwing up your HDR experience?

Difference between mini-LED and full-array local dimming?

fully array vs local dimming backlight
Hisense

Dawson writes: What’s the difference between mini-LED and full-array local dimming. He’s looking into buying a new TV, is doing some research, and sees a lot of folks getting excited about mini-LED OVER local dimming.

Mini-LED backlighting and full-array local dimming are different things, and they actually work together. In fact, it is very rare to find a mini-LED TV that does not also have local dimming. I have some videos explaining mini-LED in more detail, which I encourage you to go watch.

Full-array local dimming — sometimes abbreviated as FALD — means that a TV’s backlights are broken down in several zones — sometimes hundreds of them, sometimes thousands. Each of those individual zones can be dimmed or brightened as needed. Before local dimming, all the backlights in an LCD TV were on all the time. The only thing that made dark areas on the screen was the fact that the LCD cells were closed and blocking light.

With local dimming, you can actually turn off the backlight in certain areas to get better contrast.

Mini-LED just means the LED backlights are much, much smaller, and that usually means there are a lot more of them. This can mean that you get more precise picture control and better contrast — but that is not always the case.

At any rate, mini-LED can be a more advanced backlight system, and it almost always also has full-array local dimming. So just because you get mini-LED doesn’t mean you give up on full-array local dimming. It just means the led backlights are smaller, and there are more of them.


Where are the headphones with local storage?

Sony WH-1000XM5
Sony WH-1000XM5 Digital Trends

Lovro writes: Why don’t modern wireless headphones come with a built-in usable flash memory, or at least a microSD card slot That way we would be able to listen HQ lossless music directly from our headphones, without any need for (lossy) Bluetooth connection. Seems like a no-brainer, but what might be a reason for this? Extra hardware required for reproduction? Inadequate interest? Something else? Are they any options out there? Any hope for this to be more common in the future?

The reason why we don’t have such things — any more — is … well, you’ve pretty much got it figured out. It’s actually a combination of those things.

Headphones with a built-in flash memory player would take more hardware, and would be more expensive to make. But also, I don’t think there’s high enough interest for one of the big brands to pick up this idea. There are headphones with built-in MP3 players, but I haven’t seen any that supported huge uncompressed (or at least mathematically lossless) codecs with built-in playback support for hi-res audio files.

Anyway, I love the idea, but it would seem that the bean counters at most consumer electronics companies believe that the vast majority of folks find Bluetooth audio “good enough” for their needs.


Any discounts coming to LG G4?

LG G4 OLED
Zeke Jones / Digital Trends

John writes: I am curious what price drops you expect from the LG G4 TV this Black Friday? The 77-inch is tempting, but it’s so expensive when it’s not discounted.

My best answer is that I suspect history will repeat itself. The 77-inch LG G3 currently sells at between $3,300 and $3,800 after being introduced at $4,500.

We may not see as steep a discount on the G4, as prices have generally gone up. But … I’d say probably still looking at about $800 to $1,000 off if you can wait until this year’s holiday season. That’s not a guarantee. But if history is anything to go by, that’s a reasonable expectation.


Ethernet vs. Wi-Fi for streaming content

A man watches a movie in a studio with a Sonos Era 300 in a surround sound setup.
Zeke Jones / Digital Trends

Raul S. writes: My question revolves around using the Ethernet port on my TV versus using Wi-Fi. I have a Samsung S95B. From my research I’ve come to realize the Ethernet port is limited to 100 Mbps. Is this a problem? I feel as though Wi-Fi could be faster than this. I realized streaming might not even use the 100 Mbps, but is there any benefit to using a wired connection over wireless? I mostly do streaming and don’t have a NAS.

Based on my understanding of bandwidth usage from all the major streaming services, the highest data consumption for 4K content is right around 7 to 8 gigabytes per hour, which translates to about 17 Megabits per second. (Note the parts in italics — that’s volume in the first instance, and speed in the second.) Most streaming services offering 4K content recommend 25 Mbps speeds as a minimum, and I think that’s to allow some headroom.

There are rumors that some streaming services could peak as high as 30 or even 40 Mbps, but I’ve not validated any of those claims. It doesn’t matter, though, even at those theoretical speeds — which I imagine would be occasional peaks — you still aren’t going to surpass 100 Mbps with any commercial video streaming service.

So, for your case, I don’t think you’ll stand to benefit by using the Ethernet connection for a bandwidth bump. However, Ethernet connections can be more stable. There are a lot of factors that play into not just how fast your Wi-Fi connection is but how reliably fast it may be. So, it could be that, in your home, Ethernet could provide more stability and reliability.

I think the folks who stand to benefit most from Ethernet connections are those who are streaming uncompressed 4K Blu-ray rips from a NAS, and might need speeds in excess of 100 Mbps second. In those cases, if the Ethernet port maxes at 100 Mbps, then fast Wi-Fi is probably the better choice for enjoying un-buffered ultra-high-res, uncompressed content like that.


Blu-ray settings for OLED

Magnetar UDP900 Disc Player
Zeke Jones / Digital Trends

Bobby Rivers writes: I just recently got a Sony A95L and am coming from an LG G1. I have my Sony paired with a Panasonic UB820 Blu-ray player. My question is in the settings for the Blu-ray player it asks you for the panel type, and I have of course selected OLED. It then tone maps the HDR to 1,000 nits for the brightness setting. My question is with all these newer OLEDs that are on the market now and being capable of putting out more than 1,000 nits of brightness, should I still select OLED as the display type? And by doing so, am I selling myself short of my display’s capabilities?

Panasonic makes great Blu-ray players, and Panasonic has killer chops when it comes to video processing. However, when you have a high-end TV with some of the best video processing in the world, there is no need to have a source device do HDR tone mapping or upscaling processing. The exception to that may be the Xbox and PlayStation 5. But for Blu-ray players and devices like the Nvidia Shield, there’s no need when you have something like the A95L.

If you can turn that feature off entirely, do it. If not, tell it you have an LED TV. Whatever it takes to get it to stop doing any HDR tone mapping. Let your TV do that work.

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums