There are two kinds of people when it comes to technology. Those who want their thing — be it a phone or car or TV — to do as much as possible in a single package. And those who’d prefer a more dedicated device, something that does fewer things, but does them very well.

Those in the latter group lost out years ago. And nowhere is that more apparent than in the television space, as evidenced at CES 2024 by the sheer onslaught of AI. You couldn’t read a press release without running into a section that included AI. You couldn’t make it through a press conference without an executive explaining how great their AI is.

Of course, AI is just an acronym. “Artificial intelligence” in and of itself really just means a new way of interpreting and computing data. And as we learned at the various Las Vegas events as the manufacturers unveiled their 2024 TVs, AI is being implemented in a number of ways.

TCL vice president of product marketing and development Scott Ramirez.
Scott Ramirez, TCL vice president of product marketing and development, at CES 2024. Phil Nickinson / Digital Trends

“Better processing makes a better picture,” Scott Ramirez, TCL’s vice president of marketing and development, said in his company’s press conference. “It just does.”

Best Super Bowl TV Deal

He’s not wrong. And that sort of AI-powered upscaling has been in use for years. It took a little while to make it into the actual televisions — graphics manufacturer Nvidia had it baked into its Shield TV device back in 2017 — but now it’s here to stay. And in 2024, TCL is implementing that sort of AI-powered upscaling across the board with a trio of processors. The AIPQ processor is found in the S5 and Q6 televisions. The AIPQ Pro will be in the QM7 and QM8. And the AIPQ Ultra will be in the massive 115-inch QM89 television.

Many of the CES unveilings went beyond the mere upscaling of content, which at this point is table stakes.

“Our Hi-View Engine chipsets are ushering in a new era of user experience,” David Gold, vice president of Hisense International and president of Hisense Americas, said at his company’s CES press conference. “Our commitment extends beyond superior picture quality and performance. It is about tailoring experiences for the entire family.”

Hisense USA president David Gold at CES 2024 in Las Vegas.
Hisense USA President David Gold at CES 2024. Phil Nickinson / Digital Trends

Hisense USA marketing head Doug Kern told much the same tale.

“These features, technologies, and images on the screen need to work together,” Kern said. “It’s a state-of-the-art AI chipset that utilizes deep learning and an array of technologies to refine the viewing experience.”

But the AI processing has come a long way from analyzing the picture as a whole and simply clearing up blurry patches. “Through local tone mapping optimization, it assesses hundreds of thousands of image areas,” Kern said. “Its face detection feature recognizes faces in the image and adjusts for a more natural appearance.”

Yep. Your next Hisense TV might have facial recognition built in. For Hisense, it’s sort of the opposite of a camera. Instead of recognizing your face, it finds a face on the screen and tweaks things somehow. (Or if you really want to extrapolate, you could imagine a company recognizing, say, Tom Cruise, and then recommending other films.)

LG took things even further, weaving a tale of AI that’s built into the company’s entire way of being. Not just a feature in a single product.

“The AI brain of LG that we envision is a powerful engine with orchestrated processes,” CEO William (Joowan) Cho said at the press conference that kicks off every CES. “It starts from focusing customers’ needs through interactive conversation, or contextual understanding, like behavior patterns and emotions. And ultimately the AI brain generates optimal solutions to prompt tangible actions by orchestrating physical devices. So we call this, ‘Orchestrated intelligence.’”

That’s lofty stuff. In addition to the usual sort of AI processing we’ve already talked about, LG also is going so far as to identify different users by their voice and then applying the proper user profile. Or, as vice president of home entertainment content and services Matthew Durgin said, the new Alpha 11 processor “allows LG TVs to recognize you.”

A representation of Samsung's NQ8 Ai Gen. 3 processor.
A representation of Samsung’s NQ8 Ai Gen. 3 processor at CES 2024. Phil Nickinson / Digital Trends

Samsung also is taking a holistic approach and applying AI across its entire portfolio. We sat down at CES with Jay Kim, executive vice president of Samsung Visual Display Business, to discuss AI as a whole, and to get a crash course in how Samsung devices put to use traditional computing processors in conjunction with graphics processors and neural processors.

The end result will be seamless, he said.

“I think that ordinary consumers will not know if an [neural processing unit] is at work or not,” Kim said through an interpreter. “So, I think it will be through their experience that they understand the benefits offered by an NPU. So, an exceptional new experience that they have in their interaction with a device, I think that will help them realize, indirectly, the power of NPUs.”

As with most (if not all) Samsung devices, look for the AI in one product to help better another.

“If you go to a Samsung Health application,” Kim said, “maybe you’re working out, you’re doing squats or pushups. The AI will be able to make assessments and see if you’re doing it properly. Maybe if you’re not doing it properly, it’ll help you to sort of fix your posture and so forth.”

Better living through artificial intelligence. It’s coming.

Editors’ Recommendations

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums