MediaTek chip illustration on the Samsung Galaxy S23 Ultra.
Nadeem Sarwar / Digital Trends

You may have already tried a bunch of generative AI apps on your phone. OpenAI’s ChatGPT or HeyPi are some examples of chatty AI apps, while the likes of Runway ML will let you create AI-generated videos right on your phone.

But so far, almost every generative AI app has relied on cloud-based computing, which means all the magical AI processing happens in the cloud, just the way Xbox kits do the heavy lifting at Microsoft server centers when you stream a console game on your phone or tablet.

MediaTek wants to change that, allowing for on-device generative AI computing right on your smartphone. That ambitious moonshot will materialize with the next flagship chipset by MediaTek, which is going to appear inside Android phones by the end of this year. The company is aided in its efforts by Facebook parent Meta, which is offering the Llama 2 model as the foundational stone for generative AI apps.

The chipmaker is touting a whole gamut of benefits from its on-device generative AI processing approach, which includes “seamless performance, greater privacy, better security and reliability, lower latency, the ability to work in areas with little to no connectivity, and lower operation cost.”

Speeding up generative AI on phones

White Xiaomi POCO F5 and bluish-green OnePlus Nord 3 held in hand over a heap of Android smartphones including OnePlus 11, Pixel 6a, Samsung Galaxy Z Flip 4, Galaxy S23.
Tushar Mehta / Digital Trends

The upcoming top-end MediaTek chip will employ three key tactics to enhance the generative AI experience for smartphones. First, it will feature a dedicated software stack optimized to run Llama 2, an open-source large language model developed by Meta that seeks to challenge OpenAI’s GPT and Google’s PaLM 2 models.

Llama 2, which was released in July, is “free for research and commercial use.” So far, not many apps that rely on the Llama 2 language model have arrived, as GPT-based AI apps are currently the talk of the town. But Meta has been actively soliciting players in the smartphone industry, including MediaTek’s archrival, Qualcomm.

Just like MediaTek, Qualcomm also inked a deal with Meta to showcase on-device processing for Llama-based applications on smartphones powered by its flagship chips starting in 2024. Both companies are also targeting other application areas, such as vehicles. XR hardware, smart home devices, and more, with their device-local generative AI efforts facilitated by top-tier silicon.

Meta's Llama 2 language model on a phone.
Nadeem Sarwar / Digital Trends

On its upcoming AI-friendly flagship chip, MediaTek will also use a souped-up APU (AI Processing Unit) that relies on “transformer backbone acceleration.” Transformers are a type of neural network architecture tasked with creating large language models such as GPT (Generative Pre-Trained Transformers) that gave birth to products such as ChatGPT and Microsoft’s Bing Chat.

Finally, MediaTek’s new chip will also dip into the DRAM module to enhance the user experience of Llama-based generative AI apps. DRAM, short for Dynamic Random Memory, is the high-speed, low-latency memory module that stores an app’s working data. The more DRAM in your phone, the higher the number of apps you can run in the background without any issue. There’s a reason smartphone makers like OnePlus have fitted as much as 24GB DRAM on their phones.

What these AI upgrades mean for you

Poster with the MediaTek logo in orange.
Joe Maring / Digital Trends

With the aforementioned tweaks in place, MediaTek says its upcoming AI chip will enhance LLM and AIGC (Artificial Intelligence-Generated Content) experience by shifting the bulk of processing requirements to the local hardware.

As for the kind of tasks you can expect to speed up, well, it would depend on the capabilities of Meta’s Llama 2 model and the kind of apps that are built atop it. Llama 2 is a text-based natural language model, so you can expect it to pull the same kind of tricks as ChatGPT or Bard. It can process queries and provide answers based on its training datasets, summarize or expand text, and more.

As for MediaTek, this won’t be the first AI partnership of its kind. In July this year, the company inked a deal with the China arm of Unity — which develops the eponymous game development engine — to explore generative AI applications in the gaming segment.

MediaTek expects its new chip with Llama 2 apps to arrive in smartphones by the end of the year, so we don’t have that much longer to wait to see what this all looks like in an actual product.

Editors’ Recommendations

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums