The next big thing is always around the corner in the tech world, and artificial intelligence is the newest and most exciting innovation. Sure, AI can power ChatGPT and other engines, but it can also make devices more efficient, learn our habits to adjust our tech to meet our needs, and even improve everything from emailing to photography. MediaTek is looking to help boost that to the next level with their partnership with Meta, Facebook’s parent company, to use the Meta Llama 2 LLM (Large Language Model) to bring on-device generative AI to our pockets and homes.
That may sound like a string of buzzwords strung together, but here’s what it means in a practical sense.
Right now, Artificial Intelligence (AI) runs on cloud computing, where the bulk of the AI’s processing and efforts are limited by your internet connection.
Every MediaTek-powered 5G smartphone System on Chip (SoC) that is currently shipping is equipped with an APU that has been designed to perform generative AI functions, including AI Noise Reduction, AI Super Resolution, AI Motion Estimation Motion Compensation (MEMC), and more.
Now that MediaTek is working with the Meta Llama 2 LLM, they can bring even more AI power to the devices themselves. Doing this allows your phone or other connected device to run the AI locally rather than using the cloud.
The point of doing this is that it will give anything running this setup “better performance, greater privacy, better security and reliability, lower latency, the ability to work in areas with little to no connectivity, and lower operation cost.”
AI on your phone may seem like a chicken and egg riddle; you need powerful AI software but hardware that can handle that power. MediaTek is setting it up so that when the AI is powerful enough and ready for prime time, your phone and other connected devices will be, too!
MediaTek’s latest top-tier chip, debuting later this year, will have improved software designed for running the Meta Llama 2 LLM. It will also include an even better APU with Transformer backbone acceleration, making it faster while using less memory, which, in turn, will boost the performance of the Meta Llama 2 LLM and Artificial Intelligence-Generated Content (AIGC).
These improvements will make it easier to create applications for on-device Generative AI.
Using the Meta Llama 2 LLM and latest APUs and NeuroPilot AI Platform, MediaTek plans to build a “complete edge computing ecosystem designed to accelerate AI application development on smartphones, IoT, vehicles, smart home, and other edge devices.”
MediaTek is saying that they expect Meta Llama 2 LLM-based AI applications to become available for smartphones powered by its next-generation flagship SoC, so we should start seeing this capability by the end of the year. This is going to be an exciting development to watch!