MediaTek Brings On-Device Generative AI | LLM | Turtles AI

MediaTek Brings On-Device Generative AI
DukeRem
  #MediaTek partners with #Meta to run #generativeAI #apps on #devices using #Llama 2 and MediaTek's efficient #AI #chipsets. This brings benefits over #cloud AI. MediaTek announced today that it is collaborating with Meta to leverage the company's Llama 2 large language model to enhance on-device generative AI capabilities for smartphones, IoT, vehicles, smart homes and other edge devices. Most generative AI today runs in the cloud, but MediaTek aims to enable more processing directly on devices. This provides benefits like better performance, privacy, security, latency and cost. To make on-device AI work well, device makers need to adopt MediaTek's low-power AI chipsets like its APUs and next-gen flagship 5G chip coming later this year. MediaTek optimized its software stack and APU hardware to run Llama 2 efficiently. This will accelerate development of use cases. The company expects Llama 2 apps to be available on phones with its next flagship chip by end of 2022. The partnership with Meta will give MediaTek the tools to fully innovate in AI and deliver more edge computing capabilities than before. Highlights: - MediaTek collaborating with Meta on using Llama 2 for on-device generative AI - Provides benefits like better performance, privacy, cost over cloud AI - MediaTek optimizing chips and software stack to run Llama 2 efficiently - Expects Llama 2 apps on next flagship phones by end of 2022 MediaTek's partnership with Meta represents an exciting step towards enabling more on-device generative AI capabilities. As cloud computing becomes more expensive and faces limitations around privacy and connectivity, processing AI natively on devices emerges as an appealing alternative. What kind of use cases for generative AI on smartphones and other edge devices are you most excited about? I'd love to hear your thoughts on the pros and cons of on-device vs cloud AI.