Qualcomm Technologies and Meta are working to optimize the execution of Meta’s Llama 2 large language models directly on-device — without relying on the sole use of cloud services.
The ability to run generative AI models like Llama 2 on devices such as smartphones, PCs, VR/AR headsets, and vehicles allows developers to save on cloud costs, and to provide users with private, more reliable, and personalized experiences.
As a result, Qualcomm plans to make the on-device Llama 2-based AI implementations available to enable the creation of new AI applications. This will allow customers, partners, and developers to build use cases, such as intelligent virtual assistants, productivity applications, content creation tools, entertainment, and more.
These new on-device AI experiences, powered by Snapdragon, can work in areas with no connectivity or even in airplane mode.
“We applaud Meta’s approach to open and responsible AI and are committed to driving innovation and reducing barriers-to-entry for developers of any size by bringing generative AI on-device,” said Durga Malladi, senior VP and GM of technology, planning and edge solutions businesses, Qualcomm Technologies, Inc. “To effectively scale generative AI into the mainstream, AI will need to run on both the cloud and devices at the edge, such as smartphones, laptops, vehicles, and IoT devices.”
Meta and Qualcomm Technologies have a longstanding history of working together to drive technology innovation and deliver the next generation of premium device experiences. The companies’ current collaboration to support the Llama ecosystem span across research and product engineering efforts.
You may also like:
Filed Under: AI, Components
Questions related to this article?
👉Ask and discuss on Electro-Tech-Online.com and EDAboard.com forums.
Tell Us What You Think!!
You must be logged in to post a comment.