In 2024, Qualcomm and Meta will introduce flagship phones with on-device AI

Stable Diffusion is an AI tech introduced by Qualcomm. It is available on a smartphone powered by a Snapdragon 8 Gen 2 processor. The striking thing about this tech is that it doesn’t rely on an internet connection. Recently, the company claimed that high-end smartphones in 2024 will be equipped with on-device AI support.

Qualcomm declares that on-device AI support will be introduced to smartphones and PCs in 2024 in collaboration with (Llama 2) Meta. Users will have access to productivity applications, content creation tools, smart virtual assistants, entertainment, and more without the requirement of an internet connection.

As per the company’s announcement statement, Qualcomm Technologies said that starting in 2024, Llama 2-based AI implementation on devices powered by Snapdragon will be made available. There is no information on whether Meta will introduce Llama 2-based apps or not. However, some third-party Android app developers will possibly put in their efforts.

More about Llama 2

Besides this information, Meta declares that the LLM is available as an open source. With Llama 2 startups, entrepreneurs, businesses, and researchers have access to more tools. These tools would provide “opportunities for them to experiment, innovate in new ways, and eventually benefit both economically and socially.

Meat assumes that open access to its AI is safer. It will be subjected to stress tests by developers and researchers. The business goes on to say that Llama 2 has been “red-teamed” — tested and improved for safety by utilizing teams within and outside the company to “generate adversarial prompts.” Meta continues by stating that it will “continue investing in security via fine-tuning and benchmarking” the model.

Besides this, the company has partnered with Microsoft for Llama 2. It will be accessible on Windows and Azure as part of the recent partnership. Users can access Llama 2 in the Azure AI model catalog. It can also operate on Windows locally. Furthermore, other providers like Amazon Web Services (AWS), and Hugging Face can also access it.

Leave a Reply