We use cookies to personalize content and to analyze our traffic. Please decide if you are willing to accept cookies from our website.

Model Quantization in Action: How SMEs Can Benefit From On-Device AI

The need for AI on-the-go is a main driver for the creation of AI mobile applications like chatbots, real-time translation, and image/video generation. In some cases these applications use models that must reside on cloud servers due to their hardware and high accuracy requirements. SMEs are at a disadvantage with cloud-based AI because of their limited budgets. Meanwhile, smartphones are handling AI tasks better due to hardware improvements in chips like the Apple A17 Pro, Google Tensor G3 and the Qualcomm Snapdragon 8 Gen 3. These System-on-Chips (SoCs) will drive edge AI on smartphones–allowing large enterprises to leverage this technology.

AI teams can use model quantization to help SMEs to keep pace with the big players. Model quantization adds edge AI to mobile applications and improves inference speed, reduces cost and reduces hardware requirements. Model quantization also makes the mobile application available on a wider …

Tactive Research Group Subscription

To access the complete article, you must be a member. Become a member to get exclusive access to the latest insights, survey invitations, and tailored marketing communications. Stay ahead with us.

Become a Client!