Qualcomm bets big on Hybrid artificial intelligence in ChatGPT, Bard era

AI models with more than 1 billion parameters are already running on phones with performance and accuracy levels similar to those of the cloud

IANS New Delhi

Listen to This Article

As generative artificial intelligence (AI) and chatbots like Microsoft-backed OpenAIs ChatGPT and Google Bard take the world by storm, Hybrid AI will allow generative AI developers and providers to take advantage of the compute capabilities available in edge devices to reduce costs, chip maker Qualcomm has said.

In a white paper, the company said that hybrid processing is more important than ever.

"A hybrid AI architecture distributes and coordinates AI workloads among cloud and edge devices, rather than processing in the cloud alone," said the chip maker.

The cloud and edge devices -- smartphones, cars, personal computers, and Internet of Things (IoT) devices -- work together to deliver more powerful, efficient and highly optimised AI.

"The main motivation is cost savings. For instance, generative AI-based search cost per query is estimated to increase by 10 times compared to traditional search methods -- and this is just one of many generative AI applications," Qualcomm stressed.


Also Read

Bard, Bing and Baidu: How big tech's AI race will transform searches

Meta joins AI chatbot race with own large language model for researchers

Google may introduce 20 AI-powered tools, ChatGPT competitor in May

We're building next-gen AI to become a global powerhouse: Chandrasekhar

Not considering law to regulate AI growth in country: IT Ministry

Chat lock: Know about WhatsApp's privacy feature for private conversations

AI-generated image: This Google Search tool helps detect synthetic photos

Apple to not make iPads, Mac PCs but may manufacture AirPods in India

Zuckerberg launches 'Chat Lock' on WhatsApp to protect conversations

Mark Zuckerberg announces 'Chat Lock' on WhatsApp to protect conversations

Hybrid AI even allows for devices and cloud to run models concurrently -- with devices running light versions of the model while the cloud processes multiple tokens of the full model in parallel and corrects the device answers if needed.

AI models with more than 1 billion parameters are already running on phones with performance and accuracy levels similar to those of the cloud, and models with 10 billion parameters or more are slated to run on devices in the near future.

"The hybrid AI approach is applicable to virtually all generative AI applications and device segments -- including phones, laptops, extended reality headsets, cars and IoT," according to Qualcomm.



(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)

First Published: May 16 2023 | 3:19 PM IST

Explore News