中文
Home / IC News

Qualcomm Unveils Two Data Center AI Chips, Shares Surge Nearly 22%

US-based chipmaker Qualcomm (NASDAQ: QCOM) on Monday, October 27, unveiled two new artificial intelligence chips for data centers — the AI200 and AI250 — as part of its effort to diversify beyond the slowing smartphone market. The announcement sent the company's shares soaring nearly 22% during the session, before closing up 11.09% on the day.

The launch marks Qualcomm's most ambitious step yet into the data center segment, positioning it to challenge Nvidia and AMD in the fast-growing AI infrastructure market. The new chips are designed to deliver rack-level performance, superior memory capacity, and exceptional cost efficiency for running large language models (LLMs) and multimodal AI workloads.

Next-Generation Inference Solutions

According to Qualcomm, the AI200 is a dedicated, rack-scale inference solution built for low total cost of ownership (TCO) and optimized performance across large-scale AI workloads. Each AI200 accelerator card supports up to 768 GB of LPDDR memory, enabling greater scalability and flexibility for generative AI deployment.

The AI250, set to debut a year later, introduces an innovative near-memory computing architecture that delivers over 10 times higher effective memory bandwidth while reducing power consumption. This architecture allows for highly efficient inference, disaggregated AI computing, and improved resource utilization to meet performance and cost requirements simultaneously.

Both systems feature direct liquid cooling for enhanced thermal efficiency, PCIe vertical scaling, Ethernet-based horizontal scaling, and confidential computing for secure AI workloads. Each rack configuration operates at 160 kW power capacity.

Durga Malladi, Senior Vice President and General Manager of Technology Planning, Edge Solutions, and Data Center at Qualcomm Technologies, said:

“With the AI200 and AI250, we are redefining what's possible in rack-level AI inference. These innovative infrastructure solutions enable enterprises to deploy generative AI at unprecedented TCO while maintaining the flexibility and security modern data centers require.”

Software Ecosystem and Global Partnerships

Qualcomm emphasized that its end-to-end hyperscale AI software stack is optimized for inference and supports leading machine learning frameworks, inference engines, and generative AI tools. Developers can import models seamlessly and deploy them with one click using Qualcomm's Efficient Transformers Library and AI Inference Suite, including support for Hugging Face models.

The company said the new infrastructure solutions will be commercially available in 2026 (AI200) and 2027 (AI250), with plans to update its data center roadmap annually to maintain industry-leading AI inference performance, efficiency, and cost-effectiveness.

ASK PCB (Aoshikang Technology)

As part of Qualcomm's global partnerships, Humain, an AI startup backed by Saudi Arabia's sovereign wealth fund, will deploy 200 megawatts of Qualcomm AI racks starting in 2026. Qualcomm and Humain first signed an MoU in May to co-develop next-generation AI data centers and cloud-to-edge infrastructure to meet surging global AI demand.

“Qualcomm's entry and major deal in Saudi Arabia prove the ecosystem is fragmenting because no single company can meet the global, decentralized need for high-efficiency AI compute,” said Joe Tigay, portfolio manager at the Rational Equity Armor Fund.

Expanding Beyond Smartphones

Qualcomm, the world's largest supplier of smartphone modem chips, has been steadily diversifying to reduce its dependence on the mobile market after losing Huawei as a major customer and facing Apple's in-house chip development efforts.

Over the past two years, Qualcomm has expanded into PC processors, competing with Intel and AMD in the Windows laptop segment. Its latest move into data center AI chips represents the company's most significant diversification effort yet.

Global investment in AI chips has surged as cloud providers, chipmakers, and enterprises race to build infrastructure capable of running large language models, chatbots, and other generative AI applications. While Qualcomm faces fierce competition from Nvidia, analysts say the new AI200 and AI250 series — offering strong energy efficiency and cost advantages — could help it carve out a niche in the booming AI inference market.

Phone

+86 191 9627 2716
+86 181 7379 0595

Working Hours

8:30 a.m. to 5:30 p.m., Monday to Friday

Copyright © 2023 HuNan Printed Circuit Association of ChinaSite mapPrivacy PolicyPowered by Bontop

Contact Us