Close Menu
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    KahawatunguKahawatungu
    Button
    • NEWS
    • BUSINESS
    • KNOW YOUR CELEBRITY
    • POLITICS
    • TECHNOLOGY
    • SPORTS
    • HOW-TO
    • WORLD NEWS
    KahawatunguKahawatungu
    TECHNOLOGY

    Qualcomm announces AI chips to compete with AMD and Nvidia — stock soars 15%

    Oki Bin OkiBy Oki Bin OkiOctober 28, 2025No Comments4 Mins Read
    Facebook Twitter WhatsApp Telegram Email
    Qualcomm announces AI chips to compete with AMD and Nvidia — stock soars 15%
    Qualcomm announces AI chips to compete with AMD and Nvidia — stock soars 15%
    Share
    Facebook Twitter WhatsApp Telegram Pinterest Email Copy Link

    Qualcomm announced Monday that it will release new artificial intelligenceaccelerator chips, marking new competition for Nvidia, which has so far dominated the market for AI semiconductors.

    The stock soared 15% following the news.

    The AI chips are a shift from Qualcomm, which has thus far focused on semiconductors for wireless connectivity and mobile devices, not massive data centers.

    Qualcomm said that both the AI200, which will go on sale in 2026, and the AI250, planned for 2027, can come in a system that fills up a full, liquid-cooled server rack.

    Qualcomm is matching Nvidia and AMD, which offer their graphics processing units, or GPUs, in full-rack systems that allow as many as 72 chips to act as one computer. AI labs need that computing power to run the most advanced models.

    Qualcomm’s data center chips are based on the AI parts in Qualcomm’s smartphone chips called Hexagon neural processing units, or NPUs.

    “We first wanted to prove ourselves in other domains, and once we built our strength over there, it was pretty easy for us to go up a notch into the data center level,” Durga Malladi, Qualcomm’s general manager for data center and edge, said on a call with reporters last week.

    The entry of Qualcomm into the data center world marks new competition in the fastest-growing market in technology: equipment for new AI-focused server farms.

    Nearly $6.7 trillion in capital expenditures will be spent on data centers through 2030, with the majority going to systems based around AI chips, according to a McKinsey estimate.

    The industry has been dominated by Nvidia, whose GPUs have over 90% of the market so far and sales of which have driven the company to a market cap of over $4.5 trillion. Nvidia’s chips were used to train OpenAI’s GPTs, the large language models used in ChatGPT.

    But companies such as OpenAI have been looking for alternatives, and earlier this month the startup announced plans to buy chips from the second-place GPU maker, AMD, and potentially take a stake in the company. Other companies, such as Google

    , Amazon and Microsoft, are also developing their own AI accelerators for their cloud services.
    Qualcomm said its chips are focusing on inference, or running AI models, instead of training, which is how labs such as OpenAI create new AI capabilities by processing terabytes of data.

    The chipmaker said that its rack-scale systems would ultimately cost less to operate for customers such as cloud service providers, and that a rack uses 160 kilowatts, which is comparable to the high power draw from some Nvidia GPU racks.

    Malladi said Qualcomm would also sell its AI chips and other parts separately, especially for clients such as hyperscalers that prefer to design their own racks. He said other AI chip companies, such as Nvidia or AMD, could even become clients for some of Qualcomm’s data center parts, such as its central processing unit, or CPU.

    “What we have tried to do is make sure that our customers are in a position to either take all of it or say, ‘I’m going to mix and match,’” Malladi said.

    The company declined to comment, the price of the chips, cards or rack, and how many NPUs could be installed in a single rack. In May, Qualcomm announced a partnership with Saudi Arabia’s Humain to supply data centers in the region with AI inferencing chips, and it will be Qualcomm’s customer, committing to deploy up to as many systems as can use 200 megawatts of power.

    Qualcomm said its AI chips have advantages over other accelerators in terms of power consumption, cost of ownership, and a new approach to the way memory is handled. It said its AI cards support 768 gigabytes of memory, which is higher than offerings from Nvidia and AMD.

    By CNBC News

    Email your news TIPS to Editor@Kahawatungu.com — this is our only official communication channel

    Follow on Facebook Follow on X (Twitter)
    Share. Facebook Twitter WhatsApp LinkedIn Telegram Email
    Oki Bin Oki

    Related Posts

    Djibouti Regrets Eritrea’s Withdrawal from IGAD

    December 13, 2025

    UK sanctions Sudan RSF paramilitary deputy, other commanders

    December 13, 2025

    New photos from Epstein estate show Trump, Andrew and Bill Clinton

    December 13, 2025

    Comments are closed.

    Latest Posts

    Police officer shoots and kills reveler in altercation in bar in Kenyenya, Kisii County

    December 13, 2025

    Djibouti Regrets Eritrea’s Withdrawal from IGAD

    December 13, 2025

    EACC recovers grabbed road reserve in Mombasa valued at Sh21 Million

    December 13, 2025

    Jirongo body moved to Lee Funeral in Nairobi after his death in Naivasha

    December 13, 2025

    Kenya’s New Crypto Law Set to Boost Startups, Jobs, and Global Investment

    December 13, 2025

    Ruto Leads Nation in Mourning Former Lugari MP Cyrus Jirongo

    December 13, 2025

    Govt Postpones Religious Bill to Allow Wider Public Input

    December 13, 2025

    Ruto, CSs Makes Key Govt Appointments

    December 13, 2025
    Facebook X (Twitter) Instagram Pinterest
    © 2025 Kahawatungu.com. Designed by Okii.

    Type above and press Enter to search. Press Esc to cancel.