• OpenAI looking beyond Nvidia's expensive GPUs as it starts to use

    From TechnologyDaily@1337:1/100 to All on Thursday, July 03, 2025 18:45:06
    OpenAI looking beyond Nvidia's expensive GPUs as it starts to use Google's
    TPU AI silicon - but will it be as easy as swapping chips?

    Date:
    Thu, 03 Jul 2025 17:32:00 +0000

    Description:
    OpenAI starts to use Google TPUs on ChatGPT to reduce its reliance on Nvidia hardware.

    FULL STORY ======================================================================OpenAI adds Google TPUs to reduce dependence on Nvidia GPUs TPU adoption highlights OpenAIs push to diversify compute options Google Cloud wins OpenAI as
    customer despite competitive dynamics

    OpenAI has reportedly begun using Googles tensor processing units (TPUs) to power ChatGPT and other products.

    A report from Reuters , which cites a source familiar with the move, notes this is OpenAIs first major shift away from Nvidia hardware, which has so far formed the backbone of OpenAIs compute stack.

    Google is leasing TPUs through its cloud platform, adding OpenAI to a growing list of external customers which includes Apple, Anthropic, and Safe Superintelligence. Not abandoning Nvidia

    While the chips being rented are not Googles most advanced TPU models, the agreement reflects OpenAIs efforts to lower inference costs and diversify beyond both Nvidia and Microsoft Azure.

    The decision comes as inference workloads grow alongside ChatGPT usage, now serving over 100 million active users daily.

    That demand represents a substantial share of OpenAIs estimated $40 billion annual compute budget.

    Google's v6e Trillium TPUs are built for steady-state inference and offer
    high throughput with lower operational costs compared to top-end GPUs.

    Although Google declined to comment and OpenAI did not immediately respond to Reuters , the arrangement suggests a deepening of infrastructure options.

    OpenAI continues to rely on Microsoft-backed Azure for most of its deployment (Microsoft is the companys biggest investor by some way), but supply issues and pricing pressures around GPUs have exposed the risks of depending on a single vendor.

    Bringing Google into the mix not only improves OpenAIs ability to scale compute, it also aligns with a broader industry trend toward mixing hardware sources for flexibility and pricing leverage.

    Theres no suggestion that OpenAI is considering abandoning Nvidia altogether, but incorporating Google's TPUs adds more control over cost and availability.

    The extent to which OpenAI can integrate this hardware into its stack remains to be seen, especially given the software ecosystem's long-standing reliance on CUDA and Nvidia tooling. You might also like China wants to be the global AI leader in personal computing AMD launches a new single-slot GPU for data centers This startup may be the answer to running AI training on any GPU



    ======================================================================
    Link to news story: https://www.techradar.com/pro/openai-looking-beyond-nvidias-expensive-gpus-as- it-starts-to-use-googles-tpu-ai-silicon-but-will-it-be-as-easy-as-swapping-chi ps


    --- Mystic BBS v1.12 A47 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)