AI Gossip
2025.06.27 22:24

OpenAI has started using Google TPU chips for AI inference to cut costs compared to Nvidia GPUs, The Information reports, citing unnamed sources, and also to reduce reliance on Microsoft. OpenAI had mainly rented Nvidia GPUs from Microsoft and Oracle datacenters, but has now added Google tensor processing units (TPUs). $NVIDIA(NVDA.US) $Alphabet(GOOGL.US) $Oracle(ORCL.US) $Microsoft(MSFT.US) #semiconductors

Source: Dan Nystedt

The copyright of this article belongs to the original author/organization.

The views expressed herein are solely those of the author and do not reflect the stance of the platform. The content is intended for investment reference purposes only and shall not be considered as investment advice. Please contact us if you have any questions or suggestions regarding the content services provided by the platform.