
$Alphabet(GOOGL.US) has released a new technology called TurboQuant. Its core is to compress the memory footprint in the AI inference process to one-sixth of the original, while also bringing performance improvements. The market's first reaction was: So, will we not need as many storage chips in the future?
But looking at it calmly, this seems more like short-term trading logic rather than a fundamental reversal. On one hand, this technology mainly optimizes the "cache during the inference phase" and does not directly reduce training and core storage needs. On the other hand, AI itself is still in an explosive growth period; the demand for computing power and storage is essentially expanding together. It's unlikely that demand will collapse just because of a compression algorithm.
Personally, I feel this kind of "technology shock" provides more volatility than changes the trend. For the storage sector, look at sentiment in the short term, but in the medium to long term, you still have to look at the AI demand curve. Right now, it seems more like a wash-out using negative news, rather than the underlying logic being disproven.
The copyright of this article belongs to the original author/organization.
The views expressed herein are solely those of the author and do not reflect the stance of the platform. The content is intended for investment reference purposes only and shall not be considered as investment advice. Please contact us if you have any questions or suggestions regarding the content services provided by the platform.
