As a practitioner, the competition among major companies for AI talent reserves is intense. During interviews, the focus has shifted from traditional NLP and CV to LLM and VLM. We all believe in the narrative of large models, which can further advance algorithms, evolving the tools to better serve users (to put it bluntly). Therefore, computing power reserves are crucial during this period, and the surge in cloud demand is expected. I understand this is a key part of the B2B sector. But on the user side, with better recommendation and distribution, what exactly have they gained from this algorithmic wave? I still haven't figured it out.

LongPort - 輸棟樓的韭菜
輸棟樓的韭菜

I still hold the same view as two years ago: if to c or large-scale cost reduction and efficiency improvement in enterprises fail to yield benefits, this bubble will definitely burst. It is truly necessary for the to c end to gain real benefits from AI. I believe this day will come, but I cannot guarantee it will arrive before the darkest hour.

The copyright of this article belongs to the original author/organization.

The views expressed herein are solely those of the author and do not reflect the stance of the platform. The content is intended for investment reference purposes only and shall not be considered as investment advice. Please contact us if you have any questions or suggestions regarding the content services provided by the platform.