
Rate Of Return
CommemorativeThe built-in sycophancy of ChatGPT can create a phenomenon of "delusional spiral." You ask it something, and it agrees. You ask again, and it agrees even more, until you eventually believe something completely wrong without even realizing it. The model is actually trained on human feedback, which rewards the option of agreement.
Real-world consequences include: a man spent 300 hours firmly believing he had invented a world-changing mathematical formula; and a psychiatrist at the University of California, San Francisco, hospitalized 12 patients with chatbot-induced psychosis within a year.
The copyright of this article belongs to the original author/organization.
The views expressed herein are solely those of the author and do not reflect the stance of the platform. The content is intended for investment reference purposes only and shall not be considered as investment advice. Please contact us if you have any questions or suggestions regarding the content services provided by the platform.
