
NVIDIA launches AI sign language learning platform with 3D virtual avatars for AI real-time feedback

Analysis suggests that the launch of this platform also indicates that NVIDIA is attempting to expand its AI business, no longer limited to AI hardware
NVIDIA launched a new AI platform on Thursday aimed at teaching people American Sign Language (ASL) to help bridge communication gaps. The platform, named Signs, is creating a validated dataset for sign language learners and ASL-based AI application developers.
This interactive web platform was jointly launched by NVIDIA, the American Society for Deaf Children, and the creative agency Hello Monday, allowing sign language learners to access an ASL sign library through the platform, learn vocabulary with 3D virtual avatars, and use AI tools to analyze camera footage for real-time feedback on their sign language performance. Sign language users of any level can contribute their sign language videos to help build an open-source ASL video dataset.
NVIDIA stated that the dataset plans to expand to 400,000 video segments covering 1,000 sign language words, verified by fluent ASL users and interpreters to ensure the accuracy of each sign, thereby creating a high-quality visual dictionary and teaching tool.
The NVIDIA team plans to utilize this dataset to further develop AI applications to break down communication barriers between the deaf community and hearing individuals. This data will be made publicly available to support the construction of accessible technologies, including AI agents, digital human applications, and video conferencing tools. Additionally, this data can enhance the Signs platform and provide real-time AI support and feedback for the entire ASL ecosystem.
NVIDIA indicated that during the data collection phase, Signs has become a powerful ASL language learning platform, providing individuals with the opportunity to learn and practice 100 basic sign language vocabulary words, thereby helping them communicate more effectively with friends and family who use ASL.
Currently, Signs primarily focuses on learning gestures and finger positions, but ASL also includes facial expressions and head movements to convey richer meanings. The Signs team is exploring how to track and integrate these non-hand signals to improve the learning experience in future versions.
Additionally, they are researching how to incorporate regional variations and slang expressions into Signs to enrich the ASL database and collaborating with researchers from the Rochester Institute of Technology's Center for Access and Inclusion to evaluate and further optimize the Signs platform to make it more suitable for deaf and hard-of-hearing users Hello Monday/DEPT founding partner Anders Jessen stated:
"Improving the accessibility of ASL is an ongoing process. Signs can meet the demand for advanced AI tools and help bridge the communication gap between the deaf community and hearing individuals."
Analysis suggests that the launch of this platform also indicates that NVIDIA is attempting to expand its AI business beyond just AI hardware. NVIDIA is currently a major chip supplier in the AI industry, with its chips being used by most AI companies to train and run AI models. Additionally, NVIDIA is also developing its own AI models and software platforms. As numerous AI companies compete to procure NVIDIA's chips, the company's stock price has risen by over 100% in the past year, and its market capitalization has exceeded $3.4 trillion