Add cameras! Apple aims to turn the watch into an AI gateway

Wallstreetcn
2025.03.24 03:31
portai
I'm PortAI, I can summarize articles.

Apple plans to launch an Apple Watch with a camera in 2027, positioning it as an entry point for AI wearable devices. This move will significantly enhance the AI capabilities of wearable devices and, along with the simultaneously developed camera-equipped AirPods, will become an important part of Apple's AI ecosystem

Apple is developing an Apple Watch with a camera, positioning it as an entry point for AI features, supporting artificial intelligence services including Apple Intelligence, with plans to launch in 2027.

It is understood that the standard version of the watch will have the camera built into the display, while the Ultra version will have it located on the side.

This move will significantly enhance the AI capabilities of wearable devices, becoming an important part of Apple's AI ecosystem alongside the concurrently developed camera-equipped AirPods.

By 2027, Apple hopes to support features like Visual Intelligence through its self-developed models, which coincides with the company's plans to release the new Apple Watch and AirPods.

Apple's AI Wearable Ambition: Smartwatch as a Key Carrier

According to Bloomberg reporter Mark Gurman, Apple is developing a new version of the Apple Watch equipped with a camera. These devices may take several generations to hit the market, but they are already on Apple's product roadmap.

Apple plans to add cameras to both the standard Series watch and the Ultra model. The current idea is to place the camera inside the display of the Series version, similar to the front camera of the iPhone.

The Ultra will take a different approach, with the camera lens located on the side of the watch near the crown and buttons.

By adding camera functionality, the Apple Watch will be able to provide features such as video calls, photo recording, and visual AI analysis.

The addition of a camera will enable the Apple Watch to support AI features like Visual Intelligence, which can help identify surrounding objects and locations and provide relevant information.

Last October, Apple Intelligence was released alongside the iOS 18.1 update, providing a visual search tool that helps identify surrounding objects and locations and displays any relevant information.

With a camera, the Apple Watch will be able to "see the outside world and use AI to provide relevant information."

Apple's AI Wearable Device Ecosystem is Taking Shape

In recent years, the market has seen a surge of AI wearable devices, from the failed Humane Ai Pin to Meta's popular smart glasses.

The core idea of these devices is to utilize built-in cameras and microphones to support artificial intelligence, providing users with contextual information about their surroundings.

Apple began venturing into this field last year with the launch of the iPhone 16. Its new Visual Intelligence feature is associated with the Camera Control interface, using AI to help users understand the world around them.

This feature is currently only available on the iPhone 16 but will be expanded in the upcoming iOS 18.4, allowing iPhone 15 Pro users to access it as wellIn addition, the smartwatch is not the only wearable device that Apple plans to add a camera to. Gurman reported last December that Apple is developing an infrared camera for future AirPods.

The infrared camera can be used in various ways, including detecting air gestures and enhancing spatial audio in conjunction with devices like Apple Vision Pro.

According to Gurman, by 2027, Apple hopes to support features like Visual Intelligence through its self-developed models, which coincides with the company's plan to release new Apple Watch and AirPods.

This indicates that the company is actively laying out an AI hardware ecosystem, turning wearable devices into an entry point for artificial intelligence services, providing users with a smarter and more convenient interactive experience