In a strategic shift described as the most significant since the launch of the Apple Watch, recent reports indicate Apple is accelerating development of three new wearable categories aimed at moving “visual intelligence” from a feature within the iPhone to smart devices that see, hear, and analyze surroundings directly.
Smart Glasses
According to reports, Apple plans to introduce smart glasses that will directly compete with Meta’s Ray-Ban smart glasses, which have seen considerable success. These glasses, Bloomberg reports, will not include a display in the initial version. Instead, they will rely on dual cameras – one for high-resolution photo and video capture, and another dedicated to “computer vision” to assist Siri in understanding the environment.
The glasses will be capable of recognizing landmarks, providing real-time translation, and offering navigational guidance through voice commands, all without requiring the user to access their iPhone. MacRumors notes that production could begin as early as December 2026, with a target launch in 2027.
The second camera is particularly noteworthy. It will be able to interpret the user’s surroundings and measure distance, similar to LiDAR technology found in iPhones. Apple intends to differentiate its glasses from Meta’s offering with a higher-end camera system. The glasses could also support a version of Visual Intelligence, capable of reading physical text – such as event dates – and adding that information to the calendar. Context-aware reminders and live translation are also possibilities.
AI Pendant/Pin
Reports suggest the AI pendant, or “pin,” will be a small device worn around the neck or clipped to clothing, designed as an alternative for those who prefer not to wear glasses. TechRepublic details that this pendant will feature a “always-on” camera and sensitive microphones, aiming to function as a “contextual assistant” – remembering where you placed your keys or identifying people in front of you.
Bloomberg sources indicate that Apple’s pendant will operate as an “accessory” to the iPhone, leveraging its processing power for complex data handling to maximize battery life and maintain a lightweight design.
Camera-Equipped AirPods
The concept of a camera within earbuds may seem unusual, but Apple sees it as a technical solution to complex interactive problems. 9to5Mac reports that Apple is testing the integration of low-resolution infrared (IR) cameras into the AirPods, similar to those used in Face ID.
These cameras won’t be for traditional photography but will support “aerial gestures” – controlling volume or calls with hand movements in front of the earbud. They will also enhance “spatial audio” by accurately understanding the dimensions of a room. Leaks suggest this project is the most mature and could be available in late 2026.

Observers and tech analysts view Apple’s move towards these developments not as mere “technological luxury,” but as a strategic necessity to remain at the forefront of the artificial intelligence race. Apple recognizes that iPhone sales growth has plateaued, making “wearable” devices the next major platform.
However, cybersecurity experts point out that the biggest challenge won’t be technical, but convincing the public to wear cameras that are “always on.”
Despite this, analysts are betting on Apple’s reputation for “on-device processing” to overcome privacy concerns that have hindered similar projects by other companies. The company’s recent acquisition of Q.ai for $2 billion suggests a commitment to enhancing its AI capabilities within these devices.
Many observers agree that and will represent a pivotal turning point, as technology transitions from “tools we use” to “companions that accompany us,” fundamentally changing how we interact with the world around us.
