Apple’s Next Wearable Push Could Include Smart Glasses and a Camera Pendant — and It Sounds Like Apple Intelligence Hardware
Apple may be preparing its most unusual wearable expansion yet. A new report suggests the company is developing two separate wearable devices: a pair of camera-equipped smart glasses and a compact AI pendant designed to clip onto clothing or hang like a necklace.

The Apple smart glasses appear to be the more serious project, reportedly closer to becoming a real product. The pendant, on the other hand, is said to be in early development and could still be cancelled. Apple has not officially acknowledged either device, but the report indicates both are being designed as extensions of the iPhone ecosystem rather than standalone gadgets.
If the report is accurate, Apple’s next wearable era won’t be about fitness bands or health sensors. It will be about AI-powered perception — hardware built to “see” the world and feed that context into Siri and Apple Intelligence.
Quick Highlights
Apple Smart Glasses: A Meta Ray-Ban Rival, Built the Apple Way
The report suggests Apple’s smart glasses are being designed in the same general category as Meta’s Ray-Ban smart glasses. That means they would include built-in cameras for capturing photos and videos, speakers and microphones for calls and music, and Siri-based notifications.
But Apple’s approach appears to be different in one key way: integration.
Instead of simply being “glasses with a camera,” Apple’s smart glasses are expected to act as a front-end for Apple Intelligence. The cameras could feed real-time visual data into Siri, enabling features like contextual guidance and navigation based on what the user is seeing. Turn-by-turn walking directions, for example, could become more natural if Siri is interpreting the real world through the glasses.
This kind of AI-powered hardware direction matches the broader Apple shift we’re already seeing, especially as Apple explores deeper third-party AI integrations, as discussed in Apple Intelligence Could Soon Let Users Choose ChatGPT, Gemini, or Claude — and It May Redefine the iPhone AI Era.
No AR Display Yet — Apple May Be Playing the Long Game
One detail that stands out is what these glasses reportedly will not include: an in-lens augmented reality display.
If true, Apple’s first smart glasses would not be true AR eyewear. They would be camera and audio glasses, similar to what Meta currently sells. That might disappoint users expecting Apple to instantly leapfrog the industry with a futuristic AR product.
But strategically, it makes sense. AR displays are expensive, power-hungry, and difficult to ship at scale without compromises. Apple may be choosing to enter the market with a simpler first-generation wearable that builds the ecosystem foundation first.
If Apple can nail comfort, camera quality, battery life, and Siri responsiveness, it could establish a strong user base before launching a more advanced AR version later.
Apple Reportedly Testing Multiple Frame Designs and Colors
Apple is also said to be designing its own frames instead of partnering with a major eyewear company. Meta works with Ray-Ban under EssilorLuxottica, while other brands are reportedly looking at partnerships with established eyewear labels.
Apple, however, appears to be doing what it usually does — keeping the design fully in-house.
The report claims Apple is testing at least four frame styles, including larger rectangular frames, slimmer rectangular options, and oval or circular designs in different sizes. Colors being explored reportedly include black, ocean blue, and light brown, along with vertically oriented oval camera lens layouts.
If Apple does ship multiple designs at launch, it would be a major advantage. Smart glasses only become mainstream when they look like normal eyewear.
The Pendant: Apple’s Most Experimental Wearable Idea Yet
The second wearable in development is described as an AirTag-sized pendant with a camera and microphone. It could be worn clipped to clothing or on a cord as a necklace, functioning as an always-on sensor device that works with Siri.
This product is the more unusual one, because it doesn’t fit into a category Apple already dominates. It’s also reportedly not a standalone product. Instead, it would rely on a paired iPhone for most processing tasks, which makes it more of an iPhone accessory than an independent AI device.
That distinction matters because it separates it from products like the Humane AI Pin, which struggled partly due to being standalone, expensive, and dependent on subscriptions. An iPhone-tethered version built into Apple’s ecosystem could avoid many of those problems.
However, the bigger question is whether people actually want to wear a camera pendant in public, especially in a world where privacy concerns around wearable cameras are growing.
Why Apple’s Wearable AI Strategy Makes Sense in 2026
Apple’s strength has always been ecosystem integration. It controls hardware, software, silicon, and services, which makes it uniquely positioned to build AI-powered wearables without relying on cloud-first processing.
The iPhone is also powerful enough to act as the “brain” for lightweight wearable devices. That means Apple can build glasses and pendants that feel fast and responsive without needing massive hardware inside the wearable itself.
This is the same direction the AI industry is moving in general — smaller devices powered by bigger systems behind the scenes. The real AI revolution in 2026 isn’t happening inside apps. It’s happening through agents and connected experiences, as explained in 9 Critical AI Agents in 2026: Why OpenAI Operator and Google Jarvis are Replacing Your Apps.
If Apple can combine wearable cameras with Apple Intelligence, it could create the most natural form of contextual AI so far.
Competition Will Be Fierce: Meta, Google, and Samsung Are Already Here
Apple isn’t entering an empty market. Meta already has two generations of smart glasses and a growing user base. Google and Samsung are also reportedly working on their own smart eyewear platforms.
Apple’s advantage is obvious: the iPhone ecosystem. If Apple’s glasses can work seamlessly with iMessage, FaceTime, Maps, Apple Music, and Siri, it could offer an experience that feels far more integrated than what competitors currently deliver.
But Apple will need to execute perfectly. In wearables, small issues like poor battery life, unreliable voice assistants, or slow camera response can ruin the product experience instantly.
TechularZtrix Verdict: Apple Is Building “Apple Intelligence Hardware,” Not Just Wearables
This report strongly suggests Apple’s next wearable push is not just about launching new accessories. It’s about building hardware designed around Apple Intelligence.
Smart glasses with cameras and audio could become the most natural way to access Siri and real-world navigation. A camera pendant, if it ever ships, could be Apple’s attempt to create a new AI interface entirely — a wearable “vision assistant” that always stays with you.
The smart glasses seem far more likely to launch, while the pendant feels experimental and uncertain. But even the fact that Apple is exploring such form factors shows one thing clearly: Apple Intelligence is evolving beyond software, and Apple wants its AI to live in hardware.
For more details on Apple’s wearable roadmap, the report was credited to Bloomberg’s Mark Gurman.
The report suggests they could be unveiled in late 2026 or early 2027, but likely won’t release until 2027.
The report claims Apple’s first smart glasses may not include in-lens AR displays and will focus on cameras, audio, and Siri integration.
It is reportedly an AirTag-sized pendant with a camera and microphone designed to work with Siri, likely relying on the iPhone for processing.
No. The report says the pendant is still early in development and could be cancelled.
Apple’s key advantage would be deep iPhone integration, Siri, and Apple Intelligence features.





