DevPik Logo
Applesmart glassesAI glassesSiriApple IntelligenceMeta Ray-Banwearable AIN502027ambient AIMark Gurman

Apple AI Smart Glasses (N50): 4 Designs, Siri-Powered AI, and Why Apple Is Betting Against the Display

Apple is prototyping four frame designs for its first AI smart glasses (codename N50) — launching 2027, skipping AR displays entirely, powered by Siri and iPhone. Here is what Mark Gurman revealed, why the display-less bet matters, and what it means for developers.

DevPik TeamApril 13, 202613 min read
Back to Blog
Apple AI Smart Glasses (N50): 4 Designs, Siri-Powered AI, and Why Apple Is Betting Against the Display

Apple's First AI Wearable Finally Comes Into Focus

On April 12, 2026, Bloomberg's Mark Gurman dropped the most detailed look yet at Apple's upcoming AI glasses — a product the company has been prototyping for years, now solidifying into something real. The Power On newsletter confirms Apple is actively testing at least four different frame designs for the device, internally codenamed N50, with production targeting December 2026 and a public consumer launch in spring or summer 2027.

And the headline takeaway is not what most Apple watchers expected: there is no display.

N50 is not an augmented-reality headset. It is not a shrunken Vision Pro. It is Apple's answer to the Meta Ray-Ban AI glasses — a pair of camera-and-microphone glasses that relies on voice, AI, and iPhone integration rather than pixels in your field of view. For developers, that pivot reshapes the ambient-AI landscape entering 2027.

The Four Designs Apple Is Testing

According to Gurman's reporting (corroborated by 9to5Mac, Tom's Guide, and Bloomberg), Apple is prototyping four distinct frame styles:

  1. Large rectangular frame — the Ray-Ban Wayfarer-style look that Meta popularized
  2. Slimmer rectangular frame — similar to the glasses Tim Cook actually wears
  3. Large oval / circular frame — the retro round lens shape
  4. Small oval / circular frame — a more delicate version of the round design

The body material is acetate, the same plant-based polymer used in Ray-Ban, Oliver Peoples, and high-end eyewear — chosen for durability and a "luxurious" feel, not the flimsy plastic of cheaper smart glasses. Colors tested so far include black, ocean blue, and light brown.

Apple's internal ambition, per the report, is nothing less than an "icon" — a design as instantly recognizable as the AirPods silhouette or the Apple Watch squircle.

No Display: Why Apple Is Skipping AR for Now

This is the most interesting strategic choice in the entire project.

Apple has the most advanced consumer AR hardware on the planet in the Vision Pro. It has spent years researching micro-displays, optical combiners, and waveguides. And yet the first mainstream Apple glasses will have zero pixels — no projected UI, no floating notifications, no AR overlay on your field of view.

Instead, N50 leans on:

  • Two cameras: one for high-resolution photos and video, one for computer-vision AI tasks (similar to what Vision Pro uses for spatial tracking)
  • Vertically oriented oval camera lenses with LED indicator lights — visually distinct from Meta's circular camera layout
  • Built-in microphones and speakers for calls, audio, and voice assistance
  • Tight iPhone pairing for compute, data, and most features

Why skip the display? Three reasons converge:

  1. Battery life. The smallest, lightest micro-displays still cost 30–50% of available power budget. A display-less pair of smart glasses can last a full day; a display-equipped pair measured in hours.
  2. Weight and heat. Acetate frames with cameras and speakers can stay close to regular-glasses weight. Adding a display and optical combiner pushes the device into Vision Pro territory — great for $3,500 niche hardware, wrong for mass-market eyewear.
  3. Meta's data. Ray-Ban Meta glasses — also display-less — sold 7 million units in 2025 and are on track for 20 million in 2026. The market has demonstrated it wants voice + camera + AI, not AR overlays. Apple is following the user demand, not the technology it could ship.

The displayed Apple glasses (long-rumored AR product) still exist on Apple's roadmap — but they are now explicitly after N50, not before.

Siri + Apple Intelligence: The Real AI Bet

With no screen to tap and no display to read, the entire interface is voice-first. That means the glasses live or die by Siri — and Siri is Apple's biggest AI weakness.

The plan, per Bloomberg and reporting by Meta Muse Spark watchers, involves:

  • Upgraded Siri as the primary input — ask questions, capture images, place calls, play music, get notifications read aloud
  • Apple Intelligence visual features — camera-based contextual queries ("what is this building?", "translate this menu", "read me this text")
  • Ambient AI — the model sees what you see and can proactively offer information

This is the same playbook Meta runs with its AI (and GPT-5.4's computer-use model pushes on desktop): camera + multimodal model + voice interface. The difference is that Apple's Siri, as of April 2026, is still famously unreliable — the "smarter Siri" promised at WWDC 2024 has been delayed multiple times.

Whether N50 ships with a genuinely competitive AI assistant by summer 2027 is the single biggest question hanging over the entire project.

The AI Leadership Shakeup: Giannandrea Out

The glasses news landed the same week Apple quietly closed the book on its previous AI era.

John Giannandrea — the former Google Search and AI chief Apple poached in 2018 to lead Machine Learning and AI Strategy — is leaving Apple. After announcing his retirement in December 2025 following the failed Siri overhaul, Giannandrea transitioned to an advisor role earlier in 2026 and his last week at the company is mid-April 2026.

The reshuffle:

  • Craig Federighi (head of Software Engineering) has taken over AI efforts more directly
  • Amar Subramanya — a veteran of Google Gemini — joined Apple as VP of AI in late 2025, reporting to Federighi, leading Apple Foundation Models and AI safety
  • Senior executive oversight has been redistributed across the leadership team

The timing is loaded. Apple is betting the ranch on AI hardware (glasses, next-gen AirPods with cameras, a rumored camera pendant) just as it restructures its AI organization. The gap between Apple's AI ambitions and its current AI execution is the clearest risk in the N50 story.

Apple vs Meta vs Everyone: The 2026–2027 AI Glasses Race

N50 does not enter an empty market. The AI-glasses category is suddenly crowded:

ProductStatusPositioning
Ray-Ban Meta (Gen 2/3)Shipping, ~7M sold in 2025Affordable, proven, developer-accessible
Meta Muse SparkJust launchedProprietary AI, closed ecosystem
Apple N50Prototyping, public reveal late 2026Premium design, iPhone tight-integration, acetate/luxury
Google AI glasses2026 announcement expectedGemini-powered, Warby Parker / Gentle Monster partners
Snap Spectacles (next-gen)In developmentCreator/camera-focused
Xiaomi / Huawei AI glassesShipping in ChinaBudget tier, regional

Meta has the volume. Google has the partnerships and AI model. Apple has the design cachet and tight hardware-software integration. For the first time in nearly a decade, Apple is entering a product category second or third to market — the same position it occupied with the iPod, iPhone, and Apple Watch, where arriving late with a better-executed product has historically worked out fine for Cupertino.

What This Means for Developers

For the DevPik audience, the N50 story matters on three fronts:

1. A New App Paradigm (Eventually)

Camera + AI + voice = an entirely new interaction surface. If Apple opens developer APIs for glasses features — visual search, real-time translation, contextual reminders, live-caption overlays for AirPods — a new app category emerges. Developers who were early to Apple Watch complications, Live Activities, and Vision Pro spatial apps will be the ones building first for glasses.

Apple has already hinted at an "App Store-like platform approach" for AI features. Whether that extends to N50 on day one or later is still unknown.

2. The Privacy and Consent Problem

Always-on cameras on your face create instant privacy concerns — for the wearer, the people around them, and the data that gets captured and processed. Meta's Ray-Ban glasses have already triggered debates about recording in public spaces, consent, and persistent surveillance.

Apple will lean on its privacy-first brand positioning, LED capture indicators, and likely on-device processing. But the fundamental social norms around "AI glasses in a coffee shop" are unresolved, and developers building on the platform will be building inside that unresolved debate.

3. The Vision Pro Lesson

Vision Pro shipped ~390,000 units in 2024 and the rate collapsed afterwards — production was halted, marketing cut by 95%, and the dedicated Vision Products Group was folded into the standard hardware division. Expensive hardware + limited developer adoption = slow start.

N50 is Apple's correction. By starting with a display-less product closer to $500–$800 (early estimates) instead of $3,500, Apple can chase the ~300 million Americans who wear corrective lenses — a market 100x larger than the premium-AR headset niche. Developers who went all-in on Vision Pro in 2024 learned a hard lesson about Apple hardware launches; N50's consumer scale is designed to avoid that.

Timeline: What Happens Next

  • April–November 2026 — Active prototype testing with the four frame designs
  • December 2026 — Production begins for launch-year units
  • Late 2026 — Public reveal (likely at a special Apple event, not WWDC)
  • Spring/Summer 2027 — Consumer launch
  • Developer SDK — Timing unclear; if Apple follows the Vision Pro playbook, expect visionOS-style glasses frameworks announced at WWDC 2027

In the meantime, Meta will ship a Ray-Ban Gen 3 and likely hit the 20-million-units-per-year pace. Google will announce its glasses. And Apple will keep tightening Siri — because if Siri cannot hold its own by summer 2027, no amount of acetate and oval cameras will save the product.

The Bigger Picture

N50 is the clearest signal yet that ambient AI — models that see, listen, and assist continuously — is the next major consumer-hardware front. The iPhone redefined mobile, the Apple Watch redefined wearables, and the AI glasses category (Apple's, Meta's, Google's) may redefine what personal computing looks like when it is not held in your hand.

For now, the iPhone still does most of the work. N50 is a voice-and-camera accessory to the phone, not a replacement for it. But the groundwork — camera APIs, Apple Intelligence, improved Siri, on-device AI acceleration — is all heading toward a world where the glasses are the computer.

Whether Apple gets there before Meta or Google is the question of the next 18 months.

While Apple prepares AI for your face, DevPik keeps AI-powered convenience in your browser — no wearable required. Try our 46+ free developer tools including the full Developer Tools suite and CSS Tools collection. For more on the AI hardware race, see our coverage of Meta Muse Spark, GPT-5.4 Computer Use, Claude Mythos, OpenClaw, and the Claude Advisor Strategy.

🛠️ Try It Yourself

Put what you've learned into practice with our free tools:

Frequently Asked Questions

Is Apple making AI smart glasses?
Yes. According to Bloomberg's Mark Gurman (April 12, 2026), Apple is actively prototyping its first AI smart glasses under the internal codename N50. The company is testing four different frame designs using premium acetate material, with production targeting December 2026 and a consumer launch expected in spring or summer 2027.
When will Apple smart glasses release?
Apple is targeting December 2026 for production start, with a public reveal in late 2026 and consumer launch in spring or summer 2027. This is based on Mark Gurman's Bloomberg reporting from April 12, 2026 — timing is subject to change as the product is still in active prototype testing.
Do Apple's AI glasses have a display?
No. Unlike AR headsets like the Apple Vision Pro, the first generation of Apple AI smart glasses (N50) will not have any display. Apple is focusing on cameras, microphones, speakers, and Siri voice interaction instead — similar to Meta's Ray-Ban glasses. A separate Apple AR glasses product with a display is still rumored for a later release.
What will Apple AI glasses do?
Apple's N50 smart glasses will handle phone calls, notifications from your iPhone, music playback, photo and video capture, and AI queries via Siri. They will include two cameras (one high-resolution, one for computer vision), LED indicator lights, microphones, and speakers — all integrated into lightweight acetate frames.
How much will Apple AI glasses cost?
Apple has not confirmed pricing. Based on the premium acetate construction and positioning versus Meta's $299 Ray-Ban glasses and Apple's $3,499 Vision Pro, most analysts expect N50 to be priced in the $500–$800 range — premium relative to Meta but accessible compared to Vision Pro.
Are Apple glasses better than Meta Ray-Ban?
They will compete directly. Meta Ray-Ban has the market lead (7M units sold in 2025) and lower prices. Apple's N50 will differentiate on premium design (acetate frames, multiple styles), tight iPhone integration, Apple's privacy positioning, and an oval camera layout that is visually distinct from Meta's circular camera design. Whether Siri and Apple Intelligence can match Meta's AI features by 2027 is the open question.
What is the Apple N50?
N50 is the internal Apple codename for its upcoming AI smart glasses, revealed in Bloomberg's reporting on April 12, 2026. The project has been in development for several years and is now in active four-design prototype testing. It is Apple's answer to Meta's Ray-Ban glasses — a display-less, camera-and-voice-driven AI wearable that pairs with iPhone.
Why is John Giannandrea leaving Apple?
John Giannandrea, Apple's former Senior VP of Machine Learning and AI Strategy, announced his retirement in December 2025 following the delayed Siri overhaul. He transitioned to an advisor role in early 2026 and his final week at Apple is mid-April 2026. Craig Federighi has taken over direct AI oversight, with ex-Gemini engineer Amar Subramanya joining as VP of AI.
Will Apple AI glasses replace the iPhone?
No. The N50 glasses are designed as a companion product to iPhone, relying on it for most compute, data, and connectivity. Apple is pairing the glasses tightly with iPhone rather than positioning them as a standalone device. A future generation with on-device processing and display capabilities may change that equation.

More Articles