The future of Apple's glasses projects is coming into focus.
Thanks to reporting from multiple sources, especially Bloomberg and The Information, analyst speculation, as well as published patents, we can be reasonably sure that Apple is working on two major face-worn headset or glasses platforms.
The first is a mixed reality headset, and the second is an augmented reality (AR) glasses device.
Let's dismiss the naysaying naysayers right from the start. Yes, other companies have already launched glasses, goggles, headsets and other facetop devices of every description and none has taken the world by storm. It was also true that other companies sold PCs before the Apple I, music players before the iPod, touch-screen phones before the iPhone and tablets before the iPad. Apple's entry into a new platform matters because that company takes ideas that have been in the air and productized on the fringe and turns them into mainstream, culture-changing products.
And I believe Apple's "iGlass" products (we don't know about the branding) will bring smart glasses and 3D holograms a part of everybody's everyday life and work to the mainstream.
Here's everything we know so far about Apple's mixed-reality headset, and also about Apple's AR eyeglasses.
What we know about Apple's mixed reality headset
Apple's first smart glasses product will function like virtual reality (VR) goggles, meaning that you look at screens instead of through glass. The "real world" of your immediate physical environment will be displayed on the screens just like virtual objects will be.
Prototypes have two 8k screens. The full resolution on both screens is unlikely. It appears that eye-tracking will control which parts of the screens are full resolution. Peripheral images will be low-rez.
Apple is working with a company called Taiwan Semiconductor Manufacturing Co. (TSMC) to develop what has been described as "ultra-advanced" micro OLED displays "less than 1 inch in size" at a secret facility in Taiwan, according to a report in a Japanese business publication. Micro OLEDs should be thinner, lighter and more power-efficient than glass-based displays. The display technology is so advanced that if Apple's headset ships next year it will not contain this technology.
The headset also has a screen on the outside, visible to bystanders who are looking at the user.
Prototypes have as many as 14 cameras in total. Some are pointed outward for capturing the real world in real time for AR. Some are pointed at the user, possibly for face-mapping, FaceID, and eye-gaze detection.
Prototypes have Lidar for real time mapping of the physical environment.
The headset has haptics.
It's apparently based substantially on the headset developed by Vrvana, which Apple acquired in 2017. Vrvana was known for a dial on the side of the headset for dialing in and out of VR.
The headset will set a new standard in sleekness. A low-res mockup of a high-res render published by The Information was reverse-engineered by Antonio De Rosa, who created high-rez versions based on real data.
We don't know the weight of the headset, but observers have noted that the prototype does not feature a top-of-the-head strap associated with heavier headsets currently on the market, leading to speculation that it will be exceptionally lightweight.
Apple hasn't figured out how the headset will be controlled. They're still prototyping and testing a wide range of input devices, including rings, gloves and in-the-air hand gestures.
The price is expected to be $3,000. Apple wants to ship the headset as early as next year.
What we know about Apple's AR eyeglasses
Apple's second major platform is AR glasses designed to be worn all day, every day and everywhere. They'll look as much like ordinary prescription glasses as possible, and the user will look through glass lenses, with virtual objects and information appearing to float holographically in the real world. They will be fittable with prescription lenses.
Apple may turn its AR glasses into health-related, quantified-self devices to replace or augment the role of the Apple Watch. A new patentv continuation describes the use of sensors in glasses that look like ordinary prescription glasses that can perform extreme feats of biomedical detection. The patent describes the monitoring of head movements, jaw muscles and movement, mouth opening, respiratory rate, blood pressure, heart rate, heart rate variability, oxygen saturation, skin moisture, body temperature, body posture, glucose blood levels, stress and others. The glasses will also be able to identify the user biometrically, and detect the user's emotions, thoughts and brain function.
Apple's AR eyeglasses are unlikely to ship before 2025.
How we know Apple is committed to augmented realty glasses
Apple has been inventing, designing and patenting technologies and concepts for years, and has hundreds of patents in this space. The company has been making acquisitions and acquihires for years, hoarding intellectual property and expertise in this space.
The company has also been introducing features and components into iPhones that can be considered precursors to mixed- and augmented-reality glasses, such as Lidar and AR support for developers.
Apple CEO Tim Cook has made it clear that Apple is obsessed with AR, and doesn't care that much about VR.
Apple claims to already have the largest AR platform in the world, and brags about its technology in the AR space.
How business will use Apple's mixed- and augmented-reality eyewear
My prediction is that the "killer app" for all Apple's glasses and headsets will be holographic virtual meetings, conversations and chats. Here's how it works. You've got a meeting, so you put on your Apple headset or, five years from now, a notification appears floating in space while you're wearing your prescription Apple iGlass product. The glasses biometrically identify you, and the other meeting participants pop up one by one as holographic avatars in your physical space. The avatars will be Memoji-type cartoons that reflect the actual body language, gaze, facial expressions and mouth movements of the real people you're talking with in real time. You'll be able to make eye contact with these avatars, and when someone is talking, all avatars, and you, will look toward the speaker, and you'll be able to look around at the holograms to see where they are looking as well. The sound will appear to come from the direction of the avatar that's speaking.
Besides meetings, Apple's 'iGlasses" will be usable in all the ways that industrial AR and VR is currently uses — training, factory-floor reference, in-the-field reference and all the rest.
Why Apple's "iGlasses" matter
A radical, transformative new user interface only comes along once in a generation. The Apple version of this new world will arrive roughly 20 years after Apple mainstreamed the "multi-touch" user interface on smartphones, which, needless to say, changed everything.
I believe that within 10 years or so, AR worn all day with ordinary-looking glasses will replace the smartphone as the main "computer" and user interface to apps and the internet. Instead of ignoring our immediate surroundings to narrowly focus on a tiny smartphone screen to learn what's happening in the wider world, events from the wider world will enter our immediate surroundings. Objects will inform us about everything we need to know. Instructions will pop up over appliances. Arrows and turn by turn directions will hover over streets and sidewalks. Biographical information will pop up when we're talking to someone.
That's why every nugget of news around Apple's productization of this platform is monumental. It's great to see Apple's "iGlasses" coming into focus.