AR/VR Headset: Everything We Know So Far

News

HomeHome / News / AR/VR Headset: Everything We Know So Far

May 18, 2023

AR/VR Headset: Everything We Know So Far

Subscribe for regular MacRumors news and future AR/VR Headset info. Apple has

Subscribe for regular MacRumors news and future AR/VR Headset info.

Apple has been experimenting with virtual reality and augmented reality technologies for almost 20 years based on patent filings, but with virtual and augmented reality exploding in popularity with the launch of ARKit, Apple's dabbling is growing more serious and is expected to lead to the first wearable AR/VR Apple device in 2023.

There is a research at Apple with hundreds of employees working on AR and VR and exploring ways the emerging technologies could be used in future Apple products. AR/VR hiring has ramped up over the last several years, and Apple has acquired multiple AR/VR companies as it furthers its work in the AR/VR space.

Apple is developing at least two AR/VR devices that include a mixed reality headset set to be released in 2023 followed by a more affordable version coming at a later date. Apple's headset has faced numerous delays, and while the company originally wanted to launch it in 2022, that wasn't possible. A 2023 launch is expected, with Apple planning to introduce the wearable at WWDC. The cheaper version could come in either 2024 or 2025, and there could be multiple second-generation models available.

The AR/VR headset will be a standalone device with Apple silicon chips that will put performance on par with Apple's Macs. It will have two chips inside that will allow it to perform complex tasks and handle the virtual reality capabilities, plus it will be able to operate independently of an iPhone or a Mac. Pairing will be possible to use the iPhone as a keyboard and to transfer data, but it won't be required.

Design wise, the headset will look similar to other headsets on the market like the Facebook Oculus Quest, but it will feature a sleeker look and a lightweight build to ensure comfort. Apple will cut down on the weight of the headset by using an external battery pack that is worn at the waist rather than an integrated battery built into the device.

Two high-resolution 4K micro OLED displays with up to 3,000 pixels per inch will provide an immersive viewing experience. Apple is building more than a dozen cameras into the headset to track hand movements and gestures, which will be one method of control along with eye tracking. Users will be able to look at an item on the display to select it, using hand gestures to then interact with the item.

Multiple 3D-sensing modules will be included for detecting hand gestures and objects that are around the wearer, and it will support voice control, skin detection, spatial detection, and expression detection. Built-in speakers will be included, but better spatial audio will require paired AirPods.

Apple is designing an App Store for the headset, and content will focus on gaming, streaming video, and video conferencing. It will run "xrOS," a new operating system designed specifically for the headset. A FaceTime experience will use the cameras to project a person's full face and body into a virtual environment for one-on-one calls, and there will also be other apps like Calendar, Mail, Safari, and more.

The headset will have an iOS-like interface that will be immediately familiar to iPhone and iPad users, with a dedicated Home Screen that houses an array of apps and widgets. It will also be able to be used as a display for a connected Mac, with the Mac's mouse and keyboard used as input devices.

As the AR/VR headset is an entirely new product category adopting cutting-edge technology, it's not going to be cheap. Rumors suggest it could be priced somewhere around $3,000.

The headset will focus on VR, but it will also have augmented reality capabilities. Users will be a able to swap from VR to AR through a Digital Crown-like control mechanism, with the AR features using external cameras to display the user's surroundings inside the headset.

Note: See an error in this roundup or want to offer feedback? Send us an email here.

Apple's first headset will support both AR and VR capabilities, technology that's usually referred to as "mixed reality." Augmented reality (AR) and virtual reality (VR) are similar technologies, but their potential applications vary significantly.

Virtual reality refers to a full immersive experience in a virtual world, while augmented reality refers to a modified view of the real world. With a VR experience, the real world is largely shut out to focus on an entirely virtual experience, but augmented reality overlays virtual elements on your real-world view.

Google Glass, a Google product that's now largely defunct, is an example of a head-worn augmented reality device, while the Oculus and PlayStation VR are examples of head-worn virtual reality devices. Apple is working on both of these technologies, but the initial product will be more similar to the Oculus than Google Glass.

Virtual reality is singularly focused on immersive content consumption because it makes the wearer feel as if they're actually experiencing what's going on in the simulated world through visual, tactile, and audio feedback. Virtual reality is linked to gaming right now, but it also has the potential to recreate real world experiences for educational or training purposes.

Augmented reality doesn't hinge on immersive content and while less exciting because it's augmenting reality instead of replacing it, it has a wider range of potential applications and it is the technology that Apple ultimately seems to be most interested in. Apple's headset will support both AR and VR, which is called mixed reality, and it's something we've seen in products like Microsoft's HoloLens. Users will be able to swap between virtual and augmented reality using a Digital Crown on the side of the device.

Mixed reality merges real-world content and virtual content to produce new environments where physical and digital objects can be viewed and interacted with together. In practice, we don't quite yet know what exact kind of experience Apple's headset will provide, but we can count on immersive games, more interactive FaceTime and chat experiences, and new learning tools.

Trademark filings have suggested that Apple could call the mixed reality headset that's in the works the "Reality Pro" or the "Reality One." Apple has trademarked those names in addition to "Reality Processor," so it sounds like we could get reality-themed naming.

Apple has trademarked these names in the United States, United Kingdom, Canada, Australia, New Zealand, Saudi Arabia, Costa Rica, and Uruguay using shell companies.

Apple appears to have filed trademarks for xrOS, xrProOS, realityproOS, and realOS in recent weeks. Apple is expected to use xrOS for the AR/VR headset operating system name, but the others could be just-in-case placeholders.

Apple's AR/VR headset will look similar to some other VR headsets on the market, featuring a design that's not too far off from the Facebook Oculus Quest virtual reality headset. Several of the design details have been revealed in rumors, and The Information even saw a prototype so we have a good idea of what to expect.

The headset will use aluminum, glass, and carbon fiber to keep the weight and the profile low. It has been described as having a "sleek, curved visor attached to the face by a mesh material." The renders below from designer Ian Zelbo are based on these headset descriptions.

A band in the back that's made of a material similar to an Apple Watch band will hold the headset on the wearer's head, and a soft mesh will make the fit comfortable against the front of the face. Early rumors suggested that headbands would be swappable, but new information indicates that is no longer the case. The headband has been described as being made of a soft material with two spots at the temples that feature the left and right speaker. There is a soft, removable cover that attaches to the back of the headset to make it more comfortable to wear.

Apple was allegedly working on a headband with spatial audio technology like the AirPods Pro for a surround-sound like experience and another to additional battery life while on the go, but these ideas may have been scrapped. Rumors now suggest that Apple is creating different headbands, one made for consumers made from materials like the Apple Watch band with built-in speakers, and a second targeted at developers.

The design of the headset is meant to block out peripheral vision to prevent light from leaking into the wearer's field of view, and there will be an outward-facing display for showing graphics to others.

March 2022 prototypes were said to weigh around 200 to 300 grams, but Apple is aiming to reduce the final weight to 100 to 200 grams if technical problems can be solved, which would make the headset lighter than existing VR devices.

To swap between virtual reality and the physical world, Apple is including a small, Digital Crown-like dial on the right side of the headset. It will not include haptic feedback, but it will be able to be turned to transition between modes.

The first leaked AR/VR headset component leaked in March, giving us look at the device's internals. The ribbon cable is said to be designed for the headset, and it does not appear to be similar to any cable used in a prior Apple product.

The cable appears to contour around a user's eyes, and could be used to connect display components to a board.

Rumors suggest the AR/VR headset will have two to three displays. There will be two high-resolution 4K micro OLED displays with up to 3,000 pixels per inch, for an 8K total resolution. Sony is expected to supply the display modules that Apple will use, though Apple may also use some OLED displays from Samsung.

Micro OLED displays are built directly on to chip wafers rather than a glass substrate, which results in displays that are thinner, smaller, and more power efficient. They allow for pixel sizes in the range of four to 20 micrometers, compared to 40 to 300 micrometers with standard OLED panels, plus they have a faster microseconds response time, making them ideal for augmented reality (AR) and virtual reality (VR) applications.

According to display analyst Ross Young, the AR/VR headset displays will measure in at 1.41 inches diagonally, featuring 5000 nits brightness and 4,000 pixels per inch. With 5,000 nits brightness, the AR/VR headset will offer support for HDR content, and it will far exceed the capabilities of other headsets on the market.

The AR/VR headset will also include an outer, external display that will be a standard OLED display supplied by LG Display. The standard OLED display will be a simple exterior indicator display that does not require the higher-quality micro-OLED technology used for the dinner display.

The outward-facing display will allegedly be able to show the facial expressions of the headset's wearer to the people around them, to cut down on the sense of isolation felt when using the device. The display will have an ultra-low refresh rate and reduced power consumption to keep it from draining battery.

Apple is using "Pancake" lenses that will allow for a thin and lightweight design. Pancake lenses are more expensive than the Fresnel lens technology used for other VR headsets, but will result in a much thinner device.

Because of the close fit of the headset to the face, users will not be able to wear glasses, so there will be option for prescription lenses to be inserted over the screens. The prescription lenses will be able to attach to the headset using magnets.

The AR/VR headset will feature a 120-degree field of view, much like the Valve Index, and it will be wider than the 106-degree field of view of the Meta Quest Pro.

Further, Apple will use small motors to adjust the internal lenses to match the wearer's interpupillary distance, providing the largest field of view possible for each individual.

Apple is rumored to be planning to use a standalone battery pack for the AR/VR headset, which would allow it to cut down on weight. The Information suggests that the battery would be worn on the waist and connected to the headset through a MagSafe-like power cable to the headband.

The battery is approximately the size of two iPhone 14 Pro Max models stacked on top of one another, and it will power the headset for around two hours. An external battery will allow users to swap one battery and charge another to use the device for a longer period of time.

To connect the battery pack to the headset, Apple plans to use a proprietary charging cable with a round tip that rotates to lock in to the headset so the cable does not come out during use. The cable itself will connect to the battery pack, and it will not be removable.

The cable that connects the headset to the battery pack will look similar to an Apple Watch charging puck, and it will attach to the headset's left temple. The headset will also include a USB-C cable for data transfer purposes.

Apple will reportedly adopt a 96W power adapter to charge the battery pack.

The headset will feature more than a dozen optical cameras for tracking hand movements, mapping the environment, capturing facial features and body movements, and projecting visual experiences. One of the headset's marquee features is said to be lifelike avatars that have accurate facial features captured by the included cameras.

Each eye will be tracked by at least one camera, letting the headset accurately show the user's gaze on an avatar. With precise eye-tracking, the headset will be able to perform foveated rendering to conserve power by only rendering imagery in full resolution directly where the user is looking.

Eye and hand tracking will be the main method of control for the AR/VR headset, with Apple using cameras to monitor the eyes and the hands. The wearer will be able to control the headset by looking at an on-screen item to select it, for example, then using a hand gesture like a pinch to activate the item on the screen.

Eight camera modules will reportedly be used for see-through augmented reality experiences for the user, while another six modules will be used for "innovative biometrics." One camera will also be available for environmental detection. Rumors suggest the headset will be able to map surfaces, edges, and dimensions of rooms with accuracy, employing short and long-range LiDAR scanners to do so.

For privacy and security, the AR/VR headset will integrate an iris scanner that can read the pattern of the user's eye, allowing for an iris scan to be used in lieu of a password and for payment authentication.

Iris scanning on the AR/VR headset will be akin to Face ID and Touch ID on the iPhone, iPad, and Mac. It could allow two people to use the same headset, and it is a feature that is not available on competing headsets like Meta's new Quest Pro.

There will be no wearable control device for the AR/VR headset, with Apple instead relying on hand gestures that are detected by the myriad cameras on the device.

Typing, for example, will be done using an "in-air" method through eye movements and hand gestures. Users will also be able to pair an iPhone to use the iPhone's keyboard with the device, similar to how an iPhone can be used with an Apple TV.

Two Mac-level M2 processors will be included in the AR/VR headset for unprecedented computing power in a wearable device, with Apple using the same chips that debuted in the MacBook Air. The chips will include a main SoC with CPU, GPU, memory, and dedicated Image Signal Processor.

The chips are fabricated on the 5-nanometer process and will not be made with the 3nm process that Apple is planning to use for devices that are coming later in 2023. The headset will not be reliant on an iPhone or Mac for processing power, and it will feature independent power and storage.

Because of unacceptable latency, Apple created its own custom streaming codec to allow the chips to communicate, and the dedicated ISP is able to translate distorted images captured by the external cameras into a faithful video representation of the user's surroundings with low latency. The headset will also include an H2 chip to allow the headset to form an ultra-low latency connection with the second-generation AirPods Pro and future AirPods models. Apple is even considering making AirPods mandatory for the headset for the best audio experience, as third-party earbuds do not work well because of latency problems. There is no 3.5mm headphone jack on the device.

Apple has completed work on the SoCs for the AR/VR headset, and will optimize for wireless data transmission, compressing and decompressing video, and power efficiency for maximum battery life, though they do not have a neural engine like some of Apple's other chips.

Apple will call the software that runs on its AR/VR headset "xrOS," which is meant to stand for "extended reality." Extended reality represents both the augmented and virtual reality functions the headset will support.

The name has been confirmed by internal Apple sources and also, Apple has been trademarking the xrOS through a hidden shell company. An Apple trademark filing for "xrOS" has been found in New Zealand, hinting that Apple will indeed use the xrOS name for the operating system that runs on the headset.

Apple internally called the headset software "Reality OS" or "rOS" when it was in development, but as launch approaches, decided to update the public name to the less generic "xrOS" name.

Apple is working on a range of apps for the AR/VR headset, and will be focusing on sports, streaming video content, gaming, fitness, wellness, and collaboration. Apple is creating an App Store for the headset.

xrOS will include iOS apps like Safari, Photos, Messages, Maps, Apple TV+, Apple Music, Podcasts, and Calendar, as well as a FaceTime app customized for the headset.

Apple recently announced Final Cut Pro and Logic Pro apps for the iPad, and these apps could also eventually come to the headset to allow audio and video content creators to edit directly on the headset.

Apple is planning to adapt iPad apps for the headset, and users will be able to access existing App Store content through the 3D interface of the device. Apple-designed apps such as Safari, Calendar, Contacts, Home, Files, Messages, Notes, Photos, Music, Reminders, and other built-in apps will be optimized for the device. Multiple apps will be able to be run at one time, and there will be a geolocation aspect for swapping between apps when the user is in different rooms.

A Fitness+ app will allow users to exercise while watching the Fitness+ instructor in a virtual reality setting, and there will be a Health app for guiding users through meditations with graphics, sounds, and voice-overs.

With Apple's deeper push into sports, there will be a focus on immersive viewing experiences for MLB and MLS content, plus there will be a dedicated TV app for watching videos in virtual reality environments.

Apple is teaming up with companies like Disney and Dolby for video content, plus it is updating Apple TV+ shows and movies to work with the device. Apple is said to want to provide users with the experience of looking at a giant screen in a unique environment like the desert or space.

There is a dedicated FaceTime experience will allow for one-on-one chats using realistic avatars that feature a user's actual face and body. This functionality is processor intensive, which is why it is limited to person-to-person chats, with multi-person chats instead using less detailed icons like Memoji.

A Camera app will be able to take images from the headset's cameras, and a version of Freeform will be adapted to a 3D interface for working on collaborative projects with others.

Apple has been working with a select number of gaming developers to help them update their existing content for mixed reality, and Apple will also have a robust set of tools available for creating AR/VR experiences.

Siri will be able to be used for text input, as will an iPhone, iPad, or Mac keyboard. Apple is working on an in-air typing feature that is expected to be a bit rudimentary at launch, with Apple to make notable improvements over time.

Apple is developing software tools that will allow developers and customers to create augmented reality apps for the AR/VR headset.

Customers may be able to create and release AR apps and experiences using Siri voice commands, even with no coding experience. A report on the feature compares it to Minecraft or Roblox, games that have tools available to allow players to create their own experiences.

Apple's AR/VR headset will not need an iPhone to function and it will be able to be used on its own. No iPhone is required for setup and data can be downloaded through iCloud, but there will be an option to pair an iPhone at setup for data transfer purposes.

An iPhone will also be able to be connected to serve as a keyboard for the headset when necessary, though a gesture-based typing system is included.

Apple's upcoming mixed reality headset will offer WiFi 6E support, which is the latest WiFi specification. Apple is said to be planning to implement WiFi 6E to provide a high-end, immersive experience with solid wireless connectivity. WiFi 6E has all of the benefits of WiFi 6 but adds 6GHz spectrum in addition to the 2.4GHz and 5GHz bands for increased bandwidth and less interference between devices.

Apple's AR/VR headset could be facing yet another delay, as Apple is facing development issues that need to be overcome. According to Bloomberg, Apple has pushed back the release plans for the headset several times now. Apple originally planned to launch the headset in 2021 and ship it in 2022, and then delayed that until the 2022 WWDC, and after that, further delayed the device until 2023.

At least one of the chips designed for the headset is on par with the M1 Pro from the latest MacBook Pro models, and the thermal demands of the chip are causing problems with heat dissipation. With the delay, Apple is planning to focus on the headset during the 2023 Worldwide Developers Conference. Apple will encourage developers to start building augmented and virtual reality apps for the "rOS" App Store.

Apple has been working on augmented and virtual reality technologies for a long time now, and the company has a huge team of employees developing headworn AR/VR products. It's believed that in the future, AR/VR devices will replace the iPhone, something that could happen as soon as 10 years from now.

The iPhone is Apple's most profitable and important product, so that AR/VR headset technology will replace the iPhone gives us some idea of how key it is to Apple's future.

Rumors suggest that the original headset design included a fan and powerful processors, but the device was too heavy. Early designs also would have required the headset to use the processing power of a connected iPhone or Mac, but Apple has changed the design and added high-end chips to the headset itself.

Early prototypes were described as having an 8K display for each eye, and there was one rumor that said the headset would connect to "dedicated box" using a high-speed short-range wireless technology called 60GHz WiGig. The box would be powered by a custom 5-nanometer Apple processor that's "more powerful than anything currently available." The box apparently resembles a PC tower, but it "won't be an actual Mac computer."

Internal disagreements shaped and changed Apple's goals for its AR headset over time, and rumors suggest that Apple did not go with the "box" design because former Apple designer Jony Ive did not want to sell a device that would require a separate, stationary addition for full functionality.

Ive wanted a headset with less powerful technology that could be embedded directly in the device, but the leader of the AR/VR team, Mike Rockwell, wanted the more powerful device. It was a standoff that lasted for months, and Tim Cook ultimately sided with Ive, changing the direction of Apple's headset design.

Some Apple employees are said to be concerned about the usefulness of the AR/VR headset and its high price point. There are worries that it is a "solution in search of a problem" with some employees defecting from the projects due to doubts about its potential.

According to one leaker, a tester within Apple that has used the headset has said that Apple made a huge development leap in recent months, and that they were "blown away" by the headset's capabilities. "The leap they've made since [late last year] is giant," said the source. "I was so skeptical; now I'm blown away in a 'take my money kind of way."

On the topic of AR/VR, Tim Cook recently commented that it could "greatly enhance people's communication, and could "empower people to achieve things they couldn't achieve before." He also suggested that it would be good for collaborating and coming up with ideas. and that it could accelerate creativity.

Cook said that there have been "loads of skeptics" with everything Apple has done, and he implied that Apple would "control the primary technology" in the future with the AR/VR headset.

Apple is expected to debut the headset at the Worldwide Developers Conference, offering it for sale later in the year. This year's 2023 WWDC keynote event is set to take place on Monday, June 5.

We are not expecting to see the headset available for sale until late 2023. Rumors suggest that Apple will begin mass production on the device in October, with units to be available in time for the December holidays.

Pricing on the headset could start somewhere around $3,000, which will make it much more expensive than an iPhone. To start with, it won't be aimed at general consumers, but will instead be positioned as a device for developers, content creators, and professionals.

Apple's AR/VR headset reportedly costs around $1,500 to manufacture alone, and the component costs will translate to a higher device price. The microOLED displays are the most expensive component inside the AR/VR headset at around $280 to $320. The camera modules cost around $160, and the main processor and image signal processor is priced at around $120 to $140.

Apple expects to sell just one headset per day per retail store, and it has told suppliers that it expects sales of seven to 10 million units during the first year of availability.

Apple is already working on a version of the AR/VR headset that will be priced more affordably. The first AR/VR headset will cost somewhere around $3,000, but Apple wants to develop a model that is closer in price to the iPhone.

Apple engineers plan to use more affordable components to bring the price down, but the lower-priced headset will have the same general AR/VR "mixed reality" functionality of the headset launching in 2023. The chips in the device will be on par with the iPhone rather than the Mac-level chips used in the first AR/VR headset, and Apple may also use lower resolution internal displays and cheaper materials.

As of right now, development on the more affordable headset is in the early stages and Apple has not created a working prototype. Such a headset could launch in 2025, and Foxconn is already working on it.

According to Apple analyst Ming-Chi Kuo, the second-generation AR/VR headset will be available in both high-end and low-end models, similar to the iPhone. Apple is expected to offer one more affordable model and one that is a follow up to the higher-end and more expensive first-generation model.

Apple's work on virtual and augmented reality dates back multiple years, but rumors picked up starting in March of 2015 when news hit that Apple had a small team of people working on augmented reality. In 2015 and into early 2016, Apple's team grew as the company hired employees with expertise in AR/VR technology and made multiple related acquisitions.

Apple's AR/VR team includes several hundred engineers from across Apple, all of whom have expertise in virtual and augmented reality. The team works across office parks in both Cupertino and Sunnyvale, and Apple is exploring several hardware and software projects under the code name "T288."

Apple's augmented reality team combines "the strengths of its hardware and software veterans," and is led by Mike Rockwell, who came from Dolby. Former employees of companies like Oculus, Amazon (from the VR team), Lytro, Microsoft, 3D animation company Weta Digital, and Lucasfilm are working on AR at Apple.

Former Apple hardware engineering chief Dan Riccio in January 2021 transitioned to a new role where he is overseeing Apple's work on an AR/VR headset. The project has faced development challenges, and Apple execs believe that Riccio's focus may help.

Apple software executive Kim Vorrath is also on the augmented reality team, and she has been described as a "powerful force" making sure employees meet deadlines while also sussing out bugs.

Apple has hired a number of notable AR/VR and AI experts, including computer science professor Doug Bowman, former Magic Leap engineer Zeyu Li, former Oculus research scientist Yury Petrov, AR expert and former NASA employee Jeff Norris, VR app expert Sterling Crispin, and VR camera maker Arthur van Hoff.

Many members of Apple's AR/VR team may have joined the company though acquisitions. Since 2015, Apple has purchased several companies that created AR/VR-related products, and some of its AR/VR acquisitions even date back several years.

Akonia Holographics

Apple in August 2018 bought Akonia Holographics, a startup that makes lenses for augmented reality glasses. Akonia Holographics advertises the "world's first commercially available volume holographic reflective and waveguide optics for transparent display elements in smart glasses."

The displays that it makes are said to use the company's HoloMirror technology for "ultra-clear, full-color performance" to enable the "thinnest, lightest head worn displays in the world."

Vrvana

In November of 2017, Apple purchased Vrvana, a company that developed a mixed reality headset called Totem. Totem, which was never released to the public, was designed to combine both augmented and virtual reality technologies in a single headset, merging full VR capabilities with pass-through cameras to enable screen-based augmented reality features.

Totem essentially used a set of cameras to project real-world images into its built-in 1440p OLED display, a somewhat unique approach that set it apart from competing products like Microsoft's HoloLens, which uses a transparent display to combine virtual and augmented reality. Apple could be planning to use some of Totem's technology in a future product.

PrimeSense

Apple purchased Israeli-based 3D body sensing firm PrimeSense in 2013, sparking speculation that motion-based capabilities would be implemented into the Apple TV. PrimeSense's 3D depth technology and motion sensing capabilities were used in Microsoft's initial Kinect platform.

PrimeSense used near-IR light to project an invisible light into a room or a scene, which is then read by a CMOS image sensor to create a virtual image of an object or person. This enables motion-based controls for software interfaces, but it's also able to do things like measure virtual objects and provide relative distances or sizes, useful for augmented reality applications like interactive gaming, indoor mapping, and more. PrimeSense technology can also create highly accurate 360 degree scans of people and objects, potentially useful for virtual reality applications.

Metaio

Apple acquired augmented reality startup Metaio in May of 2015. Metaio built a product called the Metaio Creator, which could be used to create augmented reality scenarios in just a few minutes. Prior to being purchased by Apple, Metaio's software was used by companies like Ferrari, who created an augmented reality showroom.

Metaio technology was also used in Berlin to allow people visiting the site of the Berlin Wall to use a smartphone or tablet to see what the area looked like when the Berlin Wall was still standing. Metaio's technology is one that could potentially be used to implement augmented reality capabilities into Apple apps like Maps.

Faceshift

Apple acquired Faceshift in August of 2015, marking its second augmented reality purchase in 2015. Before being acquired by Apple, Faceshift worked with game and animation studios on technology designed to quickly and accurately capture facial expressions using 3D sensors, transforming them into animated faces in real time. Faceshift was also working on a consumer-oriented product that would allow people to morph their faces into cartoon or monster faces in real time in Skype.

Faceshift's technology has a wide range of possible use cases, and Apple appears to be using the feature to power Animoji in iPhones equipped with the front-facing TrueDepth camera system.

Emotient

Emotient, a company that built tools for facial expression analysis, was acquired by Apple in January of 2016. Emotient's technology uses artificial intelligence and machine learning to read human emotion, features that have been used in the real world by advertisers to determine emotional reactions to advertisements.

There are dozens of things Apple could do with Emotient, ranging from better facial detection in the Photos app to analyzing customer feelings in Apple retail stores to unlocking iOS devices, but it also has potential AR/VR uses. Like Faceshift, Emotient's technology could be used to analyze and transform facial expressions for the creation of virtual avatars, useful for social media purposes and games. Emotient technology was likely used for Animoji.

Flyby Media

Purchased in early 2016, Flyby Media is another company that worked on augmented reality. Flyby created an app that worked with Google's 3D sensor-equipped "Project Tango" smartphone, allowing messages to be attached to real world objects and viewed by others with one of Google's devices.

For example, a person could "scan" a landmark like San Francisco's Golden Gate Bridge and write a message attached to it. A person visiting the bridge later would then be able to scan the bridge with the Flyby app to see the message. The Flyby app likely drew the attention of Apple because it was able to recognize and understand different objects that were scanned, technology that could be used by Apple in a number of ways in apps like Photos and Maps.

RealFace

In February of 2017, Apple purchased RealFace, a cybersecurity and machine learning company that specializes in facial recognition technology, which could potentially be used for future augmented reality features.

RealFace developed facial recognition technology integrating artificial intelligence for frictionless face recognition. RealFace technology was likely employed in the iPhone X, Apple's first smartphone with facial recognition capabilities in the form of Face ID.

NextVR

Apple in May 2020 acquired NextVR, a California-based company that combined virtual reality with sports, music, and entertainment, offering VR experiences for watching live events on VR headsets from PlayStation, HTC, Oculus, Google, Microsoft, and other manufacturers.

Spaces

Apple in August 2020 purchased VR startup Spaces, a company that designed virtual reality experiences that people could experience in malls and other locations, such as "Terminator Salvation: Fight for the Future." Spaces also created virtual reality experiences for video communication apps like Zoom, which is something that Apple could potentially incorporate into a future AR/VR product.

Valve

According to Taiwanese site DigiTimes, Apple is partnering with game developer Valve for its rumored AR headset. Valve released its first VR headset, Valve Index, in April 2019.

Valve previously worked with Apple to bring native VR headset support to macOS High Sierra, leveraging the eGPU support with a Mac version of the SteamVR software.

Apple has filed multiple patents that relate directly to a virtual reality headset, all dating back several years. While technology has likely advanced somewhat beyond these, they provide an interesting look at the ideas Apple has explored in the past.

A 2008 patent application covered a fairly basic "personal display system" designed to mimic the experience of being in a movie theater when watching video.

A second patent described a "Head Mounted Display System" with a "laser engine" that projected images onto a clear glass display worn over the eyes, similar to glasses. In this configuration, the headset connected to a handheld video player such as an iPod to provide processing power.

A third patent originally filed for in 2008 was similar in design, covering a goggle-like video headset designed to let users watch movies and other content. It outlined two adjustable optical modules lined up with the user's eye, which could provide vision correction and allow for the viewing of 3D content. Apple described this as offering a personal media viewing experience.

A fourth patent from 2008 covered a video headset frame similar to the Google Glass, which would allow a user to slide their iPhone or iPod into the headset to provide video. The headset was described as an augmented reality product that would let users do things like watch a video or check email while keeping an eye on their surroundings.

Beyond headset-related patents, Apple has also filed for patents describing other ways virtual and augmented reality features could be implemented into its devices. A 2009 patent application, for example, covered camera-equipped 3D displays that would shift in perspective based on a user's relative position.

Such a display would detect head movement, allowing a user to move their head around to look at a 3D image from different angles while also incorporating elements of a user's environment.

2010 and 2012 patents described the use of motion sensors to create a 3D interface for iOS devices using augmented reality techniques. Apple described the interface as a "virtual room" navigated by manipulating the orientation of the device through built-in sensors or through gestures.

In 2011, Apple filed a patent for an augmented reality feature in the Maps app related to mapping the distance to notable landmarks. With the camera, a user could look at the area around them and get real-time estimations of the distance between two points along with overlays of relevant information.

A patent filed in 2014 and granted in 2017 covers a mobile augmented reality system able to detect objects in the environment and overlay them with virtual information through the use of cameras, a screen, and a user interface. Apple describes the system as ideal for a head-mounted display, but it also shows it being used in smartphones.

Apple has been working on virtual reality technology that could be used within autonomous vehicles. Several Apple patents describe a system that includes an in-car virtual reality system with a VR headset worn to provide entertainment and to mitigate carsickness from tasks like reading and working while a vehicle is in motion.

A July 2020 patent application covers possible input methods with Apple Glasses, describing a system where the glasses use infrared heat sensing to detect when someone touches a real-world object, allowing the glasses to then project controls onto a real-world surface.

With this method, the Apple Glasses could project in AR control interface onto any actual object in the real world for a mixed reality overlay kind of effect.

Apple in February 2021 filed a number of patents related to its work on a rumored mixed reality headset, with the patents covering design elements, lens adjustment, eye-tracking technology, and software.

Apple has developed several methods for making a headset more comfortable to wear while also keeping it secure and blocking out light, plus there's a detailed lens-adjustment system that uses fluid to seamlessly shift the lenses to make the fit customized for each user.

Apple also details an eye-tracking system that uses infrared light to detect position, and there's also a patent on how documents might be able to be edited in a virtual 3D space using the headset and gesture detection.

Apple has patented systems for recording video from a headset, where built-in gaze-tracking sensors could provide an indication of where a person is currently looking, which could direct a built-in camera to record the scene where the user's eyes are positioned, instead of simply recording what is in front of the user.

Another patent application filed in February 2021 showed that Apple is researching a finger-mounted device with an array of sensors and haptic feedback to be used as a control device for a mixed-reality headset.

The control device has a shape that allows users to feel objects in their surroundings naturally and can precisely ascertain the way in which the user is moving their finger and interacting with surfaces. The system is said to be so accurate that it can detect how hard a user is pressing on a surface and the exact direction of this force, delivering haptic feedback in response.

Coupled with an AR or VR headset, Apple says that this finger-mounted device could "provide a user with the sensation of interacting on a physical keyboard when the user is making finger taps on a table surface" or "allow a user to supply joystick-type input" for gaming using only the movement of the user's fingertips.

Apple analyst Ming-Chi Kuo believes that Apple could be planning augmented reality "contact lenses" that could launch sometime in the 2030s.According to Kuo, the lenses will bring electronics from the era of "visible computing" to "invisible computing." There is "no visibility" for the contact lenses at the current time, and it's not a guaranteed product that Apple will develop.

Apple was also developing augmented reality "Apple Glasses" that were supposed to launch a year or so after the headset, but the project has been placed on hold so the company can instead focus on a cheaper version of the AR/VR headset. Apple Glasses are expected to launch in 2027 at the earliest.

A future version of the AR/VR headset could have accessibility settings that are designed to help people who suffer from eye diseases and visual issues.

AR/VR Headset two