A Magic Leap AR headset, an example of lightweight headset that may share some design ideas with Apple’s rumored version
Reports over the years have speculated on Apple producing some form of headset or smart glasses that use augmented reality or virtual reality. Earlier reports suggested the hardware could arrive in 2021, while an investor note from Ming-Chi Kuo points towards a 2020 launch, but ultimately it means there isn’t that long to wait for Apple to launch a product in the space, at least in theory.
The same rumors point to the use of technologies like WiGig to connect to a nearby iPhone or other host device to create a lightweight headset, and to the use of an 8K display per eye for an optimal user experience.
While the rumors offer speculation based on the writer’s thoughts, one of the few bits of evidence confirming Apple is working in the graphics field is the existence of multiple patents. Assorted applications and granted patents have surfaced over the years, showing not only how Apple intends to produce software and apps made for AR and VR experiences, but also hinting at hardware designs and applications further afield from just headsets.
While there is some debate about what form the headset or glasses will take, patents and applications reveal that Apple is working on solving many of the small but crucial problems that many of the different product categories run into, both major and minor in scale.
March 2018 patent filings published by the U.S. Patent and Trademark Office for “displays with multiple scanning modes” suggest how the display element of a headset could be optimized to allow the screen to refresh at as high a speed as possible, an issue that is compounded at higher resolutions due to the number of pixels at play.
By updating only sections of a display that require changes, the headset has less work to perform, minimizing the possibility of display artifacts that could ruin the experience for users.
The visual experience is important enough to warrant increased monitoring of the user, in order to get things as perfect as possible. Applications surfacing in April 2018 for an “eye tracking system”, simply put, tries to keep track of a user’s eyes, including movements and positions in relation to the display, something not tracked by current-generation headsets.
Eye tracking would offer a few extra benefits, including being able to provide a more realistic depth of field effect by knowing where a user’s gaze lies. Gaze-based interaction could also be offered, such as by animating parts of a scene when a user is looking at that element, without necessarily facing directly at it.
An example of reflective eye tracking in one patent illustration
Apple’s solution is notable in that, while typical eye tracking relies on being placed opposite the eyes for optimal monitoring, Apple relies on mirrors and other parts to bring the eye tracking hardware closer to the user’s head. Generally speaking, the further away weight is from the user, the more pressure could be felt on the face, making usage uncomfortable over time.
A more recent patent application for a “head-mounted display with adjustment mechanism” attempts to solve the issue of user comfort, by using motorized tightening of a headband and inflatable bladders on a rigid headset to fix the display in place on the skull with minimal movement.
The systems can be automated, such as by tightening the band if it is detected as slipping or if the user is undergoing a fast movement. Eye tracking is also brought up, with the detection of the user’s eyes being used as a measurement for optimal positioning of the headset, again with automatic adjustments based on eye placement in relation to their ideal positions.
The box at the back is used to apply tension to the band by an electric motor
At the same time, another patent application for “thermal regulation for head-mounted display” looked to solve the problem of heat generation. Components such as displays are capable of expelling large amounts of heat, which could cause not only problems for other components inside the headset, but also for the user themselves.
Apple’s solution is relatively simple, consisting of fans and vents to shift air through the chamber where components are located. At the same time, air can be sucked in to the headset via the face seal, hiding the need for separate air inlets.
The shifting of air could also be beneficial to users, as the air could be too warm or have a humidity that users may not like to have their eyes exposed to for long periods. Again, Apple suggests the use of a fan to circulate the air in that region, one that could feasibly be turned on automatically by component temperature, humidity, the user’s skin temperature, and even the level of perspiration.
On the VR side of things, Apple’s patent application from March 2018 for a “Predictive Foveated Virtual Reality System” involves a dual resolution system that outputs both a high-resolution image at the same time as a lower-resolution version.
In theory, the lower resolution option would be available at all times, but under the assumption that the high-resolution image will be limited in comparison due to the associated processing cost. The low version would be presented in areas of a user’s vision while the high resolution version is being processed, allowing users to still have a general idea of their surroundings. It would also allow the high image to be reserved for areas of a picture where a user’s focus lies, with the low version used for the remainder.
A patent image showing where many sensors could be located for monitoring inside and outside the headset
Surfacing at the same time, the “display system having world and user sensors” suggests a headset with a plethora of cameras and other sensors both inside and outside the casing. The sensors inside would monitor the gaze, expressions, and head movement, while external versions would track the room, produce multiple live video views for processing, and could even detect a user’s hand gestures below the headset.
An example of ‘glasses’ that have space to slot an iPhone in as a display
A third filing that surfaced at the time is a cross between the current Google Cardboard-style headsets and smart glasses, with “head-mounted display apparatus for retaining a portable electronic device with display” effectively referring to glasses that can hold an iPhone or similar device.
Taking the form of shielded glasses, the iPhone would slot in to the front, lining up with the user’s eyes and connecting with the Lightning port. Holes would allow the rear camera to view the world, as well as others for sound output and ventilation.
The glasses are also depicted with built-in earbuds, a remote control for interactions, and side buttons for manual control.
An Apple patent application image showing potential mounting points for finger sensor units
While not directly connected to AR and VR, Apple has also considered how users could interact with virtual objects, with a key example being a wearable glove-like input device that could provide gesture controls and haptic feedback.
While the hardware plays an important part in Apple’s plans, the same could be said about software. While ARKit is a great start, enabling developers to easily add AR content to their apps, Apple is exploring ways to expand its utility, both for headsets and for devices.
In October 2017, a freshly-surfaced patent application detailed how to dynamically modify images and to present it, and text data, to users in real-time. Specifically the patent relates to headsets, where a distortion of an image to match the view would be required, both for still and for moving images.
Distortion is a problem raised in another patent application spotted in August 2018. “Processing of equirectangular object data to compensate for distortion by spherical projections” explained how to fix errors and issues when combining together videos produced by 360-degree camera rigs.
A 360-degree camera rig using multiple cameras
Rather than treating the videos as a large image, the encoder splits a video into pixel blocks, which are handled differently depending on where in the scene it is, as well as where it would be in the “view” of the user if they were looking in that direction. Pixel blocks at the equator will be handled more simply than those at the top or bottom of the sphere.
An April 2017 patent application for “method and device for illustrating a virtual object in a real environment” explained how to take advantage of images of the real world view of a device to make a virtual object look as if it could be a physical item placed in the world. Elements of this became part of how ARKit can render objects placed in the in-camera view to make it as realistic as possible.
Turning from content and application creation to applications themselves, one patent application from March 2018 titled “3D Document Editing System” handles the issue of creating documents within a VR or AR space. Using a headset, Apple envisages a three-dimensional editor for text that is suspended in the air, with text having a depth as well as other usual properties.
While this could be considered “Word, but in 3D,” creating documents in VR could offer a few benefits, such as making elements more prominent to the user by bringing the items closer to the user than the rest of the document.
Apple’s 3D document editor concept
Navigation is an area Apple seems to believe is really important to the acceptance of AR, with some patents and applications putting its properties to good use. One from January 2017 tells of a mobile device that can detect its surroundings and provide information to users about items in real time, such as paintings in an art gallery.
Rather than simply determining the position of the user in a building, the system would actively identify what nearby items are in a default low-power scanning mode, before ramping up to a high-power mode to download and display related AR content about an item. Optical tracking using a single rear camera would help give the impression of the AR data being fixed in place, at least within the camera view.
A patent granted in February 2019 for “Method for representing points of interest in a view of a real environment on a mobile device and mobile device therefor” offers an AR view that overlays a user’s surroundings. In this case, Apple mostly refers to the concept for general navigation, superimposing points of interest and associated data in the camera view, allowing users to know roughly where the place is in relation to their own location.
An image from the object scanning patent application
Again, a camera view would be used to determine points of interest that can be seen, and therefore what data could be offered up to users.
Notably the patent does not just apply to general navigation across a large area, like a city, but also to more immediately accessible places. It is suggested such a system could offer details about a car’s main features when in a showroom, by highlighting areas of interest when framing up the dashboard. It could even be used by users to learn about things within their own home, such as a lost iPhone’s position.
In this patent, Apple also suggests it could be used with smart glasses with a “semi-transparent display” as well as mobile devices, with users able to see the AR indicators superimposed on their own real-world view of their surroundings.
Illustrating how the point-of-interest system could work in hypothetical smart glasses
In The Car
While mobile devices and headsets are obvious venues for VR and AR, Apple has also looked into further afield areas where the technology could be useful. One that crosses over with its work in automotive engineering.
Spotted in April 2018, the patent application for “Adaptive vehicle augmented reality display using stereographic imagery” explains how an AR system within a car could provide drivers and passengers with an alternate view of the road ahead. Using data about routes, points of interest, onboard sensors, and other elements, the system would be able to offer items like the view of the route beyond mountains and buildings obscuring the user’s view, or an enhanced view of the road in foggy or other low visibility conditions.
An example of a computer generated view of the road ahead of a vehicle
A crucial change to the other patents and applications is that Apple envisions the information wouldn’t necessarily be shown on a headset or glasses. The information could be presented as an AR overlay for the windscreen, overlaying the driver’s own view of the road.
Apple does have other vehicular interests, under the masthead “Project Titan,” which includes self-driving vehicles that are covered with sensors that could monitor the surroundings. While it is centered around self-driving vehicles, Apple’s work in the field has led to many other technological patents meant to improve driving, with the AR view being one of them.
While rumored almost as much as the fabled AR smartglasses, it also remains to be seen how Project Titan will transform and become a consumer product, if at all.