In the future, Apple will enable developers to link the U1 ultra-wideband chip with augmented reality applications. The new presentation at the WWDC developer conference feeds rumors that all kinds of preparations for an AR headset are already being made in the new developer tools of this year. There is a lot of speculation about Apple’s entry into the Metaverse. It is currently expected that a hardware announcement will not be made until the beginning of 2023.
Nearby Interactions enhanced
The extension of the Nearby Interactions API, an interface for developers in iOS 16, would be at least a useful building block for an AR headset. The U1 chip allows for precise localization of items that are tagged with it. For example, owners of AirTags can target the small tracking devices using the U1 chip and thus determine where they are in the smallest of spaces.
Apple has already connected an augmented reality application to the U1 for this purpose, namely the function for finding the AirTags. With the expansion of developer tools, all developers can build such apps with the fall release.
For example, it is conceivable that the U1 could be used in games with real objects that play a role in the mixed reality of the glasses. Applications that serve spatial orientation or control are also conceivable in combination with the Apple Watch or the iPhone.
Mysterious U1 chip
The U1 chip was introduced in 2019 with the iPhone 11. At first, its exact use remained a mystery. Apple only spoke of “amazing new capabilities”. With improvements in wireless data transfers via AirDrop and the location of the AirTags, it became clear what this means. U1 chips have been built into all iPhones since the 11 generation and the Apple Watch since the 6 Series. In addition to the AirTags, the HomePod mini also has the chip, which allows the audio playback to be switched more precisely between the phone and the speaker.
Where there is still AR potential
In addition to the revised Nearby Interaction, other new functions and APIs also stimulate the imagination of those who expect an AR headset. Some see the design of the window manager Stage Manager, which is being introduced on the iPad and Mac, as a harbinger of what the display could look like in glasses. The capabilities of Live Text Camera are seen as useful for AR glasses, as is the new API called RoomPlan, which creates a precise room profile using a camera and lidar sensor. Apple has also significantly upgraded the ARKit developer tool in Version 6 and, with Metal 3, has also further improved the graphical possibilities of Apple devices.
However, observers of the WWDC are waiting in vain for an official announcement, clearly legible indications or even the first software up to the expected realityOS operating system. In years past, new hardware was sometimes announced well in advance so developers could write apps for it.
To home page
#Apples #developer #tools #headset