Oh my gosh. Okay. Apple is indeed making a $3.5K VR ski goggle. The developer tools are now available

Apple’s Vision Pro glasses won’t be available until next year, though registered developers can now explore iGiant’s tools for building apps for virtual reality headsets.

On Wednesday, the tech titan released Xcode 15 beta 2, which includes the visionOS software development kit (SDK), a 3D content design tool called Reality Composer Pro, and a visionOS simulator. VisionOS is the operating system that powers the ski goggle-like Vision Pro gadget.

Together, these tools provide a way for software developers to start building and testing augmented reality apps in preparation for hardware availability.

“Apple Vision Pro redefines what is possible on a computing platform,” said Susan Prescott, Apple’s vice president of worldwide developer relations, in a statement. “Developers can start building visionOS apps using the powerful frameworks they are already familiar with, and take their development even further with innovative new tools and technologies like Reality Composer Pro, to design entirely new experiences for their users.

“By taking advantage of the space around the user, spatial computing opens up new opportunities for our developers and allows them to imagine new ways to help their users connect, be productive and enjoy new types of entertainment.”

To accustom developers to the unusual challenges of interacting with the virtual environment, Apple plans to open development labs in Cupertino, London, Munich, Shanghai, Singapore and Tokyo next month. This will provide an opportunity to see how apps actually behave on a facetop. The company is also planning to allow teams to request to receive developer kits, presumably a hardware prototype of some sort.

A decade ago, Google took a similar approach, distributing its unloved Google Glass augmented reality headset through Basecamp stores in San Francisco, London, Los Angeles and New York. But it closed those outlets at the end of 2014, then in 2019 it closed the social network Google+ where the ad biz published the announcement of the decampment of Basecamp.

Starting next month, developers who have built apps using the Unity development framework will have a shortcut into the world of spatial computing: Apple will provide a way to port Unity games and apps to visionOS.

Figuring out what’s possible in visionOS and how to create an engaging experience may not be easy, so it helps if developers have at least six months to sort things out. Interacting with on-screen objects requires reading standard hand gestures supported in the SwiftUI framework or creating your own using ARKit, Apple’s augmented reality framework.

Apple has provided four sample apps that can help developers understand its new computing paradigm. There’s Hello World, a demonstration of 3D windows and space; Destination Video, which showcases spatial audio and 3D video; Happy Beam, which demonstrates how to use ARKit for 3D entertainment; and Diorama, which demonstrates how to use Reality Composer Pro to create and preview RealityKit content.

Initially, developers will need to get used to how 2D interfaces and associated gestures should behave in a 3D environment. Then there’s figuring out how to take advantage of spatial audio, which locates sounds in 3D space. And eventually, developers will have to grapple with how multiple people should interact in a 3D environment, using SharePlay, the group activity framework, and Spatial Personas.

There’s a lot to think about, like how many people are going to make $3,500 for a headset and a hand wave. At that price, it doesn’t have to be that many to make it worthwhile.

#gosh #Apple #making #3.5K #ski #goggle #developer #tools

Leave a Comment