Apple announced new software tools and technologies for developers to create spatial computing apps for Apple Vision Pro, the company's first spatial computer. Vision Pro features visionOS, the world's first spatial operating system, allowing users to interact with digital content in their physical space using natural inputs like their eyes, hands and voice.
“Apple Vision Pro redefines what’s possible on a computing platform. Developers can get started building visionOS apps using the powerful frameworks they already know, and take their development even further with new innovative tools and technologies like Reality Composer Pro, to design all-new experiences for their users,” Susan Prescott, Apple’s vice president of Worldwide Developer Relations, said in an Apple news release.
For Apple Vision Pro, the company's first spatial computer, developers now have access to cutting-edge tools and technology. Beginning today, Apple's worldwide developer community will be able to produce an entirely new category of spatial computing applications that make the most of Vision Pro's canvas, the release reported.
With the help of the visionOS SDK, developers can create new app experiences across a range of categories, including productivity, design, gaming and more, taking use of the strong and distinctive features of Vision Pro and visionOS, the release said.
To give developers hands-on experience testing their apps on Apple Vision Pro hardware and receiving help from Apple experts, Apple will open developer labs in Cupertino, London, Munich, Shanghai, Singapore and Tokyo the following month. The release said applications for developer kits will also be open to development teams, which can use the kits to build, test and iterate swiftly on Apple Vision Pro.
The same fundamental foundations from other Apple platforms, such as tools like Xcode, SwiftUI, RealityKit, ARKit and TestFlight, may be used by developers to create new experiences that make use of Apple Vision Pro's features, the release said. With the help of these tools, developers can produce new kinds of apps that cover a range of levels of immersion, such as windows that have depth and can display 3D content, volumes that produce experiences that can be viewed from any angle and spaces that can completely immerse a user in a setting with limitless 3D content.
A brand-new tool called Reality Composer Pro, included with Xcode, enables developers to examine and prepare 3D models, animations, photos and sounds so they look fantastic on Vision Pro and are optimized for their visionOS apps and games, according to the release. The new visionOS simulator allows developers to engage with their apps while exploring and testing different room layouts and lighting scenarios.
Additionally, every framework for developers has built-in support for Apple's cutting-edge accessibility features, ensuring visionOS apps and spatial computing are usable by everyone, the release reported. Beginning next month, developers who have been using Unity's authoring tools to create 3D apps and games can port their projects to Apple Vision Pro.
“By taking advantage of the space around the user, spatial computing unlocks new opportunities for our developers, and enables them to imagine new ways to help their users connect, be productive and enjoy new types of entertainment," Prescott added, according to the release. "We can’t wait to see what our developer community dreams up.”