banner

Blog

Apr 13, 2023

A first look at the Apple Vision Pro user interface

by Jonny Evans · Published June 5, 2023 · Updated June 6, 2023

I was curious how we will use Apple's new Vision Pro devices. I’ve looked into all the available information and have the following insights into the user interface.

It also felt useful to share what we know so far concerning the tech specifications of these systems, which I’ve put together as a list below.

Apple has worked to build a user interface (visionOS) that reacts to gesture, eye movement and voice. At present, we don't know too much about that UI – it will be several months until we really learn it all, but here is what we do know so far:

You navigate visionOS by looking at apps, buttons, and text fields. When you look at an item the app icon bounces so you can see it is active and you can then apply actions.

The actions you can apply that we know of so far include:

But this goes further with voice. For example, look at the microphone button in a search field to activate it, then dictate text. Siri will open or close apps, play media.

Accessibility tools such as Dwell, Voice or Pointer Control are also supported. The headset also works with the Magic Trackpad and Magic Keyboard.

When building apps for these devices, developers can choose how an app icon or control makes itself known to a user when it is selected. Apple says these can either glow slightly or be highlights in the space.

Porting existing apps to support these systems is as simple as a checkbox in Xcode, Apple explained.

Here in something like an order are what we know about the tech specifications of these devices.

Now with power pack

Please follow me on Mastodon, or join me in the AppleHolic's bar & grill and Apple Discussions groups on MeWe.

What is the user interface? What about the tech specifications for the device? Design Processors Display and cameras The price Please follow me on Mastodon, or join me in the AppleHolic's bar & grill and Apple Discussions groups on MeWe.
SHARE