This is an automated archive made by the Lemmit Bot.

The original was posted on /r/applevisionpro by /u/ChinSaurus on 2023-09-20 15:47:03.


Hello r/AppleVisionPro!

I’m an interaction designer and over the last few years I started to notice a trend in Apple devices and how they are slowly getting us used to AR features. But when they showed off the Double Tap gesture on the Apple Watch Series 9, it really became clear how they intend on easing us into normalizing AR. It’s gonna be a long journey before AR becomes normal, but how we trickle in the interactions to blend pre-AR and post-AR worlds is pretty critical.

I made this video about it with my friend about this and would love to hear your perspectives since you’re all so passionate about the Vision Pro. Notably, the points we cover are:

  • How Siri was the first step in getting us used to screenless interactions
  • How the Apple Watch got us used to always having a computer on our person
  • How the AirPods made modifying the world around us with sound possible (audio AR)
  • And finally how this Apple Watch Double Tap is literally in the official Apple Vision Pro documentation—and is a direct blend into the AR interactions Apple is working on

Have you guys noticed anything similar? Are there any other interactions that you’d add to this list?