Beyond Just Devices: The Path to Spatial Computing
So iOS 17 just dropped, and it looks like macOS Sonoma will be following soon next Monday. I've been testing the betas over the past few weeks looking to experiment with some of the more interesting changes, but while doing so I noticed something. iOS 17's updates really show how Apple's approach to software delivery has changed over the past few years.
From Platforms to Services
This year's updates feel more personal than many from previous years. I don't just mean in terms of features for personalizing your devices, but more so because the software changes aren't structured like they were before. Most of the changes this year are sweeping across many platforms at the same time.
This is a far cry from what the state of the same software was before. A few years ago the feature set of iMessage on iOS and macOS was very different. You would have to use an iPhone if you wanted to use the newest iMessage features.
Many of this year's updates this year can be thought of as enhancements to services rather than devices. This is evident when looking at the iOS 17 and macOS Sonoma update pages, many of the features are shared.
This cross-compatibility came thanks to a series of incremental changes over the years. These changes include the introduction of layers such as Mac Catalyst, as well as Apple's multi-platform developer framework SwiftUI.
The Benefits of Service Based Features
These updates being rolled out concurrently make the software across users' devices much more cohesive. Now you don't have to reach for your phone to use a feature in an app that you cannot access on your laptop.
Interactions and features changing across multiple devices at the same time really helps increase the sense of perceived refinement to a feature. A multi-platform more easily lets users alter the interaction model they have. Staged rollouts on the other hand, require users to keep track of which features are available and where, or leave them confused when their expectations don't align with reality.
These same principles extend to many of my favourite apps. It has become much more feasible to deliver cross-platform services to users. So sure you're using your devices, but more importantly, you're using software that solves problems for you wherever you need it to.
My productivity apps and even a surprising chunk of my developer workflow can be accessed not just on my Mac, but even on my iPad and phone. My watch and AirPods offer catered interaction models for their form factors, providing me with quick and convenient controls for my current task.
The best software I use lets me interact with my services and surroundings by providing controls where it's most convenient. This has been one of the driving factors of technology. Technology has become further integrated in our lives, in part, for convenience. We have desktops as workstations, TVs for shared media consumption, tablets for lightweight tasks, phones for portable computing, watches for fitness tracking. We even have countless IOT and smart home devices to integrate more parts of our lives into our technological ecosystems.
Effectively Spatial
We've moved past the point where we just use a single device to handle a task. At this point compute surrounds us. These devices alter our perception to deliver unique experiences and solutions. I've been saying for years that wireless earbuds are some of our first good augmented reality devices. With noise cancelling, transparency, and more they shape our perception of the world around us. They take personalization to the next level, altering your view of your surroundings. The shared experience these devices provide when working in harmony seems like the first move towards "Spatial Computing" to me.
With existing devices already altering how we interact with the environment around us it's interesting to consider how virtual and augmented reality devices will try to do the same. They show lots of potential in driving new interaction models and experiences, despite their limited adoption outside of gaming so far.
Apple's Vision Pro headset shows a lot of potential not because it looks like a better VR headset, but due to the promise it shows as an augmented reality device. The Vision Pro may be our first Spatial Computer according to Apple, but through the software consolidation of all our other devices, we've been working towards the idea of Spatial Computing for far longer.