Profile avatar
vrhermit.com
Learning, sharing, and teaching visionOS development at Step Into Vision https://stepinto.vision
1,681 posts 1,868 followers 524 following
Prolific Poster
Conversation Starter

Xcode beta 2 is out. Downloaded and set up for visionOS development. Building to device is still a bit slow, but seems faster than beta 1. developer.apple.com/documentatio...

visionOS 26 lets users lock windows and volumes in place. Let's explore this feature and how we can opt out of it for transient interfaces. stepinto.vision/example-code...

Something about the iOS beta has greatly improved my experience with dictation. It has been borderline unusable for a few years and now it’s nearly flawless.

I just found Z fighting in my waterfall. Explain that! I knew this was a simulation.

The light of the Eldar fades across the Home Screen

visionOS 26 lets users lock windows and volumes in place. Let's explore this feature and how we can opt out of it for transient interfaces. stepinto.vision/example-code...

We can use the surfaceSnappingInfo environment value to access volume snapping data. When can land the plane when the volume is snapped to a surface and we can hide the wood base when that surface is classified as a table. stepinto.vision/example-code...

visionOS 26 Beta 2 is out with lots of fixes. I'm eagerly awaiting the Xcode update developer.apple.com/documentatio...

On repeat until Beta 2 drops www.youtube.com/watch?v=A6eq...

Just so you don't think you're doing something wrong: Running an app on device, snapped windows are automatically locked and a padlock symbol is shows. When running an app *from Xcode*, the lock symbol is missing and the option to lock a window in place is missing.

Question about window locking on visionOS: Can we access a "Locked in Place" value when a window has been locked without being snapped to a surface? See the full context on the developer forum. developer.apple.com/forums/threa...

We can use the surfaceSnappingInfo environment value to access window snapping state and surface classification. stepinto.vision/example-code...

I review my WWDC 2205 Wishlist. I didn't get very many of my wishes this year, but there are so many new features and APIs to work with. I'll have no shortage of things to build with what we have available in visionOS 26. stepinto.vision/articles/wwd...

This is helpful: "All new frameworks presented at WWDC25" blog.eidinger.info/all-new-fram...

Is there a library out there with SwiftUI backports/workarounds to help adopting new features while supporting older OSes? Would be cool to have these all in one place where everyone could help.

A nice side effect of widgets on visionOS. When I take photos or screenshots of my office, it now looks at least 37% less sad and empty. My lighting still sucks though.

Happy Friday! Here is your weekly recap from Step Into Vision. I have a handful of new labs for you, and a bunch of great community resources. I've also started planning a ton of new Example Code posts for this summer. Get subscribed so you can follow along! stepinto.vision/articles/ste...

Control center

BTW, we can use two instances the manipulate gesture at the same time. Here I'm holding the car with my left hand and the SwiftUI view with my right.

I want live on none-way street

I just added 95 items to my backlog for Step Into Vision! You can look forward to tons of new Labs and Example Code covering all the new visionOS features and APIs. Please support this work if you can ko-fi.com/stepinto #visionOS #SwiftUI #RealityKit #ARKit

SwiftUI has a manipulableI modifier, sort of like the Manipulation Component in RealityKit. It is obviously intended for Model3D, but I was wondering what would happen when used on a 2D view. It works, but any 2D views end up getting clipped by the window or volume. stepinto.vision/labs/lab-065...

I installed the beta on an iPad so I could check out Liquid Glass. For the most part, I like it. But Control Center stands out as just plain unusable. Trying to add controls is a visual mess

I just added 95 items to my backlog for Step Into Vision! You can look forward to tons of new Labs and Example Code covering all the new visionOS features and APIs. Please support this work if you can ko-fi.com/stepinto #visionOS #SwiftUI #RealityKit #ARKit

🚨 New blog post released: Post-WWDC25 Thoughts: The Road Ahead for visionOS and the Vision Product Line www.tabtotap.com/blog/post-ww...

Lab 054 - First look at Presentation Component This component lets us present modal SwiftUI views positioned relative to a RealityKit Entity. This is similar to attachments, but with some extra magic for state and anchor position. stepinto.vision/labs/lab-054...

Almost done with the Priority Sessions on my list. I saved Foundation Models for last, since I know that is going to send me down a rabbit hole for a few days

One of the coolest things we got in visionOS 26 is Entity Observation. SwiftUI can now easily watch for changes to RealityKit entities, giving is the second side of two-way communication between these frameworks. stepinto.vision/labs/lab-063...

Lab 062 - First look at Gesture Component A component that allows us to create unique SwiftUI gestures for RealityKit entities. We simply define a gesture, pass it to a component, and add the component to an entity. stepinto.vision/labs/lab-062...

New in SFSymbols for iOS 26, Draw effects!! Five beautiful examples.. stay till the end for the code sample. First up, "rainbow" with .symbolRenderingMode(.multicolor). Symmetrical symbols draw outward from the center.

Are we getting a new version of Reality Composer Pro this year? The Xcode 26 beta 1 ships with a newer build of RCP 2.0. From what I can tell, there are no new features. I really want to see this tool advance long with the rest of the visionOS development pipeline.

When I first got here, this site just just twelve cat girls and ten nerds with VR headsets (nine of whom were also cat girls)

Lab 062 - First look at Gesture Component A component that allows us to create unique SwiftUI gestures for RealityKit entities. We simply define a gesture, pass it to a component, and add the component to an entity. stepinto.vision/labs/lab-062...

Xcode beta 2 please 🥺

Among Vision Pro users, I almost never see people asking for new hardware features/improvements. Most people simply want a smaller form-factor. That says a lot about the state of the software and the headset's capabilities. I'm in the same camp; wrote this recently: www.roadtovr.com/half-the-siz...

I'm gone through all the visionOS sessions on my short list. Now it's time to tackle generally useful platform stuff