John Haney ᯅ
banner
johnhaney.bsky.social
John Haney ᯅ
@johnhaney.bsky.social
1.1K followers 3.7K following 310 posts
I make apps that run on  devices, Author https://www.appsyoucanmake.com, Creator of GrammarSnob, Flashlight via my Apps From Outer Space. not representing any company. | he/him
Posts Media Videos Starter Packs
Pinned
I’ll summarize my 2024 with the word Spatial 🥽

This was the year of Spatial Computing and Apple Vision Pro for me:
3 apps for #visionOS by Apps From Outer Space (links in thread)
2 apps by Lextech (links in thread)
VisionDevCamp 🏆
Vision Hack
Open source (links in thread)

📕 AppsYouCanMake.com
Apps You Can Make
Apps You Can Make SwiftUI starter iPhone projects for iPhone and iPad Your first steps as an App Developer by John Haney
AppsYouCanMake.com
Happy  26.1 release candidate day, everyone!

iOS, visionOS, macOS, tvOS, watchOS, and Xcode are all available now
Without using this correction, the aim position was off by about a centimeter with the stylus in front of me.
* When you get an AccessoryAnchor, you can get a coordinateSpace for the .aim and use correction or not. The correction gives you a position that will better match up with real-world objects in the passthrough display.
* I'm using AccessoryTrackingProvider here to get the results, as opposed to an AnchoringComponent. I will experiment with AnchoringComponent later.
Some notes on my experiences so far with this device:
* The aim anchor is fairly smooth and updates well, but did float when I had the stylus in front of my MacBook Pro screen when it was off. Maybe this was due to reflections on the screen?
I added a new open source app, LaserPointer, to ARKitVision (link in comments). This has a simple setup for tracking the Logitech Muse stylus, and uses the primary button to change laser beam colors.

Link to #visionOS code and notes about my experiences with the Logitech Muse so far in 🧵
Just did a demo of the Apple Vision Pro with M5. It was already an amazing device that I’ve been fortunate to work with daily since last February, and this is a really great upgrade. Text is crisper, load times are near instant, and the moon surface in Jupiter has really sharp details. 🥽
Nice! Thanks for sharing your code!
Apple Vision Pro Developer Strap (2nd Generation) just arrived.

Curiously, it has some more more metal near the connector on both sides, and is just a bit visible on the right.

#visionOS
Happy Apple Vision Pro M5 release day, everyone!

There is also an updated Developer Strap available to order.
Happy  26.1 beta 4 day, everyone!

iOS, visionOS, macOS, watchOS, tvOS

So far, latest Xcode version is 26.1 beta 2
I’d support this but maybe more practical than making specific examples, it would be awesome if you could “tag” anything which also works on iOS, macOS, and/or tvOS… since we have RealityView on all three of these with tvOS 26.
Happy  26.1 beta 3 day, everyone!

iOS, visionOS, watchOS, macOS, tvOS
Yep. If you had already downloaded it, you still can access it. It’s removed from the App Store, and no updates are planned.
Who(m)ever at Apple that made the Xcode 26 Theme preferences where I can just press ⌘-+ or ⌘-- to increase/decrease all of the font sizes at once, you deserve cookies. THANK YOU! I love this little convenience!! 🔨🧑‍💻
Happy  26.1 beta 2 day, everyone!

iOS, visionOS, macOS, watchOS, tvOS, and Xcode
Probably would be too busy, but what if the days were all mushroom outlined just flat/outlined if none?
Reposted by John Haney ᯅ
Working with ARKit & imageTrackingProvider on visionOS, anyone knows how many images could be tracked at the same time? From my own test seems only 1 image at a time to detect, & maximumNumberOfTrackedImages not available for visionOS? Any idea to track multi images simultaneously?
#visionOS #iOS
Thanks so much! This is merged in and released as 26.0.3
Xcode 26.1 beta 1 is also available
Happy  26.1 beta 1 day, everyone!

iOS, visionOS, macOS, watchOS, tvOS, and Xcode 26.0.1