Under the Radar

295: WWDC 2024 Interview


00:00:00   Welcome to Under the Radar, a show about independent iOS app development. I'm Marco Arment.

00:00:05   And I'm David Smith. Under the Radar is usually not longer than 30 minutes, so let's get started.

00:00:10   So we are here live in Apple's podcast studio at WWDC 2024.

00:00:15   And we have some special guests today for this special interview episode of Under the Radar.

00:00:20   We have Serenity Caldwell, design evangelist here at Apple. Hello, hello.

00:00:25   And we have Serenity as our first returning guest on the show.

00:00:30   I feel very honored.

00:00:35   Thank you. And we have Kristin Ora, product marketing for developer technology on Vision OS.

00:00:40   Hello, welcome to the show. So what are we talking about today, Dave?

00:00:45   Yeah, so I think the wonderful opportunity we have here is to talk through Vision OS, Vision Pro, and to dig into it in a way that this time last year at WWDC, it had just been announced.

00:00:50   Very few people had ever had the opportunity to wear it, to see it. Now we're four months into it actually being out into the world.

00:00:55   And so it's an interesting opportunity to just talk through making great apps for Vision Pro.

00:01:00   Because I think it's an interesting place as a developer to be, we're at the starting line of that platform,

00:01:05   where we're still working out how we as developers can make great apps for Vision Pro and kind of talking through

00:01:10   what are the hallmarks of a good Vision Pro app, what are the attributes of that.

00:01:15   So this is a great place to start. I'd encourage anyone who's listened to this who hasn't already to watch the "Designing Great Vision OS Apps,"

00:01:20   which I think was a tremendous video for explaining a philosophy to bring to this platform.

00:01:25   That if you have experience with iOS or Mac OS or other places, it's a great start.

00:01:30   And I think what was very interesting in that Sarah broke down the attributes of a great app into four categories.

00:01:35   And I think walking through each of those gives us a great sort of jumping off points to discuss.

00:01:40   And the first one of those was intentional.

00:01:45   And it was essentially, it's coming at this platform, it's a different thing to the other platforms in lots of different ways.

00:01:50   And I think there's an interesting challenge for us as developers is finding what parts of our app would make sense there,

00:01:55   or what experiences or things would be useful.

00:02:00   And I was curious, now that you've actually had the opportunity to see other developer apps,

00:02:05   what are some of the things that we should be considering and the thought process we should go through

00:02:10   when we're deciding should we bring our app to Vision OS, and if so, what parts of it?

00:02:15   That's a great, great question. I think what's really special about Vision OS,

00:02:20   and you kind of hit the nail on the head, right, is that it's such a new platform.

00:02:25   For many existing developers, we have folks who have been developing just for a 2D screen in the past, right?

00:02:30   And being able to move into a 3D space that requires a bit of thought of like,

00:02:35   okay, what's my key moment that's actually going to feel so, so relevant in this world?

00:02:40   Right? I think DJ, we focus on that a fair bit in Sarah's session,

00:02:45   is a great example of this, where they have a phenomenal 2D app experience, right,

00:02:50   in that and the iPad, right? It's a fantastic way to interact with music and DJ,

00:02:55   and really like get into the music. And so when we moved, you know,

00:03:00   when they thought about how to move to Vision OS and what they could do,

00:03:05   obviously the opportunity there is really cool, right? DJing not only in your own space

00:03:10   and being able to practice on something that looks real and feels real is super, super exciting.

00:03:15   But then pushing that sort of next boundary and thinking about, well, what's really core to the DJing experience?

00:03:20   Like, how do you flow into it?

00:03:25   And when we talked to Curry Morsy, who's the CEO, who also is again quoted in that session,

00:03:30   it's something really cool where it talks about like, DJs really love being in a flow state.

00:03:35   And you see YouTube videos of DJs being on mountains and really cool things like that.

00:03:40   And like really, really just like vibing with the music. And he's like,

00:03:45   "I'm going to make a DJing experience into something like Apple Vision Pro."

00:03:50   And that's where those environments come through and you find yourself being able to DJ there.

00:03:55   And it's what I find magical about that experience and sort of that transition from an intentional key moment

00:04:00   is that they're taking something that is very, very personal to the DJing experience

00:04:05   and then they're making it accessible to everybody.

00:04:10   Like in spatial computing.

00:04:15   I'm honored that you think I would have that skill.

00:04:20   Well, so here's the thing. Here's the thing, right? I was very intimidated when we first started working with that team.

00:04:25   Very, very, very intimidated because I'm like, I've never touched a turntable in my life.

00:04:30   I have one record player. It is to play three records. That's it.

00:04:35   Genuinely, you can throw on two of the DJ songs or songs from Apple Music and they have you up and running

00:04:40   and feeling so cool in like two minutes.

00:04:45   Kristen, you've tried this. Yeah. And one thing I love about it, too, is they build on that iPad experience.

00:04:50   So if you're familiar with the iPad app, you actually see some of those iPad UI elements and then you see how they've

00:04:55   reimagined it for spatial computing. Taking advantage of 3D moments, but also

00:05:00   showing how UI can be 2D at times. It can be 3D. You can literally put your hand out and engage with

00:05:05   a space that allows you to control the music. Or you can find those

00:05:10   little buttons that you might have seen on the iPad app that you maybe first got familiar with.

00:05:15   Yeah, it's this really good combination. I think that's one of the things, you know, not everybody is going to have a slam dunk

00:05:20   like, oh, I know exactly what this is going to be in a spatial computing sense.

00:05:25   But one of the other things about that video is, you know, we've mentioned, I think, over ten

00:05:30   developers that are making incredible apps on the store and you actually hear from them one on

00:05:35   one in that session. And they're all across such a broad range of topics, right?

00:05:40   And so I think one of the things that we think is really crucial is like, it's not that

00:05:45   session is not just a checklist of you have to do every single one of these to be a

00:05:50   great vision OS app. It's more like a design mood board for spatial computing where

00:05:55   it's just like, here are how apps from very, very different topics.

00:06:00   We got, you know, kitchen design and weather and sport, you know,

00:06:05   a sports media experience. You can find sort of that key moment

00:06:10   in your app that feels right or that might feel exactly

00:06:15   the way that you want to dig into to a spatial computing experience. And from there, that's

00:06:20   where you experiment, you iterate, you test and you push forward.

00:06:23   And what I thought was very interesting about it is essentially it seems a lot of it is being intentional about

00:06:27   scoping what experiences that you currently have on the iPhone or iPad makes

00:06:32   sense in the spatial context, that it isn't necessarily that coming at it with I'm just going to take everything I can do

00:06:37   on an iPhone or iPad. It's like there are certain things that will work better and create more richness

00:06:42   if you're bringing it into a spatial context. Yeah, absolutely. I mean, I love what you did

00:06:47   when you were exploring with your app, David, honestly, because it's not like you brought all of your features one to one

00:06:52   over to Vision OS when you were experimenting. You're like, what actually makes sense? What kind of widgets would I actually want in

00:06:57   my space? And I really loved the design explorations you went through over the

00:07:02   last, what, six months? You were blogging about it. You were sharing your designs in public, which we love

00:07:07   to see as design evangelists. It's really, really special. Yeah, and I think what I found was it was the

00:07:12   easiest place to start is obviously if you just recompile your iPad app

00:07:17   into Vision OS, you can get sort of a starting point. But it is unlike... I feel very seen right now.

00:07:22   But even where you are, I think there are, like, in Overcast, there are some of those features that are

00:07:27   going to make sense in a spatial context and there are some things that aren't necessarily going to apply as well.

00:07:32   I'm not sure if the entire app makes sense in a spatial context, but I'm curious, like, what we saw

00:07:37   in these videos and what you're showing off and exploring,

00:07:42   we're seeing a lot of apps that are completely custom 3D volumes and 3D

00:07:47   objects and 3D environments and everything. There's also this whole

00:07:52   world of apps that are, for lack of a better word, boring.

00:07:57   So it's like UI controls and just basically the Vision OS

00:08:02   UI kit equivalence of just seeing, and SwiftUI of course, but seeing

00:08:07   here's the regular controls, regular interfaces, here's a split view, and here's a list and stuff like that.

00:08:12   How should we be thinking in terms of, is it worth bringing

00:08:17   that kind of app to Vision OS?

00:08:22   Like a usefulness perspective and also, are we,

00:08:27   I just have to say doing it wrong in the sense that I feel like

00:08:32   right now I just let my app run an iPad mode because I didn't finish the Vision OS port yet.

00:08:37   We love that.

00:08:42   Or that it's like a substandard experience in some way.

00:08:47   Is using the standard controls with a native Vision OS app another step on that ladder that's also

00:08:52   kind of substandard to a full 3D interface? Or are those considered peers in some way?

00:08:57   Oh yeah, I mean I think it's really, first of all, I love having compatible apps on the platform

00:09:02   because again, what I love about Apple Vision Pro is it is a computer in its own right.

00:09:07   I use Apple Vision Pro with Mac Virtual Display, I use it with

00:09:12   multiple apps during my day to get work done. So it's really awesome to have apps from

00:09:17   iPad and iPhone that I know and I love and I'm able to launch that rather

00:09:22   than having to look through, pass through to use an iPhone or an iPad app.

00:09:27   So that is a great first step and we're very unified on

00:09:32   wanting folks to opt in where it makes sense for their program and their platform.

00:09:37   And then there are some apps that do need a bit of work before they're going to be

00:09:42   working well in sort of a native context. And it's okay to be in a compatible

00:09:47   sense or a compatible app in that time. But yeah, taking us up, I mean there's a reason

00:09:52   why so much of SwiftUI is parody, right? And

00:09:57   even increases and just adopts the beautiful design language of Vision OS.

00:10:02   There's a reason why UIKit works on Vision OS, right? Is that we really do

00:10:07   see like 2D windowed experiences are fantastic. We don't, not every, not every

00:10:12   app needs to be 3D. And I think we didn't say that in the session. Totally. And you can have your compatible

00:10:17   app and that's great. But we did a lot of work to make sure that if you recompile, you get a lot of that look and feel that feels

00:10:22   native in that Vision OS environment. And so even just having the subtle depth with the UI

00:10:27   or being able to have that beautiful plate that allows that transparency allows your

00:10:32   app then to fit inside the Vision OS look and feel, even if it's mostly 2D, it

00:10:37   fits very nicely next to a more spatial app. And like Surya said,

00:10:42   it's the foundation that allows you then to build that spatial layer onto it when you're ready or when you find the right use case.

00:10:47   Yeah, I think two great apps that come to mind are like Carrot, Weather and Crouton. I mean both

00:10:52   incredible indie app developers, right? Those are solo proprietor shops

00:10:57   who have fantastic windowed experiences, right? Crouton is a cooking app and Devin

00:11:02   put a lot of effort into, okay, how does my UI transition into sort of the glass material

00:11:07   and what color changes do I need to think about? Because he uses a lot of colored text in the iPad

00:11:12   version that maybe doesn't necessarily work quite as dynamically on the glass

00:11:17   material. So he thinks really carefully about that. And then he's found some really smart ways.

00:11:22   I mean one of the other things that I think windowed apps really succeed in on Vision OS is you've

00:11:27   got this infinite canvas, right? And what I love about some of Devin's experiences is he really

00:11:32   adapts the window size. It's like, oh, if you want to go through cooking steps, he actually brings out a separate window

00:11:37   that's smaller and compact and then you're able to kind of look at it while you're like prepping.

00:11:42   I would never use Vision OS while on a hot stove or at least I wouldn't.

00:11:47   But from a baking perspective, from a prepping perspective, it was actually, I did this a couple of times, like prepping

00:11:52   biscuits and prepping cookies and it's really fun. Like planning out that meal and like really getting into it.

00:11:57   Yeah, yeah, exactly. And Carrot Weather is another great example of like we take

00:12:02   in all of that compact information and I mean Brian packs it

00:12:07   in on the iPhone and the iPad. Like there's so much to access there. And then

00:12:12   on Vision OS is really able to expand the canvas and show more information and then has started

00:12:17   to experiment with these really great 3D elements like the 3D globe that represents like

00:12:22   wind temperature and the like. And that's a really great example of

00:12:27   someone who uses standard components and does a fantastic job and it feels

00:12:32   native and natural on the platform. So I think there are, yeah, there are a lot of opportunities

00:12:37   to bring an experiment even if you're not, if you're like you look at 3D content, you're like that scares me. I'm

00:12:42   not a 3D developer. I don't know how to make a fully immersive environment. I feel very seen about that. That is the thing that

00:12:47   when I saw Vision OS, the thing that made me scared to get started was the sense of it's like I'm not

00:12:52   a 3D artist. That is something that I have no experience in, no expertise, probably no talent.

00:12:57   But what I found was interesting about getting to even just staying in the windowed experience is

00:13:02   there's enough sort of 3D affordances that are added even in a 2D app

00:13:07   that the like the tab bar is elevated out or the ornaments are

00:13:12   given a spatiality that isn't, it's there's a 3D-ness to it. But it's not, I'm not creating a 3D

00:13:17   model in Reality Composer. It's just part of the system is that there's a depth. There's still a richness and a depth

00:13:22   that feels natural on the platform that is sort of I'm getting even without that expertise. Absolutely.

00:13:27   And one thing that's worth mentioning is both Swift UI and RealityKit are really great examples how we're allowing you to

00:13:32   write one line of code and use it across platform. When you use that line of code, even in Swift UI on

00:13:37   Vision OS, it'll have those subtle 3D elements with depth and space. And then when

00:13:42   you use it on your iPad app, it obviously fits within that space as well. Yeah, it's really cool. And of course

00:13:47   with RealityKit kind of coming across platforms this year as well, that's really exciting.

00:13:52   It's just we're trying to make it easier for developers to build the right app for the right platform

00:13:57   and allow themselves to be on multiple platforms and have them feel great so that it doesn't just feel

00:14:02   like, oh, I'm just porting my iPhone app over platform to platform. It's really

00:14:07   breathing and local. I mean, one of the things we saw in the developer labs, I mean, we've we've been at the developer

00:14:12   labs since the very beginning of Apple Vision Pro. I mean, it was such a great opportunity

00:14:17   to connect with developers worldwide and really hear from them. It's like, what kind of apps

00:14:22   do they want to build and what are their fears and how are they feeling a little bit like, oh, God, 3D content.

00:14:27   And it was so magical to talk to folks and have them come in and feel like this is really

00:14:32   cool, but I'm nervous. And then have them say, oh, in 10 minutes I recompiled my app and like,

00:14:37   yeah, there are some things that I need to change. But oh, now I now I understand now I see like what

00:14:42   I can potentially do and maybe this is going to require a little bit more work and this is something I can do

00:14:47   in the immediate term. But I think really seeing your you're seeing your app in device

00:14:52   and doing that initial recompile step is the first step to

00:14:57   what do I do from there and how can I augment and what makes sense in this space?

00:15:02   Yeah, and I think what's interesting to what I found coming to this platform is also the understanding

00:15:07   that I think is the sense of the user interaction model, making it comfortable for a user

00:15:12   is very different because there's a physicality to it that it's not a

00:15:17   window on something you're holding in your hand. It's a almost an object in space that you interact

00:15:22   with physically. And so there's a lot of differences that you need to do or having an inner having a visual

00:15:27   design that is responsive to users glancing around

00:15:32   is a very different way of thinking about it because I think I used to notice is there is a lot of places where I was

00:15:37   doing I had visual affordances that you need on iOS to like indicate something's a button that's something

00:15:42   selectable. Whereas on VisionOS the interesting thing is if someone looks at it, it will tell them that it's a

00:15:47   button because it turns into a button. And I think those kind of visual comfort things were very

00:15:52   different and interesting to get into. And I was curious if you had just thoughts about things we should be keeping the back of our mind about

00:15:57   making sure that we're building something that's actually going to be comfortable to our user to adapt to that physicality.

00:16:02   Oh yeah, we had so many conversations about button shapes. I think that's a really really great question.

00:16:07   I think some of the big things that we saw in the developer labs first and foremost like you said,

00:16:12   thinking about your UI from an eyes and hands perspective and you know the hover

00:16:17   effect does so much that you don't necessarily need the drop shadow. You don't need the outline.

00:16:22   You can do a lot with the different colored materials as well to indicate

00:16:27   kind of hierarchy on your page. You can do really interesting things

00:16:32   now that the tab bar has moved from where you might think on iPhone

00:16:37   down at the bottom or from like a split view perspective. You can do all of

00:16:42   that on Vision OS so you have some more flexibility again in presentation and I think that's really awesome too.

00:16:47   Like one of the things we see when folks first move from an iPad or an iPhone

00:16:52   over to Vision OS is there's this moment of oh should I just

00:16:57   you know this is the default window size like 16 by 9 window size and we see a lot of folks just start

00:17:02   developing for I think you and I had that conversation too. And then at certain points like oh I can

00:17:07   change the size of the window to fit my content and I don't necessarily need to keep it this way. I can make it larger. I can

00:17:12   make it smaller. But we do we have a lot of conversations about that and field of view right. Finding

00:17:17   ways to you can explore expanding the content while keeping it comfortable in someone's field of view.

00:17:22   We've talked about multiple windows too. One of the really wild things about spatial

00:17:27   computing is obviously there's depth right in a way that you don't necessarily have on a 2D

00:17:32   screen which means that if you layer objects in front of and behind

00:17:37   your eyes are actually doing the same kind of work as they would in reality. So if you have multiple windows

00:17:42   and you're asking somebody to be switching constantly between those windows your eyes are actually

00:17:47   going to get really really tired. So it's one of the reasons we talk about like thinking about

00:17:52   when you're creating your UI can you keep it contained in a single window. Where does it make sense if you're

00:17:57   moving to a supplementary window if you're moving to a modal like making sure that that attention is focused

00:18:02   and that if you're having to make those attention switches or those context switches

00:18:07   they're not like rapid fire. You give yourself some time to focus.

00:18:12   One of the other things that we mentioned in the session is thinking about branding in the context

00:18:17   of VisionOS. So obviously in iOS and iPadOS we've done a lot with

00:18:22   colors and with color tinting and the like and I think a lot of people very

00:18:27   early on were like OK VisionOS my branding color is a dark blue

00:18:32   so my window is going to be a dark blue. Just a giant wall of dark blue. It turns

00:18:37   out that making a dark blue like on the glass material

00:18:42   looks a bit like the monolith from 2001 A Space Oddity.

00:18:47   But that's another thing that goes back to that idea of testing in device to really understand what

00:18:52   people who are using your app are going to feel is you very quickly you're like oh I'm

00:18:57   going to do that and then you get in device and you look at your window and you're like if I look at that for more than five seconds

00:19:02   it's going to make me claustrophobic. Yeah exactly. So that's I think that's a really

00:19:07   thing really exciting thing about like thinking about comfort when designing for VisionOS.

00:19:12   Did I miss anything? Oh yeah I think spatial computing is just a brand new platform and so getting people

00:19:17   that time and device to iterate as a developer and then obviously seeing how your consumers are responding is so

00:19:22   important to be able to refine that experience and that's honestly what's inspiring to Serenity and I

00:19:27   when we share like our favorite apps of saying you know this is a brand new way that this developer

00:19:32   is thinking about that nuance and experience and how they're addressing that comfort and the needs of the user in that moment.

00:19:37   It reminds me of something that I ran into with when I was working on Widget Smith was the reality of having

00:19:42   essentially unlimited window size in a way that I'm so used to on a phone it can only be so

00:19:47   big and scrolling is a natural gesture on VisionOS but I would say it is

00:19:52   less desirable than just making like I could just make my window sort of tall

00:19:57   and long. You have an infinite canvas. And so I can just that was the easier solution to avoid an unnecessary

00:20:02   scrolling whenever I can by just making the window tall is an interesting way that it's much

00:20:07   more comfortable if you never have to scroll if you never have to do the it's a slightly more involved gesture than

00:20:12   just looking and clicking your fingers. Yeah absolutely I mean I think the gesture system is something I really love about

00:20:17   VisionOS and the fact that you can just keep your hand in your lap while you look at items and

00:20:22   tap on target them and then even experiences that are more physical. Right.

00:20:27   We have a session later this coming out tomorrow on designing interactive experiences

00:20:32   where we talk about interact we talk about encounter dinosaurs a little bit. And one of the things that

00:20:37   is done in encounter dinosaurs that I think is really cool is this idea of that is an experience

00:20:42   where physical like custom gestures is crucial to the experience.

00:20:47   Right. If you want to interact with the dinosaurs you're holding your hand up to let them sniff you. You might want to try and pet the big

00:20:52   dino he might snap at you. But that's not necessarily how everybody wants to interact

00:20:57   or can interact. So there's a fallback interaction system built into that

00:21:02   app where you can actually choose the alternative gesture system and that gives you more of a like video

00:21:07   game type modal experience where you can actually just interact by looking at

00:21:12   an action sheet and basically being able to say like oh I want to be friendly to the dinosaur I want to be a little bit

00:21:17   more ferocious to the dinosaur. So I think there are opportunities with vision OS even though we give that

00:21:22   the affordance for really awesome custom gestures. We're also always thinking about accessibility

00:21:27   and ways to not only like make it accessible to the folks who can

00:21:32   like jump up and interact and slice with both hands. Synth Riders is another great example

00:21:37   of that where Synth Riders is of course an Apple arcade game. It's really great. It's been a couple

00:21:42   of other platforms as well and you use like both hands to like I don't even know how you describe it Krista

00:21:47   like punch or like dance your way through through objects. But

00:21:52   they have a one handed mode so that the recognition that like sometimes you may not be able to use two hands while you're doing this.

00:21:57   So those are other I think really great affordances for comfort and

00:22:02   accessibility when we think about designing these apps is even if you're going into that 3D space

00:22:07   you're going more immersive you're exploring custom gestures to always think about

00:22:12   how you can support the standard interactions and really make it accessible to all.

00:22:17   Yeah I think too I mean I think vision OS is an interesting place for me where it

00:22:22   reminds me very much of the early days of watch OS. So both Marco

00:22:27   and I developed watch OS 1 apps watch OS 2 apps. You've been on this journey for a long time and

00:22:32   the watch OS 11 now is a completely different platform than it was

00:22:37   back in the early days. So much better. It is immeasurably better and I think what is interesting I have exactly

00:22:42   the same thought of vision OS 1 was a very strong start but to think

00:22:47   about going forward where it's going to be and to be a developer that's part of that journey

00:22:52   that it feels very young at this point but there's a lot of opportunity and I think even just in this

00:22:57   first release of vision 2 I'm seeing lots of things where you're responding to the needs that we run into

00:23:02   like we have custom hover effects now. One of these things that was like really bugged me that I couldn't do

00:23:07   previously but it's like nope now I can do that and it's lovely to see that journey is underway

00:23:12   and look forward to seeing where it's going. Totally with vision OS 2 there's so much that we're either addressing

00:23:17   as you said things that we're hearing from the developer community and that's a result of the direct engagement with our engineers

00:23:22   our design evangelists with those developers throughout the last year. You're also seeing a lot

00:23:27   of ways that we're just trying to make it easier to make incredible spatial apps. I think a great

00:23:32   example of that is tabletop kit where you have pieces, dice, cards

00:23:37   and it's all built in so you can get that experience up and running to have a meaningful

00:23:42   tabletop game you can play with a friend. You know spatial personas and all of that

00:23:47   built in trying to reduce that time so the developer can focus on the experience and what they

00:23:52   want to build and then the software and vision OS 2 can actually just support that all the more to make it faster.

00:23:57   Yeah I think that's a really great point Krista it's for so long and I think the idea again

00:24:02   designing spatially can be a really daunting thought for a lot of people and the fact that you can take

00:24:07   all of these you know these frameworks that you know and love on other platforms and they just

00:24:12   work out of the box and then now our thought is like how do we push that forward. How do we make

00:24:17   it even easier. Spatial photos and video I think is another great example of that. You know we have a great session

00:24:22   that I think came out yesterday on how to really

00:24:27   build your own spatial photos and video and how to incorporate that stuff in your apps.

00:24:32   And that's you know stereoscopic video the idea that you could pick it up and shoot it with your

00:24:37   iPhone and suddenly you have 3D video in your pocket like that's not something that

00:24:42   has been accessible to all that many people until recently and I think that's another exciting

00:24:47   aspect of this platform is like how can we reduce complexity for developers for

00:24:52   consumers just for everybody to really get a chance to experience spatial computing.

00:24:57   The other thing I think that we've really learned in the developer labs is that one to

00:25:02   one connection with our developers and understanding how they want to build apps and what they're excited

00:25:07   about. And I think I mean we've got 30 sessions this year on Vision OS

00:25:12   too and you might not think that from you know from initially but you dig into it and you're like

00:25:17   oh man there's so much new stuff here. So a bunch of new RealityKit APIs. There's a ton

00:25:22   of improvements in Swift UI from Windows and like being able to automatically position windows

00:25:27   and automatically close windows when others open to the additions to volumes

00:25:32   to things we've heard from developers about immersive spaces right being able to

00:25:37   pass through for interesting game and experiences. There's all of these

00:25:42   little features that I think we're just building on top to really enable developers

00:25:47   to make their best work on this platform. And I think for me it's I enjoy that

00:25:52   process of being on this platform early on. I think I could understand a developer who says they want to

00:25:57   show up in five generations of this in Vision OS 5 and that's where they wanted

00:26:02   to start. But for me I enjoy being at this part of it because it's nice to be at the part and I think

00:26:07   in five years I'll be a better Vision OS developer having gone through this process of having seen

00:26:12   the feeling the feeling the pain point and then finding the way to work through it

00:26:17   makes you better at understanding when to do that when not to do that and having a deeper understanding of the platform.

00:26:22   And so that's one of those things that I've enjoyed. It's a young platform that's definitely in

00:26:27   its early infancy but it's been fun to see it even start to develop a few months into

00:26:32   its introduction. Yeah. And I mean we love that. We're not going to

00:26:37   know what your pain points are unless you poke at them and you tell us right. Like we have we have some

00:26:42   great ideas for the platform but it's it takes developers we are

00:26:47   better together when we understand what it is that you want to build how you want to build it

00:26:52   how you want to use this platform the wild ideas that you have. I mean I think of our

00:26:57   two ADA winners this year for spatial computing and some of the initial ideas as they were exploring this

00:27:02   platform it's just like wait you want to do what you have people bounce up

00:27:07   to oh actually that's really fun. That's fantastic. And I think those kinds of those kinds of

00:27:12   things only happen when we get to have that like developer

00:27:17   Apple interaction. And that's why I love the developer labs so much. I think it's been a really really awesome learning

00:27:22   opportunity for us. Yeah. I was curious if someone was wanting to get into vision OS

00:27:27   development what would you say is the best sort of starting points for them to learn. Is it the videos

00:27:32   or is there other places in the developer resources that they should start to look.

00:27:37   Yeah that's a great question. There's so many developer resources so I think this time of year is a great time as

00:27:42   Fernanda mentioned the sessions and diving in there. I think the best in class development we've

00:27:47   been sharing with Swift UI over the last few years is a great foundation as well.

00:27:52   Just thinking about Swift UI reality hit all these core frameworks that sit at the foundation of

00:27:57   all the best apps on Apple platforms. And again you can write one line of code

00:28:02   and use it in so many different places. So I think that's a great place to just to start ideologically. But there's

00:28:07   that we really expanded the number of resources that are available this year. And that's that was a huge priority

00:28:12   for us. Absolutely. We just launched a new pathways program on developer.apple.com

00:28:17   so we've got a vision OS pathway which kind of outlines it's like what do you need to get started. We also have these amazing

00:28:22   guides that are going throughout the WDC 24. There's a guide on each of our main topics including

00:28:27   spatial computing and that talks about all right here all of these sessions that are new as well as we've released

00:28:32   a ton of new sample code too. And the sample code projects I think are really really great ways

00:28:37   like our sample code projects from last year like Hello World destination video has been updated this year.

00:28:42   So for people building media experiences they can play with the new A.V. kit features like light spill and docking.

00:28:47   Like I think there's a there's a lot to dig into and we've really thought about yeah that

00:28:52   initial kind of guide in and then also thinking about we're continuing to think about ways for

00:28:57   creators and folks who are not necessarily you know haven't been with the platforms for a long time

00:29:02   right. They might be new to Apple developers. They might be creators on other platforms who are curious about

00:29:07   spatial computing on Apple Vision Pro and we're really thinking about ways to reach

00:29:12   them where they are as well. Great. Thank you so much Serenity Caldwell, Kristin

00:29:17   Oro. Thank you so much for being here. Everyone out there thank you for listening. Best of luck out there building your

00:29:22   Vision OS apps and we'll talk to you all next episode. Bye.