00:00:13 ◼ ► We are thrilled to once again be live and in person at Apple's Worldwide Developer Conference
00:00:19 ◼ ► 2024. Now won't you please silence your devices, then open your ears and put your hands together
00:00:37 ◼ ► Hello! Welcome to the talk show. I am your host Jon Gruber. Most of you here are probably
00:00:52 ◼ ► familiar with me. Welcome! I was told we were backstage and the House Sound guys were like,
00:01:01 ◼ ► "This crowd is really loud. What do you do?" It is great. The crowd is always great here.
00:01:11 ◼ ► It does feel a little extra energetic and with good reason. I think we're going to have
00:01:23 ◼ ► people to thank. These are our sponsors, without whom we would not be here, trust me. They
00:01:30 ◼ ► are amazing sponsors, three of them. The first amazing sponsor is iMazing. iMazing 3 is the
00:01:47 ◼ ► iPhone's release way back when, the iMazing team has been delivering the missing features
00:01:59 ◼ ► your computer. I don't know, Phil Schiller obviously still has an iPod. I know he flies
00:02:05 ◼ ► with it. Version 3 of iMazing features a brand new user interface. It is so beautiful. It
00:02:24 ◼ ► to discover, to explore, to get these features. But iMazing gives both individuals, families,
00:02:36 ◼ ► enterprises can use it to manage their devices and efficiently manage Apple mobile devices.
00:02:43 ◼ ► It includes powerful local backups with incremental snapshot support, text message extraction
00:02:51 ◼ ► options, whether for compliance or keepsakes. You can tell from that word compliance, probably
00:03:01 ◼ ► one of those tools that scales from you to really big companies. Versatile data extraction,
00:03:09 ◼ ► file transfer, it is natively coded for the Mac by a passionate team in Switzerland for
00:03:20 ◼ ► world. Check out iMazing 3 today and save 20% by visiting iMazing.com/thetalkshow. That
00:03:31 ◼ ► is iMazing.com/thetalkshow. Second, my good friends at Flexibits, the makers of Fantastical
00:03:46 ◼ ► and Cardhop. These are, you guys probably have heard of them, these are amazing. Fantastical
00:03:54 ◼ ► is a calendar, Cardhop is a contact manager, absolutely gorgeous, previous Apple design
00:04:01 ◼ ► award winner and they are for every platform Apple makes. Of course the Mac, iPhone, iPad,
00:04:10 ◼ ► they even have right on day one a fantastic version of Fantastical for Apple Vision Pro.
00:04:20 ◼ ► Really gorgeous. Now I'm not naming names, but there are other companies that don't have
00:04:25 ◼ ► a calendar app that is native for Vision Pro. Fantastical was native on day one and it is
00:04:39 ◼ ► not know. I got to find Galenzi in the audience after the show and petition him, I don't know,
00:04:45 ◼ ► must be a back story there, but it is beautiful, beautiful apps. They also have Flexibits Premium
00:04:57 ◼ ► get Cardhop and Fantastical for all the platforms. You don't just pay per platform, you just
00:05:09 ◼ ► they have been using the fact that it is a subscription service to build out an infrastructure
00:05:15 ◼ ► on the web to do things like doing the thing where you pick a time for an event and people
00:05:30 ◼ ► they need to the cloud, they don't keep anything they don't need, but to coordinate picking
00:05:45 ◼ ► it at 2.30." That sort of stuff, all they're built into Flexibits Premium service, it is
00:05:57 ◼ ► fun stuff like stickers that you can put in messages. I mean, the whole scale from totally
00:06:09 ◼ ► really great. New and existing customers with this special deal for this show, new customers
00:06:20 ◼ ► already subscribe, what's in it for you? Well, guess what? Go to this code, go to this URL
00:06:25 ◼ ► and when your renewal comes up, just by having gone to this, you'll get 20% off when you
00:06:31 ◼ ► renew. Everybody gets 20% off, new and existing, by going to flexibits.com/wwdc2024. That's
00:06:58 ◼ ► use Flighty? I use Flighty, it is fantastic. Flighty, another Apple Design Award winner
00:07:10 ◼ ► back in 2023, way back last year. Every download of Flighty, you get the pro features complimentary
00:07:29 ◼ ► can see what it's like. I guarantee you, you will sign up after you've flown with Flighty
00:07:34 ◼ ► compared to flying without it. Every year, there's a new record for problems with flights.
00:07:41 ◼ ► I mean, I don't think I'm making this up here. Flying is getting worse. Last minute delays,
00:07:50 ◼ ► more disruptions, plus the airlines are secret and they don't really want to keep you up
00:08:01 ◼ ► has and keeps you up to date. I cannot tell you just in the last year how many times Flighty
00:08:05 ◼ ► has told me about gate changes. It's not just like, oh, you go from A-10 to A-8, but when
00:08:10 ◼ ► you go from gate A-23 to gate D-12. It's like, oh, that's 20 minutes away in the airport.
00:08:18 ◼ ► Flighty tells you as soon as it happens. You're not at the wrong end of the airport. It is
00:08:21 ◼ ► absolutely fantastic. They've got great features for sharing your flights with friends or friends
00:08:29 ◼ ► sharing your flights with you. So if somebody in your family or dear friend is traveling,
00:08:38 ◼ ► hooks up to all of the latest stuff that you would want them to. Live activities on the
00:08:43 ◼ ► home screen, everything you want. It is so fantastic, so beautiful. Honestly, the airlines
00:08:50 ◼ ► just want you to sit quietly and just be uninformed at the gate, get there three hours early and
00:08:54 ◼ ► just sit there on your iPhone. No, get Flighty. You will be there when you need to be there
00:09:00 ◼ ► without wasting time and you will be informed of everything you need to know. Go to flighty.com/tts.
00:09:10 ◼ ► The talk show, TTS, flighty.com/tts. My thanks to all of those sponsors. And now with no
00:09:47 ◼ ► shrunk. >> I'm not really Craig Federighi. I can never pull off the real hair. Anyway. All
00:09:55 ◼ ► right. Let's bring up the real Craig. >> I think they got both Craig Federighi and Greg
00:10:07 ◼ ► Joswiak at the same time. >> They did. It's a twofer. >> That was fantastic. >> That's as
00:10:21 ◼ ► direction. >> How many people in the audience saw last year's show? How many people remember
00:10:42 ◼ ► shredding. We don't have anything like that today. Now, how many people here were at Apple
00:10:50 ◼ ► Park for the keynote yesterday? And I'm sure for everybody watching in post on YouTube,
00:11:01 ◼ ► I'm sure everybody has watched the keynote. But for those who were in Apple Park the last
00:11:07 ◼ ► couple of years since this new format has been out, the keynote starts right at 10 a.m.
00:11:12 ◼ ► but around 955, Tim comes out, welcomes the crowd, has Craig come out to also say hello
00:11:28 ◼ ► are no gags, there are no -- there's no zany opening movie, and none of that is true. Well,
00:11:58 ◼ ► up there. We have a helmet. >> That helmet is heavy, by the way. >> But I am to understand
00:12:06 ◼ ► there was some concern about the helmet and the hair. >> I mean, not especially. >> But
00:12:12 ◼ ► I thought it looked especially good on Josh. >> Yeah, I had cool Craig hair. Hey, we do
00:12:48 ◼ ► looked realistic. You did not jump out of an airplane. >> Whoa, whoa, whoa, whoa, whoa,
00:12:54 ◼ ► what the hell, Gruber? What are you implying? >> But here's my real question. My real question,
00:13:03 ◼ ► number one, how great was it to see captain Phil Schiller flying the airplane? [ Applause ]
00:13:12 ◼ ► >> I will tell you that stuff wasn't the only take. >> Well, that was my question. Are there
00:13:25 ◼ ► alternate takes where you didn't say I'm getting too old for this stuff? >> That was the G-rated
00:13:38 ◼ ► think you guys did a very good job ordering the keynote, and I think we can stick to roughly
00:13:51 ◼ ► order for a reason. >> It opened with vision OS 2 and a sort of state of Apple vision pro.
00:14:03 ◼ ► Do you find it frustrating that five months in of shipping, four months? >> Four, four,
00:14:24 ◼ ► really as popular as the iPhone. This product is a dud. >> John, we've gotten that with
00:14:42 ◼ ► a $399 music player, CD players are $39. There's no future in this. And the reality is it takes
00:14:49 ◼ ► time. It took us about two years, right? And iPod, if you remember that worked out pretty
00:14:54 ◼ ► well. But in the meantime, everybody was calling for its death. Same thing, by the way, for
00:15:07 ◼ ► even after we were sometime into it, we said, okay, we'd love to get 1% market share of
00:15:14 ◼ ► the phone business in the first full year, which in 2008, we just barely made it, right?
00:15:19 ◼ ► And that turned out pretty well as well. >> Got past two, I think. >> Got past two. And,
00:15:29 ◼ ► you know, I mean, we have all the time. And watch has gotten it. I mean, oh, what a dud,
00:15:42 ◼ ► whose quote it was, but somebody who talked about the fact that people tend to overestimate
00:15:49 ◼ ► things that can happen in the near term and they underestimate things that happen in the
00:15:53 ◼ ► longer term. And I think that product expectations are much like that. It takes a little while
00:16:04 ◼ ► the canonical example of that? That what platform ever debuted as a huge hit? Maybe the web,
00:16:15 ◼ ► right? You could sort of say sort of exploded and then you look back to -- >> Yeah, even
00:16:19 ◼ ► that. Started slower than you think. >> Right. And then you look back to like -- >> That's
00:16:27 ◼ ► like, oh, no, there was like a five-year stretch there where I was still using Gopher. >> Originally
00:16:32 ◼ ► had to buy a next machine in order to do the web. So the market was contained for a while
00:16:37 ◼ ► there. >> But I'll tell you what, we're psyched over how many apps we already have for Vision
00:16:40 ◼ ► Pro. We have over 2,000 native apps already. We had over 1,000 just in the first few days,
00:16:46 ◼ ► which was faster start than even we had for the original app store. And let's not forget,
00:16:51 ◼ ► we have a million and a half compatible apps. So there's a lot of things you could do with
00:16:54 ◼ ► it even today, even as we're, as you said, building a platform. So it's pretty exciting.
00:17:07 ◼ ► >> One of the themes I thought for this show and talking about Vision OS and the platform
00:17:17 ◼ ► is just the word patience. And I think that Apple as a company has institutional patience
00:17:25 ◼ ► that other companies in the field don't have. For example, I can think of a company that
00:17:35 ◼ ► was also in the AR/VR space and even renamed themselves after, I can't remember the word.
00:18:26 ◼ ► it's the new, you know, it's the hot new thing, chasing the hot new thing and then another
00:18:30 ◼ ► hot new thing and then you chase it. Whereas Apple, you know, there was the sometime in
00:18:39 ◼ ► the last decade, a thousand no's forever yes. But that when you guys get to a yes, you stick
00:19:06 ◼ ► >> WWDC is a developers conference, but I thought that some of the news yesterday about
00:19:13 ◼ ► vision was it's -- if you expand developers to creative people, it was creating content
00:19:22 ◼ ► for vision pro with partnerships, with cannon making a new lens with two elements so that
00:19:44 ◼ ► Apple is. Where Apple has always been is creating these platforms for creative people to make
00:19:58 ◼ ► >> Well, let me just maybe even kind of build on that because that was probably one of the
00:20:01 ◼ ► biggest questions we got from creatives and film makers was I want to be able to do this.
00:20:07 ◼ ► Because once you see a movie, you know, even a 2D movie, it looks amazing. You see a 3D
00:20:18 ◼ ► directors saying this is the way they always envision a 3D movie to be experienced is how
00:20:23 ◼ ► it was in vision pro. And then when you see the immersive video we did, it's like being
00:20:27 ◼ ► there. And of course, the film makers realizing this feels like the future, right? This feels
00:20:40 ◼ ► the black magics to be able to say let's enable an ecosystem because also as part of building
00:20:44 ◼ ► a platform, you know, is an ecosystem, right? An ecosystem of content providers, app providers,
00:21:06 ◼ ► >> Home screen customization. Apparently it takes 18 revisions to get to the point where
00:21:25 ◼ ► >> You talked earlier, John, about patience. You got to wait, you know, pick your moments.
00:21:38 ◼ ► We thought 18 years is about the right moment to move an icon from the top down to the bottom.
00:21:57 ◼ ► >> There were actually developers that were born. We had to wait until they could mature
00:22:20 ◼ ► >> I think -- and I know we have very serious, very complicated, very serious computer science
00:22:34 ◼ ► be beloved feature in addition to being able to arrange the icons where you want, but being
00:22:40 ◼ ► able to customize the look and flipping to dark mode, I never thought of it before, but
00:22:50 ◼ ► really had occurred to me before that the icons don't change in dark mode, and now they
00:22:54 ◼ ► do, and they do look really cool. But I think even cooler than that is the customization
00:23:04 ◼ ► example in the keynote was very, very vibrant, like a vibrant electric yellow, and it themed
00:23:37 ◼ ► Photoshop files, you would drop Photoshop's icon on folder icon maker, and it would spit
00:23:55 ◼ ► icon of the app. I spent a lot of time on that. But a couple of years ago with shortcuts,
00:24:07 ◼ ► all of his icons -- and it was sort of a three or four step process where you make a shortcut,
00:24:11 ◼ ► and the shortcut just opens the app, but then you can assign a custom icon to the shortcut
00:24:23 ◼ ► some time, make a whole home screen of black and white icons, monochromatic, and then if
00:24:32 ◼ ► over and do them all over again. Now you've made a feature where, you know, I think that's
00:24:41 ◼ ► >> I am glad, though, that you were able to pass that legacy on to your son. Had we made
00:24:48 ◼ ► it too easy, his critical step in his maturation would have missed it, and now he's stronger
00:24:56 ◼ ► >> But do you see that as a people pleasing, like, yes, people are going to love this feature?
00:25:08 ◼ ► mean, just even the number of people becoming developers to download the beta on that first
00:25:39 ◼ ► >> But, you know, I think that at times, I mean, you guys are -- Apple is very famously
00:25:45 ◼ ► known for its design strength and, you know, you buy Apple products because they are well
00:25:53 ◼ ► designed and they are designed by Apple designers, but it's not, oh, here's how we want your
00:25:58 ◼ ► iPhone to look. Your iPhone is going to look just like the ones in the Apple store with
00:26:03 ◼ ► our wallpaper and our arrangement and, no, I mean, this is -- your iPhone can look like
00:26:15 ◼ ► you build a new product, I think you do want to agree to establish a kind of shared reality
00:26:20 ◼ ► and identity for that product and we certainly did that pretty thoroughly with iPhone. But
00:26:52 ◼ ► we pushed that frontier a little further and I think people are really going to enjoy it.
00:26:58 ◼ ► >> I would say the biggest upgrade to control center since control center was added, where
00:27:03 ◼ ► control center now is almost like a mini environment within itself with third party APIs for controls,
00:27:18 ◼ ► including all the way back to the lock screen where you can -- previously it's like you
00:28:01 ◼ ► of course using shortcuts as a means to do that. And so the idea of making controls really
00:28:08 ◼ ► kind of a universal concept in the system, letting you of course tie them to the action
00:28:16 ◼ ► you're going to want to customize control center and a lot of ways to do that. You might
00:28:19 ◼ ► have more of them, so you might want multiple pages of control center. And then the biggest
00:28:39 ◼ ► of power. And the other thing we did many years ago, you might remember, control center
00:28:50 ◼ ► that design had the property that it was guaranteed that every time you went to control center
00:28:53 ◼ ► it was on the last page you were on that wasn't the page you wanted to use next. So we backed
00:28:59 ◼ ► off from that and created an all in one design. For this one, of course we preserved the ability
00:29:16 ◼ ► that was super important to make it browsable, but also essentially instantly navigable as
00:29:26 ◼ ► And I think developers now just exposing more and more of their capability as actions and
00:29:52 ◼ ► we were going to talk about the show in order. So I'm just figuring. You're a thorough man.
00:29:58 ◼ ► >> All right. Here's a feature that I think kind of flew by in the keynote, but it really
00:30:08 ◼ ► >> We now know where every -- by the end of this we're going to have a map of where everybody
00:30:41 ◼ ► use Bluetooth, which brings up a system prompt, and it would say this wants to use Bluetooth
00:30:47 ◼ ► and the explanation text would be like it wants to connect to certain things or something
00:30:52 ◼ ► like that. And it's like allow or don't allow. And it's like, well, I don't even -- I have
00:30:57 ◼ ► no idea why this thing I bought wants to connect to Bluetooth. What am I doing? It's very confusing.
00:31:27 ◼ ► to yesterday, I guess, that when you set up that accessory, if you felt comfortable proceeding
00:31:35 ◼ ► and you set up your drone or whatever it was, that app still had open-ended access to Bluetooth
00:31:41 ◼ ► or Wi-Fi or whatever, and so you don't know what kind of signature it's gathering of discovering
00:31:46 ◼ ► nearby devices and so forth, and you probably didn't go back afterwards and take that permission
00:31:52 ◼ ► away and maybe the app even still needed it. So we've really focused over the years, we've
00:31:57 ◼ ► seen this on feature after feature where we've made the granting of access very clear and
00:32:10 ◼ ► know, today you'd have an app, maybe you want to do some messaging with it or something,
00:32:14 ◼ ► and it says, all right, this app wants access to your contacts in order to give you maybe
00:32:33 ◼ ► the full history of your contacts, present and future, out of which we've seen a history
00:32:38 ◼ ► of people kind of hoovering that up and using it to build a social graph on their servers
00:32:47 ◼ ► to really cover in great detail in the keynote is that we've given apps a great out of process
00:32:54 ◼ ► API for if you're typing in that app, trying to type a completion, there's a way that they
00:32:59 ◼ ► can surface in their own search UI without seeing the contact they're surfacing, surface
00:33:03 ◼ ► a match that we get out of your contacts so then you can grant access one at a time. So
00:33:08 ◼ ► this this gives you the same kind of convenience you want without just opening the kimono on
00:33:32 ◼ ► people who make the apps for those peripherals that are on their phone that asked for the
00:33:52 ◼ ► oh, well, if you before you say allow or don't allow, here are all the Bluetooth things that
00:33:59 ◼ ► are in your network right now that you would be telling this app that you have or if it's
00:34:05 ◼ ► Wi Fi, here's all the devices and in my house, it's an awful lot of devices are on the Wi
00:34:10 ◼ ► Fi. But that's really, really just just fantastic. Because I think those those dialogues before
00:34:17 ◼ ► like I have no idea what it's going to I don't really feel comfortable giving this access
00:34:25 ◼ ► it shows you what it would see. Yeah, these these dialogues. It was started with location
00:34:31 ◼ ► where you need some text in your location, you didn't quite get a visceral sense of what
00:34:46 ◼ ► and kind of show you where you've been right as well to give this sense just at a glance
00:34:50 ◼ ► visually like this is what you've been handing over. This is what you're going to be handing
00:34:59 ◼ ► sharing because a little bit of explanatory explanatory text, you know, doesn't doesn't
00:35:13 ◼ ► years. Now, the other problem I see this solves is the accusation that okay, AirPods get a
00:35:30 ◼ ► get this really beautiful UI. What about third parties? Well, sure. And and the pattern that
00:35:39 ◼ ► I see, but here's the pattern that I see there's a problem. The problem is it's really hard
00:35:43 ◼ ► to pair wireless ear ear pods with a device. Yeah. To Apple solves the problem with a really
00:36:05 ◼ ► step two and three is patience. That's one of our themes. But is the virtue. But is that
00:36:13 ◼ ► what do you say to someone who says that it should ship as a public API right away? Like
00:36:17 ◼ ► the day AirPods dropped, that should have been an API that any other company that makes
00:36:23 ◼ ► headphones could have used? Well, I think a lot of where where our ability to to innovate
00:36:30 ◼ ► comes is from our ability to rapidly try things out, some of which don't work, some of which
00:36:37 ◼ ► we need to get out there and learn and perfect before they become API API is a contract virtually
00:37:01 ◼ ► here who knows, you know, you put something together for use inside your app. That's one
00:37:05 ◼ ► thing. Getting exactly the right abstractions, the right contract that you want to live with
00:37:11 ◼ ► and support for the decades is a whole different exercise. And so the practice of we have teams
00:37:19 ◼ ► internally that, you know, live on if you're any team shipping software with the operating
00:37:24 ◼ ► system, you're taking the pain of living on a lot of work in progress that are us different
00:37:32 ◼ ► teams at Apple trying out new framework APIs and designs and you're riding that crazy wave
00:37:51 ◼ ► I think that's that's a virtuous cycle for us to be able to perfect it and then and then
00:37:56 ◼ ► scale it out. And so do you think people out there need a little more patience? And yes,
00:38:05 ◼ ► absolutely. Speaking of patience, speak to me about the engineering challenges of bringing
00:38:49 ◼ ► cool stuff in messages, but I'm going to move on to photos and photos for both iOS and iPad
00:39:03 ◼ ► OS has it's very familiar when you launch it. It is a very it's a significant new version.
00:39:12 ◼ ► But when I heard that the tab bar at the bottom was eliminated, I have to admit I wasn't.
00:39:18 ◼ ► That seems like the wrong way to go. It seems like, oh, here's a list of photos. And now
00:39:36 ◼ ► running iOS 17 and I realized I don't really use that tab bar. I'm always in library. And
00:39:43 ◼ ► what are the what were those tabs for? They were for things like albums and stuff, which
00:39:48 ◼ ► you manually create and put things in or manually defined smart albums on the Mac or something
00:39:53 ◼ ► like that. But that you the user sit there and spend time on a Saturday afternoon organizing
00:39:59 ◼ ► and not necessarily the replacement. But the next step in iOS 18 and iPad or in all the
00:40:06 ◼ ► platforms and photos is using machine learning even more to automatically categorize your
00:40:13 ◼ ► photos into things like trips or groups or what other ways is machine learning organizing
00:40:32 ◼ ► pretty great. But now we even do like groups of people. So because if you analyze the graph
00:40:38 ◼ ► of photos and who appears with whom, you'll find things like in my library also, you know,
00:41:07 ◼ ► away and certain kind of photos and it's amazing. Just turns out we can do an extremely good
00:41:13 ◼ ► job of finding trips, finding the best photos of those trips and it creates a really just
00:41:20 ◼ ► compelling way to look back. But I think that design point about the tabs, this was one
00:41:28 ◼ ► that we worked on a long time. In fact, this was one where we were building it, you know,
00:41:51 ◼ ► that. But part of what we realized is for many users, this is true across a lot of apps
00:42:21 ◼ ► probably rarely went to unless maybe you clicked on a widget or something and it took you in
00:42:29 ◼ ► and happening over time. And it was like this hidden gem that we didn't have the right way
00:42:42 ◼ ► it was before. You want to see these organized collections. Well, they're right there to
00:42:47 ◼ ► each of them just as accessible as the other. I think it really came together to be interesting
00:42:57 ◼ ► apps, because I think it's come together really, really nicely. Off the top of my head, I think
00:43:11 ◼ ► who doesn't have a large collection of photos in their library. It's for very typical users.
00:43:40 ◼ ► unread still. But everybody has lots of email, but most people tend to live at the top of
00:43:46 ◼ ► their inbox at the new email. But photos is a thing where the stuff that's in there, you
00:43:52 ◼ ► know, if you have 50,000 photos in your library, if you go back 40,000 photos, if anything,
00:44:03 ◼ ► new picture. If I lost my last week of photos, I can take a picture of my family from the
00:44:14 ◼ ► mean, I think it's very true. It's the one thing that everybody has in their lives where
00:44:24 ◼ ► machine learning, I think, it's just a step change in exposing this and changing the way
00:44:36 ◼ ► right? And you want to go back and enjoy them. And I think the team's done an amazing job
00:44:39 ◼ ► with everything from the photo widget to this redesign to exposing those things. I mean,
00:44:47 ◼ ► And you share it, right? You go share it back with your family. It's like, that's why we
00:44:58 ◼ ► when you were growing up, but in my household, I don't know, there were maybe four photos
00:45:11 ◼ ► something like that. And growing up, those are like the only photos. My self-conception
00:45:16 ◼ ► is rooted in like handful of couple of photos. And now, my kids live in an entirely different
00:45:43 ◼ ► but it's different at least. And I'm sure it's good. I'm sure it's all psychologically.
00:46:04 ◼ ► And I mentioned mail. Mail very similarly has a very nice machine learning updates this
00:46:10 ◼ ► year with things like categorization of your incoming mail. These are where things like
00:46:16 ◼ ► receipts go. These are things you subscribe to a bunch of newsletters. They go somewhere
00:46:20 ◼ ► else and it leaves your inbox for actual email from actual people trying to get to you. I
00:46:32 ◼ ► line in the keynote that, hey, maybe the first two lines of an email are not the best way
00:46:38 ◼ ► to understand what the email is about. Use machine learning to summarize the email. And
00:46:45 ◼ ► if you're only going to have, you know, it's a row of your inbox and you only get two or
00:46:48 ◼ ► three lines of text. Let AI give a summary. It's got to be better than the first two lines
00:46:52 ◼ ► that like PR people send me, right? It's life. Oh, PR people out there. But not your emails.
00:47:04 ◼ ► I'm talking about other PR people. I get these, these messages and they used to say, you know,
00:47:15 ◼ ► just for me. Now it just says wants free iPhone. That's right through it. That's also for me.
00:47:30 ◼ ► But photos, like I said, I think, I don't know, do you guys know like what is the average
00:47:37 ◼ ► photo library size? I don't know if you guys even collect that. I don't know. But we know
00:47:43 ◼ ► more pictures are taken with an iPhone than any other camera in the world. Anecdotally,
00:47:48 ◼ ► I think, you know, very, very typically tens of thousands, lots, lots and lots, lots of
00:47:58 ◼ ► you know, it's part of the glue. Lots of people are age choosing. Yeah. I was gonna say it's
00:48:09 ◼ ► for that, some other apps that I don't, there may be some others. Yes. So you're saying
00:48:13 ◼ ► those young engineers who got the home screen to have icons for everyone and don't answer
00:48:17 ◼ ► their emails. Not on the home screen. Yes. But these are very practical apps that people
00:48:27 ◼ ► use very widely and machine learning is making things they actually do better or making it
00:48:35 ◼ ► so they do the things way less time like skipping the wants free iPhone, iPhone email. I mean,
00:48:50 ◼ ► mean, I think it's a theme from Apple over many years now that in many cases you didn't
00:49:03 ◼ ► take a good photo? Why is photos building me a nice memory? How is it surfacing a photo
00:49:11 ◼ ► I actually care about? Yeah, there was a lot of powerful machine learning happening behind
00:49:33 ◼ ► photo of a person I think you would like on your watch, but it's also one that's framed
00:49:40 ◼ ► well for Apple watch and with machine learning. Here's where we think the numerals for the
00:49:53 ◼ ► yeah, this is like, these things look designed, they look like posters. Yeah. But where does
00:49:59 ◼ ► that are you guys frustrated or you guys just don't care that these features are in the
00:50:11 ◼ ► of this whiz bang new intelligent software features. Well, we've been doing machine learning
00:50:18 ◼ ► and AI. You know, you've heard me say it for so many years. We didn't even call it ML or
00:50:24 ◼ ► AI. We called it proactive. If you remember, we're doing proactive features. We're doing
00:50:29 ◼ ► this stuff for a long time. We've been building neural engines for seven years now, I think,
00:50:33 ◼ ► to help us do AI and ML on devices. So we've been at this a long time. Well, I mean, that
00:50:40 ◼ ► has that's been the funny bit now with the AI PC. It's like someone discovered the idea
00:51:02 ◼ ► Yeah. So, so obviously, you know, from, from, you know, phone phones for, for many years
00:51:10 ◼ ► and you know, the first M one Mac and we introduced in 2020 and every Mac we've introduced since
00:51:16 ◼ ► I guess we missed the boat to name it an AI PC because we've been making great ones this
00:51:36 ◼ ► distinction because maybe there's some minor exceptions, but it seems like with the tent
00:51:42 ◼ ► pole features, it's feature parody on iOS and iPad OS with like, for example, when lock
00:51:56 ◼ ► things we've already been talking about moving icons around, tinting your icons, getting
00:52:01 ◼ ► dark mode icons, iPad OS 18, the troll center. Yeah. Yeah. New photos app. Yeah. We were,
00:52:38 ◼ ► Was your strategy to hold back on a calculator for iPad so that when you did it, we would
00:52:50 ◼ ► the, the adulation we would have given up had we actually shipped it right out the gates?
00:52:54 ◼ ► You know, when I first came out, would anyone have cheered? No, no, no. You hold it and
00:52:59 ◼ ► hold it and hold it like that dramatic pause. Yeah. I forget if I've previously asked you
00:53:12 ◼ ► been asked about it all the time. I mean, truly all the time. We did the big iPad introduction,
00:53:16 ◼ ► you know, in early May and I was still getting asked, do you guys do a calculator? You know,
00:53:36 ◼ ► we bring to a calculator on a bigger screen? Turned out to be true. You did bring something
00:53:43 ◼ ► with math notes. Yeah. I mean, it was a real answer at the time, which is we, we really,
00:53:57 ◼ ► calculator to the iPad. But we really felt that we wanted to do something more and different
00:54:17 ◼ ► for iPad. And so, and I couldn't be happier, I think, and the team with the reception that
00:54:31 ◼ ► happy with the work. I think one of the really cool things about doing it with, or with the
00:54:40 ◼ ► it came up in the iPad segment, it's all with the pencil, but it's there on the phone. It's
00:54:46 ◼ ► speaking about feature parody. It's on the Mac. And obviously on the Mac there's no pencil.
00:54:52 ◼ ► You type, you know, you can type Mac too. But I think, but I thought the really interesting
00:54:59 ◼ ► thing about emphasizing with pencil wasn't just finally there's an iPad calculator. It's
00:55:16 ◼ ► you type SQRT with parentheses. And that's not how math students think of it. Math students
00:55:32 ◼ ► how I learned it. We agree, John. Yep. Sort of in the same lines, but the Smart Script,
00:55:44 ◼ ► which is the name that's the name for the feature where you're writing and you're handwriting,
00:55:50 ◼ ► and Smart Script will actually make your handwriting better. Yeah. I mean, step one was to learn
00:55:58 ◼ ► your handwriting so that we could both create writing that was consistent with your handwriting
00:56:07 ◼ ► and that we could even sort of build the neatest version of your handwriting that was consistent
00:56:12 ◼ ► with your handwriting style. You know, going back years now, we really had this goal to
00:56:25 ◼ ► there's also so much that's great that's a disjoint set about typing, which is type text.
00:56:30 ◼ ► You know, if you spell check and correct like no problem, it can rewrap that. You want to
00:56:34 ◼ ► insert in the middle? No problem. It can reflow it. You want to copy and paste a section?
00:56:40 ◼ ► You can do that. But of course, when you want to do that with handwriting, those things
00:56:44 ◼ ► aren't really possible unless we can actually generate your handwriting. If you want to
00:56:49 ◼ ► copy paste and move all the words around and change it, or we want to spell check, we have
00:56:52 ◼ ► to rewrite that word for you. And so SmartScript lets us learn your handwriting. And then we're
00:57:03 ◼ ► with a pencil, I sometimes do. And always I realize like there's a word I want to insert
00:57:17 ◼ ► rewrap. So now we need to understand the flow of the text and rewrap it. So we're really
00:57:37 ◼ ► term Apple employees who worked on the Newton who were like, Yes, this is what we were going
00:57:44 ◼ ► for. Right. And including scribble to create the gap to say I don't want that word anymore.
00:57:55 ◼ ► I can remember with my Newton that I missed until now. And now I've got it. Yep, you got
00:58:13 ◼ ► sort of an iconic slide in a keynote was you addressing the sort of conjecture out there
00:58:23 ◼ ► that Apple was in the midst of merging Mac OS and iPad OS. And do you remember the slide
00:58:46 ◼ ► at the time, the conjecture that you were addressing was of the mind that the Mac, which
00:59:06 ◼ ► thinking, it was going to be the iPad superseding the Mac is what people thought and you're
00:59:21 ◼ ► iPads that came out a month ago, and in re up this sort of mentality of, oh, you're calling
00:59:30 ◼ ► these iPad pros, but I can't do my professional work on them. You should let me do boot my
00:59:38 ◼ ► iPad into Mac OS or why don't you bring app kit and Mac OS to iPads. And I kind of feel
00:59:57 ◼ ► the answer. I've pontificated along similar lines of that of my not necessarily frustration,
01:00:10 ◼ ► light bulb that went off to me is well, the way I deal with that is I just do what I feel
01:00:29 ◼ ► cliche at this point, it's a magical sheet of glass becomes anything you want it to be.
01:00:33 ◼ ► And it's over a million apps that are designed for the iPad to allow to do incredible things.
01:00:38 ◼ ► And I wouldn't tell somebody who's a procreate artist that they're not doing pro things,
01:00:51 ◼ ► I don't know why people have wanted for the time that these have existed to want to merge
01:00:56 ◼ ► them together. That's not our that's not our desire to still big fat No. Yeah, I love my
01:01:01 ◼ ► iPad, I probably spend at least as much maybe more time on my iPad doing a whole variety
01:01:08 ◼ ► of things than I do on my Mac. I also love my Mac. I do not want them to become the same
01:01:14 ◼ ► device. I think you have you've used the phrase in the past, maybe that the, you know, Mac
01:01:21 ◼ ► is heavy so that iPad can be light or something like that. And I don't think Mac is heavy.
01:01:27 ◼ ► I think the Mac is awesome. But, but I think I think you're right. And I think you're years
01:01:34 ◼ ► ago wise man talked about kind of cars and trucks. And and I think I was I was honestly
01:01:41 ◼ ► bracing myself for another wave of this, not at this particular forum, but of this sentiment
01:01:48 ◼ ► when we were about to release the the iPad with an incredible M4 in it. Because there's
01:01:55 ◼ ► a mindset out there that says, Okay, you took this incredible powerful V8 engine that powers
01:02:17 ◼ ► in the truck. And it's awesome in the sports car. And and when you use your iPad Pro right
01:02:24 ◼ ► now, it's, it's the best iPad experience you can ever imagine. And that's worth a lot to
01:02:31 ◼ ► to a lot of people. I mean, that is that is a great experience. And we want to keep making
01:02:50 ◼ ► Moving on to Mac OS Sequoia. A I can't every time I see Sequoia, I cannot help but note
01:02:56 ◼ ► that it has all five vowels. Wow. Wow. And achievement. I think I play too much wordle.
01:03:10 ◼ ► I think it looks like well, that's cool. But big deal. I think it's like a game changer.
01:03:20 ◼ ► use this all the time with the new continuity feature where you can get iPhone mirroring
01:03:23 ◼ ► on Mac. Yeah. Is that it? It seems like that's harder than it looks. Great low latency connectivity
01:03:54 ◼ ► is magic. But this this is one that many of us and I certainly wanted for a long, a long
01:04:00 ◼ ► time. I use continuity camera all the time. And then my phone is up here. And, you know,
01:04:18 ◼ ► to access your phone, it's on the other side of the house or in your bag. And it's it's
01:04:24 ◼ ► great. And we made the trackpad gesture handling just totally smooth. So manipulating the phone
01:04:41 ◼ ► very wide. And you guys often say Apple complies with the laws around the world as needed on
01:04:54 ◼ ► a per country or, you know, whoever the jurisdiction. Yep. The world's gotten wider in the last
01:05:19 ◼ ► there. Now we know the attendees from Belgium are sitting. Russell. In general, as Apple
01:05:39 ◼ ► rules like in the app store to adapt to the changing world, to adapt to changing features.
01:05:45 ◼ ► In general, you guys seem to make significant efforts to keep the rules the same everywhere
01:06:02 ◼ ► Fair Trade Commission about something, something with reader apps. OK. And we'll just make
01:06:17 ◼ ► way where, OK, this is what we need to do in the EU. Here is an extensive set of frameworks,
01:06:43 ◼ ► we're going to keep this as the default everywhere, but we'll do that only when needed. Well,
01:06:48 ◼ ► John, this has been a first of all, it's been a massive effort. A massive effort by a lot
01:06:55 ◼ ► of people at Apple to work to comply to it. As you said, we have to write. We do business
01:07:08 ◼ ► a heavy lift. And especially to do that while trying to balance the safety, security, privacy
01:07:31 ◼ ► early March. So it's still early days, but we're working hard to try to do what we have
01:07:37 ◼ ► to do there. But it's a tough one. Yeah, I don't think, especially among the technically
01:07:44 ◼ ► savvy among us, many of us in this room have an idea that we all used Macs since even the
01:07:52 ◼ ► pre-internet era. We use them now. And we feel like we know how to keep ourselves safe.
01:08:04 ◼ ► We know there is no battle that is more continuously fought and just a pitched battle than the
01:08:15 ◼ ► one we have around keeping our devices secure and creating an ecosystem where my mom, my
01:08:26 ◼ ► kids, I feel like go ahead and download whatever you want and enjoy your phone. That's not
01:08:43 ◼ ► our users and such an amazing thing, I think, for the overall community, for developers,
01:08:49 ◼ ► the opportunity it creates when everyone feels like, you know, I want to try that. Oh, I'll
01:08:53 ◼ ► just download it. It works. When we see something that has the threat of imperiling our users
01:09:06 ◼ ► that ecosystem, we're going to, where we can, protect our customers. And we've done lots
01:09:15 ◼ ► of very, very hard things in the EU to minimize the damage. But it is no panacea what's being
01:09:24 ◼ ► put on users there. And I think it's underappreciated, and I understand on a technical audience,
01:09:35 ◼ ► teams and those that work on protecting our users, this is a serious issue. And so we're
01:09:47 ◼ ► I think it's technically minded. Thank you. I think it's technically minded critics who
01:10:03 ◼ ► don't see things your way, who rolled their eyes and think it's spin and not true, that
01:10:11 ◼ ► you guys hear an overwhelming amount of feedback from users who say, oh, I like my iPhone and
01:10:35 ◼ ► overwhelming in that regard. And look, I think I think look, it's totally valid. Like if
01:10:40 ◼ ► you want an Android phone, go buy it. Right? There is there is choice out there. It's fantastic.
01:10:49 ◼ ► They're so there are options. And by the way, in Europe, I mean, more people have Android
01:10:55 ◼ ► phones than iPhones. So some people, though, say they like the overall ecosystem that Apple
01:11:19 ◼ ► There's another big topic I want to talk about. But before we get to it, I think I would be
01:11:37 ◼ ► getting older and the way time compresses. But I still think of Swift as new. But I did
01:11:50 ◼ ► 10.0? Which cat? Oh, Jaguar. Cheetah. Cheetah. Very good crack marketing team. When Mac OS
01:12:05 ◼ ► 10.0 shipped, Next was only 13 years old. I know Objective C technically came like 1984,
01:12:13 ◼ ► something like that. But it was sort of known for only 13 years. And here we are 10 years
01:12:31 ◼ ► industry and see different places where companies have tried to create new languages, very,
01:12:36 ◼ ► very few of them kind of ultimately go anywhere. And Swift was, it has been just a runaway
01:12:46 ◼ ► success for app development on our platform. Right? Just a million app on the App Store.
01:12:52 ◼ ► I'm not sure what the number is. It's an extraordinary number of apps that use Swift. There's a whole
01:13:20 ◼ ► When we introduced the language, though, our ambition, step one was clearly a great language
01:13:26 ◼ ► for app development on our platform. Right? That was mission number one. But the ambition
01:13:30 ◼ ► for the language was as a general purpose systems programming language. And what's happened
01:13:44 ◼ ► is Apple's adoption and some others as a systems programming language. I mean, now with embedded
01:13:57 ◼ ► servers. We're running Swift just all over the, yes, all over the system. And these are
01:14:06 ◼ ► places that historically our only alternative really was C or C++ really. And many of us
01:14:16 ◼ ► grew up programming those languages. They are by design effectively unsecurable. Right?
01:14:28 ◼ ► abstraction and are inherently not very safe. And in the world we live in now, having a
01:14:33 ◼ ► programming language be safe as well as ideally expressive, incredibly productive are big
01:14:46 ◼ ► just with Objective-C and C, but now also with C++ that we think there's a new era ahead
01:14:54 ◼ ► where the world needs a safe, expressive systems programming language. And we think Swift is
01:15:01 ◼ ► it. The slide I saw in the State of the Union, I wrote it down, Swift is the best choice
01:15:06 ◼ ► to succeed C++. That's the whole slide. And that is the truth. You look at the alternatives
01:15:14 ◼ ► out there, nothing, look, Swift is built by the team that built probably the most popular
01:15:21 ◼ ► C++ compiler in use today in Clang. And Swift is more native to C++ than any of the alternatives
01:15:30 ◼ ► out there. Its ability to call and interoperate with C++. Because this code base, there are
01:15:35 ◼ ► code bases out there that aren't going away overnight. And people are going to continue
01:15:39 ◼ ► to write a lot of C and C++, but people are also going to add a lot onto it where using
01:15:57 ◼ ► Swift as having an extraordinary future, of course inside of Apple, but across the industry.
01:16:13 ◼ ► Obviously we want to talk about Apple intelligence, but I feel if we're going to talk about Apple
01:16:59 ◼ ► Sorry for keeping you waiting. It's your colleagues fault for having so much to talk about.
01:17:18 ◼ ► I think when I started working with the Siri team, the first instruction I gave them was
01:17:29 ◼ ► it's got better over the years, people just, we see in our data that people just use it
01:17:32 ◼ ► more. And the numbers, I'm not going to tell you what the numbers are, but they're huge.
01:17:46 ◼ ► because when people use a voice assistant and it works for them, then they use it more.
01:18:22 ◼ ► So this is the NLP brittleness problem, which is you don't want to have to teach your system
01:18:27 ◼ ► all of the ways that human beings might ask for something. So one of the things that we
01:18:31 ◼ ► showed at WWDC was an example of you correcting yourself and saying two different things and
01:18:37 ◼ ► the new models figuring out which thing you meant. And so the good news is that the technology
01:18:42 ◼ ► of language models is getting dramatically better and is less likely to make these fragile
01:18:51 ◼ ► It's good to hear that. Let me clarify something. And I don't think it was unclear in the keynote,
01:18:59 ◼ ► but I just think that it's so important that it deserves to be emphasized over and over
01:19:05 ◼ ► again is everything we've spoken about heretofore is not Apple intelligence, right? Scene learning
01:19:33 ◼ ► are not, those aren't built on the foundation model on device or in the cloud. And when
01:19:39 ◼ ► we're able to bring those to the much broader class of devices because they don't require
01:20:00 ◼ ► also to make kind of clear for people what's for all the devices and what's for the class
01:20:08 ◼ ► So let's talk about the class of devices that can support Apple intelligence. The cutoff
01:20:13 ◼ ► is for iPhone is very recent. It is the iPhone 15 pro with the a 17 pro system on a chip
01:20:34 ◼ ► So these models, when you run them at runtime is called inference and the inference of large
01:20:39 ◼ ► language models is incredibly competition expensive. And so it's a combination of bandwidth
01:20:45 ◼ ► in the device. It's the size of the A and E it's the, it's the, the, the, the, the oomph
01:20:51 ◼ ► in the device to actually do these models fast enough to be useful. You could in theory
01:20:56 ◼ ► run these models on a very old device, but it would be so slow. It would not be useful.
01:21:17 ◼ ► No, you know, we've, we've had so many, I mean, our, our first move in any ways to figure
01:21:28 ◼ ► that time and time again with us. What, what JG says is, is right here. We, this is what
01:21:36 ◼ ► it takes. This is the hardware, what it takes. I mean, it's a pretty extraordinary thing
01:21:54 ◼ ► chip with the neural engine. But is RAM a function of that too? Or is it really primarily
01:22:07 ◼ ► Yeah. And the A17 Pro is not the first A chip that's good in your engine, but it's got a
01:22:15 ◼ ► One thing that it doesn't seem, I don't even know if they're capable of it, but like I,
01:22:29 ◼ ► Difficult. I do. Yeah. Obviously. Well, you know, you know how many, how many colleagues
01:22:34 ◼ ► do I have? I got one podcast on the side with one colleague. Let's see if I can keep, keep
01:22:42 ◼ ► that going. No, but I, I find it, or I should say maddening when I meet somebody who never
01:22:52 ◼ ► says I don't know. I think that the humility of somebody who hopefully you know most of
01:23:01 ◼ ► the questions I'm talking to you about, but if you don't know, I appreciate just saying
01:23:13 ◼ ► They also have the feature that they double down on what they said. I think we technically
01:23:39 ◼ ► don't have features that will write, you know, a college essay about something about the
01:23:45 ◼ ► world, right? We've tried to make this, we've tried to corral this technology to do what
01:23:57 ◼ ► with guardrails and a whole bunch of careful work, but basically you can summarize an email
01:24:10 ◼ ► Right. So that you're not going to recommend using glue to stick your cheese on a pizza.
01:24:28 ◼ ► How broadly, how are you thinking though about protecting against, let's say offensive, wrong
01:24:47 ◼ ► hallucination problem you've alluded to just saying random stuff that's not true. But the
01:25:10 ◼ ► for their purposes and their creative reasons. And so we don't want to be the arbiter. We
01:25:15 ◼ ► don't want to have a huge blog list of words you're not allowed to say or ideas that somehow
01:25:20 ◼ ► don't work in the word processor. So we have to find that balance point where we're not
01:25:24 ◼ ► amplifying inappropriate uses. And we do have a huge number of what we call guardrails and
01:25:31 ◼ ► system techniques to make sure that certain topics are off limits. But it's a very fine
01:25:42 ◼ ► this is new for us, this is new for the whole industry, trying to figure out where that
01:25:47 ◼ ► And it seems you don't really have, like you said, like you're not writing, Apple Intelligence
01:25:54 ◼ ► is not generating a first draft of an essay. So you can't get it, you can't give it a prompt.
01:26:07 ◼ ► from my local Apple store. I'm trying to pick something that is funny and wrong, but you
01:26:17 ◼ ► know, would be a headache for Jaws that, oh, everybody's getting the new Apple Intelligence
01:26:27 ◼ ► Yeah, an example we use internally and research use all the time is like, tell me how to hot
01:26:30 ◼ ► wire a car. It's a factual question. Most large language models go try it, will probably
01:26:37 ◼ ► refuse to answer that question because they've detected that it's illegal activity and not
01:26:44 ◼ ► So what you could do with Apple Intelligence is you can sit down and write your own story
01:26:51 ◼ ► that might be inappropriate or goofy or committing crimes in the story, select it and then say,
01:27:11 ◼ ► Right. It could be fictional, but that's not being generated. You're the person. You're
01:27:25 ◼ ► It is. And I mean, we literally had ethicists involved in our discussions on this. We do
01:27:33 ◼ ► have our roots as a personal computing company. This is a tool for you. So if you were writing
01:27:40 ◼ ► a story or maybe you are writing something about a very harmful phenomena for the purposes
01:27:51 ◼ ► of illustrating why it's bad or to protest against it. And if you're not careful, a model
01:28:10 ◼ ► harmful topics. Well, now that's not empowering you as the use of the system. We aren't going
01:28:17 ◼ ► to introduce the harm. We're not going to amplify the harm. But we do think it's a tool
01:28:29 ◼ ► Yeah. And we published a blog post yesterday which goes into some depth about how we built
01:28:37 ◼ ► these models and how they work and so on and so forth, some technical behind the scenes
01:28:40 ◼ ► stuff. And in there we actually listed some of our values about how we approach our Apple
01:29:20 ◼ ► But I'll even admit knowing Apple's sensitivity towards the brand and the brand promise to
01:29:36 ◼ ► that you don't want to limit, but you're not going to give them the story, but you'll let
01:29:59 ◼ ► or is the part of it where you're generating the most. Right. And there's three styles.
01:30:07 ◼ ► I think you call them animation, illustration and sketch, sketch, sketch, but not among
01:30:34 ◼ ► And there's no reason to do that in the first version of the product. And there's lots of
01:30:47 ◼ ► to it. Right. It is. This is kind of gross and uncomfortable and worrisome, really. Right.
01:30:54 ◼ ► Yeah, potentially. Yeah. Yeah. I mean, the opportunities for abuse are really high. And
01:31:09 ◼ ► We're not trying to create an alternate reality at all. It's very clear when you see our output,
01:31:21 ◼ ► alligator really on a surfboard? Is that alligator on a surfboard? We want to create no confusion
01:31:29 ◼ ► you can remove unwanted either people or objects in the background, that I'm guessing is generative
01:31:42 ◼ ► in what it is. So obviously, the capability is there in in the technology you have. But
01:31:48 ◼ ► there you're not generating reality. You're filling in the reality that was there behind
01:32:09 ◼ ► is also a point of great debate. We make sure to mark up the metadata of the generated image
01:32:13 ◼ ► to indicate that it's been altered in this way. And like you say, we're not. Yeah, it's
01:32:18 ◼ ► good to. But but but yeah, we're not trying to generate photo realistic images of people
01:32:27 ◼ ► or places or anything like that. And or when you point your iPhone at the moon, for example,
01:32:37 ◼ ► or a good example, another random example, just off the top of my head. One of if I can
01:32:47 ◼ ► pick one word to summarize the entirety of yesterday's keynote, I would pick this word,
01:32:55 ◼ ► trust. And let me give you an example. Hypothetically speaking, let's say there's another company
01:33:05 ◼ ► that makes desktop operating systems and they see AI as, you know, the bright future that
01:33:24 ◼ ► power AI PCs and would introduce a feature that could recall everything that was on your
01:33:33 ◼ ► screen every five seconds. And then it turns out that the first version they shipped is
01:33:42 ◼ ► shipping the database of all of the text that was on your screen all the time in a plain
01:33:49 ◼ ► text database on your startup drive. Hypothetically speaking, I think that's the perfect reaction.
01:34:02 ◼ ► But is that frustrating to you that that it sort of plays into the consumer's worst fear
01:34:11 ◼ ► about this stuff? New stuff is scary. Fingerprint sensor on my phone. Whoa, whoa, whoa. Apple's
01:34:17 ◼ ► going to have my fingerprint. That seems scary. Oh, no. You know, and they're just on device
01:34:25 ◼ ► you how the secure enclave is very clear and would never go to the cloud. And if you get
01:34:28 ◼ ► another device, you've got to use the fingerprint sensor again because we never had your fingerprint.
01:34:37 ◼ ► But that's I'm exaggerating. But I think it's not really normal. Not really. That was that
01:34:50 ◼ ► But like with the Windows recall feature, it literally plays into the worst fears people
01:35:04 ◼ ► out features that you're trying to build trust in that overall are outside? Are we frustrated
01:35:28 ◼ ► Let's right before you came out, we were talking about Swift. And to go back to that, a big
01:35:41 ◼ ► part of the news didn't really make the keynote because State of the Union is where developer
01:35:46 ◼ ► oriented news goes. That's you know, I always call it the developer keynote. And Xcode has
01:36:02 ◼ ► is completion, code completion. And the next one is Swift Assist. Am I getting the name
01:36:20 ◼ ► though. And I think trust. And I think there's two trust factors. The first one is can a
01:36:24 ◼ ► developer trust the code that's being generated? And then the second one is can the developer
01:36:36 ◼ ► true. Right. So so I mean, it turns out these large language models are very good at generating
01:36:42 ◼ ► code if you ground them in code that's either been run and tested or code that came, you
01:36:49 ◼ ► know, was legitimate code that was being used. And there's a technique for doing this called
01:36:53 ◼ ► retrieval augmented generation. And so that second feature is anchored in, you know, ground
01:37:00 ◼ ► truth about what is good working code. And yes, certainly we would not use our developers
01:37:07 ◼ ► code to train our models. In fact, we could say something even stronger, which is we don't
01:37:11 ◼ ► use any of our users data to train Apple Foundation models. So what what can you say about the
01:37:25 ◼ ► training data that you did use to train the code generation features? Where did it come
01:37:30 ◼ ► from? If not from users? Well, the foundation models themselves are trained on a wide variety
01:37:41 ◼ ► of data, some of it public web data and some of it licensed data. And so for the public
01:37:47 ◼ ► web data, these large language models learn to generalize by seeing literally, you know,
01:37:55 ◼ ► billions and trillions of tokens. When we do that, with with Apple bot, we let publishers
01:38:07 ◼ ► And then for the Swift code specifically, I actually don't know the answer to the question.
01:38:19 ◼ ► I do know that he uses rag. So it's grounded in code that's probably been licensed or otherwise
01:38:24 ◼ ► acquired. Yeah. And then we do a lot of synthetic data generation as well. When it comes to
01:38:28 ◼ ► coding data, we can take things like our own documentation prompt them up. All right. That
01:38:44 ◼ ► what what are the kinds of things people would want to do with this API write a program that
01:38:53 ◼ ► it, et cetera. And so through through these kinds of advanced pipelines, you can generate
01:39:08 ◼ ► going to become more and more of the answer for making these models smarter, especially
01:39:13 ◼ ► if you can build a good evaluation feedback loop to establish like this. This generated
01:39:25 ◼ ► engineers on your teams using these features only in a pilot sense so far? Because these
01:39:43 ◼ ► level we wanted very recently. And so I can't claim that, you know, this this release was
01:39:48 ◼ ► X percent written by our models, but people who who use them love them. I mean, the acceleration
01:40:07 ◼ ► And so I expect we'll be using it very regularly internally. Well, they will be. So I don't
01:40:12 ◼ ► mean to say present tense in terms of I know that this is new and maybe, you know, very
01:40:31 ◼ ► like a lot of internal demand, a lot of interest in it. And absolutely. And in terms of that
01:40:38 ◼ ► fear or just back of your mind suspicion that, oh, I don't want to use a tools that have
01:40:50 ◼ ► on your team will be able to use it with Apple intelligence, even when it goes to the private
01:41:04 ◼ ► well, and it's I mean, private cloud compute is a topic I'd love to dig into the the predictive
01:41:14 ◼ ► it's entirely on device. Right. So we really get. Yeah. Which which I think is relatively
01:41:20 ◼ ► novel. I think a lot of the tools out there that do that now are involving sending your
01:41:24 ◼ ► code off to somebody servers and not private cloud compute servers, for that matter. But
01:41:30 ◼ ► this is all running on device. It's incredibly fast and it has it's going to work when you're
01:41:34 ◼ ► on the plane or offline, et cetera. So I think that's a fantastic component. But, yeah, I
01:41:50 ◼ ► times what you want to use them for are things that involve personal or confidential information.
01:42:16 ◼ ► to doing personal intelligence that involves data you actually care about is ironclad privacy.
01:42:24 ◼ ► And there. Yes. And on device, of course, a fantastic answer where that works. And we've
01:42:36 ◼ ► is specializing these models for certain high value tasks by building adapters that can
01:42:47 ◼ ► compute to throw at the problem in an instance when it comes to these big things. But we
01:42:57 ◼ ► your data in any fashion. And so we absolutely moved mountains across like every component
01:43:05 ◼ ► of the stack from having our hardware team build us custom servers to us building a custom
01:43:11 ◼ ► OS and inventing a lot of technology for trusted attestation and server management, a bunch
01:43:19 ◼ ► of code written in Swift to do that, to create, I think, for the first time in the industry,
01:43:35 ◼ ► can have any access to the data that's being used for inference. So now you can use this
01:43:40 ◼ ► thing and really feel like, oh, I'm not I'm not exposing my data to others. It's existing
01:43:46 ◼ ► in that same privacy bubble that has protected my data on my phone. And that's been extended
01:43:52 ◼ ► to the cloud. And we think that's an extraordinary engineering achievement. And I think what
01:43:56 ◼ ► the future of AI should be. Was this table stakes for Apple that, OK, some of these tasks
01:44:20 ◼ ► these features, therefore, we need to do what Craig just said and build out Apple's own
01:44:40 ◼ ► the Apple silicon in the cloud servers saying? No, we didn't specify. I said we didn't specify.
01:44:48 ◼ ► Oh, we didn't specify. That's the way we choose not to answer. John, when you say how many
01:45:10 ◼ ► all right. Let me ask about this with private cloud compute. A lot of people are very concerned
01:45:28 ◼ ► large language models and generative in the servers in the cloud. The environmental impact.
01:45:35 ◼ ► Very, very serious concerns. What is the story with Apple's private cloud compute and its
01:45:43 ◼ ► environmental impact? I mean, there are two reasons why we wanted to build private cloud
01:45:48 ◼ ► compute out of Apple Silicon. One is the privacy architecture it gives us, you know, building
01:45:53 ◼ ► on the secure or enclave, trusted boot, every other thing about the systems. But the other
01:45:59 ◼ ► is the incredible energy efficiency of Apple Silicon, which, you know, is forged in our
01:46:04 ◼ ► history of building mobile silicon. This is not the history of your typical A.I. inference
01:46:18 ◼ ► private cloud compute. And this is going to allow us to continue to have our data centers
01:46:22 ◼ ► be 100 percent carbon neutral, which renewable energy. More than that. It's more than 100.
01:46:33 ◼ ► not just carbon neutral. It's entirely 100 percent renewable energy. That's right. Yeah.
01:46:40 ◼ ► So to clarify, this is not with carbon offsets. No, this is 100 percent renewable energy.
01:46:51 ◼ ► on your device, which is inherently very efficient. Right. And not even going to the data centers.
01:46:57 ◼ ► But as you know, this has been a big concern for generative A.I. is the amount of energy
01:47:01 ◼ ► they they're using off the grid. And we have an incredible Apple kind of story on that.
01:47:09 ◼ ► Is it possible that achieving that took a little extra time? Hence the patience. Patience.
01:47:24 ◼ ► know, a couple of years ago, the hot new thing was cryptocurrency. And I don't think people
01:47:30 ◼ ► were waiting for Apple to mint its own cryptocurrency. I think most people expected Apple to let
01:47:44 ◼ ► a lot of there there. And I think there was a consensus of, oh, that 2030 whole company
01:47:51 ◼ ► is going to be carbon neutral. Wow. That goes out the door because now there's this monkey
01:47:56 ◼ ► wrench thrown in. But no, no, this does not disrupt. This doesn't move it back to 2031.
01:48:03 ◼ ► It's day one. Renewable energy. Yeah. I think that that goal is very much alive and well.
01:48:18 ◼ ► Can you speak to me a little bit about this? This idea of verifiable images on the cloud
01:48:43 ◼ ► your iPhone, we of course, are every time we make a change to the behavior of of iPhone,
01:48:51 ◼ ► it's because we've shipped a software update with an image of the software and researchers
01:48:56 ◼ ► are able to debug that system and observe its behavior and check its code and so forth.
01:49:04 ◼ ► And traditionally in the cloud, that's totally not the case. Right. If you someone's operating
01:49:10 ◼ ► in a server, you're sending your request to their server. You have no idea what's running
01:49:15 ◼ ► on that server. You have no way to inspect what's on that server. Even if at one moment
01:49:20 ◼ ► in time it got audited, you don't know if a minute later it changed its code and a minute
01:49:24 ◼ ► later changed it back. I mean, you just can't have any confidence in what's happening there.
01:49:35 ◼ ► that same property. And we do this by saying that the whole trusted boot infrastructure
01:49:45 ◼ ► that the SCP, the secure enclave processor on the server is measuring is able to identify
01:49:59 ◼ ► code can be added. No new code can be generated at runtime. Like the OS literally won't map
01:50:04 ◼ ► in any executable code pages that aren't part of the encrypted image. And when you make
01:50:09 ◼ ► a request to that server, there's a handshake where the SCP attests and says this was a
01:50:22 ◼ ► talking to if you were to send me the request and here are the keys to have that conversation
01:50:27 ◼ ► with this thing. Well, separately your phone is going to look in an attestation log, an
01:50:34 ◼ ► append only log, brought up crypto, that kind of technology, that Apple had to post that
01:50:42 ◼ ► these are known posted public images of this and your phone says is the image I'm talking
01:50:55 ◼ ► attacker were to try to sign an image, put it on that server and try to intercept a request,
01:51:08 ◼ ► to publicly post. We are then also equipping researchers with VMs so they can run those
01:51:22 ◼ ► step up from what they're offered for iPhone. And do the same thing where they can take
01:51:27 ◼ ► the image on their local VM that they're testing and check that its signature matches. Exactly.
01:51:33 ◼ ► The one that you've posted to this write only chain that they can verify that you didn't
01:51:38 ◼ ► ship, you Apple didn't ship the researchers a different image. Exactly. Exactly. And no
01:51:50 ◼ ► one's ever done anything like that. Not that I know of. Yeah. It's not a subtle difference,
01:51:57 ◼ ► but a complete inversion difference with Apple intelligence in the starting point is that
01:52:02 ◼ ► because you were running on the customers devices, you've got all of this personal context
01:52:11 ◼ ► to start with as opposed to starting with whatever was just typed in a box. How different
01:52:33 ◼ ► models? The core models are so-called foundation models. What's new and novel is that they
01:52:40 ◼ ► can operate, the ones on device can operate over their semantic understanding of the user's
01:52:46 ◼ ► data. So we have a semantic index of your messages, things on your, Craig was mentioning
01:52:53 ◼ ► earlier about relationships in your photos and so on and so forth. And so the model can
01:52:57 ◼ ► reason over that private data. And what that gives you essentially is a personalized feature
01:53:04 ◼ ► that is personal to you and runs on your device. Whereas these other chatbot like products,
01:53:10 ◼ ► they have tremendous world knowledge, they've ingested all of Wikipedia or whatever, but
01:53:14 ◼ ► they don't know anything about you. And so they're limited from talking in a completely
01:53:17 ◼ ► unpersonalized way. And we think that that's just interesting but less useful than getting
01:53:38 ◼ ► Right, right, right, because it's not going to say I don't know. We can try it right now
01:53:44 ◼ ► But I mean it's even more foundational than that. If you ask a chatbot what time is it?
01:53:48 ◼ ► It doesn't know. There's no concept that there's an API call that you should be making.
01:53:57 ◼ ► time and what's interesting about the semantic index is for many, many years foundational
01:54:02 ◼ ► part even going back to Mac OS X has been Spotlight and apps and with iPhone apps contributing
01:54:10 ◼ ► information in the Spotlight so that the user could search them or so that the app could
01:54:20 ◼ ► Well that same interface, the same way that apps are donating that context to Spotlight,
01:54:29 ◼ ► not just by keyword and so forth but semantically and that becomes the source for Apple intelligence
01:54:40 ◼ ► So it's not that there's a whole new, there's no new information being exposed to the system.
01:54:44 ◼ ► It's the same information that's been exposed to Spotlight all along and there's no new
01:54:49 ◼ ► work for developers. Another good reason for developers though to provide information into
01:54:53 ◼ ► Spotlight because now it's making Apple intelligence even more able to help the user and to direct
01:55:12 ◼ ► And the difference in context, like I said, it's very useful over here with world knowledge
01:55:25 ◼ ► But in your career really, do you find yourself thinking about how humans deal with context
01:55:44 ◼ ► Does your mind as someone whose career has been like this, do you constantly think about
01:56:01 ◼ ► So I mean the technical answer to this question is the breakthrough with transformer models
01:56:07 ◼ ► and LLMs that has caused all of the excitement of the last few years is around an idea called
01:56:14 ◼ ► attention which is that not all data is equally relevant when you're considering a time sequence
01:56:20 ◼ ► So this is why we use examples like when do I need to leave to go pick mom up at the airport
01:56:24 ◼ ► because the model really truly has understood the notion that you have to do one thing before
01:56:39 ◼ ► But that notion that there is the idea of picking somebody up from the airport is actually
01:56:48 ◼ ► And that has really been the breakthrough for LLMs and why this technology is so important
01:56:55 ◼ ► That brings me to the last topic from the keynote and that is the partnership with OpenAI
01:58:33 ◼ ► I mean, it just became a thing that you would type into the URL bar your search term and
01:58:40 ◼ ► Well similarly, if you actually do want to write an essay about blue whales and you want
01:58:46 ◼ ► And we want to make it very super clear to you that you're using ChatGPT, that the thing
01:59:31 ◼ ► And you know, if someone for some reason just never wants it to be suggested to them at
01:59:40 ◼ ► And so if this isn't somehow, you know, insidiously integrated and pre-enabled or anything like
01:59:54 ◼ ► And your IP address is obscured and your requests are not logged, you know, so it was a lot
02:00:00 ◼ ► There's a lot we did to make sure that if you were going to use it, that it was as good
02:00:41 ◼ ► I've seen this floating about on social media, the same sentiment, but something to the fact,
02:01:06 ◼ ► I want AI to wash my dishes and do my laundry so that I can make music and I can make paintings
02:01:16 ◼ ► And I think it speaks to Apple as a tool maker for people who want to do things like that.
02:01:43 ◼ ► How many people now, I mean, just to go technology from the 80s could pick up a synthesizer and
02:01:51 ◼ ► now make music that maybe previously involved in orchestra, but now they're composing something
02:02:02 ◼ ► I think these tools are going to enable people to, a single individual to express their creativity
02:02:44 ◼ ► I would like to thank my friends at Sandwich who shot the video, Spatial Jen who shot this