469: Bring in the Boffins


00:00:00   [Music]

00:00:09   From Relay FM, this is Upgrade, episode 469. Today's show is brought to you by Factor, Vitally, and Ladder.

00:00:18   My name is Mike Hurley and I'm joined by Jason Snow. Hello, Jason Snow.

00:00:22   Hello, Mike Hurley. How are you?

00:00:25   Oh, you know what, Jason? I'm okay right now.

00:00:28   You're gonna get angry later.

00:00:29   I'm gonna be very upset.

00:00:31   This is a summer of un-fun happening later today. Before we get into it, I'm gonna share with you a little brief glimmer of the summer of fun.

00:00:39   Okay? Glimmer of the summer of fun. We went to, we visited friends of ours in the Midwest, in the American Midwest.

00:00:49   And we went to a family that's been in their family for decades, cottage on a lake. On a bunch of lakes.

00:01:00   Fantastic.

00:01:01   And there's like a boat, and there's a boathouse, and there's like some chairs on a deck, and it was warm.

00:01:07   And at one point there was like a brief thunderstorm that blew through and then went back out, and then it wasn't rainy anymore.

00:01:14   It was a quintessential summer Midwestern experience. I loved it. Now I'm gonna give away where it was by saying we had some, I had some fried cheese curds.

00:01:23   I had some local, a local beer. A lot of cheese in this place. It's America's Dairyland, you could say.

00:01:31   Is there cheese in the beer?

00:01:32   It's Wisconsin. It's Wisconsin.

00:01:34   Oh, okay.

00:01:35   Beer you can only buy in, no, but, no, there's no cheese in the beer, but there's beer you can only buy in Wisconsin. It's sort of like an attraction.

00:01:42   Anyway, it was, I've heard them talk about, we've known these guys for like 15 years, and I've heard them talk about this family place, and it always sounded amazing.

00:01:50   And I had never gotten to go there, and I got to go there this weekend. It was amazing. It was. It lived up to the hype.

00:01:55   So that was my summer of fun. And now I'm completely exhausted because we got in at like 1.30 last night, but it's okay. Upgrade comes first.

00:02:05   And I'm gonna chew you off a 20 minute run anyway, so it's gonna be okay.

00:02:08   I was actually not too far. There's a cheesehead in our Discord now. I was not too far from Green Bay. I did not go to Green Bay. We didn't go quite that far.

00:02:15   I was relatively close to Green Bay, the home of the famous Green Bay Packers, but I didn't go that far.

00:02:21   Speaking of American football, I have a Snortle question for you, Jason. It comes from Daniel.

00:02:26   I'm ready for some football. Okay.

00:02:27   And Daniel says, "If Tim took you up on your ticket offer, what would have been the one question you would use this opportunity to ask him?"

00:02:35   Okay, now the question says what would be you made have been as if this has already failed. Keep hope alive, Mike. Keep hope alive. Tim?

00:02:41   Sorry, I thought maybe the game would pass. I didn't know when it was.

00:02:43   Just to re- oh no, the game- no, the game- Mike, the game is not until September, okay?

00:02:48   Why would I know that? Did you say that?

00:02:50   I did say it when it was last time. It's September 9th.

00:02:53   There were a lot of things said in the Snortalk for last- or that follow out for the last episode.

00:02:58   September 9th. Just to recap for everybody, I have four season tickets to Cal. We have three that are reliably spoken for.

00:03:07   We use the fourth for guests and things like that. I offered it to Tim Cook last week. Haven't heard from Tim yet, but Tim, still open.

00:03:15   Haven't put that one up on StubHub or found a friend to go with me yet, because Tim, it's there for you.

00:03:22   It's there for you.

00:03:23   And Daniel's question will be the one question you would use this opportunity to ask him. Okay, Daniel, American football games are extremely long.

00:03:31   If Tim took me up on my ticket offer, we would be chatting throughout the entire game.

00:03:38   Right, but here's the thing-

00:03:40   We had a long, free-flowing conversation.

00:03:42   What we're not asking here is like, you're only going to get to say one thing to him.

00:03:45   Because you're going to be like chit-chatting, right? Just nice- but what is the burning question?

00:03:50   That's what's going on here. You're going to be with Tim the whole time. You're going to be like small talk, get to know you.

00:03:58   You know, "Oh, what about that pass? What about the interception?" and all that kind of stuff. Defense, defense, all these kinds of things.

00:04:04   But there's going to be- you're going to get the opportunity when the walls come down a little bit. You can fire one at him. What's it going to be?

00:04:13   So here's the thing. The realism of this, which there's very little, precious little, but there is a little bit, is Tim's not going to say stuff to anybody, but certainly not to somebody who writes about this stuff.

00:04:26   That's going to give anything away. So what I would be probably most interested in, rather than saying, "Hey, how about that car project?"

00:04:34   Is I would ask him about some personal stuff. I'd ask him like how he uses his Apple Watch. Like what's his routine? What does he use at his desk?

00:04:50   Is he using an iPad more or is he using a Mac more now? How has the Apple Watch changed how he views fitness? Does he-

00:05:02   I want to know what's on his home screen.

00:05:04   Has he changed his approach to fitness because of the Apple Watch? Has he changed the Apple Watch's approach because he can do that?

00:05:12   What's on his home screen? Yeah, that's a very interesting one. Like what are your go-to apps? Although even then-

00:05:19   The truth is that somebody like Tim Cook, even a version of Tim Cook who would show up at the five yard line on a football game and sit in a row with the regular people,

00:05:30   even that version of Tim Cook would be incredibly circumspect about what Apple is doing. But yeah, that would be the stuff that I think might be interesting.

00:05:38   Is the human stuff of like, I always kind of want to know like what makes him tick and what's his day really like and how does he actually use the tech?

00:05:47   And how does that inform how he charts the course? Because, you know, he's not going to- I could say like, "So you were totally going to kill the Mac and just build up the iPad until you did that thing."

00:05:58   I'm not going to do that because he's not going to talk about it. This is me taking Tim Cook to a football game, not me giving a truth serum to Tim Cook while I have him in my basement evidence dungeon or something.

00:06:12   That's not what this is.

00:06:14   I mean, it's the same as like, you know, every time anybody interviews an executive and they're like, "Why didn't you ask?" Because they're not going to answer it.

00:06:21   There's no point in asking. There's no point in asking. It's wasting everyone's time. It creates a welcoming atmosphere to everyone. It's just like not good.

00:06:29   I would love to- if I had Tim Cook in the cone of silence, you know, of course I would say, "China is really a difficult problem, isn't it?" Right?

00:06:40   Like, what are your thoughts about how you approach this? But he can't say that stuff publicly because China will hear them. So he's not going to say that.

00:06:47   So it's very difficult. But I don't know. Yes, my opportunity, if I was going to ask him about Apple-specific things, would probably be more about, again, just like one human being talking to another.

00:06:58   And we both have spent a lot of time thinking about the technology that this company makes, about how it impacts him personally and how that's informed his job and stuff like that.

00:07:07   Look, I just spent four hours each way in a car with a friend of mine, and we had wide-ranging conversations about what my job is and what his job is and observations about the world.

00:07:19   And that's sort of the stuff that happens at the football game, too, is that you chat about light stuff and personal stuff and observational stuff.

00:07:28   And that's just part of the tapestry of it. So in this scenario, yes, I would just enjoy chatting with Tim a little bit and getting a better sense of who he is and what makes him legitimately excited.

00:07:40   And it would be personal stuff like that.

00:07:43   I look forward to tracing the path of this until the game in September.

00:07:47   Oh, yes, the path, yes. As we get closer and Tim's people call my people, which again is me, we'll see.

00:07:56   They could call me and I could act as your people.

00:07:59   They could. Actually, if you work with Tim and work with the executives at Apple, I just want to point out Mike has an anonymous informant network.

00:08:08   It could be activated for this purpose. You could contact Mike and send out feelers about the possibility of a meetup.

00:08:18   We could do some security groundwork stuff.

00:08:21   Exactly. We could work out extraction plans and those kinds of things if needed. It's all available.

00:08:28   You can send that information in the same way that you can send in a Snell Talk question for us to open the show with like Daniel did by going to upgradefeedback.com.

00:08:38   We have some follow-up. Jason, would you like to start?

00:08:41   Yes, this is Mike's way of saying, "What is this thing that you pasted into our document?"

00:08:48   No, I mean, I've read it, but I'm also kind of like...

00:08:50   So we talked about the photo picker in iOS 17 and changes to the photo picker and changes to the way apps use the Photos app.

00:08:58   I heard from several people, including our friend Casey List, who did an app that uses Photos permissions.

00:09:05   This is, I guess, an anonymous informant sort of? I don't know.

00:09:10   From the Photos team who said there are actually a bunch of the things that we talked about, about like having the photo picker that the app doesn't see.

00:09:22   That's been around a long time. The difference is... There's like a couple big things that have happened.

00:09:28   One is better privacy information about the system picker.

00:09:31   There's a banner in the picker itself and a new UI in the settings that educate users about what private access for photos is.

00:09:38   So Apple's trying to do more disclosure. The idea here is there's a push-pull.

00:09:42   It's like users should know this and developers should know that the users are going to know this

00:09:46   and that maybe developers who want more photo library access need to back off unless they really need it.

00:09:52   The app I always think about is Slack.

00:09:55   Slack has its custom UI where it shows you the last few photos on your camera roll.

00:10:01   That's totally not necessary.

00:10:03   Now I'm in a weird limbo state on my phone with Slack where I try to add a photo and it says,

00:10:09   "Do you want to add photos to the list of photos I can see or pick photos?"

00:10:14   It's like, "Just get out of my way. You should just get out of my way."

00:10:18   But anyway, the functionality of walling off your photo library from apps has been around a while.

00:10:24   What's happening is there's a lot more disclosure and a lot of stuff that's pushing users and educating users about how they handle the permissions.

00:10:31   There are also new APIs that allow apps to embed a compact version of the system picker,

00:10:37   which means that apps now never need to ask for library access theoretically

00:10:42   because one of the reasons you do it is that you had a smaller photo picker that you wanted to use

00:10:47   and so you'd build a custom one that required complete photo access for your app to build it

00:10:52   and now you can just opt in to that.

00:10:54   I'm sure there are apps that are like, "No, no, no. We really need full photo library access."

00:10:57   Basically, it's providing some tools for developers and it's talking to users about this.

00:11:03   That's all Apple cranking up the pressure to get apps to really stop asking for full library access.

00:11:09   But what we described last week about how Apple does offer this thing,

00:11:15   that thing's been there for a long time.

00:11:17   What's changed is the disclosure to users and some sort of Apple leaning on app developers.

00:11:24   There's a carrot and a stick here, basically.

00:11:27   I'll put the link in the show notes to a WWDC 2023 session that's all about what's going on here.

00:11:35   In the picker, there's an option sheet now that lets you take the location information out and stuff like that.

00:11:41   It's tidied up.

00:11:42   I feel like the prompts were really confusing before of like, "Allow access to some photos,"

00:11:49   "Allow access to all photos," "Add more photos to the access list."

00:11:52   It was like a whole thing.

00:11:53   It's interesting and I wonder if this is a future direction, but it's clear to me, I think,

00:11:59   that Apple does not want apps to ask for full library access for photos unless there's a very compelling reason for it.

00:12:07   There are still apps out there, probably for legacy reasons, probably because there was a reason five years ago and they did this.

00:12:14   There are apps out there that ask for everything and they just don't need everything and they need to just stop.

00:12:22   So anyway, thank you to the person from the Photos team who wrote in.

00:12:24   Thank you to Casey.

00:12:25   Thank you to the other people I heard about this.

00:12:28   Not a new feature, just Apple is using, I think actually maybe even Tip Kit.

00:12:32   I don't know.

00:12:33   Apple's using like got little disclosure things that are like, "Your app has access to everything."

00:12:40   They're labeling all that stuff now and that's the new thing.

00:12:43   So it's drawing.

00:12:44   I guess basically it works as intended in that we're talking about it now and that's what Apple's really kind of wanting to do,

00:12:51   is put a spotlight on this and say, "Hey, you should know that apps are able to look at everything in your photo library and maybe they shouldn't."

00:13:01   So do you remember my complaint last time about the LOL emoji, like when you typed LOL on the keyboard and the emoji that was suggested in the quick type bar?

00:13:09   Yeah, ROFL.

00:13:11   Indeed.

00:13:12   I had a variety of people write in with different things going on.

00:13:17   Most people were sending me screenshots that they were on the iOS 17 beta and they were seeing the existing emoji, the correct emoji, the ones that are mostly just laughing emojis.

00:13:27   I ended up working out this change exists on the British English keyboard.

00:13:33   The American English keyboard has the correct previous emoji from iOS 16 and before.

00:13:38   So for an iOS 17, for some reason, Apple believes that British people do not laugh like normal people.

00:13:44   Oh, right.

00:13:45   Have changed the emoji accordingly.

00:13:48   So this is one of those things where this is very weird and I have seen, I've had a couple of people write into me who were in the UK and so they had this set to British English and it was showing weirdly like mine were.

00:13:59   But I could test it myself by just changing my dictionary on the iPhone and then when I typed LOL, it showed me the correct emoji.

00:14:07   So this is one of those things where I guess I will see what happens when the next beta comes around and then I will file the feedback.

00:14:14   Do they not know, like does LOL mean something different in Britain?

00:14:20   Lots of love.

00:14:21   Lords or ladies?

00:14:23   Well, this is that thing like, you know, I had this, my mom, lots of parents think this, like they think LOL means lots of love.

00:14:29   But the emoji do not seem to indicate lots of love either.

00:14:33   Lords or ladies, what was the other one I thought like limeys over lemurs.

00:14:40   I don't know. Anyway, it doesn't make sense. It doesn't make sense. I don't know why this happened.

00:14:47   Unless that was my best guess is like there was somebody who's in charge of the British keyboard who's like no, no, no, no, no, no, no, no, no, no.

00:14:54   These emoji are wrong here for cultural reasons.

00:14:57   Or someone's just pranking me, right?

00:15:00   Someone at Apple is pranking everyone in Britain.

00:15:04   Well, I mean, that's another segment.

00:15:07   I have some follow up about the burgies. So the burgies has been dictated last week is where we look at each other's recently used emoji.

00:15:15   David wrote in to say, as the only show with its own penance system for the keynote drafts, right?

00:15:23   So the penance behind this, which designate champion and challenger.

00:15:27   Please note that in some contexts, your pennant could be mistaken for a burgie.

00:15:33   A burgie is a type of flag that are parts of recreational boating organizations that are pennant shaped.

00:15:41   Okay. So if we had a boat, an upgrade boat, we would have a burgie on our boat and we could put an emoji on the burgie.

00:15:51   Yes, we could put an emoji on the burgie.

00:15:53   This is a yacht club thing. So we'd have to start a yacht club.

00:15:57   Makes me kind of want to have a boat now.

00:15:59   The SS upgrade. Unity has launched their beta program for vision OS game development tools.

00:16:07   So that's out there and available now. They had some like showing off some examples.

00:16:11   I think the what the golf developer was making a game, which is interesting to me, Jason,

00:16:17   because this tri-band that come to have a VR game is called what the bat.

00:16:22   So it was very peculiar to me to see them bringing a non VR game to this rather than just using their VR game.

00:16:28   But who am I to criticize?

00:16:31   Yeah, it's a Apple did a whole thing where they're like working for years with unity because they knew they weren't going to be working with the other 3D tool developer because of all the lawsuits.

00:16:46   And now you can sign up as a developer to get access to the unity tools for vision OS and build stuff.

00:16:52   And they had some samples from different developers and it's just all part. It's clearly part of the development story rollout plan for vision OS.

00:16:59   It's very important, right? Like if they do want any game experiences, especially any VR ports to vision OS, which for sure they do want, they need this.

00:17:10   And so I'm pleased that this is available again, like now, like way in advance. People can go and try it out.

00:17:17   And there was one last thing I wanted to mention. I just saw it. It was interesting.

00:17:20   We've spoken a little bit about pass keys on the show. Later this year, one password will allow for unlocking your vault with a pass key.

00:17:28   Nice.

00:17:29   I just thought it was an interesting thing to do, right? I feel like it could maybe allow people to have even more security over their actual one password.

00:17:39   So this one to me seems like, oh, yeah, you know what? I might do this because it's not it's not something I need to share with anyone.

00:17:45   Right. And so I thought, yeah, this this one kind of makes sense to me.

00:17:49   I had the I hit last week the thing that I've been dreading for a while now.

00:17:55   And it actually kind of enraged me. Right. Which is I got somewhere where Apple said, oh, you can unlock this with a pass key.

00:18:06   Do you have your phone with you? Right. And I thought to myself, no, I am on a Mac with touch ID.

00:18:16   Oh, and I don't have my phone with me.

00:18:21   Why do I need my phone to log into this thing on my on my Mac? Is there really no way to do it? There's no way to do it.

00:18:29   You can't even set it up on a Mac to use like separately. I don't know. I don't know. I hate it.

00:18:37   Yeah, that's less than ideal. Yeah.

00:18:40   I just if somebody if somebody knows the reason, you know, I guess let us know. But like I don't I kind of don't care what the reason is in some ways.

00:18:49   Right. Like I have a biometrics authentication system on my Mac that allows me to log in to things and identify who I am.

00:19:00   I don't understand why you won't let me use it on the Mac and that my Mac is incapable of using this thing unless I happen to have my phone with me to scan it.

00:19:11   It's my Mac. It's not a guest Mac. It's my Mac. Why are you having me do that?

00:19:15   Because that is a step back, because if you use iCloud keychain, then that would have worked. Right. You could have used a thumb.

00:19:22   There are places where you can use your finger to authenticate on the Mac and log into things. But apparently not Apple ID stuff for whatever reason.

00:19:32   I just and I don't I mean I can see somebody saying, well, the reason is because there's a chain of events this and then this and then and I was like, OK, there's a reason.

00:19:42   That's why I say I kind of don't care about the reason because I this should work. Right. It should work.

00:19:46   And if it doesn't work, fix it. Right. Like find a way to make it work, because there's no way that a Mac user sitting at their Mac trying to do something should be told, go get your iPhone and scan this code.

00:20:00   If I was logging into the Apple ID website, one of the times that I might want to do that is if I lost my phone. Yeah.

00:20:07   Well, then my phone's not with me to log in with my passkey. So then what do I do? Yeah. Yeah.

00:20:15   I know that there are steps, but it's like this. That's not. Yeah, interesting. OK.

00:20:22   This episode is brought to you by Factor. Now that we're in the thick of summer, you might be on the lookout for some wholesome, convenient meals to support sunny, active days.

00:20:32   Factor is America's number one ready to eat meal kit, and it can help you fuel up fast and flavorful and nutritious, ready to eat meals delivered straight to your door.

00:20:41   You'll be saving time eating well and staying on track to reaching your goals. With Factor, you can skip the extra trip to the grocery store.

00:20:48   They're fresh, never frozen meals are ready in just two minutes and you can treat yourself to more than 34 weekly restaurant quality options like bruschetta shrimp,

00:20:57   risotto, green goddess chicken and grilled steakhouse filet mignon. So good. Plus, keep your energy up with lunch to go. Factor's effortless, wholesome meals like grain bowls and salad toppers. No microwave required.

00:21:11   Factor offers options to fit a variety of lifestyles, including keto, calorie smart, vegan and veggie and protein plus.

00:21:19   Then select from more than 45 add-ons, including breakfast items like apple cinnamon pancakes, bacon and cheddar egg bites and smoothies and more.

00:21:28   And you'll be able to rest assured that you're making sustainable choices because Factor offset 100% of their delivery missions to your door,

00:21:35   along with sourcing renewable electricity and featuring sustainably sourced seafood. Jason, could you let me know some of the good, good food you've eaten from Factor?

00:21:45   Oh yeah, I mean, the chicken is the stuff that sticks to mind. I know I've said this before, but like it's not crappy. I have had, again, not going to name names,

00:21:52   but I have had other things that come in a box to your door that I was disappointed with the quality of the meat, especially like the chicken would be kind of like crappy.

00:22:02   And this was the best kind, like the kind I would get if I went down and got some at Whole Foods. It was that kind of very good quality, tasted great when I ate it.

00:22:10   All the other ingredients also looked really good. And that was the thing that I was on alert about for this product when I got it.

00:22:15   It was like, yeah, okay, it came in a box and I'm going to heat it up, but is it going to be good stuff? And the answer is yes, it is high quality stuff. I was very impressed.

00:22:24   Head to factormeals.com/upgrade50 and use the code upgrade50 to get 50% off your first box. That's upgrade50, the code upgrade50 at factormeals.com/upgrade50 for 50% off your first box.

00:22:41   Our thanks to Factor for the support of this show and Relay FM.

00:22:46   Every now and then a topic will come to the show or will come to mind or like a thing will happen and we're going to talk about it.

00:22:57   And it's usually, and like these things are like, I don't want to talk about these things. You know, like I know you feel it too. Like we had this a lot with the CSAM scanning, you know, things that I've like, or, you know, like we've spoken about it with like.

00:23:11   Oh, some of the Apple lawsuit stuff was like that. And it's like, I don't want to, I don't want to talk about it. I don't want to get into it.

00:23:18   But they're important things.

00:23:19   Not fun, but they're important and you got to talk about it.

00:23:21   But you're not going to have a good time and we're in the summer of fun. This is just not fun.

00:23:25   No, not fun.

00:23:27   So, and as well, I feel like I should talk about it because of where I am in the world and you know, so here I am.

00:23:34   Okay. So this is a thing that I would expect most listeners aren't aware of or very recently became aware of.

00:23:42   We have something happening in the UK right now called the online safety bill.

00:23:47   So it's going through the final stage of law, the lawmaking process in the UK.

00:23:52   So just as a very quick thing, we have two chambers of government, right?

00:23:57   We have the House of Commons and the House of Lords. The House of Commons is made up of representatives, right?

00:24:04   Members of parliament. They, you know, they represent the voting of the country.

00:24:09   They will work on laws and they will vote and move them over to the House of Lords.

00:24:13   The House of Lords is a selection of people who have through some kind of peerage or have been appointed to.

00:24:20   Typically, a lot of people have had long standing political careers.

00:24:24   They are then appointed to the House of Lords. They are a second check.

00:24:28   They debate and argue, make revisions, maybe will sometimes actually send something back to the House of Commons.

00:24:35   They're like, nope, do it again. But this piece of legislation is in the House of Lords right now.

00:24:41   It's being debated. It's most likely going to go through in some fashion.

00:24:47   I've had some conversations with some people who I trust to know these things who have said this is going through.

00:24:53   It's just a case of what shape it takes.

00:24:56   This bill, the online safety bill, started work in 2019.

00:25:01   And it was ostensibly a bill to make the Internet safer for children and has a lot of that stuff in it.

00:25:10   The problem is a bill that's been in the works for like four or five years.

00:25:17   It's almost like rolling a snowball down a hill that it has picked up so much stuff on its way through the four prime ministers that have been a part of this bill at this point.

00:25:31   So it was ostensibly about things like to try and make, you know, age verification stuff and trying to keep harmful content away from children.

00:25:43   That was what it was looking at.

00:25:45   Then age checks, mandatory age checks for adult content got wrapped into it.

00:25:49   Then identity verification for social media websites got wrapped into it to try and stop anonymous online trolling.

00:25:56   Then to try and get rid of scam ads, that got pulled in. This bill is huge.

00:26:02   And I think it's one of the reasons why this genuinely, like, I just didn't know this was happening right now.

00:26:09   Neither do a lot of people in the UK, even aware that this bill will pass through Commons because it's been going on for so long and it's now so big that people have just kind of ignored it.

00:26:20   It is truly shocking to me how little it's been reported on until it already passed through Commons and it's with the House of Lords.

00:26:30   But this has recently started, and I will include, there's a bunch of links in the show notes for some good articles.

00:26:36   There's a great one on The Verge by John Porter where John has gone through and created a timeline of this bill and it was very informative to me of like where it started and where it is now and all of the things it's picked up along the way.

00:26:50   But this has recently started to cause a bit more trouble due to provisions in this bill for, you guessed it, scanning encrypted messages for child sexual abuse material, terrorism activities and abusive content.

00:27:05   Essentially, our government wants to give our regulatory body, that's called Ofcom, let's just say that this is closest to...

00:27:15   - What's the FCC? - Yeah, the FCC, that's it. My brain was saying FTC, but that's not it. The FCC is what it's kind of closest to.

00:27:22   It's like our regulatory body for communications and it covers like television and the airwaves. It's effectively to give Ofcom the power to demand a company, a company like Apple or WhatsApp, or as a matter, to create a backdoor into their encrypted messaging and communication apps to allow for this stuff to be scanned for and reported.

00:27:49   It has led to exactly Apple, WhatsApp and Signal saying that if these rules come to pass, they may have to exit the United Kingdom completely.

00:27:59   So the issues that we're encountering here is companies are either going to... basically, the law is complicated in such a way of like, if this law is passed, it is not saying that Apple, from my understanding, it is not saying like, from day zero, Apple need to create a backdoor.

00:28:20   What it's saying is, we as the government may be compelled to ask you at some point to do this. So like, they could pass the law, but nothing changes until the time it does.

00:28:32   And there are some like, amendments being proposed now about like, what is the chain of events that need to occur for that to happen? None of these are still very good, but like, these are the things that are being argued.

00:28:46   So, when and if that occurs, that the UK government will go to Apple or go to WhatsApp and say, we need the backdoor, the only solution, realistically, is that these companies will have to say no, and they will have to say no for two reasons.

00:29:06   One, we won't do it. Two, we can't do it, right? Which is what these companies are saying. So the only solution then is they would need to literally leave the United Kingdom.

00:29:17   Or build a feature that would back them back into something like, what is it, Facebook that does this? Back into a, you can opt to be in not encrypted mode or whatever, but even then, if people opt to be encrypted, the UK law would basically say no, no, no, no, we want it anyway.

00:29:38   And there's no way to do that. We've talked about this a lot. This is the math part, which is, end-to-end encryption means they can't break in.

00:29:48   So what this law is basically saying is, don't do that. Don't have end-to-end encryption. We're outlawing it functionally because you can't, and they may not even think about it this way, because remember, there's this magical thinking that happens in a lot of governments, where they think, oh, these tech wizards will just figure it out.

00:30:06   But if you look at the math of cryptography, that's not how cryptography works. It's just not how it works.

00:30:12   And like this is, you know, just there is obviously a little bit of posturing occurring right now, right? Which is where we are right now. Everyone's posturing. Like Apple could just create a version of iMessage, which is unencrypted. I'm not saying they would do that, but they could do that. Right? And so they could.

00:30:29   And then just basically in the UK, no iMessage was encrypted, and you'd get a warning if you were texting with somebody in the UK saying this message is not encrypted.

00:30:37   I don't necessarily, well, I don't think they should have to do that. I'm not sure if they would do that, but they could. But everyone right now is just saying, we will have to leave the United Kingdom.

00:30:47   And it kind of seems like at the moment that the government is hoping somehow that tech companies can find a way to do the thing they want them to do, which is like do one of these, as you mentioned, right? All those, those, those boffins over there, they're going to work it out. Right?

00:31:02   Good word. They probably use the word boffins. Right?

00:31:05   I thought I might as well go full on.

00:31:06   The UK is the home of wizards, but boffins is even better. Right? Because boffins is, I don't understand it, but they, they, they, they talk their little blah, blah, blah, blah, blah, and then magic things happen. So they'll work it out. They always do.

00:31:17   Which is what all governments think, right? It's, this is just like, it's just like, oh, they're so smart.

00:31:22   All politicians and governments. Yeah, we'll, we'll, we don't need to worry about it. We'll, we'll just deal with this.

00:31:27   I mean, and I guess this is one of those things though, where like, I understand why people feel that way because there have been so many times where these kinds of things have happened, right? Where it's just like, we need to get around this problem. We'll just sort it out. We'll just work it out.

00:31:39   But this is one of those ones where like, there isn't a way to do it and stay true to what encryption is. Right?

00:31:49   Right. They're, they're like the, they're, they're whistling past the graveyard here. They're like, yeah, yeah, yeah, yeah. Uh, I'm sure it'll all work out.

00:31:56   Because like, don't forget, I'm sure this didn't help. It was only a couple of years ago where Apple was saying they were going to have this, right? Like, we went mental, right? But I have no doubt that like, I mean, and I'm a little bit mad about this because I genuinely believe that Apple kind of created this bet a little bit where they made it clear that there was a process.

00:32:19   There was one and they were gonna do it right to handle at least the CSAM stuff. So then you got to think if you're in government and like, well, if they could do it for that, they could probably do it for anything. And the answer is yes. Right? Like they could cryptographically create any of these things like, oh, there's a bunch of words that we look for for terrorism activities. Right? Just create that list and have it scan iMessage. Right? So like, and then send the send the results. Right? Yes. So there is no like,

00:32:48   Apple did this whole thing. They have this whole plan. Right? And I have no doubt that like, because there is one thing that I put at the end of my notes here. I'll just say it now. Like, this is happening here. But listen to wherever you are, this is coming for you too. Right? Like, every government wants to do this. Every government wants to do this. Like, and this goes back for so look at the Patriot Act, right? Like, every government wants to do this.

00:33:11   And they will use, they will use the most frightening and dangerous examples in order to force you to agree because the last thing you want to do is say, and I'm sure there are people out there listening to this who are thinking this right now, which is like, what's the big deal? They're looking for child sexual abuse material, terrorism and other abusive content. This is bad stuff. Why would the government not want to patrol it?

00:33:36   And the answer is, once it's there, and we've seen this again and again throughout history, including the stuff that happened after the Patriot Act. Once it's there, the government will use it for whatever they want to.

00:33:50   All they need to do is get it through. Because now they've cracked it open. And what they hate about end to end encryption is that it means that the government can't spy easily on whoever it wants to.

00:34:01   And I know there's legal reasons, but we saw certainly with the Patriot Act that there were all sorts of things that they were looking at because they decided to, but they scare you with.

00:34:11   Including other governments.

00:34:13   Why do you want to protect terrorists? The answer is, I don't want to protect terrorists. But I also don't want you to have carte blanche to spy on everybody's communications in your country.

00:34:25   There is a privacy issue here too. I would argue a right to privacy. And then there's the fundamentals of the math, like I said, of cryptography.

00:34:37   Companies can make it so that everybody can talk to each other and the government can't spy on them. And the government says, "Whoa now. We want to be able to spy on them."

00:34:47   And it's a little like, I know I've said this before, but I'm just going to say it again. It's a little like when the Miranda decision came down in the U.S. in the early 70s.

00:34:55   And the police said, "Oh no. If we have to read people their rights, we'll never solve crimes."

00:35:00   And the answer was, guess what? They figured out how to solve crimes while reading people their rights. They were able to manage that.

00:35:08   But in the moment, you'll get your government to say, "Oh no. We're helpless. We just can't do it. There's no other technique we can use for this."

00:35:16   "All crimes will just occur and there's nothing we can do."

00:35:20   And the flip side, yeah, more difficult maybe, but the flip side is you're breaking privacy and making it so that anybody, including the government doing bad things and including bad people doing bad things, will be able to view what people send.

00:35:37   And the thing about this bill and its size is, and the reason that it's here right now, is because this was the bill to keep children safe online.

00:35:48   And who's going to argue with that? And that's why they rolled it into this and that's how it got past us.

00:35:55   Because there are things in this bill that I agree with. There's so much in the online safety bills. This is good stuff that I agree with.

00:36:05   But there's also this ugly part of it.

00:36:08   Well, somebody in some shadowy department inside the UK government leans into one of their friends in the House of Commons and says, "Slit this in there."

00:36:16   Yeah. Yes, that's been happening constantly. It's still happening.

00:36:19   This is what we really want this and this is a great vehicle to get us this thing that we want, which is to break encryption.

00:36:26   Also, I'm sorry if this touches a sore spot for you, but what's also interesting about this is it's different.

00:36:35   The way it's being covered, too, and the way it's being reacted to by companies is different than some of the EU restrictions that we've been talking about on this show for years and years.

00:36:44   Because of Brexit. This is a relatively small country. I know it's the sixth biggest economy in the world.

00:36:51   It's not nothing, but it's just the UK. And just the UK doesn't carry the weight of all of Europe.

00:37:00   It just doesn't. If this was a European regulation to break encryption, it would be a much bigger deal.

00:37:09   And I don't think anybody would be as seriously talking about, "We'll just turn it off in the EU."

00:37:15   But in the UK, it's easier to do that. It's easier to do that now.

00:37:19   And yet, this bill makes it seem like, "Oh, well, they're going to do what we want. They're going to do what we say because we carry all this weight."

00:37:26   And they don't carry the weight they used to.

00:37:28   We still carry a weight, right? The UK is one of, if not the largest, for technology companies especially, right?

00:37:37   Yeah, for sure. It's an important market. I'm struck by the fact that there's a power imbalance here versus the stuff that we hear about with EU regulations.

00:37:47   Yeah, it annoys me. There's been a lot of talk about this over the Microsoft thing, right?

00:37:55   With Microsoft and Activision. And a lot of people, including people in our sphere, are just like, "We'll just cut off the UK."

00:38:00   It's like, "Well, I mean, I'm going to have to say no on that one, right? I don't agree with that."

00:38:06   There's a lot of conversation of like, "Well, it's just the UK. Who are they to say that an American company can't..."

00:38:15   Well, the American company wants to do business here, so if they're going to do business here, we do have our own rules and we are allowed to look at this and judge on it, right?

00:38:25   They can decide not to do business here, but if they want to do business here, we get to say what's right for our people, whether we agree with that or not, right?

00:38:34   The implication here, I think, is that when companies like Apple are playing hardball with the UK, and that's what this is, is political hardball.

00:38:42   What they're saying is, "Okay, you want to do this and we don't want you to. Let's follow a scenario here, which is we break our services in the UK and blame you."

00:38:55   How does that play out? I'll also point out, it's a Tory government that is going to lose the next election. So maybe they don't care.

00:39:05   I think they're laying themselves up for the next time. Also, Jason, this has got cross-party support. Everyone wants this, and Labour won't get rid of it. I'm convinced of that.

00:39:16   Well, that's true, but it doesn't pass without the majority passing it. I don't know what that political calculus is.

00:39:22   But anyway, the hardball is, Apple says, "Play this out. What happens and who will they blame? Will they blame Apple or will they blame you?"

00:39:30   I think it's a case where, not that the UK isn't important, but that a big company like Apple might look at it and say, "We're willing to break some stuff in this market for a while,

00:39:41   because we think in the end it will bounce back and hit the people making the laws and not Apple."

00:39:51   That's a risky calculation, but I think it's an easier one for them to make a little bit easier for something like the UK for something like encryption,

00:40:02   versus Apple saying, "Take that, EU. We're not going to put USBC on our phones and just not sell them in the EU." That's a lot harder for them to say. So they won't say it.

00:40:11   I would say this one is a bit easier because it won't be just Apple. If this was a law that just impacted Apple, but the big air impact is WhatsApp.

00:40:19   WhatsApp here is the dominant messaging service. None of my friends want to speak to me on our message. Everyone wants to use WhatsApp. It is all WhatsApp here.

00:40:31   That will be actually the big fall. I'm focused on Apple because it's what I care about, but the one that will have the biggest impact is WhatsApp.

00:40:41   WhatsApp have also said that they will just leave. Meta have a history of this. You can't get meta services in China. They didn't do the deals that Apple did.

00:40:53   Threads isn't in the EU.

00:40:55   Yeah, Threads isn't in the EU. They will just not be in a place. I wanted to read this one thing that was, again, just so… this makes me so angry.

00:41:07   Speaking in the House of Lords, Baroness Fox of Buckley says, "The government has exempted text messages, Zoom, and email from the provisions of the bill, and has also accepted messages sent by law enforcement, the public sector, or emergency responders."

00:41:24   So the government has carved themselves out of this.

00:41:28   Ah, yes. Law enforcement and the government can't be spied on. Just everybody else.

00:41:34   Zoom! Zoom!

00:41:37   It seems that the target of this bill is UK private citizens and residents, and that the public are seen as the people who must be spied on.

00:41:44   I wanted to say this for two reasons. One, I just think it just makes me more angry.

00:41:48   But also just to be like, I know when I say things like the House of Lords, people get a thing in their mind about who these kinds of people are, but just to prove that there are actually smart people who understand this stuff.

00:41:57   It's not just a thing that goes up there and flies through it. They're arguing it through, which just makes me happy that people are actually doing this.

00:42:07   Making an effort there.

00:42:08   This bill is going to pass in some form.

00:42:11   No, this is all about what form it is. And I will also say, exemptions, right? Like loopholes and exemptions. And here's, okay, I was going to say here's the real story here, but there are multiple real stories happening.

00:42:28   One story is the government wants to look tough and say that it did something. That's part of it.

00:42:35   Part of it is there are parts of the government that really want to, in my opinion, overreach in terms of access to people's information.

00:42:42   But in this kind of scenario, you end up with this question of like, could we put in some loopholes that will allow us to look tough and not look bad like we drove these?

00:42:59   Because the calculation is they want to do stuff that will be done because they don't want the end result to be that these companies abandon the UK.

00:43:07   That makes the UK look bad. And in fact, I mentioned Brexit earlier, but it really does make the UK feel off, feel broken.

00:43:18   It's like, oh, it's available everywhere in the world, but not the UK because you know. And they don't want that. They don't actually want that.

00:43:25   So there's this calibration of like, how far do you push it? And the danger is what you said about the boffins.

00:43:32   The danger is that they throw in the boffin variable and they're like, oh no, boffins will figure it out. It's like, no, no, no, no, no, you can't make that calculation.

00:43:39   You're doing it wrong. But that's what they're trying to find here, I think, is a way that makes them look good and maybe gets them some of what they want, but doesn't explode in their faces.

00:43:51   And if they essentially outlaw all sorts of services by all sorts of companies by doing this, and maybe they don't realize they're doing it, but I think they've been told and they can just believe it or not.

00:44:03   They know that if they just do that, it's probably going to make them look bad. So I think it's a fascinating political calibration of like, Zoom is a good example.

00:44:12   Like, okay, why is Zoom exempted? Why is email exempted? Why are text messages exempted, but not, I guess, encrypted messages via an app?

00:44:22   Like, these are the questions and it probably all has to do with that push and pull of like, well, okay, we'll put this in here because I guess somebody at Zoom has really good lobbyists who said you'll break Zoom if you do this.

00:44:33   And they're like, oh no, not Zoom. I use Zoom.

00:44:35   But the government uses WhatsApp. This is what so makes us-

00:44:40   Right, that's why they've exempted themselves, right?

00:44:43   That's why WhatsApp won't be there anymore. So like, I think what they tried to do is exempt themselves so they could use WhatsApp, not realizing that WhatsApp would say, bye.

00:44:55   I would also say not realizing that what they're really legislating in that scenario is a two-tier encryption policy for apps, where there is end-to-end encryption for some but not for others, right?

00:45:08   That's the only way that the public sector carve-out works, right?

00:45:11   But it's also not a thing they're going to get, right?

00:45:14   Well, no, they're not going to get it, but I think it speaks maybe to a level of delusion about what they can make.

00:45:19   It's a little bit like the FBI saying, we would like Apple to create an entire software department that builds versions of their OS for us to use, right?

00:45:29   Which they did. They legitimately said, we think that this should be a thing where if you're an OS vendor, you should have a bunch of engineers who basically work for the government to do back doors or break things or do special builds to be loaded onto evidence phones and things like that.

00:45:45   And it's such a spectacular overreach, but I mean, they're not, they do it.

00:45:50   It's not a new concept to try to ask for stuff like this.

00:45:55   So this is all just so, and I don't know what it means for my career ultimately, right?

00:46:00   Like if there's bill passes and Apple pulls out of the United Kingdom, like I don't know what I will do.

00:46:07   Like it's complicated, like, because what would they pull out? Like, would they just stop selling phones here?

00:46:15   I don't think so. I think it would be that they would break, like they would break iMessage.

00:46:20   Sorry, no iMessage, just take it off the phone, maybe?

00:46:23   Yeah. Yeah, I think that's, I think that's likely what they would do because they want to still sell phones, but they would just, they would just turn off those services.

00:46:30   I also wonder, something we haven't talked about that is also part of the political calculation here is using the UK as an example.

00:46:39   Right? So part of your calculus of your Apple or WhatsApp or whoever is, look, if they get away with this, everybody else, like you said, everybody else is stepping up to bat, right?

00:46:51   Everybody else is going to come in with their version of how do we break encryption because we want to spy on our people too.

00:46:57   And so Apple and WhatsApp and the rest may look at the UK and say, well, they're the ones pushing this now.

00:47:06   If we don't put our foot down right now, the rest of them are all going to say, oh, well, the UK did it, we can do it too.

00:47:14   So the UK may end up feeling the pain of that, right? Of just saying like, we need to do this now or we're not going to ever be able to stop this rolling around the world.

00:47:24   What about the China example?

00:47:26   Yeah.

00:47:28   There is already, Apple's already made an example, right? But isn't it that all iCloud stuff is stored in servers in China and they have the key as well?

00:47:39   It's not just Apple that has the key, right? If I'm remembering that correctly.

00:47:43   I believe that there is some truth in that, but that it's not part of their end to end encryption story, right?

00:47:49   They said the end to end encryption is going to be available. And I think they said China too, right?

00:47:55   And everybody went, well, that's not going to work. And we don't know how that's going to work.

00:47:59   But yeah, this is the, that seems, it's a little bit different because I think that's about iCloud data being stored and there being a decryption key.

00:48:10   I don't know. I mean, it's not quite the same, but this is, this is the question is like, do you want to let this, do you want to let this go?

00:48:16   Because we know that's fuzzy. That's like a fuzzy thing.

00:48:21   End to end encryption means there is no key. There is no server key that Apple can share with China.

00:48:27   They can't because there isn't one. And that's what these laws want and bills and whatever these want to do is they want to say, no, no, no, no, no, no.

00:48:35   There's got to be a key that you hold that we can get to. There has to be, or it's not legal.

00:48:42   It's, it's, which has taken us back to the bad old days of the nineties when encryption was a munition and couldn't be exported.

00:48:48   And Bill Clinton wanted to do like the clipper chip, which had a backdoor that was immediately discovered.

00:48:55   And it's, it's bad stuff. But that's, that's what they want is essentially what that China story was about,

00:49:01   which is, um, if there's a key to be had and we can say, we get to have access to it.

00:49:06   And that's what the governments want. Ultimately, that's what governments want is they want a decryption key that is held by the service provider that they can access on demand.

00:49:16   So I don't know. I mean, I'm not a betting man on this if I was, but here's my imagination of how this plays out just as being a realist is that this will pass.

00:49:32   But a lot of these bad things won't happen because they'll work out a bunch of carve outs because I think realistically, everyone's going to realize this is an untenable situation. Right?

00:49:46   Like, yes, there are conversations behind the scenes where like Apple was saying, like, we will do this, but we want you to know we don't want to do this. Right.

00:49:55   Like, which is a different message to how they will play publicly.

00:50:00   They're also going to bring in the boffins and explain, like, we can't do what you think we can do. Their only choices are, we have a key and anybody else can get that key, theoretically, or we don't have a key and I can't give you access because we don't have the key.

00:50:18   I think you're right. If I had to guess, and you know, we live in strange times. It may be that we see companies shut off their services in the UK. It's also possible that they'll pass the bill and there'll be a date for it and the companies will start announcing the shutoffs and it will become like a thing and they'll be like, oh, we have to fix that and they fix it.

00:50:37   But I think you're right. The most likely scenario is they carve out enough that the government can declare victory and it doesn't break the internet for the UK. That's the most likely scenario because in the end, they're politicians, whether they're going to be reelected or not, but like they all want to be reelected.

00:50:52   They're politicians and they want to look good and they want to look like they're protecting the country, but they also don't want to look like they broke the country. And that would be an overreach and a misstep.

00:51:03   And I don't think they really want to pick fights with big tech companies. I think they just want to look tough. And that's where the carve outs that are already in there came from. So more of those, yeah, probably.

00:51:14   And it might even be something like that companies have also agreed to do these other things to protect and Apple, you know, they'll, maybe they'll even put in there something like language about scanning for child sex abuse material.

00:51:28   That is literally what Apple's already built in. Right? Like there's, right, there's, there's, there's ways that they can mitigate this to make it look like they're tough, but also not look bad when people's phones stop working right.

00:51:45   Just before we wrap up, I will just throw in just like a closed bracket on the bracket I opened earlier. I actually don't think that Microsoft should be allowed to buy Activision Blizzard. I don't think they should be allowed to brute force their way into having a successful platform.

00:52:00   I think a lot of people are not aware of the fact that Microsoft bought have already bought a billion game studios. If they can't make it work at this point, that's on them buying Activision Blizzard isn't the way that they should be able to try and make their console competitive.

00:52:15   They're throwing around money from a different part of the organization that's not coming from the Xbox organization to brute force themselves into being competitive with PlayStation.

00:52:24   I don't think they should be allowed to do it, but that's neither here nor there. I just, I've never really said that on a show or many shows, but like that's my opinion on it. I don't think they should be allowed to do it.

00:52:33   I get, I get the, um, I get the counter argument, which is that Sony is so successful and that there would be more competition is probably better. But I also hear your argument, which is Microsoft have been buying studios for years.

00:52:46   They've done everything they need to do. If they can't get these games across the line, that's on them. They shouldn't now buy Activision as well. Like Microsoft have, like, I think people aren't aware they own so many game studios.

00:52:59   There's an Indiana Jones game coming out on Xbox. It's going to be excellent because they bought the company that's going to make it.

00:53:05   Starfield, one of the biggest games on us anticipated games in this year, it was a Bethesda game that is only going to be on Xbox because they bought it with ZeniMax. Like there are so many games that Microsoft have already bought to put on their platform that like they, they have everything they need to try and be competitive against Sony.

00:53:22   They just haven't made it work yet. They don't also need to own Activision. Like it's not a thing that is required. Like I believe it is anti-competitive no matter what deals they sign, there is no reason that they should own Activision Blizzard.

00:53:36   Like, but there you go. There's, there's a little bit on the end there that I'm just going to add in for the sake of it. Uh, I think that it's stupid.

00:53:45   Okay. Fair enough. Well, some are fun. Hmm.

00:53:52   This episode is brought to you by Vitally.

00:53:56   Customer success teams today, they're facing a problem. How do they connect customer data back to their work?

00:54:02   Vitally changes that. It's a new kind of customer success platform as an all in one collaborative workspace that combines your customer data with all the capabilities you expect from today's project management and work platforms.

00:54:14   Because it's designed for today's customer success team, that is why vitally operates with unparalleled efficiency, improves net revenue retention and delivers best in class customer experiences.

00:54:26   It's the solution to helping your customer success team keep a better pulse on your customers, which maximizes productivity, visibility and collaboration.

00:54:35   You can boost your bottom line by driving more revenue per customer with vitally. And if you take a qualified demo, you'll get a free pair of AirPods Pro.

00:54:44   So if you're a customer success decision maker, actively seeking customer service solutions, working at a B2B software as a service company with 50 to 1000 employees, and you're willing to explore changing customer success platforms, if you already have one in place,

00:55:00   schedule your call by visiting vitally.io/upgrade and you can get yourself that free pair of AirPods Pro. That's vitally, V-I-T-A-L-L-Y.I-O/upgrade for a free pair of AirPods Pro when you schedule a qualified meeting.

00:55:15   Our thanks to vitally for their support of this show and Relay FM.

00:55:19   Do you want to talk about Apple and AI?

00:55:23   Why not?

00:55:25   Maybe see if that can change the mood that I'm in right now. I mean, realistically, probably not. Talk about a few things that I don't really...

00:55:34   Anyway, Mark Gurman is reporting in Bloomberg that Apple is working internally on their own tools to combat the likes of OpenAI, Bard from Google and Llama from Facebook.

00:55:44   They have apparently built a framework to create large language models inside of the company. This is code named Ajax, from which... Is it Ajax or is it Jsoft?

00:55:55   I think it's Ajax because it's based on Jax, which is a Google Cloud thing.

00:56:00   I didn't know that. What's Jax, Jason?

00:56:03   It's in Mark Gurman's thing. I'd never heard about it before.

00:56:07   Oh, I missed that.

00:56:08   It's an extensible system for transforming numerical functions.

00:56:11   Okay. You know what? I read this article.

00:56:14   It's a machine learning framework for transforming numerical functions.

00:56:16   I think that piece of information just fell out of my brain because I don't understand what it means.

00:56:19   From which they have made... Apple has made a chatbot of their own from this Ajax project.

00:56:27   This is internally being referred to, I'm assuming as a joke, as Apple GPT.

00:56:32   Sure.

00:56:33   This Ajax project was originally started as a way to unify Apple's disparate machine learning projects into one kind of thing.

00:56:42   But now they've started to do all of this. This team has grown. It's become like a cross division team, which was a thing that Mark was reporting about a couple of weeks ago that we didn't end up covering on the show.

00:56:53   But the Vision Pro team is different. The actual makeup of the team is made differently to the way that Apple's teams have been made up.

00:57:03   Where they have their own head of software engineering and their own head of hardware engineering.

00:57:07   Which is interesting. Craig wasn't overseeing the software part. It was somebody else.

00:57:14   I'm sure they link together. But anyway.

00:57:16   Of course they connect.

00:57:17   And it seems like this is maybe a similar thing. Where it's like they're bringing in their own... Anyway.

00:57:23   This is a quote from Mark's piece.

00:57:25   In recent months, the AI push has become a major effort for Apple, with several teams collaborating on the project, said the people who asked not to be identified because the matter is private.

00:57:35   The work includes trying to address potential privacy concerns related to the technology.

00:57:41   Apple employees are using Ajax to assist with product prototyping.

00:57:45   It also summarizes texts and answers questions based on data it has been trained with.

00:57:50   But Apple are not currently permitting this to be used.

00:57:54   The output of talking to Ajax cannot be used in any customer-facing features due to security concerns.

00:58:01   Yeah. The line that jumped out at me was, the chatbot app was created as an experiment at the end of last year by a tiny engineering team.

00:58:12   Just let that sink in a little. Remember we were talking about all the chatbots and it's like, "Oh, what's Apple doing here?"

00:58:21   And I'm like, "Ah, do they have nothing or are they working on something and they just can't tell us about it yet?"

00:58:27   Well, according to this report, late last year, somebody at Apple was like, "Hey, let's do a chatbot too."

00:58:34   Okay. And then its rollout within Apple was initially halted over security concerns about generative AI.

00:58:40   But it has since been extended to more employees. Still, the system requires special approval for access and any output from it cannot be used to develop features bound for customers.

00:58:49   Okay.

00:58:51   One thing that I found interesting in this article was that Apple considered signing a deal with OpenAI to use their tools.

00:58:59   They conducted a corporate trial of the technology and decided not to do it.

00:59:03   It's wild to me that they even considered it, honestly.

00:59:06   It's a real not invented here kind of thing. It's like, "Oh, we can do that too."

00:59:09   Unless they were looking to buy them, right? And unless this was before the Microsoft thing,

00:59:13   and maybe they wanted to do this as a way to see if it was worth doing what Microsoft did.

00:59:18   So the way I read this is that Apple has a lot of overarching technology that they use machine learning for.

00:59:26   We know that. It's everywhere. Whenever people say, "Well, Apple didn't pay attention to AI," it's like, "They built the neural engine.

00:59:32   They've been scanning your photos a billion times for metadata for years now." All of that is true.

00:59:40   But it does seem like this natural language thing caught them by surprise,

00:59:46   or, alternately, the popularity of the natural language thing caught them by surprise.

00:59:52   I could see a scenario where the machine learning people inside Apple look at the chatbots and they're like, "That's stupid."

00:59:58   They turn their noses up and it's like, "That's stupid." And then the public is like, "Oh, did you see that? That's really exciting."

01:00:04   And inside Apple, they're like, "Oh, people like this." "Do we have anything?" "No."

01:00:10   "Let's work on that. Let's put something in there." What baffles me more is what is going on with Siri?

01:00:16   What is going on with Siri that all of these sorts of things are happening?

01:00:20   Language models is not new to Apple. The Transformer model that they're using in AutoCorrect in all their OSes this fall,

01:00:28   that's a Transformer model. It's like GPT. But it's targeted on AutoCorrect.

01:00:33   The chatbot part, okay, maybe they didn't want to do that, but what is the Siri group thinking about this?

01:00:41   Are they thinking, "No, no, no, no, no, the way we do it is best."

01:00:45   Are they thinking, "Well, we are doing something that's like a chatbot, but it's a little bit different." Or what?

01:00:50   That's the part that keeps on hovering over my head when I see that they made a chatbot app late last year is,

01:00:57   "What about Siri?" Because all our thoughts go there when we think about these chatbot apps and personal assistance and all of that.

01:01:04   It's like, "Okay, Siri's powered by machine learning," I guess, or whatever, but the experience is poor.

01:01:09   And we look at the output from these chatbots and we think, "Oh, that's Siri-like except remembers what you said and tries to help you."

01:01:17   And yeah, there are issues there. They make things up and all that, but there's something interesting there.

01:01:22   And this report is like in December, somebody at Apple sat up straight in their bed in the middle of the night and went,

01:01:28   "Oh, we should do a chatbot." Like, really? That's when it was? Late last year? I don't know.

01:01:38   "While the company doesn't yet have a concrete plan, people familiar with the work believe Apple is aiming to make a significant AI-related announcement next year."

01:01:47   Bravo. Bravo. Probably the line of the piece. And it's so troubling because basically what it is is the company doesn't know what it's got.

01:01:58   I chained together in my little link post on Six Colors, they don't have a "strategy," they don't have a "consumer angle," they don't have a "concrete plan,"

01:02:09   but they want to ship something next year or at least announce something next year. It's like, what? What are you announcing?

01:02:18   What are you announcing if you don't have a concrete plan or a strategy? Why in the world? That is a red flag.

01:02:28   That is a big red flag of, "We feel we need to make an AI announcement. We don't know what it's gonna be, and we don't have a plan, but let's plant a flag."

01:02:39   It says the chatbot is not ever expected to be a consumer product, that it is super bare bones, it's really basic, it's like, "Hey, we're just trying out this thing."

01:02:50   I was like, "Yeah, but do you not have a Siri one? Do you really not have Siri working with this technology? Really? Surely they do, right?"

01:03:06   There is a version of Siri somewhere inside of Apple that is using a Transformer model. We can't be the people making that suggestion.

01:03:16   No, no, I mean, like I said, I think that what probably is going on is either there's a group inside that's working on something and they don't want to share it,

01:03:24   or it's a group that poo-pooed it, right? So, nah, nah, nah, that's not for us. We're too good for that sort of thing. It's bad, it hallucinates things, it's not gonna be anything.

01:03:33   And then it becomes a big hit and everybody's like, "Oh, should we do that?" And then they're behind because they kind of poo-pooed it.

01:03:39   Now, I'm open to the argument that the current chatbot kind of tech is just not suitable because it does, it, you can't have Siri making things up.

01:03:48   But we're already starting to see the fact that if you have a conversational model that can talk to people, glean information, and then connect it to data sources

01:03:57   that it can summarize, that you can get a pretty good experience. And it's just, you know, not only is that baffling, but again, if you don't have a concrete plan,

01:04:08   why do you have a date? Why do you have a date to make an announcement? Because that, it just, that's such a bad sign, right? That is a company saying,

01:04:16   "We need to be seen doing something in this area." Well, what do we got? Uh, nothing. Well, let's slap something together because we got to make an announcement,

01:04:24   whatever it is, however we figure it out. And that, like, that's a little too speculative for my blood to have it be that we don't have a concrete plan or a strategy,

01:04:33   but we're going to try to announce something. We don't know what next year. Not. So. Great. There's something, it feels very dysfunctional.

01:04:41   It feels like this is an area where either, I don't think it's an area where Apple's not paying attention. That's why I keep going to the fact that it's like it was,

01:04:49   it was considered and discounted because they're doing a lot of machine learning stuff. They really are. They really are. But this one, I just, I don't know.

01:04:58   Is it, is it people on the Siri team? Did they decide to do something else? Is it people in machine learning? He says that there's some real disagreement inside Apple

01:05:05   about the value of some of this stuff. And that's where I start to get those vibes of, "No, that's no good. We're not going to do that at Apple. We wouldn't do that sort of thing out there."

01:05:13   And then, so you kind of say, you kind of say, "No." What, what, what Gurman reports is that John Giannandrea and Craig Federighi haven't presented a unified front.

01:05:25   Giannandrea has signaled that he wants to take a more conservative approach with a desire to see how recent development from others evolve.

01:05:32   So that's him saying, "I don't like this stuff. Let's just wait and watch because if we jump into this, we're going to come up with something that we don't like.

01:05:40   So let's just wait and see what happens." And Apple will come in late, as it often does, with something that is at the right time that they're proud of it.

01:05:46   Whereas it sounds like from this, Craig Federighi is implied to be a little more enthusiastic about, "Let's try this stuff and see where it goes."

01:05:54   And so this is, you know, and, and if it's Giannandrea saying these things, he's not necessarily wrong, but the risk you take is you poo-poo something.

01:06:06   And rather than like going down that path and realizing it's not fruitful, you say, "Let's just not go down that path." You risk missing the important path and being behind.

01:06:17   And there's the question, right? Do they take the, the existing Apple path, which is what Giannandrea seems to be suggesting, or do they take the Vision Pro path, which is what Federighi is suggesting,

01:06:31   of like, "This is not the product yet, but we kind of got to get something out."

01:06:37   Yeah, right, exactly. And, and is that, and does, who does that? Like, does that go back to Giannandrea's group? And they're like, "All right, I'll give you a chatbot."

01:06:48   And, and are they enthusiastic about it? I don't know. I don't know. It just, look, this article's full of red flags for me. That's, that's the thing.

01:06:59   It, it, and I'm not, again, I'm not saying, "Obviously this is a thing Apple should do. They should ship it now. Why haven't they been there?"

01:07:05   That's not what I'm saying. I'm saying I'm a little troubled that they were, they were not even trying to build this, apparently, until late in the year.

01:07:12   And I am deeply troubled by the idea that they, according to Germin, don't have a concrete plan, but have decided they're going to make an announcement next year, because that smacks of desperation, right?

01:07:21   They're like, "We look bad. We need to talk about this. Let's get something we can talk about." And you know what? It could turn out nicely.

01:07:30   It could be, and I'm just making this up. I don't have any facts about this, but I'm just gonna throw it out there. It could be that the Siri team is stuck in the mud.

01:07:38   And they've poo-pooed all this technology, but they can't ship something that's good. And they're, they're stuck. And that something like this provides a kick in the pants.

01:07:51   It basically says, "You're going to do this," from a high level, maybe from Tim Cook even. It's like, "No, you got to do this. This is important. And I want something, and I want it next year, you know, and it needs to fix this."

01:08:01   Maybe that's it. Maybe something positive comes out of this. It's entirely possible. I just, I'm troubled by the idea that Mark Gurman is reporting simultaneously that they don't have a concrete plan or a strategy, and that they have a date.

01:08:13   Doesn't seem right. Doesn't seem good.

01:08:16   While we were recording today, more information dropped about access to Vision Pro hardware for developers. So, a couple of things. The applications are now open for developer kits.

01:08:31   So you can apply, as they say, you have to submit an application, and you'll provide information about the type of app or experience that you want to build, and you get help setting up the device and check-ins with Apple experts for development guidance.

01:08:45   And also, the first labs start next week. So apparently, there's one in London next week.

01:08:55   James has already applied, by the way.

01:08:59   Oh, I'm sure. This came up in the Discord, so I did my friendly duty. I immediately text Underscore, and Underscore texted me back and said, "I'm already halfway through my application."

01:09:12   So, you know. It's a widget that's a calculator and drops bananas, and it's all those things.

01:09:19   I mean, it's exciting, but it won't be anything that we can talk about because nobody would be allowed to talk about anything, but it's still exciting.

01:09:25   Right, but still, getting hardware to people, which is something that we said all along seems like it would be important, and here it is. Remember, we used to speculate, like, "Is this a developer kit or not?"

01:09:35   And the answer is, it's a product and a developer kit. It's a little bit of both. They've announced the product, which allows them...

01:09:44   This is something, I know we went back and forth on this, but this is the real master class kind of move, which is announce the product, and then ship a limited version of the product as a developer kit.

01:09:59   Because it's, although the product is mysterious, and you're going to get it out to people, and your hardware is going to be in the world, and Apple says, "We can have you return it on request."

01:10:11   Like, "Bring it back. Bring it back. We don't want you to have it anymore." But it allows them to do this developer kit thing, which they couldn't do if the product weren't announced. So it's good.

01:10:20   Mark Gurman's newsletter this weekend was basically a subject that we've talked about here a lot, especially in the run-up to the Vision Pro, which is, "What's the development story for the Vision Pro?"

01:10:29   And this is part of Apple's attempt to really get developers working on Vision Pro apps so that when they launch this thing, the reviews don't say there are no apps for it.

01:10:42   But in the long run, there is this question of what developers are going to prioritize Vision OS when the volume is going to be so low for probably so long, and that there are some developers who will embrace it because it's a new Apple platform and they want to experiment with it.

01:10:57   But in terms of money-making, right, is it going to be like the Apple Watch store or the iMessage store where there's sort of nothing in there and it doesn't really make sense?

01:11:07   I don't think that's true, but it's a tough sell.

01:11:10   I think that there is a really good, really good story for indie developers because early adopters will just buy so many apps.

01:11:20   They want apps.

01:11:21   Yes, and so if you can get something out and you charge like five bucks for it, there will be like half a million people who are like dying to get some kind of app experience.

01:11:30   Like that's my belief on this because what it reminds me of, Jason, I'm sure you have the same thought, when the app store launched.

01:11:36   Yes.

01:11:37   Which we just had like the 15th anniversary of it, I was buying stuff I didn't even want.

01:11:41   And when the iPad launched, bought a bunch of apps there too.

01:11:44   Yes.

01:11:45   Also, Mark Gurman said something in his newsletter that I have been thinking about and have not written about, and so I was kicking myself on one level but another level I was happy that he mentioned it, which is, what do Vision Pro apps cost?

01:11:56   And he basically floated the idea that maybe the $1 iPhone app is a $20 Vision Pro app and that games are priced like console games and that, this is the question is, will Vision Pro, because of its cost and because of the limited size of the market,

01:12:12   will you see apps that are much more expensive for Vision Pro than you see on Apple's other platforms?

01:12:18   And as somebody who knows a lot of independent developers, I kind of want that to be the case.

01:12:24   I kind of want it to there to be a reset here because the truth is, if you're selling a $2 Vision Pro app, guess what? You're never going to make your money back.

01:12:32   You're never going to make your money, not for a decade are you going to make your money back because there won't be enough people to buy it.

01:12:37   So that's, I could see there being some indie apps that are very simple that are fairly low cost, but I could also see the argument that they're going to try to price this at a much higher level because there's such a limited market.

01:12:50   I would love that situation to be true. I fear it might be a little bit wishful thinking.

01:12:57   It feels like wish casting a little bit, yeah.

01:12:59   Because I understand the console game thing, but these games aren't going to be like console games in the way that they operate and they won't have the cache.

01:13:09   Yes, but I don't think that's relevant. I really don't. I think the idea is a premium custom experience for a very specific platform. The pricing on the meta Quest store is higher than on phone. Right?

01:13:27   Maybe. There's more paid stuff. I mean that's maybe the real difference. Most iOS games are free with in-app purchases and most of the Quest games that I've played are paid games.

01:13:43   They're not console prices, but they're more than iPhone prices and that might be the case, but I just think realistically the train has left the station on pricing and if this platform becomes bigger and bigger, the price will just continue to go.

01:14:02   I mean honestly the model that I look at here is the iPad. Like when the iPad launched, apps and games are more expensive and maybe that will be how it starts, but it will be a race to the bottom.

01:14:14   But iPad volume was high and the iPad price was low. The volume on this is low and the price is high.

01:14:21   I'm not sure that the price of the hardware makes a difference personally. I think realistically in the economics because…

01:14:30   I do. I do because it's somebody who not only do you not have very many potential customers, but this person just spent $3,500 on a headset and that means they've got money to spend.

01:14:40   And you put those two things together, which is you can't sell very many of them and the people that you're selling to have deep pockets. That's why it's a $20 app instead of a $5 app.

01:14:49   I'm not arguing that it might start that way, but I don't think that that is the destiny for this platform.

01:14:55   The problem I have with your argument is that it feels very much like an infinite timescale argument and I'll say sure, if in five years they're selling it at volume, the prices will go down.

01:15:04   Very fair. So I will agree with you. I agree with you.

01:15:07   But the products that launch in the first year of this platform are not going to be part of a strategy for them to lose money for five years, right? Like they're not going to be.

01:15:17   No, no. I'm on the same page. I see what you're saying, but I'm not trying to make one of those infinite timescale arguments.

01:15:25   But I agree with your point, which is that to start with, I hope that developers charge more money. I hope that they will.

01:15:35   Because you've got to have millions of potential buyers for the lower price to make it up in volume, right?

01:15:39   Because there's a fixed cost, which is developing the software and then you've got to make it back somehow.

01:15:43   And if there's only a few hundred thousand even potential customers for you, like wow, that's hard.

01:15:49   And this goes back to what I was saying. I think that there is a money making opportunity because you maybe can charge a little bit more money and people will be hungry for content.

01:15:57   And I'm more optimistic about this platform as a market than I think Mark Gurman is.

01:16:02   Sure.

01:16:03   But it's a tough one, right? And this is, I think, this is the number one reason why it's going to be tough.

01:16:08   And I think the advantage Apple has here is it's got a lot of developers who are familiar with Apple's tools.

01:16:13   And this is an exciting new product and that we're going to see, I really do believe there are going to be some platform, big platform vendors who are like, like Microsoft, who are like,

01:16:24   "Yes, we will support this because this is very interesting and we want to be on all Apple's platforms.

01:16:28   And it doesn't matter if we lose money on this because it's part of a holistic, you know, Microsoft products everywhere kind of thing."

01:16:33   And then you'll have a lot of indie developers who are in it because their costs are not as high and they want to make cool stuff.

01:16:41   And so they can, they can get in there. But yeah, Mark Gurman is not wrong in saying that there's sort of a middle tier of developer who is not going to invest a lot of money building an app for Vision OS if there's not going to be a return, right?

01:16:58   And it's going to be hard for there to be a return.

01:17:00   I actually predict, I don't know about the difficulty of all of this, but like, it sure feels like what you will get is a bunch of stuff that's come over from other VR devices, right?

01:17:11   Like some of that stuff, because it won't be as much cost to bring it over.

01:17:15   But I don't know. It's going to be, and it's going to be a long time, bottom line, a long time before Apple sells as many of these as are in the meta quest ecosystem, right?

01:17:24   It's a hard one for that. So we, and we don't know.

01:17:27   Like, I think this is one of those points that is very much open about what the app market on Vision Pro will actually look like over time.

01:17:38   And I think that like Mark Gurman's opinion is different to mine, in your opinion, just due to the types of developers we're exposed to.

01:17:47   So we are seeing in our more independent community, like the indie community, people seem to be very excited about developing for this platform.

01:17:56   But I'm expecting in the more corporate, larger scale company that just don't care because it's like, whatever, like, let's just see.

01:18:05   We're not going to put effort into this, right? Except for the companies you've already seen, you know, like Microsoft are going to put the office apps on there because Apple asked them to.

01:18:13   And, you know, and so they're going to do it. Like they were in the keynote, like they'll, they'll get around to it.

01:18:17   And Disney's going to do it because Bob Iger loves Apple.

01:18:21   Yeah. And there's an opportunity. I was surprised. I read somewhere that Netflix says they're not going to bother.

01:18:29   And maybe that's the Netflix doesn't believe in 3D content. But like if I had a catalog of 3D movies, I would want to be on the Vision Pro.

01:18:37   And I think maybe what will happen is that Apple will just have a TV app on the Vision Pro that has access to the iTunes store, essentially, the TV and movie store.

01:18:47   And they'll load that up with 3D movies. But like, if I had 3D content, I would want to be on this thing because that's going to be one of the drivers of this is going to be great 3D content.

01:18:58   And so that would be if I was Netflix, if I had a bunch of 3D content, which maybe they don't have enough to be worth it.

01:19:05   That's what would make me interested in being on that platform.

01:19:07   We can hope that Apple will buy the rights to a bunch of 3D movies and put them on TV Plus. That's what we can hope for.

01:19:13   Sure. But also just in the store, right? Like I would imagine that every 3D movie that has been released, there will be a 3D version of it that you can rent or buy in the Vision Pro version of the TV app.

01:19:24   I think that'll happen for sure. And so you'll be able to get that. But yeah, it would be nice if there was stuff on TV Plus and Disney Plus, right?

01:19:31   All those Disney features, at least. All the Marvel stuff, all the Star Wars stuff. That's all got 3D versions too.

01:19:37   So presumably that'll be there on your Disney Plus subscription or they'll make you upgrade to the 3D version of Disney Plus in order to get it.

01:19:45   Maybe? I could see them doing that, right? Pay an extra $2 to get the 4K 3D version of something. I could see that.

01:19:52   Netflix will put a version of Netflix on it. I'm convinced because they put Netflix on everything.

01:19:56   But they're probably not going to do the work to build like a full-on Vision Pro.

01:20:01   I guess my question is if Netflix makes an iPad app and they bring it to the Vision Pro, iPad app running on Vision Pro, can it show 3D content? I don't know. Or does it have to do something?

01:20:12   No, it can't. But does Netflix have 3D content?

01:20:16   Well, this is the question. They probably don't have a lot of it and that's why it's not a priority for them.

01:20:20   But I'm sure many of the feature films that come on and off of Netflix are available in 3D.

01:20:27   Yes. Now my question, because I just don't know, is 3D a different rights thing?

01:20:35   I think it depends on the contract and it depends on what you're buying it from.

01:20:41   It's interesting because there were lots of 3D, there were like 3D Blu-rays that were sold.

01:20:47   But then there were also bundles where you get the 2D and the 3D together when there were 3D DVD players and stuff like that, TVs.

01:20:53   So that's the question when you package this out again, because I'm sure everybody out there who's got a catalog is like, "Oh, finally.

01:21:02   We got another place to sell these things for home video." And will it be, it's just like upgrading to HD or 4K.

01:21:11   Do you get upgraded? Do you have to buy the new version? I don't know. It probably varies. It's probably contractual.

01:21:17   They have to supply them, right? I mean, they're not supplying those 3D versions digitally now, I don't believe, anywhere.

01:21:23   Unless there's meta. Maybe there's an app on the Quest Pro that sells digital 3D movies. I don't know.

01:21:30   But there's some possibilities there.

01:21:33   Mm-hmm. So go apply for your developer kit.

01:21:37   I will just say, last thing before we wrap up. James posted this. I'm assuming it's from some kinds of terms and conditions.

01:21:43   "You agree that all access to and usage of and storage use of the developer kit will be in a private, secure workspace accessible by you and your authorized developers.

01:21:53   Fully enclosed with solid doors, floors, walls, and ceiling, and locks that can be engaged when the developer kit is in use.

01:22:00   You must ensure that unauthorized persons, including family, friends, roommates, or household employees, do not access, view, or handle or use the developer kit.

01:22:12   When in use, the developer kit should be in your positive control on your direct or within your direct line of sight at all times.

01:22:20   You must ensure it is passcode protected. Never leave it unattended.

01:22:24   When not in use, turn it off and store it inside of its locked Pelican case in a locked space that only you have access to.

01:22:33   The developer kit may not be moved from or taken away from its ship to address.

01:22:38   If you will be away from your workspace for more than 10 days, consult with your Apple point of contact about how to keep it safe while you are away."

01:22:47   This is incredible. I love it.

01:22:50   Do not taunt Apple developer kit.

01:22:53   Do not look at the developer kit in a wrong way.

01:22:56   Can't even be played. Don't even look at it. It can't be played.

01:22:59   Do not say "the" before you say "Vision Pro." It's just "Apple Vision Pro." That's how you must call it. Otherwise, the developer kit will explode into a pile of smoke.

01:23:08   No, there'll just be a knock on your door immediately, and an Apple employee will be outside and they'll say, "Hand me the case. Hand it to me."

01:23:15   I like that little detail of the locked Pelican case just snuck in there of like, "Oh, so that's how they get it to you."

01:23:21   They learned from the big boxes of tools that they would send out.

01:23:27   Yeah, you get the Pelican case. It's locked.

01:23:30   Incredible.

01:23:32   Yeah.

01:23:34   This episode is brought to you by Ladder.

01:23:37   Let's be real. Look, if you're anything like me, we all have a tendency to put some things off until the very last minute.

01:23:44   Whether that's going to the DMV, arranging a dental checkup, or getting to that home improvement project, you know these kinds of things that I'm talking about.

01:23:51   Most of the time, it's fine. Things are a little bit late. You get it done.

01:23:56   But something in your life that you cannot afford to wait on is setting up term coverage life insurance.

01:24:01   You've probably seen life insurance commercials on TV, you maybe heard one of these ads before and you thought, "I will get into it later."

01:24:08   But you should not wait. This is something you don't want to wait on.

01:24:11   Choose life insurance through Ladder today.

01:24:14   Ladder is 100% digital. No doctors, no needles, no paperwork.

01:24:18   And you apply for $3 million in coverage or less.

01:24:20   Just answer a few questions about your health in an application.

01:24:23   Ladder's customers rate them 4.8 out of 5 stars on Trustpilot and they made Forbes' best life insurance list in 2021.

01:24:31   All you need is a few minutes and a phone or laptop to apply.

01:24:36   Ladder's smart algorithms will work in real time so you'll find out if you're instantly approved.

01:24:40   And there are no hidden fees and you can cancel at any time and you'll get a full refund if you change your mind in the first 30 days.

01:24:46   Ladder policies are insured by insurers with long, proven histories of paying claims.

01:24:52   They're rated A and A+ by AM Best.

01:24:55   And since life insurance costs more as you age, now's the time to cross it off your list.

01:25:00   This is why you don't want to wait.

01:25:02   Every day you don't, go and check it out. Every day you go and sign up, it's just going to get more expensive.

01:25:06   So, go to ladderlife.com/upgrade today to see if you're instantly approved.

01:25:11   That's L-A-D-D-E-R life.com/upgrade.

01:25:15   One last time, that is ladderlife.com/upgrade.

01:25:18   Our thanks to Ladder for their support of this show and Relay FM.

01:25:22   It's time for some Ask Upgrade.

01:25:26   Questions to finish out today's show.

01:25:30   The first comes from Sasha who asks,

01:25:33   "On your MacBook Air, do you use your screen with default resolution or something else?"

01:25:40   Um, default for me.

01:25:43   Oh, okay. Why do you do that? I'm just intrigued.

01:25:45   Just because you don't change it?

01:25:48   It's enough, I guess.

01:25:50   Which means it's not the one-to-one, right? Because that's too much.

01:25:53   So, it's default. That's just what I use.

01:25:56   It feels like, it feels right. It feels enough for me.

01:26:01   I do more space because on a laptop I never feel like I have enough space for my windows.

01:26:07   I always want more space.

01:26:09   That's fair. I sometimes will switch to more space. I'll even do that on my desktop.

01:26:13   Interesting.

01:26:14   Where if I'm doing a video, like if I'm doing a live video stream and I've got multiple windows that I'm capturing and then I've got to control it, plus I've got to see the stuff.

01:26:22   That's a lot. So sometimes I'll do that.

01:26:25   But I very rarely do that on the laptop. I'm pretty comfortable.

01:26:29   I spent so many years using an 11-inch MacBook Air that it feels so spacious on the 13-inch.

01:26:34   So, for some reason, this is funny, I don't know why I'm doing this.

01:26:38   Oh, I see. Sorry. On my machine here.

01:26:43   So I have a laptop plugged into two displays. On my main display I'm using the default.

01:26:48   On the display that I have to the side I use larger text just because I actually just need to see audio hijacked larger.

01:26:55   Right? Because that's just all I need on that screen.

01:26:57   But yeah, on my studio display, on my main desk and my main monitor here, I choose those at the default.

01:27:05   And it looks nicer. Right? It just looks better. The text looks nicer. Everything's crisper.

01:27:09   But on a 13-inch display I want more space.

01:27:14   I think on the display of the 14 I did use that at the default.

01:27:20   But on the 13 it's just not enough for me.

01:27:23   I think I bumped the text up a little bit though because the text was a bit small.

01:27:27   But I do want more window room.

01:27:30   That's fair.

01:27:31   John asks, "Do you think the Vision Pro will work well outside?

01:27:37   Most headsets of inside-out tracking struggle when they're not indoors. If so, is this a problem?"

01:27:44   Yes and no. Oh no. No and no. No and no is my answer.

01:27:52   I don't think it'll work well outside.

01:27:54   Me either.

01:27:55   And I don't think it's a problem.

01:27:57   Because I just don't think they've made it for that. Like I just don't think this is a headset for outside.

01:28:01   And they'll work it out for when they have other products that they want to be used outside.

01:28:06   Realistically people aren't going to use these outside.

01:28:08   Like why would you use it outside? Like I don't know why you would.

01:28:11   Well I'll tell you why I'd use it outside is that my house isn't very big and I have a patio that actually has more space.

01:28:15   I could probably do much better out there with VR than I can do it in my house.

01:28:20   But if you're sitting down, you know, and you just create a virtual world for yourself, space is infinite if you're at Mt. Hood, Jason.

01:28:26   You know?

01:28:29   Okay. Alright. Sure. Space is infinite.

01:28:32   I mean realistically I understand that some people maybe don't have the space to play certain games or whatever.

01:28:39   So they maybe want to take it outside.

01:28:41   But I just don't think that this headset, the main use cases require space.

01:28:47   They don't really seem to be optimizing for that.

01:28:51   Like draw your room boundary and stuff like that.

01:28:53   Like it doesn't really feel like this is what they're doing right now.

01:28:57   So I don't think it's a concern for right now and they will just solve this problem for their technology that they're expecting you to wear all the time.

01:29:06   Right.

01:29:07   And this isn't that.

01:29:08   Drew asks, "Now that the two of you have been posting clips to TikTok, I'm curious if it's given you a different perspective of using the app.

01:29:17   I know Mike has talked about not being a fan of the algorithmically driven timeline in the past, but I'd love to know if your opinions have changed at all."

01:29:26   Well we should say, we're not posting clips to TikTok.

01:29:30   Correct.

01:29:31   We have our people. We have our people do that.

01:29:32   Yes, we have people.

01:29:33   Tim Cook can call those people.

01:29:35   We're on our second person. So it is people.

01:29:37   It's true. We have people who do that work for us.

01:29:42   The TikTok work, the Instagram work, hopefully the YouTube work at some point here.

01:29:47   All that posting stuff is not done by us.

01:29:51   So it hasn't changed my perspective of TikTap.

01:29:56   That's a new one. What's TikTap?

01:29:58   I don't know, but maybe we could start that company.

01:30:00   It's a dance style. It's like a tap dance style. It's a very TikTap.

01:30:05   It's a very special kind of style.

01:30:08   Or is it tap dance TikTok? Is that TikTap?

01:30:11   I'm sure there's tap dancing TikTok.

01:30:13   I have the app installed and I've looked at it. The thing that I talk about is not only is it the algorithmic timeline, but I'm also uncomfortable knowing that if I linger on a video at all, it's like, "Aha, you lingered on that video."

01:30:26   I don't like that. It makes me really uncomfortable that it's trying to use my non-interaction as a source of a sign of my interest.

01:30:36   I find that just deeply uncomfortable. I don't like it. I don't use it.

01:30:40   I don't do a lot of video stuff. I don't leave the sound on my devices very much.

01:30:46   For me, video on a mobile device is a very intentional thing where I'm sitting down to watch something and TikTok feels very much like a casual "I'm just going to flip through with sound on."

01:30:57   I don't like that. I don't like flipping through with sound on anywhere.

01:31:03   So it's not really for me. But it's for other people and that's great.

01:31:07   I don't need another time suck in my life. I really don't.

01:31:10   Yeah, I'm not into it either. There was an episode of Sharp Tech recently, which is Ben Thompson's one of Ben Thompson's shows with Andrew Sharp.

01:31:18   They were talking about threads and algorithms and how people want the non-algorithmic timeline is what they're asking for.

01:31:31   Ben is saying that it's been shown time and time again that there is a difference between stated and revealed preference.

01:31:40   Ben was really going on about how people say they don't want the algorithmic timeline, but it's been shown time and time again that they do.

01:31:50   This, to me, feels like one of these replication crisis studies where I don't think it's as simple to say that because people engage with algorithms that they want them.

01:32:06   I agree. I get Ben's point, which is if you put an algorithmic timeline in, it drives engagement. And that has been proven.

01:32:16   It's unequivocal, but I don't want it.

01:32:20   You could argue part of the engagement is that you're scrolling because you're trying to find your stuff and you can't find it.

01:32:24   And that's engagement, but it's bad engagement.

01:32:26   But also, it sucks me in and I don't want that. That's the thing. I, Mike Hurley, don't want that to happen to me.

01:32:34   It makes the product less appealing.

01:32:36   But then I'm less likely to want to use it. So I tried a couple of weeks ago.

01:32:40   Because what is this all like? I didn't want to use TikTok because I didn't want to have to start all over.

01:32:46   Did you say, "What's all this then?"

01:32:49   "What's all this, TikTok?" I said.

01:32:51   So I was trying out Instagram Reels for a bit.

01:32:56   Lime is over Lemus, I should say.

01:32:58   And it did not go in a way that I wanted. It started diverging into areas that I wasn't interested in. Way too much inappropriate content, realistically.

01:33:15   And it's like, "I don't know why I'm here and there's now no way I can seem to get out of this."

01:33:19   "How can I escape this?"

01:33:21   So it quickly revealed to me, I see how someone could get sucked into this.

01:33:29   And they're just like, "Well, I'm just going to keep watching and watching and watching and watching."

01:33:32   Like I know so many people do. And I'm not judging you. I'm saying for myself, I don't want that.

01:33:36   So yes, I could imagine a scenario in which an algorithm like this could capture me and I'm just watching, watching, watching.

01:33:42   But I, Mike Hurley, am saying before that happens, I don't want that.

01:33:46   No matter what my brain ends up doing to me, my logical brain will say, "I don't want that."

01:33:51   And there is a difference between the logical brain and the lizard brain, right?

01:33:55   That will just do whatever it wants because it wants the dopamine rather than a sensible version of me who's like, "No."

01:34:01   It's the same as junk food, right? I want to eat it when it's in front of me or when it's happening.

01:34:08   But realistically, I know I don't want that because I know I don't feel good afterwards.

01:34:11   And so it's like, that's the same. Like I kind of see that like for me, my opinion is like TikTok and Instagram reels and these kinds of like pure algorithm dread video things.

01:34:24   They are junk food. And like, so it's like there's no real content to it.

01:34:28   Like I want to be able to choose the people I follow and I watch the videos from them and I like it like in the main feed because I've made that choice.

01:34:36   But this just like convey about like I always think of Homer Simpson being fed the donuts in hell, right?

01:34:42   That's what this is. Like, and just giving it to me, just giving it to me.

01:34:46   And oh, I just keep taking it because I love the donuts in hell. But like I don't realistically want that.

01:34:53   And so like I don't think it is as simple to say that like because algorithms show engagement, then like that is your revealed preference.

01:35:01   It's like, no, it shows that there is a part of my brain that wants it, but the whole human of who I am, I don't want it.

01:35:08   That's how I feel. So no, I'm not trying it.

01:35:12   So I think when somebody like Ben says it works, like the problem is what does that mean?

01:35:20   And what does it mean for the product? I agree with you, by the way, about it does feel like the endless conveyor belt.

01:35:28   And this isn't good for me, but it presses all the right buttons.

01:35:32   It's a, you know, it's the tobacco industry argument, right?

01:35:37   It's like, well, it's super addictive. Isn't that great?

01:35:39   It's like, well, I mean, it's great that you have a product that's addictive because your people will always come back.

01:35:43   But it's is it great for society and the people? No, but it's good for your business.

01:35:47   I think, though, the broader point would be that it's not first off algorithm doesn't necessarily mean what it means

01:35:57   as implemented on some of these services. And also different users are different.

01:36:03   I would argue for when you're talking about threads, for example, like power users, creators,

01:36:09   there are all sorts of people who take control of the product and they're important to the product

01:36:15   because they make the content on it and they curate curate content and they have followers.

01:36:20   And there are people who use these services who follow, make the effort to follow people and curate their experience.

01:36:28   And so first off, if they make the effort, let them see the results of the effort.

01:36:35   Let them see the stuff they want to see because they made the effort.

01:36:38   Now, a lot of people aren't like that, right?

01:36:41   There are a lot of people who don't know who to like Twitter.

01:36:43   One of the biggest problems with Twitter is who to follow and how do you get the timeline to be interesting?

01:36:48   And the solution for all of these things, for TikTok and Twitter and Instagram and anything else is you don't have to follow anybody.

01:36:55   We just look at what you like and what you're interested in.

01:36:58   Maybe we ask you to click on some subjects that you're interested in.

01:37:00   We just show you stuff. And whenever you want to look at stuff, we got stuff for you

01:37:04   because it's constantly being generated and we'll just show it to you.

01:37:07   And so I get that that's a problem that is solved by the algorithmic timeline,

01:37:11   but there are some people who don't want that.

01:37:14   And then separately, algorithmic timeline can mean different things to different people.

01:37:19   One of my realizations on threads is that if I just loaded threads, I could see people that I know.

01:37:24   But if I made the mistake of reloading it and Instagram is the same way,

01:37:28   the reload signal, I know I mentioned this a couple weeks ago,

01:37:30   the reload signal means I've seen all of this, show me something new.

01:37:33   And what ends up happening is all the stuff from people you know goes away

01:37:36   and it's all replaced with stuff from people you don't.

01:37:38   And I hate that.

01:37:40   Is there value in saying, okay, here are the people you know,

01:37:44   and then here's some other content, whether it's in a separate location

01:37:47   or it's interspersed in your timeline.

01:37:49   I loved Nuzzle, which was a thing that looked at your Twitter lists

01:37:54   or your Twitter timeline or people you follow on Twitter,

01:37:57   like second level and the links that they generated.

01:38:00   There are algorithmic things you can do to say,

01:38:03   here's a thing from a person you don't follow,

01:38:05   but somebody you follow follows them and they liked it.

01:38:08   And this is what Twitter has done.

01:38:10   They liked it or they retweeted it or whatever.

01:38:12   And you see it and it's a lighter version of the algorithm.

01:38:16   Or you do scroll down to the end and there's nothing more.

01:38:20   And the goal of the service is to keep you there.

01:38:22   So they start feeding you other stuff that's related.

01:38:25   Like there are gradations in it,

01:38:28   but I personally prefer to take control of my content.

01:38:34   And the infinite algorithmic scroll makes me, I just, I don't like it.

01:38:40   And like I said, I also really disagree with the idea

01:38:43   that if you're not interacting with the content, you shouldn't.

01:38:49   I get why they do it, but me pausing on a video should not be a signal.

01:38:53   Me liking a video should be a signal,

01:38:55   but I believe TikTok is always watching.

01:38:58   So I don't know.

01:39:00   If you like it, great.

01:39:03   It's not for me.

01:39:05   And we're there because we know that people like it.

01:39:07   And that's awesome.

01:39:09   But I always would prefer services to offer algorithms

01:39:13   for people who don't want to do the work.

01:39:15   And for the people who want to do a little work

01:39:17   and do a little curation, let them.

01:39:19   Do both. Why not both?

01:39:21   And to Instagram's credit, Threads is supposed to add a view

01:39:25   for your view that you take control of.

01:39:29   That's supposed to be coming.

01:39:31   And that's good because I find Threads unusable now.

01:39:34   I've seen a screenshot of it.

01:39:36   I think it was possible that Adam Mossieri shared a video

01:39:40   and you could see on his timeline, it looked like the Twitter one,

01:39:44   where you could swipe left and right

01:39:46   between for you and your following feed.

01:39:50   On the official Twitter app, I don't know if you've seen this,

01:39:53   but you could swipe left and right.

01:39:55   It's going to be like that, it looks like.

01:39:57   By the way, Adam Mossieri finally came out as an Android fanboy,

01:40:00   so everybody who's waiting for an iPad version of Instagram or Threads,

01:40:03   forget it. It's never going to happen.

01:40:05   The website is the best you can hope for.

01:40:08   Did you really expect that there was...

01:40:11   No, but him just doing a post and saying,

01:40:14   "Android's better than iOS," I was like, "Well, forget it."

01:40:16   He's not somebody who cares about Apple platforms,

01:40:19   so just get ready for the web. And that's fine.

01:40:21   A usable web version would make that site...

01:40:25   I know in our outro we say that I'm on Threads.

01:40:28   I am, but I almost never look at Threads

01:40:31   because I can't get to it through the web.

01:40:34   I'm on a Mac or an iPad,

01:40:36   and I don't like using the iPhone app on the iPad either.

01:40:39   It's terrible.

01:40:41   I'm not convinced that he was being real with it,

01:40:44   because he uses an iPhone.

01:40:46   I don't know. I don't know what to tell you.

01:40:48   I know he said it in that thing, but the guy uses an iPhone.

01:40:51   Yeah, I don't know.

01:40:53   Okay, so he's a troll or whatever.

01:40:56   Maybe he just likes it, but he uses them.

01:40:58   I took that as a signal.

01:41:00   Even the guy in charge is just not interested in Apple's platforms,

01:41:03   and they're only on the iPhone because they have to be.

01:41:05   And it's like, fair enough.

01:41:07   I just want to put a stake in the heart of the hope

01:41:11   that there will ever be anything for Mac or iPad

01:41:14   from these companies, because there's not going to be.

01:41:16   And that's fine, because the web is there.

01:41:18   Just make it so I can use it on the web,

01:41:20   and I will check in a couple of times a day on my computer.

01:41:25   But until then, also, I want to point out,

01:41:28   I know why Apple does it.

01:41:30   I know why they do it,

01:41:32   but using an iPhone app on the iPad is so terrible.

01:41:35   Yeah, I know. I've been using it.

01:41:37   At least I have an iPad Mini, so it's not so bad,

01:41:39   but it's still pretty bad.

01:41:41   I feel like, could you fake it where you put up the right keyboard

01:41:45   instead of the little iPhone keyboard?

01:41:48   Because it's unusable. It's unusable.

01:41:51   Also, I don't think it's the right size class.

01:41:53   Like, they could use the biggest iPhone,

01:41:55   and I don't think they do.

01:41:57   I don't know. I get why they do it.

01:41:59   They're punishing those apps.

01:42:01   It should hurt to use an iPhone app on an iPad,

01:42:04   but I don't like it.

01:42:06   If you would like to send in a question for us to answer

01:42:09   in a future episode, just go to upgradefeedback.com,

01:42:13   and you can send in an Ask Upgrade question.

01:42:15   But you can also send in your feedback

01:42:17   and your follow-up there as well,

01:42:19   including if you have any anonymous information for us.

01:42:22   You can check out Jason's work over at sixcolors.com.

01:42:26   You can hear his podcasts at the incomparable.com

01:42:29   and here on Real AFM.

01:42:31   You can listen to my shows here on Real AFM

01:42:33   and check out my work at cortexbrand.com.

01:42:36   You can find us on Mastodon and Threads.

01:42:38   Jason is at JSNEL, J-S-N-E-L-L,

01:42:41   and I am at i-Mike, I-M-Y-K-E.

01:42:44   You can also find the show on Mastodon.

01:42:46   We are at Upgrade on realafm.social.

01:42:49   You can watch video clips of the show

01:42:51   on our Mastodon, but also on TikTok and Instagram

01:42:54   and kind of, sometimes a little bit on YouTube,

01:42:57   but we're working on that.

01:42:59   TikTok and Instagram are the places to go.

01:43:01   I did mention it on last week's episode,

01:43:03   but I do really recommend it.

01:43:05   There is a very, very, very funny video

01:43:07   that we put up on last week

01:43:09   of us singing Immigrant Song,

01:43:11   which wasn't in the show in its entirety.

01:43:14   It was very funny.

01:43:15   I recommend people go watch it.

01:43:16   I'll put a link to it in the show notes.

01:43:18   Thank you to our members who support us

01:43:20   with Upgrade Plus.

01:43:21   You can go to getupgradeplus.com

01:43:23   and you can sign up there,

01:43:24   and you'll get longer ad-free versions of the show.

01:43:27   And thank you to Ladder, Vitaly, and Factor

01:43:31   for their support of this week's episode.

01:43:33   But most of all, thank you for listening.

01:43:35   Until next week, say goodbye, Jason Snell.

01:43:38   - Goodbye, listeners.

01:43:40   (upbeat music)

01:43:44   (upbeat music)

01:43:48   (upbeat music)

01:43:52   (upbeat music)

01:43:55   [ Silence ]