365: Apple's Compromise


00:00:00   [music]

00:00:12   From Relay FM, this is Upgrade, episode 365. Today's show is brought to you by Pingdom, DoorDash, and ExpressVPN.

00:00:21   My name is Myke Hurley and I am joined by Jason Snow. Hi, Jason!

00:00:26   Hi, Myke. How are you?

00:00:27   I'm good. I'm happy to be back. I wanted to extend...

00:00:29   Welcome back!

00:00:30   ...my thanks to you and of course to Julie Alexander for joining the show last week.

00:00:34   It was a really great conversation. I enjoyed it greatly.

00:00:37   I'm glad you did. I haven't spoken to you since you listened. I sent you a link to the file when it was ready and you said, "I will listen on Monday."

00:00:44   As upgrade listeners would listen.

00:00:45   As intended. I like to be a listener every once in a while. It's a nice experience.

00:00:50   I have a #SnowTalk question. It comes from Ben and Ben wants to know, "When shutting down or rebooting your Mac, do you ask it to reopen Windows when you log back in?"

00:00:58   So you keep everything where it was when you restarted.

00:01:01   I didn't know we were going to just jump right into a power user tip.

00:01:05   But here it is. Here's your power user tip for the day.

00:01:08   Okay.

00:01:09   #powerusertip. No. Ben, I think I'm going to use my like wise sage voice.

00:01:17   Ben, you're asking the wrong question.

00:01:22   The right thing to do is to not be given the choice. Hold down the option key and just to shut down or restart and then it doesn't ask you anything.

00:01:34   Power user tip.

00:01:35   But what happens then though?

00:01:39   I think it may... Is reopen a setting or is it just in that dialog box? I think it's a setting?

00:01:50   Yeah. Well, I've just clicked the restart button, which is a horrible thing to have done while we're recording.

00:01:55   Oh, don't. What? No.

00:01:57   No, wait, wait. I have 53 seconds.

00:01:58   No, are we restarting the podcast? From Relay FM, this is upgrade.

00:02:02   I only have 48 seconds left. It says reopen Windows when logging back in. That's a checkbox.

00:02:08   Yes. Yes, that's true. I just am wondering if there's any way to do it. I think that is the default when you restart by holding down the option key.

00:02:15   My point is I don't want to be asked.

00:02:18   Right.

00:02:19   So whatever it does, it does because I don't want to be asked. I don't pull down the shutdown or restart menu and have a question asked.

00:02:28   I feel like we're digging too deep into one. Do you like your Windows to be reopened is maybe a better question.

00:02:34   I don't care. I'm just saying all I care is that I shut down or restart immediately when I do it. That's why I hold down the option key.

00:02:40   Whatever happens then is just up to Apple, I guess. I don't care. I literally don't care.

00:02:46   I have a bunch of startup items set anyway. It's fine. I just don't care. That's why I hold down the option key.

00:02:54   People who don't care just have to care about holding down the option key.

00:02:58   Fair enough. If you would like to send in a question for us. Oh, by the way, people in the Discord are freaking out.

00:03:04   I have obviously pressed the cancel button at this point. They're already worried that things are about to show up.

00:03:09   You might restart.

00:03:10   Not restarting. It's okay. If you would like to help us open an episode of Upgrade, just send out a tweet with the hashtag #snowtalk or use ?snowtalk in the Relay FM members Discord.

00:03:21   I have some Apple Store updates for you, Jason.

00:03:24   I actually think this is a very interesting story and I have some thoughts.

00:03:27   But tell the listeners, Myke, what happened on the Apple.com webpage, as it were.

00:03:34   The first Apple Store update is that there is one again.

00:03:37   For many years, I think this was an Angela Ahrens change. They removed the dedicated Store tab on the Apple.com website and you would just, when you were on any product, you could just click to buy it.

00:03:51   Yep.

00:03:52   Which was a choice and we've got, you know.

00:03:55   I have, okay. Let me just say this. This is a perfect example.

00:04:02   I think that has a reflection in other things that Apple does from time to time that are dumb.

00:04:09   Where Apple's got a really idealized idea, you know, concept for what a thing should be.

00:04:15   And they're like, "Oh, well, it should be like this."

00:04:17   And it's usually spoken in sort of like heady kind of design and information philosophy kind of jargon about like, "Oh, well, the whole..."

00:04:25   In this case, the ridiculous thing that they were trying to do was, "Well, the whole site's really a store and you can buy from any page. So why do you need a store?"

00:04:34   And the answer is, because people want to buy things and they want to click the button to find where the things are to buy.

00:04:38   And we've spent the last few years going, "Well, if I need a Mac accessory, if I need like a cable or something, what do I do?

00:04:45   I guess I go to the Mac page where it's going to try to sell me Macs and then maybe I can click on accessories there.

00:04:49   Or maybe I have to click on a Mac and then like, well, I just want to buy the thing."

00:04:54   And this is, I feel like such an Apple move of saying, "Well, we're just going to abstract this because who needs an actual store? That's not elegant.

00:05:03   Our store is throughout the site."

00:05:05   And it is a beautiful philosophy.

00:05:12   But when it meets the real world, it collapses.

00:05:18   It's like a folding chair.

00:05:20   It just is down because people want to buy it.

00:05:23   Like they go to your site, they're not looking for an experience. They want to buy a cable.

00:05:28   And so you should be able to say, and like every other site on the internet, that's the other part is it collides with the reality of every other site on the internet, right?

00:05:35   Which is to buy things. And Apple's like, "Oh no, no, we're above that. We're above that.

00:05:39   We're an experience with products and then eventually you'll give us money."

00:05:43   But it's bigger than that. And the fact is everybody's trained to just, "Can I just find the thing I want?"

00:05:49   And so they have finally kind of, because somebody at Apple's, it's their job and they're judged based on online sales through the apple.com site, right?

00:05:58   And they're like, "Uh, this isn't working. I want a store page and tab because people can't find where to buy things and we want them to give us their money."

00:06:12   So to me it's just a perfect encapsulization of Apple having these kind of like highfalutin ideals about how things should work, colliding with the reality of how things actually work.

00:06:25   I think that this was an extension of the whole "meet at Apple" idea, right?

00:06:30   You remember the whole like Angela Arendt's thing of like, "We don't call them Apple stores anymore."

00:06:35   It's just like you meet at Apple and then it's like the website itself wouldn't have a store page because the whole experience is there.

00:06:42   I don't know for sure. I think this predates Angela Arendt's though. I think this is an even earlier thing.

00:06:48   But regardless it is from the same, it's cut from the same cloth, right? Which is, apple.com is an experience and you just wander from thing to thing and then eventually a product will hit you in the face and then you'll buy it.

00:07:00   And that's just not like, no. I mean, I think probably somebody at Apple said, "People are so frustrated with our site that they just go and buy it on Amazon."

00:07:09   Because on Amazon there's just a box and there's products and then you buy a product and you're done.

00:07:13   Whereas ours is more like a little adventure game where like, "How do I buy a product on apple.com?"

00:07:18   Can I not? Do I have to go to Amazon for that? I mean, of course you can.

00:07:22   But they've chosen to hide it and make it more of a mystery to click around. It's like playing mist or something.

00:07:30   It's, you know, "Oh, it's an adventure. Where is the bag? Where is the checkout button? Well, let's find it. How do I find that lightning cable? Hmm. Well, just click on iPad and see what..."

00:07:44   "Oh, no. No. You have an iPad Pro. It has USB-C. Click somewhere where there's lightning. Click on iPhone." I know it's not that bad, but it's just, it's really frustrating.

00:07:52   Because this is such an obvious glaring thing where Apple just tried to be better than the internet and the internet said, "No, you are on the internet. You need to be what people expect from your website."

00:08:03   So I found an article somewhere that said that it was removed in 2015 and Aarons joined in 2014.

00:08:10   Okay. So there you go. We'll lay it down on Angela Aarons' lap then.

00:08:15   But I think it does go hand in hand with the whole idea of what she was trying to create. And I think there were good things that came from it and bad things that came from it.

00:08:24   And I think some of that abstraction of like, "We don't have a store. It's not a store," I don't think was right. But some of the design stuff is fantastic.

00:08:32   So here's the thing about Angela Aarons was the number two, I think, or maybe she was the CEO. CEO at Burberry.

00:08:42   And so her hire was very much like Apple as a luxury brand, right? I think her hire, we've talked about this before, her hire goes—

00:08:50   It was the right person at that time. That's what they were trying to be.

00:08:53   Yeah, for what they were trying to be. And it goes hand in hand with like making a solid gold Apple watch, right?

00:08:58   It was the idea of what can we learn from these luxury brands, which is funny because Apple stores do better in sales per square foot than luxury brands do, right?

00:09:07   Apple stores do better. So maybe the luxury stores should learn from Apple and not from Apple learning from them.

00:09:13   And this is a little like that too, which is that it's part of this kind of exclusive, it's like, "Oh, well, you know, if you have to ask how expensive it is, you can't afford it," kind of approach that like—

00:09:26   But within Apple, I think, Apple's true if it's really honest with itself. I think Apple's personality as a company is not—

00:09:38   Let me put it this way. It's further away from luxury brand and a little bit closer to hard sales, right?

00:09:47   Like I think Apple cares more about getting your money than maybe it wants to show or admit to itself. And the whole luxury thing was part of that, which is like, "We don't need to do the hard sell."

00:10:02   Remember when the iPhone sales sagged and they suddenly realized that they needed to actually hard sell on iPhones because they had just tried to not?

00:10:11   But that's an example where they're like, "Well, they just turned on a dime because in the end, there is somebody at Apple going, 'Where's my money?'"

00:10:17   So this is like that. This just feels very much like that, which is they like to think that they're above it all, but in the end, they really do want your money.

00:10:26   And I'm okay with that. Like as a user—it's going to be my user story of the day, right?

00:10:34   If you've ever built a website or probably software too, you've had the user story, which is how do you explain the feature you want? And the answer is you have to phrase it as, "As a user of Apple.com, I want to buy something. Where is the store?"

00:10:49   It's pretty simple, right? I want to buy something. If I go to Amazon, I type in, you know, lightning cable Apple or whatever it is, iPad Pro, and hit return, I get everything that they're selling.

00:11:00   And then Apple, it's like, "Hmm, you got to figure it out." I've had multiple friends say, you know, "Where do I go?"

00:11:08   We go to Apple.com, I'm like, "Okay, you got to click to Mac and then you should see..."

00:11:13   So anyway, they got over it and good for them because it was dumb that it went away. And I thought it was dumb at the time.

00:11:21   And thank you for allowing me—thank you, Apple, for bringing this subject back so I could beat it to death.

00:11:26   But good job. You brought back a store tab. You should have never gotten rid of it.

00:11:32   And with this revamped store tab, we have two new products. We have the Magic Keyboard with Touch ID for M1 Macs.

00:11:40   Not new.

00:11:42   Well, new for you to be able to buy it.

00:11:45   If you don't have an iMac, it's new to you. That's right. True story.

00:11:48   They also revamped the Magic Trackpad with its different curves that it has.

00:11:55   It's right. It's all of the new input devices that were previously only available on the 24-inch M1 iMac are now available silver only.

00:12:05   Silver only.

00:12:07   Right, because why would you sell a color when you can sell not a color?

00:12:12   But this is great. Like, if you've got a Mac Mini, I think that's—or a docked M1 laptop.

00:12:20   Also, by the way, I was talking to somebody about this who was concerned about buying this because they want a new Apple keyboard,

00:12:27   and they're going to buy an M1 Mac or some other Apple Silicon Mac at some point, but not yet.

00:12:32   And they were concerned about this.

00:12:35   These keyboards work fine. It's only the Touch ID that doesn't work with Intel Macs.

00:12:41   It works as a keyboard just fine.

00:12:43   So if you want to get one now because you need a keyboard and you know eventually you'll get an Apple Silicon Mac,

00:12:52   which you will if you're going to stay with the Mac, don't worry about it. You can get it. It works fine.

00:12:59   It just doesn't do the magic stuff of Touch ID. But that's okay.

00:13:03   Ignore Apple's compatibility. Like, on the website, they have that because they still sell the other one,

00:13:11   and it's just to stop people getting confused. They work. They just don't do the authentication part.

00:13:17   Exactly. That part doesn't work unless you've got an Apple Silicon Mac.

00:13:21   But I have to imagine it was all volume that they were shipping.

00:13:29   They had only made enough to get into the iMacs that they were making, and there was no overage.

00:13:35   They were ramping up iMac production, and they were ramping up keyboard production.

00:13:40   And now they've gotten to the point where they have enough overage, right,

00:13:44   that there are keyboards that don't have iMacs attached to them, that they can do this.

00:13:48   Because it was frustrating for a while there because I definitely heard from people who are,

00:13:54   they have one of those Mac Minis and they love it on external display.

00:13:56   And it's like, it's perfect for that device. And they work. They always worked.

00:14:00   You just couldn't get one. So now you can get one. That's a good thing. It's a very good thing.

00:14:06   And there are new GPUs available for the Intel Mac Pro. They're very expensive.

00:14:12   They're based on AMD's new Radeon Pro series, the W6000.

00:14:19   Extremely expensive. The people who buy them probably don't care so much about how expensive

00:14:23   they are, but it is an example of Apple making new. And these aren't just like,

00:14:27   "Oh, you can buy a card and stick it in." These are MPX modules. They're the whole thing.

00:14:33   And there's, you know, we've talked about the rumor that there's probably a new iteration of

00:14:40   the Mac Pro coming with a new generation Intel Xeon processor. So, I mean, this is good.

00:14:49   I think the idea there is that they wanted to support this Mac Pro and not just kind of

00:14:54   ship it and forget it. So they're still updating its components. And it makes me wonder if it might

00:14:59   be updating components for the Intel Mac Pro for a while, right? Because the people who buy these

00:15:05   things are making a large investment and they, you know, Apple can move forward with Apple Silicon

00:15:14   and still put out MPX modules for the Intel Mac Pro, right? For years. And that, I think,

00:15:22   I hope that's what they do, right? Because the people who are buying these systems, they just

00:15:26   want them to be good and fast and work for them for a long time because they spend a lot of money

00:15:32   on them. - I mean, maybe it's too soon, but to me, this just feels like the Apple Silicon Mac Pro

00:15:41   will support this. That's just how it feels to me. I feel like it's a lot of work to offer so many,

00:15:48   like you have three available. Like it just seems like a lot of work. - Who's to say, given that they

00:15:51   threw away the trashcan Mac Pro after one iteration, you know, and they said they would do better?

00:15:56   You never know. It would be, I mean, it's happened before that Apple's created a whole

00:16:03   connectivity spec, like this MPX module kind of thing, and then thrown it away. But you would hope

00:16:11   that that rumored Apple Silicon Mac Pro that's like a mini Mac Pro would support an MPX module,

00:16:20   if not two, right? You would hope that they would extend this. As for when we see that thing, I mean,

00:16:27   the more that happens with the Intel Mac Pro, the further back I imagine that other product will

00:16:33   exist, which is fine because it seems like it's the hardest engineering challenge for Apple to do

00:16:37   a Mac Pro using their own chips. So maybe that's the very end of the transition process. So end of

00:16:45   next year, maybe for that? - I'm choosing to have faith on the MPX stuff. - Yeah, I mean, for people

00:16:54   who love this stuff and haven't listened to ATP last week, as you might expect, ATP talked about

00:16:58   this an awful lot, and it's all in there, right? Like you would hope that Apple is not essentially

00:17:06   reneging on what they promised pros, which was that they were actually going to stand by and

00:17:10   support these devices. So I think this is Apple making good on that by releasing new GPU modules,

00:17:18   and they're very expensive, but there you go. The whole product is extremely expensive. That's just

00:17:23   what it is. - Yeah, I do wonder why they don't have versions of the consumer graphics card,

00:17:31   like the newer consumer GPUs, like why they always go for the pro stuff, but maybe it's just purely

00:17:37   because they need to make a bunch of money from it, so this is what they go to. Because the new

00:17:41   consumer GPUs are all incredibly powerful. They could make versions of those as well, but they

00:17:46   seem to choose not to. - Well, if I have a criticism of the ATP discussion, it's that it's

00:17:55   because it's John Syracuse and he plays games, it gets skewed toward games. And Mac Pros are not

00:18:00   meant for games. You can do it, but they're not meant for games. They're not meant for boot camp

00:18:05   and games. They're not. They're meant for a very narrow set of business needs that businesses buy

00:18:14   incredibly expensive computers that have, and then spend money on these incredibly expensive cards

00:18:20   to do whatever it is. And I don't even know what all of those things are. Is it 3D rendering? Is it

00:18:28   biotech analysis? I don't know what it is exactly. It's a lot of vertical categories. And so my guess

00:18:37   is that the people who are doing this inside Apple are aware of who their core customers are and what

00:18:43   they want. And what they think they want are this class of GPU. - Or maybe they just can't get any

00:18:50   of them because nobody can get the consumer once. - Or maybe, I mean, that's what John said about

00:18:55   these cards is that they cost a fortune, but you can get them. And so they're like pricing in the

00:19:00   scarcity of it. That's fair enough. Fair enough. And anyway, I think most people don't care because

00:19:08   most people aren't Mac Pro users, but it is kind of interesting to see how Apple handles this market

00:19:13   and the people who do care, care a lot. - All right, let's handle some upstream headlines.

00:19:17   We've got some news, especially from Apple as well, before we continue with this week's episode,

00:19:21   of course, in upstream, we take a look at some of the news and streaming media and streaming media

00:19:26   services. Apple has acquired the rights to Argyle from director Matthew Vaughn. This is a movie

00:19:32   with a huge cost, including Henry Cavill, Sam Rockwell, Bryce Dallas Howard, Brian Cranston,

00:19:38   Catherine O'Hara, John Cena, Dua Lipa, and Samuel L. Jackson. Cost Apple $200 million.

00:19:43   - And here, here, Myke, is the key thing is it's just stated outright that the goal of this

00:19:53   is to create a franchise. This is a future intellectual property play. They want this to be

00:20:01   like, people say James Bond, but like, let's say the Bourne movies, right? They want it to be this,

00:20:08   you're not just buying this movie. I get the impression that you're buying into this as a

00:20:13   franchise. - I, yes, of course. The thing that surprises me about this though is I don't really

00:20:20   understand how this movie ended up ever getting in front of streaming services because of that cost,

00:20:26   like this is a massive blockbuster, surely, right? Like if you saw that on a poster,

00:20:33   there's a huge cost. It's just surprising to me. I don't know if it's maybe because of, you know,

00:20:40   concerns and nobody knowing what the future of cinema is gonna be like, et cetera, et cetera,

00:20:46   but I'm still really, I'm just surprised. This isn't a movie that's done, right? It's not like

00:20:51   it's done and then they can't put it in the cinema. - They haven't shot it yet. - No. So,

00:20:56   it's just a surprise to me. - So this is Apple's film, Apple Films, whatever sub-brand,

00:21:05   whatever it is. I'm unclear, I mean, this may be a theatrical debut and then straight to Apple TV+

00:21:12   kind of thing. - Yeah, but if they do that, it will be quick to Apple TV+, right? - Of course.

00:21:16   - You know, so really it's an Apple TV+ thing. Even if they put it in cinemas. - Yeah, but that

00:21:23   might be the future of all cinema, all movies is that you have a very narrow window in theaters.

00:21:29   What Julia was saying last week is run three weeks and then you're done basically. You've made all

00:21:34   your money that you're gonna make. - But that's always gonna be less box office money, right?

00:21:38   - Yeah, I'm fascinated by it. Just, I think one of the untold stories of this era right now

00:21:45   is everybody who doesn't have franchises trying to make franchises because we live in an era where

00:21:52   the big franchises, and Marvel is the biggest at this point, just are machines that throw out

00:21:59   billions of dollars with every release and every company wants, who doesn't want a machine that

00:22:06   you press a button and a billion dollars comes out? That's pretty good. - Yeah, I think that

00:22:11   there's a bit of a, like a fool's errand in this. It's just like, you can't all create Marvel.

00:22:19   Marvel's Marvel and that's that, right? - Yeah, oh, I agree. - Yeah, so I think. - I agree. I think

00:22:25   that there's a good conversation to be had about why you can't do that and you especially can't do

00:22:30   it if you're trying. It's like the watch pot never boils. It's like the franchises happen

00:22:37   and then you take advantage of them and I feel like if I were, this is hilarious, but if I were

00:22:44   in a position where I was acquiring content for a streamer and I was looking for franchises,

00:22:52   I would probably be making, I don't want to say small bets, but like medium bets, not big bets.

00:23:01   Like I would not do what Amazon's doing with Lord of the Rings. - No, that seems like a bad idea to

00:23:06   me. - Because I mean, it is a pre-existing franchise, so I've got, they got me there,

00:23:10   but it is one big swing and as a baseball fan, I will tell you that your percentage chance of

00:23:19   getting a hit in any at-bat is low and the same goes for this kind of stuff and so I would rather

00:23:25   take a bunch of swings and then find the ones that are the hits and cultivate them and try to build

00:23:33   them up, then, you know, which, you know, to be fair, the counterargument is that's what Netflix

00:23:38   has been doing and they really haven't, I mean, they've had a handful of things like this, but

00:23:42   nothing at a huge level. Yeah, I mean, is The Crown a franchise? I mean, they're gonna run out

00:23:48   of time for, unless there's like a future season of, oh man, can you imagine there's a future

00:23:53   season of The Crown that's set in like the 24th century and they've cloned Queen Elizabeth and she

00:23:59   comes back, then The Crown's a franchise, but until then, anyway, I don't know, hey, if this

00:24:06   is the next Bourne series or a new James Bond or something like that or what they didn't say in the

00:24:13   reports and I wonder about is one of the modern ways you do a franchise thing is you plant

00:24:18   characters in your movie who then get their own streaming series, right? Well, yeah, like,

00:24:23   franchise to me isn't just like you have a bunch of good movies, right? Like, I feel like in a

00:24:29   modern parlance of franchises, you have like a universe, right? Like, you can do a bunch of stuff

00:24:35   with it and then maybe like, but I don't know, like that seems... Right, Marvel may be the exception

00:24:43   to the rule, although, I mean, Marvel Star Wars, right, are great examples. There aren't a lot of

00:24:49   James Bond, like there are some, but that's a tough game to play, but I understand why they

00:24:54   want to play it because the reward could be huge. Massive, yep. But in this case, it is just a big,

00:24:59   expensive spy movie, which is, that's fine, it could be really good, great cast and all that,

00:25:04   but yeah, I just, I keep thinking that the modern way you do a franchise is, and somebody's doing

00:25:10   this, I can't remember who it is. I was just about to say it's Netflix with Ryan Gosling

00:25:14   and Chris Evans. They have a spy movie that's coming, it's The Gray Man, it's based on the

00:25:20   book series, The Gray Man, yeah, and they're trying to like, that's their thing. But that's

00:25:25   not what I meant. I meant that I read somewhere that there's a movie coming out that is, oh, it's,

00:25:30   oh, I know what it is, it's, it's, it came out, I think it's Suicide Squad, The Suicide Squad,

00:25:36   which came out this last weekend, and that, that they are already shooting an HBO Max series based

00:25:45   on one of the characters who's in The Suicide Squad. Yeah, I believe it's a bit of a spoiler,

00:25:50   so we won't say who, but yes, they are, they are making a television show based on the characters.

00:25:54   It wasn't a spoiler when nobody knew or could see The Suicide Squad, but now that you can see it.

00:25:58   I only know because I haven't seen the movie, but so like, it's, I knew it already, and then it's

00:26:04   like, yeah. I think, I think you're going to start seeing more of that too, which is these kind of

00:26:08   prefab franchises where they, they're, the whole strategy is we're going to have a film, but we're

00:26:15   also going to have like ancillary characters who are planted to spin off into TV shows so that the

00:26:22   franchise stays in front of people until the next big thing happens and they all come back together,

00:26:26   which I don't know if executed well, and that's always the question with this, this stuff,

00:26:30   if executed well, that could work. It could also be a total failure. And given that Suicide Squad

00:26:35   has not performed well, at least in theaters, the investment that they made in that spinoff TV show,

00:26:40   it could, they could be looking at it now as a waste, or maybe it's a way to salvage

00:26:44   The Suicide Squad by making a part of a bigger thing. I don't know. I don't know,

00:26:49   but it's fascinating to watch. I'm, I'm, I'm, I got my popcorn out.

00:26:53   I'm just going to say that that HBO Max idea of putting all the movies in the service just

00:26:58   seems like it's become more and more of a bad idea no longer. We're out from it. Like, yeah, not good.

00:27:06   Well, you know, Jason Kyler's not going to keep his job. So, you know, it's gonna,

00:27:10   there's going to sweep that one right under the rug and move on to 2022. Yep. Yep. Yep. Yep.

00:27:16   Musical Come From Away will be arriving on Apple TV+ on September 10th. Big surprise to me. It's

00:27:22   going to be a live performance like Hamilton was. I don't know to what extent they've shot this,

00:27:28   like how similar it will be to Hamilton or not. This is a surprise. Like I had no idea that it

00:27:33   existed like as a, as a filmed thing. And not only is it arriving, it comes in a month, which I'm

00:27:38   excited about it because I've wanted to see Come From Away. Cause I, I hear it's very good.

00:27:43   Everybody that I know that loves musicals speaks very highly of this one. So I'm excited about this.

00:27:49   Yeah. And this is a question for, I guess, what would up, upstage. Oh, upstage is beautiful.

00:27:56   Which is. Jason, we are creating a franchise. Oh man. You're right. Where's our money?

00:28:04   Thank you. Thank you for buying those summer fun t-shirts. So upstage, it is,

00:28:10   the, the, the finance is a Broadway and, and the, and the theater in general. Apologies to London,

00:28:18   cause the theater is huge in London. The, the, the money involved, they're very complicated, right?

00:28:25   And the way you make your money, you spend huge amounts of money on these shows and they're like

00:28:29   swings of the bat too. Sometimes they make it, sometimes they don't. The ones that make it are

00:28:33   the ones that pay off for all the money lost on the ones that didn't. But once you get it,

00:28:37   you take it, you take it from London to New York and then you do traveling and you do, you know,

00:28:44   and you, you franchise it in its own way out and then you're making huge amounts of money. And like

00:28:48   in Hamilton's case, they had multiple national tours in the U S plus they were permanently in,

00:28:53   uh, in New York and they were, they did a long run in San Francisco and like all of this stuff goes

00:29:00   on. I wonder if, is this just a COVID effect where all the theaters shut down or when you look at,

00:29:10   we'll see what this performance is, but like when you look at something like Hamilton,

00:29:14   which was a phenomenon, but like, is there money, this is how they probably think of it.

00:29:20   Is there money available to us from people who are never going to go see it in the theater?

00:29:26   Because it doesn't come to them or it's too expensive or whatever. Is there another portion

00:29:32   of theatrical of not theatrical film, but like of theater that is the Hamilton like

00:29:40   filmed stage production with high production values that we can get a lot of money from a

00:29:46   streaming service for. And how much does that cut into our ticket sales of our traveling or does it,

00:29:50   or does it boost it because people become fans? I don't know the answer as somebody who doesn't

00:29:55   go to a lot of theater. Although I do go to some, the idea that I could catch a really good quality

00:30:01   capture of maybe a, maybe a high quality original cast of something performing something at a very

00:30:09   high level, like that appeals to me greatly, but I don't know about the financial part of it.

00:30:14   And I'm curious about that part, whether this is something that will end up benefiting the theater

00:30:21   industry or not. But I think it's great for audiences. Obviously it's not the same as going

00:30:27   to see it in person, but you know, first off you see it in person and then you're done and you

00:30:32   don't get to relive it at all. And people don't tend to go back. I mean, some people do, but most

00:30:38   people don't go back to the theater again and again and again to see it again and again and again.

00:30:41   Yeah. It has to be something special. Says the guy who's seen Hamilton three times, but still.

00:30:45   But Hamilton, I think is the outlier though. Like I've seen Hamilton three times. I'm planning on

00:30:50   going to see it again soon. Just what I want to do. I want to book tickets and I'll go see it.

00:30:55   Right. And, and Matt in the discord is making a point that I think is absolutely true,

00:30:59   which is there's an argument to be made at least that you are creating audience for your property

00:31:02   by doing the streaming version. Cause now you really ought to see it in person, right? It's

00:31:08   coming to your town. It's like, if you like it on TV, imagine what it's like to be there.

00:31:13   Maybe that's the plan. Like you, you run a musical until it starts to decline. You put out a video

00:31:18   version, which you'd make a bunch of money from and you maybe you boost tickets. I don't know.

00:31:23   I don't know, but it's, it's, uh, anyway, coming to Apple TV, plus I'll watch it. I'm looking

00:31:28   forward to it. Reese Witherspoon has sold hello sunshine for $900 million to a media company

00:31:35   backed by private equity group Blackstone. Right. This media group is going to be led by

00:31:41   ex Disney executives, Tom Stags and Kevin Mayer. Kevin Mayer, you may remember as the person who

00:31:47   ran Disney plus everybody thought was definitely going to be the CEO was passed over for CEO ship

00:31:53   left and went to Tik TOK and then that all imploded. And now here he is.

00:31:56   So yeah, there, his name wasn't Bob. That was his fatal flaw. So the, this is fascinating because

00:32:04   it's like they're, they're working with a private equity group to make, uh, make a studio, right?

00:32:09   Like make a big studio from nothing. Um, as Julia and I talked about last week, um, she talked about

00:32:15   the idea that not everybody needs a streaming service, right. And, and that maybe, uh, it would

00:32:20   be okay if you became a content arms dealer, as she said, and that there would be value in that.

00:32:25   And that maybe something like CBS Viacom, um, would look at what they were doing in two or

00:32:31   three years and be like, Oh, we just be better off selling this stuff to the highest bidder

00:32:35   of the streaming services rather than doing this ourselves, which sort of is what Sony's game is

00:32:40   right now. And I wonder if this is that right. Kind of, which is, uh, there's a insatiable thirst

00:32:45   for, for content. Um, and they don't need to create a streaming service. They can just, uh,

00:32:50   fulfill the needs of the people who need content on their streaming services. Um, it does.

00:32:57   It also points out since Apple was supposedly sniffing around Hello Sunshine, right. That,

00:33:02   um, my guess is that they got outbid that the private equity group finds more value in

00:33:09   aggregating these studios together than Apple found in sort of, you know, getting some talented

00:33:16   people that they like working with into their, you know, on their team.

00:33:20   Mm-hmm Reese Witherspoon will remain on the board, uh, along with current CEO Sarah Harden, and

00:33:27   they're going to continue to oversee operations of Hello Sunshine. And Sky has announced that they

00:33:34   will be the home of Peacock and Paramount Plus in the UK and Europe. This will be at no extra

00:33:40   cost for current subscribers. Uh, Paramount Plus will also be available a standalone at a later

00:33:45   date and Peacock has said that they will have, it will be ad supported on Sky, which makes me think

00:33:50   that they may also have a direct to consumer option in the future as well. I think this is

00:33:56   kind of smart from Sky, to be honest. Like, "Hey everyone in America, why don't we just take all

00:34:04   that content for you and we'll give you some money for it?" And, you know, I actually think it's kind

00:34:10   of a smart move. I don't know how I feel about it as a consumer. So Sky is a satellite linear TV

00:34:17   provider, is that right? It's really difficult to describe what they are now. I mean, okay,

00:34:23   just imagine... So this will be presumably in their app on streaming, you'll get Paramount Plus

00:34:28   and Peacock now. Yeah, or on their box. Their box has like a whole interface. Basically, at this

00:34:34   point, Sky is like Comcast and TiVo and a streaming service, right? It's like all of those things,

00:34:42   and no matter what part of it you are a part of, you can get this. So like, we use Now TV,

00:34:48   which is Sky, but it's their streaming thing, and because we're a Now TV subscriber, we'll get

00:34:53   Paramount Plus and Peacock. I see. So they're basically, the US equivalent would be sort of

00:34:57   that they're a cable or satellite provider. They've got a bundle of content. You sign up

00:35:02   for Sky and you get a bundle of stuff that includes linear channels and stuff that's on

00:35:06   demand and all of those things. Yes, everything. And now they're going to be a front for the

00:35:11   American streaming services too. Fascinating. Yeah. Fascinating. I think it's an interesting

00:35:16   play from them. I could imagine HBO doing this as well because HBO and Sky have a very long-standing

00:35:22   relationship, which is why we've never got HBO Go. As pointed out by Tony in the chat,

00:35:30   Comcast owns Sky. So they are Comcast. They're really Comcast. All right. Fascinating.

00:35:36   This is an interesting move for Paramount Plus and Peacock as well, because the idea here is,

00:35:42   how do these services that are especially, in Paramount Plus' case, a bunch of their originals

00:35:46   are not available to them outside of the US and Canada because they sold them off to Netflix and

00:35:51   Amazon, right? But they do want to have a presence. And this also is kind of a nice package deal.

00:35:58   Presumably it means that you're basically buying all the stuff that they're producing

00:36:02   for their service in the US that remains. Peacock is a good example of that. And it all just comes

00:36:08   over, right? So all of those are Peacock originals that NBC is building in the US will just be

00:36:14   available to Sky as well. And it gives them an international presence without, like you said,

00:36:19   they will probably build their own offering as well, but they're kind of like doing the bundle.

00:36:24   They're bundling it in before it exists, which is interesting. That's an interesting idea.

00:36:28   This is also a case where these are companies that didn't have a really

00:36:31   fixed international strategy. And so they're figuring it out.

00:36:35   - This episode is brought to you by ExpressVPN. You probably wouldn't take a call in a public

00:36:43   place if there was anybody around you, maybe on speakerphone. You don't want people listening in,

00:36:47   because you would care about your privacy. We're using the internet without using ExpressVPN.

00:36:52   It's kind of a bit like taking that call because somebody could eavesdrop if they wanted to.

00:36:56   ISPs, and if you're connecting to Wi-Fi that you're unaware of, they can see the websites

00:37:00   that you visit. That data could be sold to others who might want to use it to target you

00:37:04   for marketing. Thankfully, you can use ExpressVPN to create a secure encrypted tunnel between your

00:37:10   device and the internet so people can't see your online activity. It's so easy to use.

00:37:14   You just fire up the app, you hit one button, it works on phones, laptops, even routers,

00:37:19   so everyone who shares your Wi-Fi can be automatically protected. It's no surprise

00:37:24   ExpressVPN has been rated number one by CNET, WIRED, and The Verge. I just got back from

00:37:28   traveling. I was in a hotel, hotel Wi-Fi. I had ExpressVPN on the entire time because I don't

00:37:33   control that network. I don't know who controls that network. So I just turned on ExpressVPN on

00:37:38   all my devices. It was great. What was also good is I wanted to be able to watch a TV show that I

00:37:44   couldn't watch because we were not in the UK with the company that we pay for, like with the service

00:37:51   that we pay for. So I could say to ExpressVPN in the app, "Hey, I'm in the UK," and then ExpressVPN

00:37:56   can spoof my location and I could watch that show as well. So really great. So loved it.

00:38:01   Secure your online activity by visiting expressvpn.com/upgrade today. That's

00:38:07   expressvpn.com/upgrade and you can get an extra three months for free. That's expressvpn.com/upgrade.

00:38:17   Our thanks to ExpressVPN for their support of this show and Relay FM. Okay, so big topic time for

00:38:23   today's episode. Last week, Apple announced that they are working on two initiatives to combat

00:38:29   child sexual abuse material. How was that said? CSAM? Is that how it's... Yeah, I think that's

00:38:34   what they're calling it. And for people who say that they haven't really heard this term before,

00:38:37   this is what has historically been called child pornography. And in the last few years,

00:38:45   there has been an effort to rename it because of the feeling that that term doesn't get at what is

00:38:54   actually going on here, which is any images, sexual images of children is by definition child abuse.

00:39:02   So they don't want people to call it pornography and instead call it sexual abuse material,

00:39:11   child sexual abuse material. I think it's a better phrase because I assume that it can also

00:39:15   encapsulate other things which can be used for this purpose. Right, right. But the idea here is

00:39:21   just to classify it. I mean, words define how people file things in their brains and what

00:39:28   they're trying to do here is say, "You need to take this more seriously. This is not material that

00:39:35   some people are using because it turns them on. This is evidence of a crime," essentially, right?

00:39:40   This is, "These photos are evidence of a crime and should be thought of in that way." So that's why

00:39:45   when this came out, you see CSAM, the acronym used a lot. So they showed off two new features

00:39:54   that are coming with an upcoming software update for iOS. Both of these features are going to be

00:39:59   in the US only at first, possibly coming to other regions in the future but on a case-by-case basis.

00:40:04   This is a very, very large topic with a lot of implications. And so we're going to try and talk

00:40:11   about it like this. I am going to outline the two things. Then we're going to talk about some things

00:40:17   that have been reported on this, some more discussion, some FAQs, some responses from Apple.

00:40:24   And then if anything else, we have not yet covered for our own thoughts on these systems.

00:40:31   I'm sure there will be intermixing in the conversation.

00:40:35   - And it starts with the fact that as you mentioned, this is not one thing, right? Apple

00:40:41   announced sort of two very distinct things and put them in the same bucket because it's a child

00:40:49   safety bucket. But they are very different technologies that do different things. And

00:40:55   I think it doesn't do anybody any good to conflate them.

00:41:00   - And of course, this is quite a sensitive topic, right? So if this stuff is not good for you,

00:41:07   skip it, right? We have chapters. You can skip this conversation.

00:41:11   And of course, as well... - Some are fun.

00:41:14   - Yeah, it's not fun at all today. And of course, because this is so sensitive and complicated,

00:41:20   we are going to try our best to have nuanced and thoughtful discussion about this. But we will not

00:41:29   be perfect about it because it's so complicated, right? I just want to say that upfront before we

00:41:35   start digging in. So the first part is probably the easier to get your head around, but I don't

00:41:42   think perfect. Communication safety. This is for the Messages app on iOS and the Mac. This system

00:41:50   is intended to, in some cases, warn parents if their child views content that is deemed

00:41:56   as sexually explicit. This will be determined by on-device analysis powered by machine learning.

00:42:03   If a photo is determined to be explicit, it will be blurred out. Now, the sexually explicitness

00:42:12   of these images, this is completely divorced from the CSAM detection stuff.

00:42:18   This is a machine learning model. - Yeah, that's it. It's a machine learning model

00:42:23   that's basically saying, "Is this sexually explicit content?" It runs on device. And then there's this

00:42:32   interception, which is not a blocking either. It's an interception and a warning with different

00:42:39   things that happen based on different age groups. - And if somebody tries to view one of these

00:42:44   blurred images, a child, this is in an Apple iCloud family, you're deemed a child, your account is a

00:42:50   child, it can be turned on, et cetera, they will be shown a warning, like a set of warning screens

00:42:55   that Apple spot on their website, telling them the content can be harmful. If a child is under 13,

00:43:02   so 12 and under, their parents can be alerted if the image is viewed or sent to someone else.

00:43:09   - And that's a parental option. The parent would turn that option on, and then there would be this

00:43:15   warning. And basically the idea there is somebody sent you something, you should probably tell your

00:43:21   parents. If you wanna see it, your parent will be alerted. And that's for 12 and under.

00:43:24   - Now for 13 to 18, 'cause that's where it ends at 18, the individual will see the warnings,

00:43:33   but there's no parental notification of that. - Right, so a lot of the hot takes when this

00:43:39   first was announced were, this is Apple basically saying you can't send, you're a teenager.

00:43:44   Hot teenagers sending nudes to each other are gonna run afoul of this. And it's interesting

00:43:49   that Apple has actually built this in. It's like, no, no. And in fact, what this feature is,

00:43:55   if you're a teenager is if somebody sends you something unsolicited, it fuzzes it out,

00:44:06   so you don't have to be prompted with it. You don't have to see it if you don't wanna see it.

00:44:11   And if you do wanna see it, then you get to see it, which is an interesting combination that you

00:44:16   could also view as being sort of like for teens, it's, you know, who sent this to you? Is it

00:44:22   somebody who you want to see or not? And if not, then you don't have to see it. It'll get fuzzed

00:44:28   out and you can just tell them to go away or block them or report them to somebody in a position of

00:44:36   authority to get them in trouble, whatever it is. But if it's something you wanna see,

00:44:40   my understanding is that's it. You just say, okay, I'll see it. And your parents don't get told,

00:44:45   none of that happens. - There's no logging of it or anything like that between the ages of 13 to 18.

00:44:49   Now the CSAM detection is the much bigger part of this. So again, so everything we've just said,

00:44:58   that's one thing, forget about all that now for this. These are completely different.

00:45:02   They are not related in any way other than the fact that children are involved. That is where

00:45:07   it ends. - Right. Before we kind of close up on this first one, I'll just say this is an interesting

00:45:12   feature that I'm actually, and maybe it's because they're afraid that people are gonna conflate this

00:45:18   even more. I think it's interesting that Apple hasn't made this a feature for adults to just say,

00:45:27   to do this same feature, which is like, if somebody sends me an unsolicited, you don't know,

00:45:34   they don't know whether it's solicited or not. So why don't you just fuzz it all out? And then if I

00:45:38   wanna see it, I will tap to see it. It's like a machine learning based filter, but they're not

00:45:44   even doing that. They're like, no, this is a child protection feature. That's all it is. - You know

00:45:48   what, actually, just so we don't mix things up, let me give my thoughts on this part, because I

00:45:53   don't think we're gonna come back to this otherwise. - Yeah, I think so. - I kind of have,

00:45:58   like this is the easiest one to have feelings about, like on the face of it, decent system,

00:46:03   provided that it's implemented well. But I do also have some concerns about it. Like,

00:46:08   what is going to be considered explicit and how is this determined? Like just a machine learning

00:46:14   model, like is weird that Apple have been so forthcoming with the second part and how that's

00:46:20   determined. And I feel like this is not very well determined. Like I've seen some concern from

00:46:25   members of the LGBTQ community that there are existing systems and models that over

00:46:34   categorize things in these communities as explicit, even if they're not so. And so like,

00:46:41   I can understand how if you're in those communities that you could be concerned,

00:46:46   considering the fact that Apple's not being very forthcoming with this. - I would say the classic

00:46:50   one is Facebook banning pictures of nursing mothers. - Yeah. - Right, which is not sexually

00:46:58   explicit in any way, but they have a basically machine learning model for breasts. And they're

00:47:08   like, "Whoop, there they are." And it's like, "Yeah, but no little machine, no, it's not like

00:47:13   that." And this is the lesson that we've all had to learn over the last few years, which is

00:47:17   a machine learning model is only as good as how it's trained. And if it's trained with biases,

00:47:21   the biases will be in the model. So that is a question. It seems less harmful here in the

00:47:26   sense that what it's gonna generate are false positives or it's gonna miss things. - Well,

00:47:33   as I say, again, and I don't know enough about this, but I've seen people saying it, so I will

00:47:39   listen to what they have to say, right? Like if you are a part of the LGBTQ community, right,

00:47:47   and you've not come out to a family member and you know what I mean? Like if this is,

00:47:55   there are potential consequences depending on how this is trained that you could be saying

00:48:02   something to someone that you didn't want. Like it's complicated again. It's just complicated,

00:48:07   right? - I don't know about that because of the 13 to 18 thing, but yes, I guess that's true that

00:48:14   at the younger age, if that material was flagged and then notified a parent,

00:48:21   yeah, it's all a very sensitive subject. - It's complicated. - I will say that, and this is not,

00:48:27   and we're gonna get to the rest of it in a minute, but like this is not an endorsement of Apple

00:48:30   and what it chose, 'cause it may have made bad decisions. We can argue about that,

00:48:35   about whether this is good or bad. And a lot of people, very smart, thoughtful people have taken

00:48:42   different sides on this. And I think that's instructive about how hard a subject this is.

00:48:47   But I will say this, which is when this was announced, there were so many knee-jerk hot takes

00:48:52   that were, I can't believe Apple didn't think about X. And when you look at the details here,

00:48:58   it's very clear that Apple thought a lot about this. And this is a very carefully constructed

00:49:04   system. You may not agree with it, but I think it's worth at least acknowledging that the people

00:49:10   who built these features at Apple seem to have thought a lot about the ways that they could be

00:49:17   misused and have tried to build in features to make that not the case. We can, again, we can

00:49:24   debate whether they actually succeeded or not, but I think that it would be a mistake to say

00:49:27   they didn't think about these issues because I'm sure they did. They may have made good decisions

00:49:32   or bad decisions after they thought about it, but this bears the imprint of a lot of debate

00:49:38   and discussion and a kind of a careful choice about what features got implemented. - Yeah. And

00:49:44   if there's the whole angle of control over a child, right? And it's tricky, right? Because Apple can't

00:49:54   make that kind of situation any different than it is, right? Like if a family member is controlling

00:50:01   a child, if they are going to use, they'll change the age of the iCloud family, that kind of stuff.

00:50:09   And so it's like, I can understand how people can say, "Well, that's not Apple's responsibility."

00:50:14   However, there is also this element of later on where Apple is also kind of considering itself

00:50:22   a part of law enforcement now. So it's like you can be my protector, but also not. And it's like...

00:50:29   - And the truth is that every tool of control that gets built can be misused. And so the argument is

00:50:39   — and this goes for this whole thing, so we'll get back here — but the argument is,

00:50:44   do you build the tools? Or if you know abuse is going on, do you refuse to build the tools?

00:50:50   Which means that abuse that was going on will continue to go on. And it can be a very difficult

00:50:56   choice to make. So every bit of Apple's parental control features can be abused by a parent,

00:51:05   right? A parent can turn off all the features on their kid's phone, and then the kids will try to

00:51:14   find ways around them. And so on one level, I look at this and I think, "Well, this is a tool that

00:51:18   could be abused." But I also look at this and think, "This is also a tool that could be subverted."

00:51:22   And so that's why it's complicated, right? Because whenever a parent is limiting a child's access to

00:51:31   something on their device, that's a tool that a good parent can use for good and a bad parent

00:51:37   can use for bad. And as the toolmaker, Apple has put in this difficult position of wanting to

00:51:42   provide good tools for — or tools for good parents and to protect their children, but they know that

00:51:51   every tool that they make has the potential to also be misused. And it's a very unpleasant place

00:51:56   to be, if you ask me. Talking about control and teenagers and et cetera, et cetera, I also would

00:52:03   be concerned that this feature set would drive teenagers away from using iMessage, as they may

00:52:10   feel that their parents are going to be spying on them, no matter what age they are. Yeah, I mean,

00:52:14   it is — the idea is 13 to 18 aren't, but that's true. I saw some arguments — But also as well,

00:52:19   like, you know, I could imagine being 16 or 17 and getting that prompt and feeling like my phone is

00:52:25   talking down to me. Sure, I get it. I get it. I just — I don't think — I did see the argument when

00:52:32   this came out of somebody saying that this was a bad move for Apple, essentially because it was

00:52:36   going to drive people to other chat platforms. I'm like, you know what? No, I'm not going to buy that

00:52:41   one. Like, imagine Apple — imagine the stories about Apple choosing to not protect children

00:52:49   from the fear of losing them to WhatsApp, right? But I do think that the 13 to 18 — the 13 to 18

00:52:58   prompt should look different to the ones that Apple have shown. Yeah, I mean, I think I agree

00:53:04   with that. Like, the important point is that nobody gets notified, and by conflating the

00:53:08   under 13 through the 13 through 18, the more you do that, the worse it seems for the teenagers. But

00:53:17   teenagers are pretty clever, like I said. They may go away from this or they may know,

00:53:22   but they will — I think they'll figure it out. The other part of this that feels something — you

00:53:30   know, like, ultimately, I feel like if I was a parent, I'm not a parent, right? So,

00:53:33   you know, bear that in mind. I feel like it's something that I, in some instances, would want,

00:53:41   right, to try and help make sure that my child was making the right decisions or at least had

00:53:45   a second to think or be able to make a second thought. It's definitely not perfect. And there

00:53:50   are some lines about privacy, which is, you know, interesting and strange, like, that, like, adults

00:53:57   can do whatever they want, but not kids. Yeah, I would say there's a debate about privacy

00:54:05   expectations for children, right? Like, theoretically, children have no expectation of

00:54:10   privacy because, like, legally, they don't. However, I would argue that that is — that may

00:54:19   be true, but I have some questions about the parenting choices. And everybody has different

00:54:24   parenting choices, right? There are the — there are — so Lauren and I were just talking about this

00:54:30   because she had a friend in high school and college whose parents were very strict. And

00:54:38   he did stuff like he bought a motorcycle from a friend and he parked it around the corner from

00:54:42   their house when he was home from college so that they didn't know that he had it.

00:54:48   And my thought was, well, that'll show you how good it is to be a super strict parent. What it

00:54:54   means is it teaches your kids to lie to you and hide things from you because there's no trust

00:54:59   there anymore and they just have to go around you, right? And that's just — again, everybody's going

00:55:03   to have a different parenting philosophy, but that struck me. And I think when we talk about this,

00:55:08   it's a similar thing, which is, do children have an expectation of privacy? No, but I think that

00:55:15   you — as a good parent, you should give them some space to be themselves and to do things that you

00:55:20   don't need to, you know, go through their correspondence. I had — when I graduated from

00:55:25   high school, my mom made me a book of high school memories and things, and it was pictures and stuff.

00:55:34   But in doing it, what I found is that she went into a box of my private, like,

00:55:41   photos and letters and stuff from friends and my girlfriend at the time.

00:55:49   And she expected that she did this nice thing for me and that I should thank her for it.

00:55:56   And my response was, this is a colossal invasion of my privacy, even as well-intentioned as it was.

00:56:04   So do children have an expectation of privacy? I think they do. I don't think it's legal,

00:56:12   but I think it's kind of moral. And so that's what strikes me about this feature. And we're

00:56:17   sort of like, this is feature one, and then there's the other big feature. But like,

00:56:20   while we're here, one thing that this is doing is saying, parents, we are going to protect — we are

00:56:28   going to look for really bad things, or maybe bad things if you think they're bad, on your youngest

00:56:34   children's devices because we know you probably can't or won't, or we don't want you to have to.

00:56:42   And that's interesting. It does lead you down a path, potentially, of building more features

00:56:51   that are about the device watching the kids instead of the parent. And I don't think Apple

00:56:59   intends to go here, but it's an interesting question philosophically. Like, are you building

00:57:05   a machine learning, strict kind of parental state around the kid if you turn on a bunch of features

00:57:12   like this? Or are you giving your kid space by setting these features and letting the kid and

00:57:20   the machine deal with it instead of you having to pour through every bit of content that they

00:57:25   go through to make sure it's okay? And again, I don't think there's a clear answer there,

00:57:30   but it's an interesting question. Like, having it be machine learning based means the parents

00:57:34   don't have to police this, which is good because I think most parents won't police this. Just in

00:57:41   reality, parents are very busy and most of them are not going to ask their kids to hand over their

00:57:45   phones and have them scroll through everything, and the kids are going to find a way around them

00:57:50   seeing what they want to see anyway, right? That happens. But I think it's interesting to think

00:57:57   about the expectation of privacy and whether adding a machine learning element in reassures

00:58:03   parents. Is that a better kind of scrutiny of a kid than direct parental scrutiny? I don't know.

00:58:11   So Alex Stamos, who works for the Stanford Internet Observatory, had a really good thread

00:58:16   about all of this stuff, but there was one part of it that relates to the communication safety

00:58:23   segment that I thought was interesting, which is, and I've seen other people criticize Apple for

00:58:28   this too, of like- And just to be clear, this is the Alex who was the head of security at Facebook

00:58:32   for many years and said many interesting things while at Facebook. His track record is very

00:58:39   interesting, but this is what he does now for a living at Stanford is think about stuff like this.

00:58:45   And this is something I've seen other people say too, of like, that maybe this system has some

00:58:52   interesting parts to it but probably isn't enough, and it's weird the way that Apple have rolled it

00:58:57   out to be so focused on what it is. So what Stamos said is that he would love to see Apple create

00:59:05   robust reporting in iMessage, slowly roll out client machine learning to prompt the user to

00:59:10   report abusive materials, and staff a child safety team to investigate the worst reports. And I would

00:59:16   also say, as you did, I don't know why anyone couldn't report things that they didn't want

00:59:21   to see in iMessage. This is, again, it's kind of a tangential point a little bit, but it leaps off

00:59:27   of this feature, which is Apple could do more, and I think this is overall, we're going to get back

00:59:34   to this about why this is going on and the idea, Ben Thompson wrote a really good piece about it

00:59:38   on today, Monday, as we record this about this, which is there are choices Apple made about how

00:59:45   they built this up, and Apple is in a position where it can sort of choose where to intervene

00:59:49   and where not to, where somebody like Facebook can't. But this is a really good point, which is

00:59:54   Apple has really gotten away with not having to do what Facebook and Twitter have to do in terms of

01:00:01   iMessage, right? Apple just is like, "Hey, everybody, you can block people if you want,

01:00:06   but it's just whatever." And it's like, well, okay, but if somebody is sending awful material

01:00:12   to somebody, could you report them in iMessage? Are they violating a term of service? Could you

01:00:17   do that? Right now, you can't. And so this is what he's suggesting here is that what if you

01:00:23   build a system where you build a reporting framework and a safety framework for iMessage,

01:00:27   you use the machine learning to buttress it by like flagging things and saying,

01:00:31   "Do you want to report this? You can report this as abuse," whether it's language-based or

01:00:36   photo-based or whatever. And then his idea is you have a child safety team that investigates

01:00:43   if a child says that they're being abused. All interesting points about how Apple could have

01:00:50   approached this and thus far has not. All right, this episode is brought to you by

01:00:53   Pingdom from SolarWinds. Today's internet users expect a fast web experience.

01:00:58   No matter how targeted your marketing content or how sleek your website is,

01:01:02   they'll bounce if a page is loading too slowly. But with real user monitoring from Pingdom,

01:01:06   you can discover how website performance affects your users' experiences, so you can take action

01:01:11   before your business is impacted or for as low as $10 a month. Whether your visitors are dispersed

01:01:16   around the world or across browsers, devices, and platforms, Pingdom will help you identify

01:01:21   bottlenecks, troubleshoot performance, and make informed optimizations.

01:01:25   Real user monitoring is an event-based solution, so therefore it is built for scalability,

01:01:29   which means you can monitor millions of page views, not just sample data, at an affordable

01:01:34   price. Get live site performance visibility today with real user monitoring from Pingdom.

01:01:40   Go to pingdom.com/relayfm and you'll get a 30-day free trial with no credit cards required to do so.

01:01:46   Then when you're ready to buy, use the code "upgrade" at checkout and you will get an amazing

01:01:50   30% off your first invoice. That's pingdom.com/relayfm and the code "upgrade" at checkout.

01:01:56   Our thanks to Pingdom from SolarWinds for their support of this show and Relay FM.

01:02:01   So now let's talk about CSAM detection. This is a new technology that will allow for Apple to scan

01:02:09   for known CSAM images stored in iCloud photos. This allows them to report instances to the

01:02:15   National Center for Missing and Exploited Children, which is abbreviated to NCMEC,

01:02:20   I believe, which will work with law enforcement in the US. Apple will not be scanning the images

01:02:27   themselves in the cloud. Instead, they perform on-device matching using a database of image

01:02:32   hashes. So just a bunch of code, basically. Then before an image is uploaded, it's scanned against

01:02:38   this. So a hash is made of an image and it's scanned against this list of hashes. There's

01:02:43   like this whole cryptographic way of doing it. Don't worry about the details. Not important for

01:02:47   this conversation, I think. If a match is found, it creates something called a cryptographic safety

01:02:52   voucher, which is then uploaded alongside the image as it goes up to iCloud. Apple say they

01:02:58   cannot interpret these vouchers so they don't know that they exist unless an account, an individual

01:03:02   account, passes a threshold of known CSAM content. This threshold is not stated, but Apple say it's

01:03:09   set in such a way that there is a one in one trillion chance per year of incorrectly flagging

01:03:15   an account. Once the threshold is exceeded, Apple will manually review it to confirm a match is

01:03:20   correct and then disables the user and notifies NCMEC and therefore the US enforcement, you know,

01:03:27   like law enforcement. So a few details before we move on with this, which is, so first off,

01:03:36   it's happening on device. This is part of the confusion. It's happening on device, but only

01:03:41   at upload time to iCloud photos. So we're in this very weird situation where having one of these

01:03:50   photos on your device doesn't do anything. This is not what Apple, I would say could have built,

01:03:59   which is something that looks at all images on a device and does this. It isn't doing that.

01:04:06   It is only doing it if you're sending it to Apple's iCloud server. Before it does that,

01:04:10   it runs this check and it's running this check for people who are curious. The hashes all come

01:04:15   from NCMEC. They're the only, I believe, organization in the US that's allowed to possess

01:04:20   these images that are fundamentally illegal. So they can run code on them and generate these

01:04:27   hashes that Apple is using. The safety voucher thing is important because people are like,

01:04:32   "Well, what this means is that I'm going to take a picture as an adult, maybe a young adult,

01:04:39   and it's a nude picture and it is going to flag this and then somebody at Apple is going to look

01:04:48   at it." And now people at Apple, just like the Siri thing, people at Apple are literally looking at my

01:04:53   nude pictures. That's not what's happening for a few reasons. One is you've got to have multiple

01:05:02   versions. They have to match the hash, which is, my understanding is, very difficult to do

01:05:08   if it isn't the image. It's literally looking for that image or an image of that image,

01:05:12   a distortion of the images that are in the database that are known by the authorities

01:05:17   to be the CSAM content. So first off, you've got to have a lot of these. One false positive is not

01:05:22   going to do it. And second, my understanding is when the threshold is passed and Apple manually

01:05:28   reviews it, I believe Apple is actually manually reviewing a low resolution preview image.

01:05:35   So it's not super clear, but it should be clear enough for them to verify that it actually matches

01:05:43   the image and then passes that on. So again, this is one of those cases where, not saying there

01:05:48   couldn't be a false positive, but Apple seems to have worked very hard to try to avoid false

01:05:53   positives and they're using a system that shouldn't flag anything that isn't already in the

01:05:58   NicMec database. So that's the idea here. I didn't know that part about the low resolution images.

01:06:02   Yeah. I just thought that they were reviewing the hashes. Yeah, no, I think that, no, they

01:06:07   look at the, they get the low res preview image is my understanding. So they, if it's something

01:06:15   that for some reason bizarrely comes across as a false positive and keeping in mind, it would have

01:06:20   to trigger lots of false positives to get to this point, which is unlikely, which is why they say

01:06:26   it's a trillion a year. Then they would look and presumably whoever is paid by Apple to look at

01:06:33   these matches would look at the low resolution preview and be like, oh, that's not this at all

01:06:37   and market and nothing would happen. So they're, they're trying to build a system where

01:06:42   essentially they're trying to build a system where you really need to upload a large number of known

01:06:47   CSAM imagery to iCloud to trigger this. And I would make the argument that

01:06:53   how many people are they really going to catch with this feature? This is what I don't understand.

01:06:57   I'm so angry about this. The answer is dumb people, but there are a lot of dumb people like

01:07:01   criminals are dumb. There are a lot of dumb people, but yes, it is a very constrained thing. Yes.

01:07:07   Why, why are they doing it this way? So, okay. So what annoys me is this is happening on device,

01:07:13   right? So all of the identification of these horrible images are happening on device on your

01:07:21   iPhone or your iPad. Right? So the device knows if it's found something. Yeah. But it won't tell

01:07:29   Apple and therefore the authorities unless that image is uploaded to iCloud. Just by the way,

01:07:35   all of Apple's stuff where they're like, if you choose to upload this image to iCloud,

01:07:39   you don't do that. It just happens automatically. It's either on or off. So anybody that I'm sorry,

01:07:45   I'm imagining a dialogue box that comes up and says, this seems to be CSAM content. Would you

01:07:50   like to upload it? Right? If you, if this is your thing, if this is somebody's thing,

01:07:57   it makes me shiver to even say that like you just turn off iCloud. Okay. So again,

01:08:05   again, people are dumb and it will catch dumb people, but you're right. Like why give such

01:08:12   an easy out? And this is, this is the thing that I am fascinated by, which is Apple theoretically

01:08:21   could do this to every image in your photo library, or even I think maybe every image

01:08:26   that's displayed using standard Apple functionality that app developers can use, but certainly every

01:08:34   photo in your photo library. And it could, so if somebody has iCloud photo library turned off and

01:08:38   they import the big CSAM content database of their photos into their iPad, nothing will happen.

01:08:46   Apple could make it that all of those photos when they're added to the photo library are scanned.

01:08:51   And that even if you're not syncing to iCloud, it sends a note to Apple that basically turns you in

01:08:56   and says, this is bad. And this person has bad things on it. And they have chosen not to do that.

01:09:01   And this is a fascinating question because it shows you that Apple drew the line at this

01:09:06   particular point. And the question is, why did Apple draw the line at this particular point?

01:09:12   And there are a lot of theories out there. I was going to mention this at the end, but I'll throw

01:09:16   it in now. One of the thoughts is that Apple is drawing the line here because Apple really wants

01:09:22   to turn on iCloud backup encryption. And the problem with that is iCloud backups currently not

01:09:32   encrypted. Apple can decrypt them if legal authorities want them to. All your stuff on

01:09:37   your device is yours, but if you back it up to iCloud, Apple and the authorities can look at the

01:09:42   backup. And one theory is that Apple has placed this where it is so that Apple can then encrypt

01:09:51   your whole photo library in the cloud, inaccessible to authorities, but still make the ability to flag

01:10:03   CSAM content. That's a theory. But if that theory is not right, then so be it. But I think it's

01:10:09   interesting to ask the question, why here? Because Apple could absolutely, and I'm sure somebody has

01:10:16   framed it this way, and if they won't, they will soon because this is how Apple gets covered.

01:10:20   I'm sure somebody will say at some point, Apple's okay with you putting CSAM content on your devices

01:10:26   as long as you don't put it on their servers. That is one, I think not very generous, but

01:10:33   statement that you could make about where they chose to nestle this in the system. You could

01:10:38   also say on the positive side, Apple put an automatic scanner for stuff in your phone.

01:10:47   It's for bad stuff, but if you don't like the idea that Apple's scanning for stuff,

01:10:50   turn off iCloud photos and then Apple won't scan your stuff anymore. And that this is the choice

01:10:56   that they're giving you is you can have bad stuff on your phone, but you can't put it on our servers.

01:11:00   - I don't disagree with any of that. - And there are legal issues and quasi-legal issues,

01:11:10   right? Sometimes, and we talked about this in the context of other Apple stuff and legislation and

01:11:14   all that. Sometimes the move you make is because of legislation, like when GDPR happened and

01:11:18   everybody's like, "Oh boy, got to add a bunch of boxes that say, can I look at your cookies

01:11:22   and whatever." Right? But there's also the preemptive stuff, which is behind the scenes.

01:11:27   Is it like, you know, you can't turn this feature on because we're going to come at you with this

01:11:33   law or this regulation or whatever. And it sounds like there's some legislation brewing, you know,

01:11:37   that the EU is moving on some of this stuff and the UK may be moving on some of this stuff. And

01:11:41   then ultimately the US is going to be moving on some of this stuff. And that Apple felt that they

01:11:45   needed to build something or potentially the theory that they want to encrypt iCloud backups

01:11:50   more broadly because they think it's better if it's encrypted and law enforcement can't get to it.

01:11:56   But in order to do that, they've got to throw them a bone. And this is the bone, which is Apple

01:12:02   is going to scan for the bad stuff before it goes into the cloud encrypted. But it's just like,

01:12:09   they obviously, I don't want to make the cynical question, which is Apple's doing this to look good

01:12:20   because I think it's true that Apple is doing this to stop this from happening on its services. But

01:12:27   the way they're doing it seems to be much more about iCloud and stopping the bad stuff from

01:12:34   reaching Apple servers than it is about stopping the bad stuff period. And see, this is like,

01:12:40   I mean, we haven't even gone into the backdoor conversation really yet and we will, like,

01:12:44   don't worry. That's coming. We'll slide down that slippery slope. It's coming. We're at the top of

01:12:48   the slippery slope now. I just kind of feel like this is like wanting to have your cake and eat it

01:12:58   kind of. It's like, we want to make the system because it's the right thing to do. But we also

01:13:05   don't want to have to deal with all of it. And I don't know. This part of it to me is like,

01:13:12   I agree with everything you have said, but it still makes me uncomfortable.

01:13:17   Oh, it makes me uncomfortable too. I just, I want to delineate here that there is a very specific,

01:13:22   if we're going to talk, as some people have said about how, you know, potentially monstrous

01:13:27   something like this is to have a, like I was saying about kids stuff, have a monitor running

01:13:32   a machine learning based monitor, looking at all the content in your device. Two ways to look at

01:13:38   it. One is, is it's it's big brother, but big brother is automated. The other way to view it is

01:13:46   it's good because it means people aren't looking at your device. It's just software. And you could,

01:13:52   you can make both arguments. And if you take them down the slippery slope of time and in a,

01:13:57   you know, infinite timescale and all of that, they may be the same. They, you know, they may be,

01:14:02   it's actually worse, right? Because the machine never gets tired. The machine can look at

01:14:05   everything and you can't slide anything by the machine like you can by a human being.

01:14:09   But it is, I think important to note what Apple has chosen to do and not do here. Cause

01:14:18   could Apple have built this feature and deployed it when the photo comes in instead of when the

01:14:25   photo gets uploaded to iCloud? And the answer is absolutely yes. And they chose not to. And

01:14:29   cause here's the interesting, you just said something. If you're a kid receiving an image,

01:14:36   they think that it's worth checking it when it comes in and alerting the parent, right?

01:14:40   But if it's this stuff, it's like, Oh no, we won't alert immediately. Like when it comes,

01:14:46   like when it arrives or when it's been sent or when it's been downloaded or saved,

01:14:50   it's only if that person decides they want to back it up. And it's so, it's so weird.

01:14:56   So, so get ready for the argument that I think, again, I don't know if I agree with it or not.

01:15:01   I might like, this is the challenge, right? Is everybody, you, and this actually came up in that

01:15:08   Twitter thread by Alex Stamos, which is there are so many people who want to do hot takes and the

01:15:13   two big hot takes are, yay, Apple is stopping CSAM and protecting kids. And boo, Apple is creating

01:15:23   surveillance devices that will ultimately watch everything you do on your phone and can be misused

01:15:27   by bad guys and authoritarian governments or whatever, right? Those are the two hot takes.

01:15:32   But the truth is that it's harder than that because both of those things are potentially true,

01:15:37   right? Like, and so when somebody comes out and says, Apple is okay with CSAM, as long as you

01:15:44   don't put it on their servers, that is, that is true. That is a choice that they made. And are

01:15:48   they really okay with it? No, but I suspect that Apple is trying to adhere to the letter of the law

01:15:56   or threats from law enforcement about it going to their servers. And that's why they built this

01:16:03   feature while not putting it everywhere on your phone because they're worried about the other

01:16:08   argument, which is you're now spying on everything I do on my phone. So they've tried to square the

01:16:13   circle here. They've done the King Solomon thing, right? It's like, we're going to go right in the

01:16:21   middle and nobody's going to be happy, right? Because we're not catching everything, but we're

01:16:24   also not- - The other thing is, like, my phone is spying on everything I do, whoever it tells

01:16:30   anyone on the line. - It's true. It's true. Right. So Apple has to, and this is why platform owners

01:16:36   in general, whether you're an OS vendor or whether you're a social media vendor or a cloud storage or

01:16:41   whatever it is, this is the line they have to walk, which is, you know, you build features and

01:16:49   they are helpful to people, but they also increase your data profile and can be misused. This is the

01:16:55   story of the 21st century tech, right? And so you got to make your choices about where you're going

01:17:03   to draw the line. And this is a very clear, I think, example of Apple making this choice,

01:17:09   which is, okay, we're going to draw the line at putting it on iCloud. And again, they could draw

01:17:17   the line, they could not do the feature or they could draw the line much earlier in the process.

01:17:21   And neither of those things are things that they did. But why? I don't know. I mean, my guess is

01:17:26   external pressure is why, but they haven't said that, right? Because it's PR. Instead, it's like,

01:17:32   yay, we did this. And Nicmec came out with a statement that was like, yay, Apple did this.

01:17:36   And then predictably, EFF came out, the Electronic Frontier Foundation came out with a,

01:17:41   boo, this is big brother. And like, you could have predicted it all. Like, it's very obviously

01:17:46   what's going on here, but it's more complicated than they're saying. - Before we get into the

01:17:52   backdoor discussion, let me read a few segments from an FAQ that Apple published, I think,

01:17:59   yesterday on Sunday. You know, it's been a few days for this stuff to continue to spiral out of

01:18:04   control. And so they've published a document where they're attempting to try and calm people down.

01:18:09   And there were three points that I wanted to read a little bit from just to help frame some of this

01:18:14   discussion we've had and we're about to have. Question, can the CSAM detection system in

01:18:19   iCloud Photos be used to detect things other than CSAM? Apple says, our process is designed

01:18:25   to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only

01:18:30   works with CSAM image hashes provided by Nicmec and other child safety organizations. There is

01:18:35   no automated reporting to law enforcement and Apple conducts human review before making a

01:18:39   report to Nicmec. As a result, the system is only designed to report photos that are known CSAM in

01:18:45   iCloud Photos. - Right, so, and again, just to be clear here, this is not machine learning detecting

01:18:52   CSAM content. This is comparing, they have a, or Nicmec has a giant database. - A library, kind of.

01:19:01   - That they have taken from offenders who build these libraries of this content. And all this

01:19:07   feature is doing is matching that database. So it's not, it's only gonna match if it sees something

01:19:14   that looks like something that was in that database. It's not saying I'm looking for body

01:19:18   parts, I'm looking for whatever it is. It's not doing that. It's I'm trying to match the known

01:19:24   illegal CSAM content. - Question, could governments force Apple to add non-CSAM images to the hash

01:19:32   list? Apple will refuse any such demands. We have faced demands to build and deploy government

01:19:38   mandated changes that degrade the privacy of users before and have steadfastly refused those demands.

01:19:45   I will come back to this point in a minute. We will continue to refuse them in the future.

01:19:50   Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not

01:19:56   accede to any government's request to expand it. This one is like, let's come back to it.

01:20:04   Right, we'll come back to it because I have so many problems with that statement. - So many

01:20:08   thoughts. - Question, can non-CSAM images be injected into the system to flag accounts for

01:20:16   things other than CSAM? Our process is designed to prevent that from happening. The set of image

01:20:21   hashes used for matching are from known existing images of CSAM that have been made available to

01:20:28   child safety organizations by law enforcement. Apple does not add to the set of known CSAM image

01:20:33   hashes. So I have a couple of point thoughts on this, right? One, it's like, okay,

01:20:40   we're just all going to accept that the US government is given the correct list of stuff,

01:20:47   right? Because we're all just assuming. Because everybody points and says China, right? Or

01:20:55   whatever, insert your country here. And a lot of this is just the assumption that what comes from

01:21:04   NicMec is 100% on the up and up. We don't know that, nobody knows that except the people putting

01:21:12   the images into the lists, right? Like I don't think it's fair to say that the US government or

01:21:20   any government in the world can be 100% trusted and that for some reason, just because Apple's

01:21:27   in America, we will just be like, perfect government, no problem, right? And I don't

01:21:32   think it's as simple to say as that. - It may be that NicMec has a system, I don't know anything

01:21:37   about this organization. It may be that this has a system that has oversight and that is part of

01:21:43   international law enforcement groups and there is oversight. But yes, your point is,

01:21:47   before we even get to authoritarian States, let's just say there's a terrorist attack on the US.

01:21:55   And they think that there's evidence, basically they want to take a bunch of known circulating

01:22:05   terrorist imagery that's been coming out of the terrorist home base wherever it is.

01:22:11   And they want to insert those hashes in the NicMec database, right? - The Patriot Act did a bunch of

01:22:16   really terrible stuff. - Yeah, so that's the argument for any country, but you could even

01:22:22   say it in the US is, Apple doesn't see anything except the hashes. So the question would be,

01:22:28   would, and honestly, I think this is a larger issue as well, which is stopping the abuse of

01:22:35   children is kind of a shield, if it may be a kind of a shield, let me put it that way.

01:22:43   That although the CIA and the NSA or whoever might want to, or any FBI, whatever, might want to

01:22:52   insert hashes of known terrorism images into the CSAM or the database that's kept by NicMec

01:23:02   specifically to run an operation that will find those terrorists who are using iPhones.

01:23:06   Right, okay, all right. The risk is that the story is going to come out that the government

01:23:13   exploited the efforts of people who are trying to stop child exploitation for their own uses,

01:23:20   which is pretty bad, like, right? That's pretty bad. - Right, but like if they say,

01:23:24   but terrorists though. - Well, yeah, and this is what I was going to say about other authoritarian

01:23:30   regimes, 'cause we got to deal with the Apple will refuse any such demands line here.

01:23:35   Which is like, I love that they said it, good for them. - Thanks. - But it doesn't really take

01:23:46   a lot to imagine China saying, we have our own image database, here are the hashes,

01:23:54   they need to be processed in China, your people who look at the positives that come out have to

01:24:00   be in China. The counter argument right now is that China, your iCloud backup is not encrypted

01:24:06   and it's on a server that's run by a company that's basically run by the Chinese government,

01:24:09   so they can look at your photos anyway. And maybe they're scanning them, who knows? But let's just

01:24:14   force, we'll use China as a proxy, it could be somebody else, it could be Kazakhstan. - China's

01:24:19   a good one because of that exact thing that you just said, right? That what Apple considered to

01:24:23   be so incredibly important that it's all encrypted and blah, blah, blah, like the iCloud backups,

01:24:28   they just allowed for China to say where those are stored. And it's the only country in the world

01:24:35   where Apple is not storing the iCloud backups on their own servers that they control. - Exactly.

01:24:40   And I would assume that if Apple does ultimately turn on encrypted iCloud backups, it probably

01:24:46   won't be turned on in China, that's my guess. So anyway, my point here is- - Yeah, and that won't

01:24:50   be by their decision, right? - Right. An authoritarian government, China, somebody else,

01:24:56   whatever, could say, "Okay, well, we're gonna do this and we're gonna provide you with a list of

01:25:00   hashes." But the list of hashes is not actually, even if they say it is, it's not actually just

01:25:05   child abuse imagery, it's not just CSAM, it's known images circulating in questionable political

01:25:16   groups and it flags them. That's the argument here. So let's say that China, for example,

01:25:21   comes to Apple and says, "We're gonna do this." Apple at that point either says Apple abides by

01:25:27   the laws in every country in which Apple participates, which is what they always say.

01:25:31   It's what they said when they added the, "Would you like to add these Russian apps to your phone

01:25:35   at startup in Russia?" It's what they say about China. It's like, "We follow the rules of the

01:25:40   local countries." Apple will refuse any such demands. And this gets back to my prior point,

01:25:45   which is the shield of child abuse is all Apple has here, which is to say,

01:25:51   if China wanted to use this feature for something other than child abuse, the story would be

01:25:59   China subverts attempts to stop child abuse in order to do whatever it wants to do, stop other

01:26:07   unrest in China. Is that enough? I don't think it is. I don't think the Chinese government would

01:26:13   necessarily care, but that's kind of it. If the Chinese government wants to put Apple on the spot,

01:26:20   Apple will either need to agree or Apple will need to basically pull the iPhone out of China and

01:26:24   lose a huge amount of money. Now, I think when we talk about Apple in China, and this is a whole

01:26:28   other big topic, but I think when we talk about Apple in China, what we often do is give China too

01:26:35   much power and Apple not enough. The truth is Apple being in China is really good for China,

01:26:40   too. It is a point of pride. The people in China love Apple's products. It's a point of

01:26:48   pride that Apple builds and assembles its products in China. It's a two-way street.

01:26:52   China doesn't want Apple out of the country, but this would be Apple will refuse any such demands.

01:26:58   It's like they're laying it down there, but what if those demands happen? What if it happens and

01:27:03   you have to abandon a market, China, Russia, wherever, you have to abandon a market because

01:27:08   the local regime says, "We got a hash of images for you and we want you to scan for it."

01:27:13   -Between Apple and China, though, we all know who's blinking first.

01:27:16   -I mean, at this point, yeah, I think, I mean, depending on,

01:27:19   yeah, yeah, I think so. I think so. You never know.

01:27:24   -Like, yes, they need each other, but China will get on just fine without Apple. They just will.

01:27:30   -Yeah. Well, you know, I would argue Apple will ultimately get on fine without China,

01:27:35   but they could really hurt for a while if they can't, you know, be participating in China Smart.

01:27:40   -Like, make their products? Yeah, that's going to be pretty bad for a bit.

01:27:43   -Yeah, it'll be tough. So this is why the bluster here is fascinating to me. Apple will refuse any

01:27:48   such demands. They're basically saying, "Go ahead and call our bluff. Go ahead and take this feature

01:27:54   that's about protecting children and turn it into a tool for an authoritarian state to analyze its

01:28:00   citizens. Go ahead and try us." The problem is I can think of one country that could go ahead and

01:28:06   try them, and it would be very difficult for them to refuse that demand. -I think the thing that

01:28:11   frustrates me quite a bit, like, and again, like, I'm just looking at this from my, like, common

01:28:17   sense look at everything that's being said and being written about. The Apple say, like, the

01:28:22   only thing this technology can be used for is CSAM detection. -That's not true. -But that's a lie,

01:28:28   right? Because all it's doing is like looking at the hashes. -Yes. -But you can hash anything.

01:28:36   -Exactly. -And I think I do not find it acceptable to say this. -Their hedge against it is the human

01:28:42   review, right? But again, if the human review is in a place or for, you know, is subverted itself

01:28:48   in any way, then you're done, right? The technology can be used for whatever. It is built to only be

01:28:54   used for this. But I think that's absolutely right. Now, what this does, because it's about hashes,

01:28:59   it's not gonna, like, use an ML model to find, you know, people who are speaking out against the

01:29:05   government. But if you've got a bunch of photos that are circulating in your, you know, subversive

01:29:14   circles in your country, you put those in, right? You put those in the memes and, like, the example

01:29:21   people give is, like, the tank man image in Tiananmen Square or Winnie the Pooh memes,

01:29:28   stuff like that, right, in China. They put that stuff in there, and they're gonna, and basically,

01:29:33   the idea would be, we can find people who are not thinking properly about the regime, and we

01:29:39   can capture them and do something to them. And, like, this technology could do that if it was used

01:29:46   in that way. And what Apple's really saying is, by policy, we are not going to use it that way,

01:29:52   which is not the same as it can't be used that way. - And that's exactly what it is, right?

01:29:57   This isn't a technological enforcement. It's a policy enforcement. And I don't think, personally,

01:30:05   that's good enough. And this is where I struggle so much on this. I cannot tell you how much I want

01:30:14   the people that circulate this kind of imagery to be removed from society and given the help that

01:30:20   they need, right? And I know that maybe some people would find that even, like, that second part of

01:30:27   what I said to be weird, but I feel like you've gotta do both parts of this, I think, 'cause,

01:30:33   I don't know, it's tricky, right? - Well, this is the, again, I'm gonna bring us back to the

01:30:40   spectrum, right, which is catching bad people and tools to spy on a mass population by,

01:30:50   I've been saying authoritarian regimes, but you mentioned the Patriot Act. It's, like,

01:30:54   by anyone, by any government, for any reason. And those are, they seem like polar opposites,

01:31:04   but the polls wrap around. 'Cause essentially what you're doing is saying, "Society has deemed this

01:31:10   kind of material bad, and we wanna look at what people have on their devices and find them if

01:31:18   they're uploading this stuff and stop the bad people." And, like, it's all, and then it's all

01:31:28   about how it's used, which is why all the slippery slope arguments exist, right? This is the Edward

01:31:32   Snowden, you know, statement that he made, which was, "No matter how well intentioned," and I think

01:31:37   that's right, 'cause I think it is well-intentioned, "Apple is rolling out mass surveillance."

01:31:41   And it's like, okay, it's a little overheated 'cause of the way it's done with the hashes,

01:31:46   but, like, it is, you know, it can be used for good and evil. It's just a tool,

01:31:55   and you built it for good. It can be used for evil. I will go back to why they built it where

01:32:01   they did. I feel like this is Apple's compromise. Apple's compromise is, "Don't use iCloud, and we

01:32:11   won't spy on you." That's the compromise at this point. Now, you could argue, like, "Well, what

01:32:16   will happen if a government said, 'We want you to scan everything that goes in your device.'" And I

01:32:20   do actually think that Apple would walk away at that point. I do think that there are limits to

01:32:24   what somebody, even China, that has the most leverage over Apple, I do think that there are

01:32:28   limits to what even China could make Apple do with its products. But that's why I think they

01:32:35   positioned it where they have, is if it does ultimately get subverted, there's still an out,

01:32:43   which is, "Don't sync it with the cloud." Unfortunately, that's also an out for the

01:32:49   people who use CSAM content in their photo library. So again, and this is the, I think

01:32:56   it's the struggle maybe even of our era between authoritarianism and people who want freedom from

01:33:02   big groups, is this, which is, we can stop crime and make everybody happier by having a panopticon,

01:33:16   having everybody, everything that everybody does is watched. And don't worry, it won't be people.

01:33:21   I just read a book about this, actually, a novel that I don't recommend to anybody because it's

01:33:26   very long and very dense, but I loved it, called, I'll mention it if you want to inflict it on

01:33:32   yourself. It's Nomon by Nick Harkaway. It's 700 pages and super dense and I loved it. But

01:33:38   what it's about in part, because it's a very long, dense, pinchon-esque kind of novel,

01:33:44   is about the UK in the future being a machine learning police state. And the idea is there's

01:33:51   no longer people watching you, but the machine is watching everyone everywhere. And isn't it great?

01:34:00   Everybody's happier. The machine can stop crime and the machine can give you advice about how

01:34:05   to be happier and all of that. Well, yes, but also if that machine, that machine can, whatever

01:34:11   that machine has decided is bad, can't happen anymore. That's the ultimate slippery slope

01:34:18   argument here and I see it. And it's a tough one because the more freedom you give, it's like Apple

01:34:26   with the FBI. Like the more freedom you give, law enforcement's like, but no, we want to see

01:34:31   because we need to find the bad people. And the counter argument is, yeah, you say you want to

01:34:35   find the bad people, but who's going to stop you from finding other people? And maybe these people

01:34:40   aren't bad. Maybe you have a new set of bad people who aren't bad, but you want to find them anyway,

01:34:45   for your reasons. Like that's, this is the struggle I think of our era, both politically

01:34:51   and technologically. I don't want this to exist, right? Like I don't want this stuff

01:34:58   to exist in the world. I don't want it to remain unchecked. See, Sam, right? You don't want to,

01:35:05   like the idea, the idea that these devices are being used as a safe harbor for this kind of

01:35:11   material. Yeah. I don't want that, right? No, who, you know, nobody does. That's the risk. Nobody does.

01:35:19   But I think it's really tricky to balance this against the potential of the security of every

01:35:27   single iPhone user on the planet. Because like, this is a slippery slope. Like this is just

01:35:36   a start. Like why would this be the only thing? Why would this be the only thing that is imaginable?

01:35:46   My understanding, by the way, another thing that I've seen in these stories is there's actually

01:35:51   kind of an understanding that lots of other cloud photo and storage services, they're already doing

01:35:58   this. They're already scanning. Apparently Apple was already doing it, right? Like there was a

01:36:02   report that somebody at Apple said this uploaded images. So the idea here is that if you encrypt it,

01:36:07   then you need to scan them before you do it. But like, this is not a new thing. And Apple is the

01:36:11   first crack and the dam is going to burst. This has been going on, right? Um, it's not, it's not new.

01:36:17   This stuff, this, this stuff has been scanned, but I think the, the people at places like Nick Mac,

01:36:22   what would they would say is that they're trying to eliminate more safe harbors for this stuff.

01:36:28   And, and that this is a place where stuff is getting stored to which I would counter. Yeah,

01:36:33   but are they really uploading it to iCloud? Yeah. But like Apple's created a safe harbor.

01:36:37   It's called your device, right? Like you can keep it on your device and no one will ever know about

01:36:43   it. But like, my point is like, this is the first time this has happened. I could imagine

01:36:48   a couple of years ago, uh, us saying an Apple saying we would never do something like this.

01:36:54   Right. I feel like that's not unfair to say, like you'd look to go back to the FBI,

01:36:59   San Bernardino thing. I feel like Apple of them would never have created a backdoor into

01:37:03   their devices. That was the whole point of that. We don't create any backdoor.

01:37:08   It's not, so this isn't a backdoor, but it is a, I mean, unless you view the,

01:37:16   the Nick Mac hashes as a backdoor in which case it kind of is. But, um, yeah, I think,

01:37:24   look in the end, we don't know why Apple's doing this. Although we have lots of suggestions that

01:37:28   there is something happening here. They're motivated probably by some sort of external

01:37:31   threat that the idea, they either want to do something that they can't do until they build

01:37:35   this or they know that they're going to be required to build this or something like it.

01:37:39   And they want to build it. I would argue, my guess is build it preemptively. What Apple considers the

01:37:44   right way instead of being told to build it away that they're not comfortable with. That seems like

01:37:48   a very Apple thing to do, which is like, we're going to mandate that you do it this way. And

01:37:51   Apple's responses, wait, wait, wait, wait, wait, let's let us come up with a better way of doing

01:37:56   what you want that we feel keeps things private. And whoever is behind this is like, okay, you know,

01:38:02   all right, that's fine. So like they're doing it their way. Um, but the problem that we always get

01:38:10   back to, and I think this is fundamentally, and there's no answer to this is Apple has built a

01:38:14   tool that is being used for good, but tools can be misused. That's it. This is coming off that,

01:38:22   what is it? Pega sus, pekatron Pegasus, right? When I'm being serious, is it Pegasus? That spying

01:38:30   software thing? Uh, Pegasus anyway, uh, it's Pegasus. Yeah. Pegatron is a different thing,

01:38:41   right? Is it a Taiwanese manufacturer? Like, like, uh, like, uh, the what's the one that

01:38:46   Apple uses Foxconn anyway, Pegasus. Yes. Wasn't it expected that it was completely impossible

01:38:54   for anyone to do that to an iPhone. Yeah. It comes back to my, like this thing that I have,

01:38:59   I think I said it, but just said a bunch of, a bunch of things. If a human makes it,

01:39:03   a human can break it. It's as simple as that. Right. There are always holes in these systems

01:39:10   and that's just like another part of it that makes me uncomfortable. There's now this thing

01:39:13   that can look at every photo. Now you can tell me what you, what Apple wants to put into it.

01:39:19   Fine. But there's a thing that can look at every photo and it can assess them and it can put a

01:39:24   little cryptographic signature on it. Here's another way where again, I think all these

01:39:29   arguments are valid and, and we need to consider all of them, but I will throw this out there,

01:39:34   which is I think maybe the difference here is that Apple is telling everybody.

01:39:40   Yeah. Yeah. Right. Better to say than not to say. Right. Because here's, here's the thing.

01:39:44   There's a lot of surveillance going on already in a lot of different ways. And a lot of companies are,

01:39:51   are, are, uh, complying to do it. So on one level, it's kind of refreshing that Apple's like,

01:39:59   this is what we're doing. I wish they would have done it differently though. Right. Like

01:40:03   we said it already, like they really bongled this one. They should have done these two things

01:40:07   separately. It would have made it a lot easier. Well, whenever you have to post an FAQ days after

01:40:12   you made your announcement, because there's been a whole, like you blew it. Like, yeah,

01:40:16   we didn't even get into like the way this was rolled out and the fact that they put,

01:40:20   that Nick Mac put out this press release, that's just like this incredible, like patting itself

01:40:24   on the back and saying like, you know, people who don't like this or are just furthering crime. And

01:40:29   it's like, Oh boy, uh, who are these people? But like, yeah, the rollout was bad. It was used too.

01:40:36   It's bad. It's very, very bad. Very bad. This is a, this is a hard thing. Well,

01:40:40   yeah. Well, in modern times, if you have had to create an FAQ because Qs have been F'd,

01:40:46   right. Cause FAQs now tend to be created before, right. Cause it's just like, I assume

01:40:54   you anticipate the questions. No, that's, that's, that's, this is, we've learned a new,

01:40:59   this is a new little upgrade, a tidbit for everybody, which is, you know, if you have

01:41:03   to build an FAQ, your rollout was F'd. That's, that's what it is. The F's have been A,

01:41:08   cause they just frequently anticipate the questions. To the Q. But the other thing that

01:41:16   has really made me uncomfortable in the last few days is realizing how much power these technology

01:41:21   companies have in our lives now that they are actually law enforcement. They're not just

01:41:25   computer manufacturers anymore. Apple is attempting to enforce law, right. That they are doing this,

01:41:32   whether, and this is the way that Apple has decided to enforce the law. They have not been told they

01:41:38   have to do it this way. They've been told they've had to do something. Apple's interpretation is we

01:41:42   will enforce laws this way. And it's like, Oh my God, thank you police. Like it's like, Oh,

01:41:49   all right. Like all of the technology companies are enforcing laws in the ways that they want

01:41:54   to enforce them and then pass that information over to law enforcement. This is the truth is,

01:41:58   this is a consequence of the fact that our law enforcement system is based on the real world

01:42:05   and they patrol our streets and they visit our houses and they go knock on the front door and

01:42:14   they do whatever they, or they knock it down. All of those things, right? The problem is,

01:42:18   is that so much of life now, maybe even most, but certainly a lot of life now is in servers in the

01:42:28   and devices on, you know, on the internet and in our devices. And the problem is that our policing

01:42:34   isn't made for that. Our laws aren't made for that. This is something we talked about with the FBI

01:42:38   stuff with the San Bernardino shootings, right? Like we, and if you, if you followed any of the

01:42:46   gamer gate stuff, like people would make reports to police and the police are like, I don't know,

01:42:53   it's the internet. We're not, we don't, we don't police the internet here. And it's like, I know,

01:42:58   yeah, I know you don't. But, but that's the problem is, is nobody does and somebody needs to.

01:43:04   And, and you probably should be the ones to do it because you're law enforcement, but you're not.

01:43:09   And what it ends up being is these territories that are so important to our society,

01:43:14   the owner quote unquote of, you know, to a certain degree, anyway, builder slash owner,

01:43:20   depending on where you are in the chain is a tech company. And so we're put in this position where

01:43:25   it's like, okay, you said that, that Apple is now law enforcement. It's like sort of, or you could

01:43:31   say they're the owner of a large amount of real estate that law enforcement has decided they're

01:43:36   that, you know, they need to patrol and Apple can't refuse them because, and they may not have

01:43:43   wanted to be that, but that's what they are. And that goes for all of that goes for Apple and

01:43:47   Google and Facebook and everybody else. Like they don't want to be, well, I should say they don't

01:43:54   want the responsibility of being the owners and operators of a huge portion of the territory of

01:44:02   our lives, but they've, but they've made a lot of money from being that. And this is the other part

01:44:07   of it is that they actually do have to have the responsibility for this stuff. And the law

01:44:11   enforcement agencies are going to come to them and these thorny problems are going to happen and

01:44:16   they can't run away from it. So this is an interesting example, whatever you think of it,

01:44:20   of Apple trying to find a way through that is not so bad. But I think we hear a lot from the,

01:44:30   because this is kind of a win for law enforcement. We are hearing a lot from people like Edward

01:44:34   Snowden and the EFF. Again, watch for it. Somebody is going to say that this is a victory for people

01:44:41   who use CSAM because Apple's not scanning everything on your device and there's an easy way

01:44:45   to turn it off. That will also be an argument. And we may in the tech side, we're not hearing

01:44:49   that argument, but mark my words, that is going to be an argument that this doesn't go far enough.

01:44:54   And that all that argument will always be there, which is why there's always the potential for

01:44:59   tools like this to be used in ways in which they weren't intended. This episode of upgrade is

01:45:05   brought to you by DoorDash. Dinner? Check. Deodorant? Check. You got that morning pick me

01:45:10   up? Check. Everything you need, wherever you need it with DoorDash. DoorDash connects you

01:45:15   with the restaurants you love right now, right to your door and also the grocery essentials that you

01:45:20   need to. You can get drinks, snacks, and household items delivered in under an hour, as well as maybe

01:45:26   your dinner or your lunch. Ordering is easy. You just open the DoorDash app, choose what you want

01:45:30   from where you want, and your items will be left safely outside your door with their contactless

01:45:35   delivery drop-off setting. With over 300,000 partners in the US, Puerto Rico, Canada,

01:45:41   and Australia, you can support your local neighborhood go-tos or your favorite national

01:45:47   restaurant chains like Popeyes, Chipotle, and the Cheesecake Factory. Jason Snell, can you tell

01:45:53   Upgradients about your DoorDash experiences? Sure. Also shout out to Puerto Rico, which is

01:45:59   the United States, but I like that they mentioned that it also Puerto Rico because

01:46:04   sometimes they get left out, sometimes they don't. I hear from Puerto Rico every now and then they're

01:46:08   like, "Hey, don't forget us. I don't forget you." The DoorDash stuff is great because, as I've said,

01:46:16   my method is don't order hungry. It's a classic. You order in advance, it shows up at your door.

01:46:21   My daughter's driven for DoorDash. It was great during the pandemic. Some places, pandemic's still

01:46:25   rising high. You don't want to go outside. You don't want to see people. Bring it on,

01:46:31   whatever you want. We have great restaurants in our town. Also places that we don't usually go to

01:46:36   because they're a little bit further away, and DoorDash will handle that too. It's like, "Yeah,

01:46:40   I'll drive down the street, but I'm not going to drive up the freeway a couple of exits."

01:46:43   DoorDash will handle that too, so super convenient. For a limited time, listeners of this show can get

01:46:48   25% off and zero delivery fees on their first order of $15 or more. You just download the

01:46:54   DoorDash app and you enter one of these codes. If you're in the U.S., it's upgrade 2021. If you're

01:46:59   in Australia, it's upgrade AUS. That's 25% off, up to $10 in value and zero delivery fees on your

01:47:06   first order. When you download the DoorDash app from the app store, enter the code upgrade 2021

01:47:11   if you're in the U.S. and upgrade AUS if you're in Australia. One last time, upgrade 2021 for the U.S.,

01:47:17   upgrade AUS for Australia. For 25% off your first order with DoorDash, subject to change,

01:47:23   terms apply. Our thanks to DoorDash for their support of this show and Relay FM.

01:47:29   Let's do a palate cleanser of a few #askupgrade questions before we round out today's episode.

01:47:35   The first comes from JD who asks, "What feature of Monterey do you think that you'll be using the most

01:47:40   even if you don't like it?" Do you have any thoughts? So my initial thing on this,

01:47:46   I've not used Monterey yet. My beta experience is still maintained just to my iPads. I've not yet

01:47:51   put it on my phone yet. The reason I haven't put my, the reason I haven't put iOS 15 on my iPhone

01:47:58   is if Apple continue to change Safari and I never have to have dealt with the problems that people

01:48:03   like Federico are going through in trying to use Safari, then that'll be great for me. I will have

01:48:08   never had to endure what is happening with Safari on iOS. However, I know that already on my iPad,

01:48:16   I love tab groups. I think it's a great feature. I have it set up really well. I like using it.

01:48:22   And I feel like with Monterey is going to be just as useful as it is on my iPad. So that is a

01:48:29   feature that I know I am going to really appreciate and enjoy from Monterey. Shortcuts. Shortcuts.

01:48:36   Oh, I forgot about shortcuts. I forgot about shortcuts. Yes, shortcuts too. I'm going to use

01:48:41   that all the time. I can't wait. I was talking with a developer friend of mine who makes a really

01:48:46   great app that I use and love very much. And he was saying that he wanted to put shortcuts into

01:48:51   the app and because it was a capitalist app, it's already done. So he's very excited about that.

01:48:57   And that wasn't James Thompson? It wasn't James Thompson. No. You would have mentioned it if it

01:49:01   was, I assume, because he's a friend of the show and listener to the show. Yeah. And I spill all

01:49:04   of James' secrets. You know, that's true. James is doing some really interesting stuff right now

01:49:11   that I'm very excited about, which would appear to be multiplayer and dice by Peacock. Yes. Yeah.

01:49:18   That's really, we, I talked to him about that a long time ago and he was like, that, that is very

01:49:23   hard. I don't think I'm ever going to do that. And then all of a sudden he tweets a thing, which is

01:49:26   like, oh, look, I'm using Game Center to have a shared table where people can roll dice. I'm like,

01:49:31   oh my God, there it is. Plus he's doing, he's built all that AR stuff. So you can roll dice on

01:49:37   like a real table. I did that and it'll even fall off the table. The dice will like real dice.

01:49:42   They'll fall off the table. That is so impressive by the way. It's amazing. If you haven't checked

01:49:46   that out, the AR mode in Dice by Peacock, I think it's still in beta. Like it's, it's in the app,

01:49:51   but it's, you know, James is still working on it. If you have a LIDAR sensor on a device,

01:49:57   it's really incredible that you can, you can, and so there's a few things I like about it. One,

01:50:02   you can have like the dice tray on the, on a table, throw the dice, the dice can jump out the

01:50:07   dice tray. That's a setting that you can turn on and it will fall off the table. And then also if

01:50:11   you throw a lot of dice down and you bring your iPhone down to the AR, you can push the dice

01:50:15   around with your phone. It's bananas. It's so good. I love it. Check out Dice by Peacock.

01:50:21   Sure. Not a sponsor, just a friend of the show and not the one Myke was talking about.

01:50:26   No, but maybe. It wasn't, but probably also applies. Matt wants to know, would you want

01:50:33   Apple to make a multi-socket mains adapter? This is, there is a huge third party market for this,

01:50:39   but would you have thought- Hello, England. That's like electrical outlet.

01:50:44   I don't know. I don't know where Matt's from, this person. I don't know. And I don't think I

01:50:48   would call it mains. Mains is not a word that yeah, I would ever use. Power adapter. Sorry.

01:50:54   I know that it's impossible for Americans to understand what I'm saying. So I will say power

01:50:59   adapter instead. Thank you. There is a huge third party market for this, but you would have thought

01:51:04   Apple would maybe want a slice of the pie. You got a Mac book, iPad, iPhone, AirPods, Apple watch,

01:51:09   probably going to need to plug at least a couple of them in to power. To power, to the mains.

01:51:14   Maybe Matt's on a ship or something. Like you have to tap into the mains and hoist the mains sail,

01:51:23   which is an electrical sail, I believe that's how that works. I'm going to let you get this out of

01:51:27   your system and then eventually we'll get to answer the question. I just bought another one of these

01:51:32   Belkin, I think it's Belkin adapters where it's a big brick with a bunch of ports on it.

01:51:38   I think the reason Apple wouldn't make it is because they, I'm not sure they could add a

01:51:45   lot of value and because they're kind of inelegant because it's just a whole bunch of cords coming

01:51:49   off of them and they prefer these sort of like slightly more elegant flat things. Although they

01:51:54   did make that weird, you know, inelegant charger thing. But I don't know. What I wanted, I'm

01:52:02   surprised they haven't made it just because those things seem to sell pretty well and they could

01:52:06   make one that was, you know, priced much higher than the others and sell it in the Apple store

01:52:09   and then I probably wouldn't buy it because there were cheaper ones. I don't know. I think it might

01:52:16   come down to that Apple's got other fish to fry and that this, they can't see how this is going

01:52:21   to be better than just letting the Belkins of the world make these things. I just bought a great

01:52:26   product that I'm very happy about for this kind of purpose. It's made by Anker and it's one of those

01:52:32   GaN charges. So they're like way more powerful and small. And Apple isn't using this technology yet.

01:52:39   I think that they may wait until they can do this kind of thing where you can have much more powerful

01:52:43   charges in a smaller form factor. I have a couple of those things that look like the little square

01:52:48   chargers that they do for the iPhone in the US but it's USB-C and it's got way more power. And the

01:52:55   reason I bought this is because I wanted one thing that I could charge an iPhone, an Apple Watch,

01:53:03   and an iPad Pro from. And you can do that with these things. So I mean, I don't know if they

01:53:08   would do this but I am at least looking forward to the day when Apple goes gets on the GaN train.

01:53:13   Not that they would ever include that in the box, you know, because they don't do that anymore.

01:53:17   Right. But like the super awesome charging thingy. Yeah. I mean, don't get me wrong. I've got one

01:53:24   that goes into the wall that is like not on a plug but the whole brick just goes into the wall.

01:53:29   That's got a USB-C and a one USB-A I want to say. But like that I could see Apple making a product

01:53:38   like that that sort of like charge all your things at once. But again, can they really add value?

01:53:43   I'm not sure they can. Maybe one of the reasons they stopped putting the charges in the box is so

01:53:47   they could move to the technology. Maybe. Maybe. Amalie's asks, can I get an official ruling on

01:53:52   wearing my Summer of Fun merchandise in the fall? It started to arrive. I've been very happy to see

01:53:57   Upgradient's taking pictures and sending them to us. Yes. Tank tops are out there now. Very good.

01:54:03   I mean, what I'll say is, summer goes on longer than you'd think, right? In the Northern Hemisphere

01:54:10   it goes on until the middle of September, toward the end of September. So there's more time out

01:54:14   there. And I would say really, if the Summer of Fun keeps you warm in the fall and the winter,

01:54:21   then, you know, Summer of Fun lives on in your heart. Summer of Fun's a state of mind, man.

01:54:26   Yeah, that's right. Also, it's very hot here in October and I will consider it the Summer of Fun

01:54:33   even then. So there. Thank you so much to everybody who sent in a #AskUpgrade question. If you would

01:54:38   like to do so, just send out a tweet with the hashtag #AskUpgrade or use question mark Ask

01:54:42   Upgrade in the Relay FM members Discord. Did you get access to? If you sign up for Upgrade Plus,

01:54:46   go to GetUpgradePlus.com and for $5 a month or $50 a year, you will get access to tons of great

01:54:53   benefits for being a Relay FM member and also ad-free longer versions of every single episode

01:54:59   of Upgrade. Thank you to everybody that helps support the show by doing so and also thanks to

01:55:04   DoorDash and Pingdom and ExpressVPN for the support of this show. Before we go, let me tell

01:55:10   you about another show here on Relay FM. Material hosts Andy Anotko and Florence Ion, are veteran

01:55:15   technology journalists with plenty to say about what's going on at Google. Follow Google's journey

01:55:20   with them at Relay.fm/material or search for material wherever you get your podcasts.

01:55:26   If you want to find Jason online, you can go to SixColors.com. You can also find Jason, he's @JasonL

01:55:32   on Twitter, J-S-N-E-L-L-L. I am @imike, I-M-Y-K-E and Jason and I host many shows here on Relay FM

01:55:40   as well if you're looking for something to listen to. If you made it through this entire episode,

01:55:45   thank you so much for listening. I know it was a difficult one. Fun will hopefully resume next week

01:55:52   on Upgrade. Thanks so much for listening. Until then, say goodbye Jason Snell.

01:55:57   - Goodbye everybody.

01:56:11   (upbeat music)

01:56:16   [BLANK_AUDIO]