443: A Storm of Asterisks


00:00:00   I don't know, I feel like 2021 has been such a whirlwind, which I'll take over the complete

00:00:04   show that was 2020, but sure, but that's setting a pretty low bar. I mean, that's not really like,

00:00:10   it's like this is, this is better than like actual hell, like in a fire, literally. Yes. Like the

00:00:20   actual world of actual hell, like where you are just burning in, you know, a hellscape forever,

00:00:25   like an actual fire, your body's being torn apart, you're being tortured, you know, you have to

00:00:30   listen to Dave Matthews' band, all of that. Oh, can you imagine? So 2020 was indeed not as bad as that,

00:00:35   but yeah, definitely not a ton better. So I have to issue a plea. Can somebody at Apple please fix

00:00:46   the wonderful Crash Test Dummies album God Shuffled His Feet in Apple Music?

00:00:52   I know we've been down on Apple Music recently. Oh, I have so many thoughts, which I'm not going

00:00:57   to get into now. We have too much to talk about, but God, what a piece of crap Apple Music is.

00:01:00   I know, just normally I don't have too many problems with it because I'm a pretty light user

00:01:04   of it really, but I just, I asked my home pods this morning to play the Crash Test Dummies album

00:01:10   God Shuffled His Feet, the one with, you know, that one, it's a big album from the, from the 90s.

00:01:15   Very well done, very well done, Marco. Thank you. If I had more time, I would do it the right speed,

00:01:18   but anyway, you're all listening at 2X anyway. This is one of my favorite albums. It's, it's,

00:01:23   liking Crash Test Dummies is, is like one of the weirdest things you can, you can be if you're not

00:01:27   Canadian. Like they were, they were really big in Canada. They were not at all big in the US except

00:01:31   for that one song. And if you like Crash Test Dummies, it's a very weird band to like because

00:01:35   every album is radically different than the other albums. And what they did after this, they had like

00:01:41   one album I liked, A Worm's Life, and then everything after that, I'm like, I'm out.

00:01:45   It got really weird. But anyway, this album, it's again, one of my favorite albums. And I listen,

00:01:52   I asked Siri, you know, "Hey, play, play this album." And that worked. You don't, you don't

00:01:56   have to like do anything weird to have it play a whole album in order. Like you can just say,

00:01:59   "Play the album named blah, blah, blah." And it says, "Okay." So great, plays the first two tracks.

00:02:03   Great. The third track, it switches to a live version of it. And then like the next, like a few

00:02:11   tracks in a row were the live version from some live album I've never heard of and don't own.

00:02:17   Then after that, it switched back to the studio version for a track or two,

00:02:21   then back to a live version, and then back to the studio version for the last track.

00:02:24   Cool.

00:02:25   Now, the really funny part of this was when I looked like on my iPad with the now playing in

00:02:31   the control center for what's going on in your home pods, which I love this integration. I've

00:02:35   told it before, this is one of the best reasons to use Apple Music and AirPlay 2 because all you

00:02:39   get this integration that's wonderful where you can interact with what's playing on your home

00:02:43   pods or whatever from your phone and from any iPhone or iPad on your network, which is great.

00:02:48   Anyway, so I checked that and it's showing the studio album as the now playing. So Apple Music

00:02:55   doesn't think it's playing a live version, but it totally is. And then just before the show,

00:02:59   I'm like, "Let me just double check. Maybe this was something weird today." So just before the

00:03:02   show, I went on my phone to see like, what does my phone version of Apple Music think that it's

00:03:06   playing? And it had the same problem where it was playing a mix of live versions and studio versions,

00:03:14   but it was a different set of tracks that was wrong on the phone versus what was wrong from

00:03:21   the home pod earlier today. So please, Apple, I know there's got to be someone who works on

00:03:26   Apple Music who either likes this really weird band like the way I do, or maybe just is Canadian

00:03:33   and therefore is more likely to care about this band. But please fix the Crash Test Dummies album

00:03:38   because it's a really good album. And this is a really weird thing to be broken. I did also,

00:03:43   I even checked Spotify to see like, maybe they did some kind of weird reissue of it for weird

00:03:48   contractual reasons. Nope, Spotify version is perfect. Of course it is. And I own this CD. I

00:03:54   ripped the CD into iTunes forever ago. I have it on my computer, but there's no way, because I have

00:04:01   it in iTunes/Music on the Mac. I have my version there, so it plays correctly because it's local

00:04:07   files. And I have iTunes Match and I have Apple Music. But apparently there's no way for me to

00:04:13   play my copy of it on my phone anymore. Like I can only play the Apple Music copy, which is broken.

00:04:19   So please, Apple, fix the Crash Test Dummies. You know, we sometimes make fun of the fact that

00:04:27   our App Store apps have like an artist and album field because like it was the repurposed iTunes

00:04:32   Music store to make the App Store, right? And the underlying database schema dates back to

00:04:37   iTunes and all that stuff. And it's kind of weird and awkward. Sometimes I think about when looking

00:04:42   at Apple Music or hearing complaints about it, just dealing with my own thing, that the iTunes

00:04:47   Music store was purpose-built to be a music store. So it can't use the excuse of like, well, we were

00:04:54   just retrofitting onto an existing system we have for e-commerce essentially, right? And I don't know

00:04:59   about you, but I've been in the position many, many times across my career when I'm called upon

00:05:05   to essentially create a data model for a thing that doesn't exist. And if I was making Apple's

00:05:11   music store, granted, nobody can see the future and know whether it's going to be big or not or

00:05:14   whatever, but if I was given that test, "Hey, we're going to sell music over the internet. We need a

00:05:20   data model for this." It's kind of like the USB connector when I complain so much about if you're

00:05:25   tasked with making a connector, spend five minutes with a whiteboard thinking about what are the

00:05:29   attributes of a good connector and write them down and see if you can hit some of those. I don't think

00:05:34   it's over-engineering or over-designing to think about when making the iTunes Music store at the

00:05:39   year that it was made, concepts that lead to, that could potentially lead to the problem you have

00:05:45   here. Like, for example, albums are released and then sometimes there is a remaster or a re-release

00:05:54   or an anniversary edition. Also, sometimes artists have "best of" collections, which include songs

00:05:59   from various albums, right? And I feel like one brainstorming session with anybody who has any

00:06:07   interactional music will lead you to those things. It's not a huge schema. It's not thousands of

00:06:15   columns and dozens of tables that are interrelated. You could fit it on a whiteboard, but concepts

00:06:21   like that are super important. I run into this all the time because I have lots of versions of U2

00:06:25   albums. Maybe the iTunes store knows this, but if the iTunes store understands that my three

00:06:31   different copies of the Joshua Tree are in fact different versions of the original Joshua Tree

00:06:37   album from 1987, it is not apparent that iTunes understands that. But it's a really important

00:06:43   concept because then not only can you display that information and understand it, but then you can

00:06:49   avoid mistakes like this by saying, "Okay, you're playing this album. If you don't give me any other

00:06:55   information, I'll play the 1987 Joshua Tree, right? If you ask for the other ones, I'll play

00:06:59   that. But if I'm playing the 1987 Joshua Tree, just play the tracks from the 1987 version. Don't

00:07:10   get confused and switch to the remaster or the 30th anniversary edition or like just play it.

00:07:16   That's how you can tell. Don't try to match them up by like track name or title, or especially if

00:07:23   the remasters are just also called the Joshua Tree. I'm not asking, again, the people will sit

00:07:28   down to make this. That's got to come up in the first brainstorming session because it's a concept

00:07:32   that exists. And if you build that into the data model from day one, it makes writing the app so

00:07:37   much easier because say if someone's trying to debug this from Apple Music or whatever,

00:07:40   it can be confusing because the track names are the same and maybe the album name is the same.

00:07:44   And maybe especially with iTunes match where it's trying to look at your files and match them up

00:07:48   with the ones they have records of, and it's hard to know which one they're matching against.

00:07:51   This kind of metadata really helps. And so I do actually wonder what is the underlying data model

00:07:57   and how limited and dumb is it that errors like this come up all the time and there's apparently

00:08:02   no recourse for us to like, you know, fix it by changing the metadata.

00:08:06   That's very true. Since we're all filing Apple Music radars, let me file one verbally as well.

00:08:12   I was listening to the aforementioned Illusion is Nowadays, the new album, I believe it was on

00:08:16   my computer the other day. And it would play most of the album until there was about 45 seconds of a

00:08:25   song left. And then the audio stopped. It's still playing, allegedly, but the audio stopped. The

00:08:33   timer is still, or the counter, the play counter, whatever, the time is still ticking up. And no

00:08:39   music is coming out of my computer speakers. I advance to the next track immediately. Music is

00:08:44   coming from my speakers again. And then until about 45 seconds before the track ends and then

00:08:47   it all stops. I'm wired ethernet on a symmetric gigabit connection. There is no reason that this

00:08:55   should not be working, but here we are. So yeah, Apple Music not going well for Casey right now.

00:08:59   I'm just going to say that and I will try to leave it at that because we have a lot to talk about,

00:09:03   starting with some follow-up. Jon, this first piece follow-up is for you.

00:09:07   Jon Moffitt Are you just trying to avoid pronouncing Tatsuhiko Miyagawa's first name and

00:09:10   last name?

00:09:11   Jon Moffitt That is exactly correct because I did not have the time to practice and I thought,

00:09:14   you know what, this is what you put in, you can do it. Thanks, bud.

00:09:17   Jon Moffitt I know this person from the internet and Pearl. So I had to practice. All right, so

00:09:22   last show I was trying to think of some kind of interview where some Apple executive tried to give

00:09:28   an explanation of why there is no weather or calculator app on the iPad. And apparently it

00:09:33   was an interview with Craig Federighi by MKBHD. We will have a link in the show notes to the

00:09:39   timestamp offset where you can hear his answer. And I had said last show that it wasn't a very

00:09:44   good answer. It's not. I mean, it's a, you know, public relations answer where you have to try to

00:09:49   make a reason that makes you seem good. And CFED's answer was like, well, we don't want to do those

00:09:54   apps unless we can do like something really special. Like we have a really good idea. We

00:09:58   really want to do them right and well. And on the one hand, it makes you think like you wouldn't say

00:10:02   that if you're a savvy Apple executive, you wouldn't say that unless there was actually some

00:10:05   kind of internal project to make a really good fancy iPad weather app and calculator app. Because

00:10:12   otherwise it sounds like, oh, we never want, we didn't want to do it unless we could do something

00:10:15   really special. You're setting yourself up for criticism if you ever release one that's just an

00:10:19   enlarged version. Cause what do you say then? So it makes me think that maybe there actually is

00:10:23   a very low priority project or two inside Apple to make these versions of the apps.

00:10:27   But the second problem with the answer of course is people don't care if it's something special

00:10:30   with the iPad, just make the app so it exists. Like just make, make the iPhone app bigger. It's fine.

00:10:35   Like people just want it to be there, especially calculator. Like we really want to do something

00:10:39   special. Oh really? With the calculator? How about having buttons you can press to add numbers

00:10:42   together? Like it's not rocket science. Well, and I feel like that's kind of a BS excuse too,

00:10:46   because you look at something like, like the clock app, originally there was no clock app on the iPad

00:10:51   that came later. It did something really special with it. Yeah. They just blew up the iPhone

00:10:56   version. It's fine. Like there's nothing, which is fine. Right? Like that's what we needed. Yes.

00:11:00   It's like, you don't need to do it. Like that's, that, that to me was a BS excuse. And the funniest

00:11:04   thing was they just redid their weather app for iOS 15 and there isn't an iPad version of that.

00:11:10   And they made it really cool. And I think if you took the iOS 15 weather app and just made it

00:11:15   bigger, it would still be a really cool weather app. It's not like it gets worse. Like I understand

00:11:19   the idea of like, Oh, I mean, especially back in the early days, it was like, if you can't think

00:11:25   of some way to add a sidebar to your app on the iPad, you're not really going iPad native. Like

00:11:29   don't just take your phone app and stretch it. Like it was a criticism of a lot of the Android

00:11:32   tablet apps. It was like, Oh, it's just the phone app and bigger. And that's true. You shouldn't

00:11:36   just take your phone app and make it bigger, but it's also true that people come to expect a

00:11:41   certain baseline set of functionality. Apple has trained them to expect this because it's available

00:11:45   on the phone. Uh, and at a certain point, it's better to have a calculator app than to have a

00:11:50   really fancy one that takes advantage of the screen space and has like scientific calculations

00:11:54   and reverse false notation and 10 memories and a graphing function. Like that's great if you want

00:11:59   to make that up, but you can also just make the calculator and have it be a little bit bigger and

00:12:03   people will be fine with that. Again, they're getting it for free with the, with the iPad.

00:12:07   You know, I, if you can't think of some way to put it in like a sidebar or like a persistent tape in

00:12:13   your calculator, it's okay just to make for the 1.0 a big calculator app and the weather app,

00:12:18   like I said, I think the graphics and fidelity and layout lend themselves well to an iPad size

00:12:23   screen. Yeah. Look at weather space dot. It looks just like Apple's but bigger. Right. Apple can

00:12:28   make theirs rotate to landscape and just blow out of the water. You know, I think I've made this

00:12:32   joke already, but you know, if only Apple had some sort of cross platform framework that they

00:12:38   already wrote the weather app refresh in, in order to put it on the iPad. Like imagine if they used,

00:12:46   you know, like some sort of Swift thing that was built for user interfaces. I don't know why you're

00:12:51   trying to make this joke. You realize the iPad and the iPhone both use UI kit. Like that already is

00:12:56   the cross-platform framework. Like they have three different, like they can use UI kit, they can use

00:13:01   UI kit plus catalyst on the Mac and they can use Swift UI. Like they have so many options. Yeah,

00:13:05   exactly. They can even use electronic here. That's getting popular. We'll get to that later. Hey,

00:13:08   all right. Moving, moving right along. God, we are way behind already and we're only 20 minutes in.

00:13:13   All right. Uh, the AMD W six, whatever video cards or workstation cards, nobody cares. Cause it's

00:13:19   Mac pro stuff moving right along. Oh, we so care. Is there just one item of Mac. Okay. So

00:13:24   there was some debate last time about whether Apple's graphics cards, these new AMD fancy ones

00:13:30   are the quote unquote of workstation cards and that's why they're so expensive. So, uh,

00:13:34   hish Nash says the AMD pro w whatever works, the w apparently stands for workstation cards,

00:13:39   get the pro drivers. This unlocks some driver features and pathways from running on windows

00:13:43   on a Mac. We're running windows on a Mac pro with a w card. You have access to these pathways as

00:13:48   well. So my question is like, okay, that's great. I understand that you get access to

00:13:51   more features in the windows drivers, but is it actually a different card? Are there any hardware

00:13:56   differences? And so Guillaume L'Huel says it is indeed the same GPU is used in the gaming cards

00:14:01   with the same performance. So there's not a hard, it's not like an entirely different GPU. It's the

00:14:06   same GPU and hish Nash says there's possibly some binning and he's not sure if the memory controllers

00:14:12   are validated for 32 gigs on the cheaper version of this on the non workstation one.

00:14:17   But I think it's mostly segmentation by AMD Apple will be paying AMD a lot more for these

00:14:21   GPUs than for a gaming than a gaming OEM would do to the pro w driver support in windows.

00:14:26   So it seems like, okay, these are the quote unquote workstation cards,

00:14:30   but the only thing that's workstation about them is that when you run windows,

00:14:34   you get to use the workstation drivers, which expose new functionality when you're running

00:14:39   Mac OS. Is there literally any difference? Cause if the hardware is the same and the driver is the

00:14:44   same, it's very confusing. And again, you can put the non workstation AMD 68 or 6900 into a Mac pro

00:14:50   and it will use, I think the same drivers as the workstation one. That's the open question of

00:14:54   whether Apple has special workstation drivers or whatever. Um, so a little bit more on this

00:14:57   comparing the w 6,800 to the w 6,800 X like the PC workstation and the Mac workstation one,

00:15:04   they seem identical except for a slight clock drop. The w 6,800 is advertised at 17 teraflops

00:15:09   versus apples being just 16, the w 6,800 on the PC is $2,100. And so Apple's price of 2,800 is not

00:15:17   that extreme given Thunderbolt, et cetera. And again, they're charging you more, both on Mac and

00:15:22   PC for the w card for the exact same hardware, as far as we've been able to determine, except

00:15:27   that on windows, you get to use better drivers, which expose more of that hardware to windows.

00:15:32   And I think, and you know, more memory and possibly a higher grade of memory. I don't know,

00:15:36   but you can get, I think you can get the, the gaming 6,800 with 32, I'm not entirely sure.

00:15:43   That's, that's the question of what mix of hardware you get. Maybe you can get a cheaper memory

00:15:46   controller, but like the fact that the GPU itself, it used to be that you'd get an entirely different

00:15:50   GPU. Like it would be a different chip that had different features in it. It was often worse in

00:15:54   games and better in workstation type stuff, but this is the same GPU. It's just like heat features

00:16:02   are hidden behind a software thing on windows only. And who knows what it's like on the Mac. So

00:16:06   it doesn't make me feel that much better, but anyway, these are expensive cards.

00:16:10   Yeah. And the moral of the story is that Apple is not marking up a $600 card to $3,000. They're

00:16:16   marking up a $2,200 card to $3,000. And the, that other, that first market was happening at AMD's

00:16:21   level, not Apple's level. Yeah. AMD is marking up a $600 card to a $2,000 card or whatever.

00:16:27   Although it's not like, again, like you can, AMD has like reference implementations, but I think

00:16:32   you can just buy the GPU from AMD and then build your own card. Anyway, the GPU market is confusing

00:16:38   and scary. Oh, speaking of confusing and scary, what is that Daisy I hear, I assume? I don't think

00:16:43   hops can make that sound. Yeah. She's a terrifying beast. She does not like the GPU market. No, no.

00:16:50   Or maybe she just doesn't like talking about the Mac pro. Maybe that's the problem.

00:16:54   We are sponsored this week by memberful, monetize your passion with membership and build sustainable

00:17:02   recurring revenue from your audience. Memberful quite simply lets you sell subscriptions to your

00:17:09   audience, whatever you want to deliver to them, whether it's custom perks, custom, maybe custom

00:17:14   podcast episodes, stuff like that. Memberful is the platform to do that with. It's super easy for

00:17:19   you to set up. It integrates with all the tools you probably already use, and they let you control

00:17:25   everything about it. You have full control of the branding. You have full control over your audience

00:17:30   and you have full control over your membership payments even go directly to your own Stripe

00:17:35   account. And of course they have everything you might need beyond that. So things like

00:17:40   dashboard analytics, member management features, free trials, gift subscriptions, all of this is

00:17:46   available on memberful. It's used by some of the biggest creators on the web for good reason.

00:17:51   They align their goals with your goals. They don't want to like lock you in. They don't want to play

00:17:56   tricks or anything. They want to help you make money from your audience. That way you can sustain

00:18:00   your revenue. You can have really lasting audience income. And it's really, really nice to have this.

00:18:06   We have this here and frankly, I kind of wish I built on a memberful some days.

00:18:10   A lot of times that I wish I didn't have to build and maintain it myself. And

00:18:15   were I building a new system today, I would definitely give memberful a very strong look

00:18:19   because you can get started for free. There's no credit card required. Again, it integrates with

00:18:24   everything. They have great support if you need it, although you probably won't because it's super

00:18:27   easy. See for yourself at memberful.com/ATP. Get started. Again, there's no credit card required

00:18:34   to get started. Memberful.com/ATP. Sell memberships to your audience to build sustainable recurring

00:18:41   revenue with memberful. Thank you so much to memberful for sponsoring our show.

00:18:45   Adrian writes, with regard to bug bounties, and I think we theorized on the show, or I don't

00:18:54   remember how it came up, but why doesn't Apple pay bigger bug bounties? They have more money than God.

00:19:01   Why not just pay all the money for really good bug bounties? And Adrian writes, "Apple can't just pay

00:19:06   bananas bug bounties because if they did, all the internal bug hunters would quit and make more

00:19:10   doing the same job from the outside. It's a delicate balance and bug hunters have to want to

00:19:15   do the right thing for it to work." I do agree with this and this does make sense. But then again,

00:19:19   Apple, like developers and employees, get a lot of tools and a lot of information that an external

00:19:25   person wouldn't get. And I know nothing about hunting for bugs, but it seems to me like that

00:19:30   would still be attractive if money is not your only driving force in the world, which for most

00:19:35   people probably is. I mean, they get health insurance and a salary and even if they don't find any bugs,

00:19:41   they keep getting paychecks. Like it's, I don't think it's an apples to apples comparison here.

00:19:47   The people who are finding them in the outside world, it's kind of like trying to win the lottery.

00:19:53   Whereas getting a job on a security team at Apple is a much different financial and life arrangement

00:19:59   that is much more attractive to some people than being outside Apple and competing with the rest

00:20:03   of the world in the hopes that you'll find a bug bounty that then you can convince Apple to pay you

00:20:07   for. Yeah, I also, I don't like this argument. And first of all, I think we heard this argument from

00:20:12   inside because we heard this from a number of different people on, from a number of different

00:20:17   names and stuff and through a number of different avenues of contacting us. And so this kind of

00:20:22   feels like we actually hit the right people with our rant last time. But to me, it's, they're

00:20:28   saying like, well, if Apple paid higher bug bounties, then we'd have to, then the internal

00:20:34   people would quit because they make more on the outside. Well, pay the internal people more.

00:20:38   Yeah, like that's not the only option here. Like you could, like if the market value of finding

00:20:44   these is so high that some company in some other country wants to sell it to Saudi Arabia or

00:20:51   whatever for a million dollars, like if the value is so high, then you kind of have to pay it,

00:20:58   whatever it takes. And so if you, if it, if it takes paying the internal bug hunters enough

00:21:03   that they aren't tempted to quit and play the lottery, as John was just saying, like

00:21:06   if they can just make good money internally, well, that's the market for that. Apple is in a very

00:21:12   high profile position in the world and they have created, you know, through their success, you

00:21:17   know, and good for them, they've created a very high value for the, for exploits of their system.

00:21:24   And so if the value of an exploit is a million dollars or $2 million or whatever it is,

00:21:29   who cares how they have to pay for it, who they have to pay, what like, they should be,

00:21:35   they should still be the ones paying for it, not some random, you know, exploit company that's

00:21:39   going to sell it to a creepy government. I mean, you can do what they do with salespeople,

00:21:42   right? So you give them a decent salary, but you say, Hey, if you find one of these bugs,

00:21:46   we just pay the bounty to you. Like it happens for salespeople all the time. Or I don't even know if

00:21:50   there's a base salary half the time for salespeople. It's like, if you make lots of big sales,

00:21:53   you get lots of money. Like the, it's like, the question is how valuable is this to Apple? And

00:21:57   whatever that number is, pay it to whoever finds the bug. And I think the internal people, you can

00:22:01   adjust and say, okay, well, the internal people get health insurance and benefits and a regular

00:22:05   salary. But also if an internal person hits the jackpot and find some kernel bug, or even maybe

00:22:09   the whole team does it, like give them the money that you would have given the extra. Like, this

00:22:13   is a solvable problem. You know, this is one of the few cases where Apple having tons of money

00:22:17   actually does help solve this problem. It's not so easy in other cases. Apple should just hire

00:22:21   all the people, especially if Apple's being stupid about remote, which they're still kind of being

00:22:25   stupid about. It's not that easy to turn money into talent, but in this case, money actually

00:22:30   does solve this problem and Apple has a lot of it. And so like, you know, I don't, again, I don't

00:22:35   think you have to, you don't have to make them exactly the same because I think there are real

00:22:40   tangible benefits to be a salaried Apple employee, like say stock benefits, like things that the bug

00:22:45   boundary people don't get, but you just have to make them competitive and comparable. That's all.

00:22:50   And then for the external people, like we said last week, make it easy for them to get paid,

00:22:53   make it so that everybody says, hey, if you find a bug, totally go to Apple because you get paid

00:22:57   quickly and conveniently because that's the way you get people to send you bugs. Exactly. The

00:23:00   reputation Apple should have amongst the security community is that if you find something broken

00:23:06   about iOS that you can go to Apple and get paid well and easily and correctly like that,

00:23:12   that should be the reputation that they develop. They don't have it now. And that's, that's a bad

00:23:17   thing, but I, that's what they should be developing. And if they have to end up, you know, paying their

00:23:21   internal bug hunters more fine. That's just, that's part of, part of how you get to that end state.

00:23:26   They can do it. It's fine. No one has ever said Apple pays way too much money to its employees.

00:23:31   I've never heard anybody ever say that. So I think they can afford to, you know, raise the salary of

00:23:37   this department if they have to and raise the bug bounties if they have to, like they can totally

00:23:41   do that. And the fact is that's what the market values these at. And so whatever the market values

00:23:46   them at, Apple should be willing to outbid anybody else in the market. Yep. Definitely agree. Some

00:23:51   quick iCloud photo library follow-up. It's funny, unlike Apple Music, which I feel like is, you know,

00:23:58   nails on a chalkboard every time I use it, I still am mostly enjoying iCloud photo library, but

00:24:04   it's not perfect because guess what? It's Apple web services. So my laptop, I tried to do an import

00:24:11   of some photos into iCloud photo library on my laptop and it hung. By that I mean like the

00:24:16   photos app still working. It's just, they never got uploaded after days, after reboots, after Ethernet,

00:24:21   after wifi, didn't matter. They never got uploaded. So I thought, okay, fine. On the laptop anyway,

00:24:26   I have the, you know, photos repository in its own, not partition, but you know what I'm saying,

00:24:31   like volume or whatever the technical term is. Sorry, John. And so I just tossed the volume,

00:24:36   rebuilt it and created a new iCloud photo library this time, or excuse me, created a new local photo

00:24:42   library. This time it actually synced very quickly, which I was quite happy about. But now I have not

00:24:47   gotten any new pictures since the fourth. And as we record this, it's the 11th. It's just frozen in

00:24:52   time on August 4th. Wonderful. Great. Thanks. Thanks so much guys. And then secondly, I went to start

00:25:00   fiddling around with smart albums, which the concept of smart albums, I really like.

00:25:04   In fact, I keep meaning to haven't done it yet, but I loved your idea, John, of setting a smart

00:25:09   album for the person is Declan, but the time the picture was taken was before he was born. Like I

00:25:15   haven't, I haven't done this yet, but I love that idea. I think it was a great idea. And I started,

00:25:18   for example, um, doing, trying to like have a smart album for pictures taken by my drone.

00:25:25   And there were a couple other things I was trying to do. And I feel like there are just not that

00:25:28   many smart album, like filtering options. And yes, I think I could have handled the drone or I may

00:25:34   have already done that or whatever, but I forget what it was off the top of my head. And I want to

00:25:38   kind of try to keep this short. So I'm just gonna move on, but I really wish there were more options

00:25:42   for smart albums for things you could filter by. And maybe that's just me, but please. And thank

00:25:46   you. Yeah. One way you can help to work around that is, uh, use the criteria that are there to

00:25:52   search for photos and then apply keywords to them and then use that keyword for filtering. You know

00:25:57   what I mean? Like you can, you can make your own criteria essentially, but because you can make any

00:26:01   number of keywords. So in the little keywords interface command K add as many keywords as you

00:26:05   want and use the existing smart album features to find the photos that you want to apply those

00:26:10   keywords to, and then use those keywords in your smart albums. It's a little bit of a workaround,

00:26:14   but I I'm really a big fan of keywords since you can make them up at any time and apply them to

00:26:19   any photos you want. They really help organize things. And of course you can apply multiple to

00:26:22   the same photo. So it's a little bit tedious sometimes to apply them, but like I said,

00:26:26   finding them in big batches and applying them, uh, usually goes a long way and you can always

00:26:30   amend them later by removing and adding a totally, I fully endorse Casey. If you're doing this,

00:26:35   assigning a keyboard shortcuts to the keywords. So you can press a single letter to, uh, to assign

00:26:42   a keyword or remove it like an unmodified keystroke. So you can just type like I type D,

00:26:47   which is for Daisy, my dog, there's also my dog. Um, and I can go through photos and really quickly,

00:26:53   like, you know, select a range and hit D. These are all Daisy or it's like the photo and hit D

00:26:58   to remove the Daisy tag. Cause it's misidentified. You know what I mean? Um, obviously you run out of

00:27:02   keys, but it's kind of like using VI. Like these are not command D not control D not option D just

00:27:07   plain D. Um, and it don't, by the way, uh, another thing, yeah, I don't know how you'd figure this

00:27:11   out. I just assume everyone knows cause I use it all the time, but people probably don't.

00:27:14   Those shortcuts only work when the keywords floating palette is visible. So you won't be

00:27:20   accidentally hitting the keyboard to like, Oh, I just labeled all my photos accidentally cause

00:27:24   my elbow hit the keyboard, right? Those key shortcuts only work after you've hit command

00:27:29   K and made the floating keywords palette visible. So you can make it visible, shove it off to the

00:27:33   side and then just select photos and just get and hit the thing. And it's actually pretty quick.

00:27:37   It's all just using it, doing a SQLite update under the covers. I'm pretty sure. So it's actually

00:27:41   pretty fast to remove them. It has some visual feedback. You can see like it turning red when

00:27:45   it removes the Daisy keyword and showing Daisy and white or whatever, when it adds it, give it a try.

00:27:51   We are sponsored this week by express VPN. Every time you connect to an unencrypted network,

00:27:58   cafes, hotels, airports, anybody on that same network can gain access to anything you send

00:28:05   or receive on the network that's unencrypted. And you know, we all use HTTPS wherever we can,

00:28:10   and lots of things are secure, but it's not necessarily always everything. It doesn't take

00:28:16   much knowledge to intercept that traffic for anybody, any hacker, anybody with bad

00:28:21   intentions and even ISPs are doing stuff like injecting ads into unencrypted pages and everything.

00:28:26   It's kind of a mess out there. So express VPN is a great way to secure your connection, to use a

00:28:33   wonderful trustworthy VPN in places where you have to use someone else's network or an untrusted

00:28:39   network. It's kind of like, you know, encryption insurance for you. It ensures that all traffic

00:28:43   coming out of your devices is encrypted. And that way the people operating your network can't

00:28:48   intercept it. They can't see it. They can't inject ads into it, whatever they want to do.

00:28:52   ExpressVPN is very simple to use. If you're going to use a VPN, this is a very, very well-rated one

00:28:58   by lots of people. You don't have to take my word for it. Look up the reviews, see for yourself.

00:29:02   ExpressVPN is very highly rated. If you're going to use a VPN for whatever reasons you might have

00:29:07   to use one, express VPN is a great choice. It's simple to use. It works on all your devices.

00:29:13   It's a single app you install. You click one button to get protected. That's it. It's super

00:29:18   easy. Secure your online data today at expressvpn.com/atp. That's expressvpn.com/atp.

00:29:28   There you can get three months free with a one-year package. Once again, expressvpn.com/atp.

00:29:34   Thank you to ExpressVPN for sponsoring our show.

00:29:36   Buckle up. Here we go. Let me start by saying, if you are the kind of person that listens to

00:29:45   this in front of your children, that's awesome. And hi kids, we're so happy that you listen to us,

00:29:50   but not this time. This time, I strongly encourage you to use your chapter skip functionality in

00:29:57   Overcast or whatever, not as good as Overcast podcast client that you're using, and maybe skip

00:30:03   this chapter until after the kids are in bed. You probably know where this is going, but we'd like

00:30:09   to talk about Apple's new child safety features. So there's not going to be like swear words or

00:30:14   anything like that, but obviously the content from here on out, we're going to assume only adults

00:30:19   are listening. So please be careful. That being said, so Apple announced sometime, I think around

00:30:26   the time we recorded last week, or maybe shortly thereafter. It was like an hour after we released

00:30:30   the show. Okay, there you go. Apple released or announced some new child safety features,

00:30:36   and there's a whole landing page at apple.com/child-safety. And there are basically three

00:30:44   major features. And I think in part, because they were all announced simultaneously, there's a lot

00:30:50   of confusion, including for me as to what happens, where and when, and what all these are about. So

00:30:56   we're going to try as much for ourselves as for all of you to try to break this down and make sense

00:31:01   of it. So let me start with the like 50,000 foot view. And so here again, there are three major

00:31:06   components that Apple has announced. Number one, the messages app will use on-device machine learning

00:31:12   to warn about sensitive content while keeping private communications unreadable by Apple. And

00:31:17   we'll dive a little deeper into this in a moment. Number two, iOS and iPadOS will use new

00:31:24   applications of cryptography to help limit the spread of child, help me with this child.

00:31:30   Sexual abuse material, CSAM. Yep, I wonder, but that was the first time we said it. Yeah,

00:31:34   it's what used to be called child pornography. And this is now like the new modern, more inclusive,

00:31:39   I think, term for. Or more accurate. Yeah, yeah, child, yeah, child abusive material. Right. So,

00:31:44   let me start from the top. iOS and iPadOS will use new applications of cryptography to help limit the

00:31:49   spread of CSAM online while designing for user privacy. CSAM detection will help Apple provide

00:31:55   valuable information to law enforcement on collections of CSAM and iCloud photos. Here

00:31:59   again, there's a lot to dive into on that one, which is probably where we're going to spend most

00:32:03   of our time here in a moment. Then finally, the third one, updates to Siri and search provide

00:32:08   parents and children expanded information and help if they encounter unsafe situations.

00:32:12   Siri and search will also intervene when users try to search for CSAM related topics.

00:32:16   So, that's the broad overview. Three things, some stuff on device with messages, some stuff that's

00:32:24   working in concert between what's on your device and what's on Apple servers for photos, and then

00:32:31   finally, presumably almost entirely server-side, updates to Siri and search. So, that is the broad

00:32:38   overview. Gentlemen, I can keep going deeper, but do you want to jump in now with any tidbits?

00:32:42   I think we should start with messages one. I know you said you thought we'd spend more time on the

00:32:47   photos one, but the more I read up on this, the more I think the messages one is actually a little

00:32:52   bit of a more difficult situation. And by the way, no one seems to talk about the Siri and search

00:32:56   thing, but I think that is also related to this. Maybe I'll try to fold it into this discussion.

00:33:01   So, the messages one, that description is vague, like, "Oh, on device machine learning to warn about

00:33:06   sensitive content." What is it actually doing? So, what it's doing is it's trying to see if kids send

00:33:12   or receive sexually explicit material by detecting that on device. And then when it detects it,

00:33:20   depending on what the situation is, it pops up some kind of dialogue to the person who is sending

00:33:26   or receiving and gives them a bunch of options. Now, Gruber had a good explanation of these

00:33:32   features with more detail on his website, and we'll link to that. So, the first thing to know

00:33:36   about the messages thing is this only applies for children in an iCloud family account. So,

00:33:41   if you are not a child in an iCloud family account, I think Apple defines child as like,

00:33:46   I don't know when it stops. This feature, I believe it's only up to 13.

00:33:51   Well, there's caveats, but anyway, so if you're not a child in an iCloud family,

00:33:56   this feature doesn't exist for you, whatever. And even if it does apply to you, you need to

00:34:01   explicitly opt in. So, your kids won't be opted into this without you doing it. It's an opt-in

00:34:07   type of thing, right? So, how does it work? If you send or receive an explicit image,

00:34:13   you get a warning about the image. I don't know what the warning says. I think there's been some

00:34:19   screenshots of it, but it's aimed at younger kids, and you have two options at that point.

00:34:26   You can ignore the warning, and if you are under 12 years old, according to what Apple knows of

00:34:32   your age because you're in the iCloud family account, it says basically to the under 12-year-old,

00:34:37   if you choose to either continue to send or continue to receive this image that we're not

00:34:41   yet showing you, and you're under 12, we want you to know that we're going to notify your parents.

00:34:45   So, the kids, in theory, are told like, you can continue and you can do what you're doing,

00:34:51   but just so you know, we're going to send your parents a notification about it, right?

00:34:54   If you're older than 12, there's no parental notification thing at all. It just says,

00:35:00   "Hey, are you sure you want to do this?" and the kids can just say yes, right?

00:35:03   Yeah, for what it's worth, I actually thought the verbiage that Apple cited on their child safety

00:35:08   page is very good and worth reading. Now, obviously, I'm no expert in this, but I thought it was good.

00:35:12   So, if you were receiving an image that has sensitive content, it says,

00:35:17   you know, huge thinking emoji, this could be sensitive to view, are you sure? And then it

00:35:21   has like three basically bullets after that. Sensitive photos and videos show the private

00:35:25   body parts that you would cover with bathing suits. It's not your fault, but sensitive photos

00:35:29   and videos can be used to hurt you. The person in this might not want it seen, it could have

00:35:33   been shared without them knowing, and it says, "I'm sure" or "not now" with "not now" being the

00:35:38   obvious default. And then there's a second dialogue, you know, it's your choice, but your

00:35:43   parents want to know you're safe. And again, three bullets. If you decide to view this, your parents

00:35:48   will get a notification to make sure you're okay. Don't share anything you don't want to. Talk to

00:35:51   someone you trust if you feel pressured. If you're not alone, you can always get help here,

00:35:55   and it appears that here's a hyperlink, and then the two options are don't view photo, which is the

00:36:00   default, and view photo. So, when you read this, you can see, you can see kind of see the target

00:36:04   audience in your mind. A kid under 12 who's involved in either sending or receiving these

00:36:09   things, there's lots of dangerous situations in which it would be good if there was some

00:36:15   intervention of someone warning or, you know, like, when you're picturing the ideal scenario,

00:36:22   you're like, these are all good things. But of course, when you're designing any feature

00:36:26   like this, any feature between parents and children, it is always fraught, because not all

00:36:33   parents are good parents, and not all children are in a safe situation. Like this feature,

00:36:37   I'm not going to say this feature assumes that all kids are in a safe situation, because it doesn't,

00:36:41   Apple does a bunch of stuff to mitigate this. For example, Apple doesn't immediately notify

00:36:46   the parents without telling the kids, because if you just assumed, oh, all parents are good,

00:36:50   and all children are in a safe situation, why this whole dance was letting the kid opt out of

00:36:55   the warning? What kid is going to read that and choose to notify their parents? That warning

00:36:59   undercuts the whole feature, doesn't it? That choice to bail out and avoid the notification

00:37:05   to the parents exists, at least in part, because Apple knows that not all parents are great parents.

00:37:11   And not all kids are in safe situations, right? The difficult balance of this feature,

00:37:15   and the reason why I think it's actually trickier to think about, is how do you, like,

00:37:22   does this increase the chance that a child reveals something in an unsafe parent-child relationship

00:37:30   that makes that situation worse? There are many parents that will have a bad reaction to knowing

00:37:33   that their kids are viewing any kind of sexually explicit images, especially if they're sexually

00:37:37   explicit images that are not aligned with the sexuality that the parent thinks the kid should

00:37:43   have, let's say, right? You can't just assume that all parents are there to save and protect

00:37:48   their children, or that all parents' idea of protection matches what Apple's idea of protection

00:37:52   is, right? And you would say, okay, well, those kids just can do the thing where they don't notify

00:37:58   the parents. Everything's fine, right? These are kids under 12. How many kids have you seen tap

00:38:03   through dialogue boxes without reading the text? Right? And I will add, on top of that,

00:38:11   even an 11- and 12-year-old can be, depending on the situation, if it's two 12-year-olds swapping

00:38:18   naked pictures of each other who are, like, in a relationship or whatever, those kids may be

00:38:23   highly motivated to see that picture. And kids don't always make the best choices, right? A

00:38:29   12-year-old kid may not necessarily make the "best choices," as in, I know my parents are

00:38:35   going to be notified, but I'm going to take the risk. You know, there's a reason children who are

00:38:40   12 years old aren't allowed to vote or drive cars and stuff like this. They're still growing. They're

00:38:46   still learning, right? So even in the best of situations, this feature can lead to harms that

00:38:52   would otherwise not happen. Now, this is why it's so difficult to think about this. You say, well,

00:38:58   should we just do nothing? Should there be no features that help a healthy parent-child

00:39:04   relationship? Think of Marco putting his Apple Watch on his son so he knows where he is.

00:39:08   Features like that can be abused by parents who are not good parents to their children,

00:39:14   to kids who are not in a safe situation. Location tracking can be used as a form of oppression.

00:39:19   It's not how Marco's using it, not how most parents are using it, but should that feature

00:39:23   not exist because it can be abused? Every time Apple adds a feature like this, you can see

00:39:29   some thought and some part of the design going into the notion that we have to mitigate against

00:39:35   the worst-case scenario. But it's difficult to argue that none of these features should ever

00:39:42   exist because there is a benefit to them, and you're trying to balance the potential harm with

00:39:47   the potential benefit. In a case like this where we're trying to deal with child sexual abuse,

00:39:53   the harm is so terrible that to do nothing, to me, feels worse than to try to do something.

00:40:02   But when you try to do something, you do have to, A, try to mitigate against harms that you can

00:40:08   imagine might happen, which I think Apple's doing, and B, accept feedback from the world and your

00:40:13   customers about how you might be able to improve the situation by mitigating that harm in a better

00:40:18   way. I'm not full of great ideas for this. That's why I think a lot of people have difficulty talking

00:40:23   about this topic because if anyone is talking about this topic and they're like, "There is an

00:40:28   obvious solution that Apple should have done that is so much better than what they did, and they

00:40:31   should just do it," I'm suspicious of that. Because unless they're extremists and they say, "Well,

00:40:36   Apple should never include any features that have anything to do with parents and children because

00:40:41   any harm is worse than nothing," like the extremist sort of, and we'll get to that with the photos

00:40:45   thing of just like freedom over everything, kind of the EFF thing where if you are a lobbying

00:40:50   organization where you are staking out one end of a spectrum, there is a place for organizations

00:40:56   like that. I mean, I like the EFF. I donate to them, but I always know that the position they're

00:41:00   going to stake out is the most extreme in favor of freedom. It doesn't mean I always agree with them,

00:41:06   but I feel like that force needs to be there to counteract the other force, which is naked

00:41:12   authoritarianism. We have plenty of that in the world, right? So those two extremes need to

00:41:17   fight it out, and I'm way more towards the EFF side of the spectrum to be clear, way, way,

00:41:21   way closer. But they're always going to say, "This feature shouldn't exist at all." I don't agree

00:41:27   with that, but I also agree that it's super hard to do this feature in a way that doesn't accidentally

00:41:31   end up harming a bunch of kids that would otherwise not be harmed, either on purpose or by accident,

00:41:36   because now this feature gives parents a, you know, gives, you know, parents, bad parents,

00:41:44   I don't want to say bad parents, but like children who are in an unsafe situation are now in more

00:41:48   danger because of the danger posed by this. Previously, there was no way to accidentally

00:41:53   hit a button and notify your parents that you're doing something you know is going to make your

00:41:56   life worse, right? And now there is. But the reason this exists is because there is other

00:42:01   harm that we're trying to stop as well. So I have real trouble figuring out how to feel about this

00:42:08   feature. Right now, I kind of feel like trying to do something is better than doing nothing.

00:42:13   But I do hope Apple iterates on this, and I do believe that there can be a better way

00:42:18   to implement this with even more safety for kids in bad situations.

00:42:22   I mean, this, first of all, like this giant disclaimer from, from at least me here and

00:42:27   probably you too as well. It's hard for me to talk about stuff like this, because this is a, like,

00:42:34   the horrible dark world of child sexual abuse and this, all this stuff that this is trying to

00:42:42   prevent or, you know, find at least. We are not experts in this world. We are fortunate enough

00:42:48   that we haven't had to be. And this is like, it's such a terrible, like, set of things that happens

00:42:56   here. And again, like, we're lucky that we're not experts, but because we have a tech podcast,

00:43:02   we, and because tech is so big and it encompasses so much of the world, stuff like this lands on

00:43:07   our feet of like, well, this is what our audience expects us to be talking about this week. It's

00:43:11   very relevant. And so here we are. And I feel like many of you out there are kind of put in the same

00:43:17   position, like, as consumers of tech news and Apple news, and, you know, or just being Apple

00:43:22   fans and being enthusiasts of this stuff. Like, this stuff comes up and all of a sudden we all

00:43:26   have to take a crash course in what all this stuff means. What is that, what is going on in the world

00:43:30   out there? You know, what, what problems and solutions already exist? What have people already

00:43:36   been doing? What have companies already been doing? So we're in unfamiliar territory here,

00:43:40   to fortunately a large degree. So please forgive us if we, you know, miss some aspect of this or

00:43:46   stumble over parts of this, because it's very uncomfortable to even be thinking about this

00:43:51   stuff. Cause it's, it's so like, you know, actual sexual abuse is so horrific. I think as we get to

00:43:58   in a minute, when we talk about the, the CSAM scanning feature, it has special treatment in

00:44:02   society because it is so horrific. Like it's such a special case in so many ways of how we treat

00:44:10   things. So anyway, all of that being said, and we'll get, we'll get back to that other part in

00:44:15   a minute. All that being said, the, you know, the messages, you know, nudity sensor, basically,

00:44:21   it seems like they've done a pretty decent job of avoiding most of the problems with the parameters

00:44:27   they put in place with this feature. If the feature went up to 18, I think that would be

00:44:32   much more problematic because, you know, there's, I think everyone can agree that you don't really

00:44:38   want nine year olds sharing nude photos with each other. But people have different definitions of

00:44:44   like things like age of consent and everything. As you get closer to 18, like you could argue,

00:44:48   many people do argue, if a 17 year old girl takes a picture of herself on her phone, is should she

00:44:55   be arrested for possession of underage nudes? Like that's, and that has happened. And there's,

00:45:01   there's all sorts of weird ways in which that can be overly oppressive to women or to queer youth.

00:45:07   And so obviously any feature involving like people's ability to take and share pictures

00:45:14   of themselves runs into serious problems in practice if it's like, you know, older teenagers,

00:45:21   necessarily. So by keeping it to younger children, you avoid a lot of those murky areas.

00:45:26   Well, the flip side of that, though, is that young kids are also the most likely to misunderstand or

00:45:31   not really get the consequences of what the dialogue box is trying to tell them. That's why

00:45:36   the dialogue is worded to try to like the bathing suit area thing. It's worded and aimed at younger

00:45:40   kids, but they're exactly the ones that are the least equipped to really, truly understand the

00:45:44   consequences and also probably the most likely to tap through them really quick. And the second

00:45:49   side of that is, you know, abuse and sort of grooming by older predators happens to 16 to

00:45:56   17 year olds all the time too. So there's some people who are more expert in this field who have

00:46:01   criticized Apple's targeting of saying most of the sort of sex trafficking and grooming that is

00:46:08   happening is not happening to nine year olds, but it's actually more of a problem in the older teens.

00:46:12   And so the situation like, we all because it's so horrific, we all tend to think of like, Oh,

00:46:18   what are the normal situations? A 17 year old couple are like sending each other nude pictures,

00:46:22   and we don't want to get in the way of that because it's just normal growing up stuff, right?

00:46:25   But what about the, you know, the much, much older sexual predator, either posing as a teen or not

00:46:34   even posing as a teen, but you know, grooming a 16 or 17 year old, it's just as bad as the child

00:46:44   situation. There's so many variety of ways that these things can be abused. And the tool that we

00:46:50   have to deal with this, this, you know, we should get into the tech aspect of this for a second.

00:46:54   This is sort of just machine learning, hey, this picture that either is being about to be received

00:46:59   or about to be sent. Does it look sexually explicit? And that's just kind of a best guess.

00:47:05   And that's the only tool we have. We don't have any other context. There is no machine learning

00:47:09   trying to suss out. Is this conversation between a predator and prey? Is this a conversation between

00:47:14   two kids who are a couple? There's as far as Apple has told us, there's none of that. It is literally

00:47:20   only this one thing, photo coming in, photo coming out, ML model, look at photo, tell me yes, no,

00:47:26   is it sexually explicit? Such a blunt instrument that has no awareness of this other stuff.

00:47:31   And it's hard enough to solve this problem because you know, all the, like the pictures of like,

00:47:35   you know, someone's making bacon cookies and they take a picture of the bowl and it's sexually

00:47:40   explicit because it's like, it looks, you know, like machine learning is not perfect. This is

00:47:44   straight up, hey, machine learning, here's an arbitrary picture. Tell me whether it's sexually

00:47:48   explicit. And it's not super accurate. So we also have to account for all the cases where some poor

00:47:55   kid or teenager is going to be faced with this dialogue, especially on an incoming picture and go,

00:47:59   why did this person send me? And it's just like a picture of their dog, right? Because their dog

00:48:05   is determined to be sexually explicit, right? So the tech involved in this thing also makes it

00:48:11   somewhat fraught. And I think like, you know, Marco, from your perspective, like, oh, it's,

00:48:16   it's easier for the older kids and harder for the younger in some aspects, but also in some aspects

00:48:20   is the reverse. And like, you really just have to go through all the different scenarios. And it

00:48:24   probably also helps to have experts in this field to like, you know, like I read a few things from

00:48:28   of saying like, here's where the bulk of the problem is. And even though this is scarier,

00:48:32   this happens, it's kind of like the whole thing of like, you're probably not going to be like,

00:48:36   murdered by a stranger, most likely, especially if you're a woman, you'd be murdered by, you know,

00:48:40   you person you're in a relationship with or someone you know, or your family member,

00:48:43   it's depressing to think about, but like, the fear of murder from a stranger, you know, or a

00:48:48   shark attack or whatever is so much out of proportion to what's actually going to kill you,

00:48:52   which is usually much more mundane. Right. And so I'm sure something like that also applies to

00:48:58   all the child sexual abuse stuff and experts in the field would could probably help Apple

00:49:03   better target this. But when your only tool is in this particular feature,

00:49:07   this machine learning model, it's your options are limited.

00:49:11   Trenton Larkin Yeah, yeah, very much so. I mean,

00:49:13   it's such a tough thing, like you guys said, you know, you want to prevent this and in Apple's case,

00:49:20   not only do you want to prevent it, but you want to do it with some semblance of privacy, you know,

00:49:24   you don't want to be beaming these images to some server like Google probably would, I honestly

00:49:30   don't know how they handle it. But you know, you don't want to be beaming these every image that

00:49:35   you receive via iMessage to some server to verify whether or not it has CSAM in it. It's a very

00:49:42   difficult problem to solve. And Apple's made it more difficult by by insisting on on having it be

00:49:47   as private as they possibly can, which is, in my opinion, it's something they should be applauded

00:49:52   for. But it's challenging. This gets us into the next feature was like, Oh, this is privacy

00:49:56   preserving, it doesn't break end to end encryption on messages, right? Because it's only like,

00:50:00   obviously, when a message arrives on your phone, something has to decrypt it. Otherwise, you can't

00:50:04   read it. Right. So and if we do it on device, if we do the machine learning model on your device,

00:50:09   like it was encrypted, and then encrypted across the hallway, and only right before it gets to

00:50:13   your eyeballs, when we have to encrypt it anyway, at that point, we'll do the machine learning

00:50:16   thing. So it's privacy preserving, right? From one point of view, yes, I if by what you mean

00:50:21   of privacy is, we didn't compromise the privacy in transit, no one snooping over the internet is

00:50:26   going to be able to see that picture and grab it because it's end to end encrypted. Another aspect

00:50:30   of privacy is, hey, Apple, don't tell my parents about this. That's privacy, too. If you're a kid,

00:50:35   not ratting you out to your parents is a form of privacy. Right? And so yes, the grant of dialogue

00:50:40   tells you it's going to do that, and so on and so forth. But you're putting a lot of weight on

00:50:44   people being able to correctly read and understand dialogues and by the way, tap the right button.

00:50:48   Right? Previously, before this feature, there was no feature in messages that could potentially rat

00:50:53   you out to your parents, right with an errand click on a dialog box, right? And now there is.

00:50:59   And from a child's perspective, that's not privacy preserving at all, right? From an abstract kind of

00:51:05   like, oh, random people snooping on internet routers can't see your picture. Great, that's

00:51:09   great. But what I care about is my parents finding out and now suddenly there's possibility where

00:51:12   that didn't happen before. And of course, on the parent side is, oh, if a predator is trying to,

00:51:17   you know, groom my 12 year old, I really want to know about that, Apple. And so there's just so

00:51:22   many conflicting stakeholders in this soup that it's very difficult to come up with a, you know,

00:51:28   other than again, the extreme position of like, you should just never do anything about this.

00:51:31   Right? And that seems like a clean solution to you think, all right, so we should do nothing about

00:51:35   child sexual abuse. It's like, well, don't do this. Well, what should we do? Oh, now it's a hard

00:51:40   question. I don't like it. Like, so Apple's trying to do something and we'll get to why probably in

00:51:43   a little bit, but anything you do is, is fraud in some way. Well, and I think, I mean, let's get to

00:51:50   that now. I think one of the things that I've learned listening to, you know, other people's

00:51:54   podcasts about this, by the way, I can strongly recommend this week's episode of decoder with

00:51:59   Neelay Patel. He had a couple of experts on in this area and I guess so this podcast, you know,

00:52:05   decoder, every, every, every episode of it is like, you know, some CEO or, you know, chief officer of

00:52:12   some company and they, they mostly sound that they're going to be really boring. But then when

00:52:17   I listen to them, it's like, I learned something cool or it's much more interesting than I thought

00:52:22   every single episode, like literally every episode I've heard, which is most of them, it always ends

00:52:26   up being worth it. Even if it sounds like from the title and description, like it might not be very

00:52:30   exciting or it might be a company you don't care about. Anyway, so this week's episode of decoder

00:52:35   with Neelay Patel, very, very good. He, cause he had two experts on in this area and I learned a

00:52:41   lot from that, that I didn't hear in a lot of other places. So I can strongly recommend listening to

00:52:45   that. If you, if you want to hear from people who actually know what they're talking about in this

00:52:49   area, you will learn a lot, I promise. But yeah, anyway, one of the big things that we've all

00:52:54   learned, at least I sure have, I didn't know this before this is that almost all of the major,

00:53:03   you know, tech cloud slash service companies are doing various forms of C scan, CSAM scanning

00:53:11   and reporting and everything like every big company you can imagine, like Dropbox,

00:53:15   you know, Facebook, Microsoft, like everyone's doing this. And one of the reasons why they're

00:53:20   doing this is because they have to by law in many countries, including in the U S and so I think

00:53:27   part of the reason why Apple is doing this is that they have been facing increasing pressure from

00:53:34   the law and, and from law enforcement agencies. And there's, there's a big history here of,

00:53:40   you know, Apple trying to make their devices very private and trying to give users lots of

00:53:46   strong encryption tools to use for their data and for their content, while sometimes being at odds

00:53:53   with what law enforcement wants to be available to them. You know, there was the obvious of the

00:53:58   famous San Bernardino shooter case, where, you know, the government wanted Apple to unlock a

00:54:05   phone and Apple basically said no. And Tim Cook made some terrible analogies about cancer. But,

00:54:10   you know, for the most part, his argument, once you got past those terrible analogies was fairly

00:54:14   sound. And why they shouldn't do that. But anyway, you know, this part of what makes this complicated

00:54:21   is that we in the tech business, we operated for so long, kind of skating by, under the radar of most

00:54:30   governments and legislators, they couldn't keep up with us. They didn't understand what we were

00:54:34   doing. And they kind of left us alone to a large degree for a very long time as we developed the

00:54:40   tech business. And I think those days are long over now, like that now governments have gotten

00:54:47   a clue of how powerful tech is. They don't like parts of it. And they intervene now it to a much

00:54:54   larger degree with legislation and pressure and legal, you know, threats or actions that they did

00:54:59   in the past. So we as computer people are accustomed to tech companies being able to do whatever they

00:55:05   wanted, and us being able to have these devices that we could do whatever we wanted on. And

00:55:10   largely the law was not enforced, or didn't expand to cover tech stuff. And so we got used to this

00:55:17   freedom of like, my device is mine, the government can't tell me what my phone can and can't do or

00:55:22   whatever that, you know, that era has been chipped away over the last several years, at least. And

00:55:28   now, all the tech companies are under much greater pressure from the governments that they either

00:55:34   operate directly in, or at least have to sell their products to for, you know, for healthy

00:55:38   financial reasons. So there's going to be an increasing amount of government intrusion into

00:55:45   tech. Some of that, like some of the antitrust proposals, which I know there was a big one today,

00:55:49   we're probably not going to get to it today, because it just happened and we have a lot of

00:55:52   stuff to talk about today. But some of that stuff will be good. But a lot of this stuff will be,

00:55:56   well, we have to comply with this law now. And some of those are going to be good laws that we

00:56:01   agree with. And some of them are not. And it's going to get messy. It's already getting messy.

00:56:06   And it's going to get messier as the tech companies have to, like, bow to pressure from or just

00:56:13   comply with the laws in their jurisdiction that they operate in. Are you sure about one

00:56:18   correction? Are you sure about the thing where they have to scan? I'm pretty sure they don't

00:56:21   have to scan in the US. What they have to do is report it if they find it, but they don't have to

00:56:25   go looking for it. But there are, I believe that's right. But there are UK and EU laws that are

00:56:32   coming down the pike that potentially will say you have to scan for it. So in some ways,

00:56:37   well, let's just finish up the motivation for this thing. Some of the motivation might be

00:56:42   that those laws are coming and you might have to comply with it anyway, so we should do it.

00:56:47   Another part of the motivation is, and by the way, all these features we're talking about are

00:56:51   US only at this point. So even those EU and UK laws, those aren't relevant, but Apple

00:56:57   will potentially expand this to other countries on a country by country basis, according to them.

00:57:02   The other thing is that if the US ever has a law like this, Apple, and this is what Apple says in

00:57:08   their interviews, and we'll have Matt Pansarino had a good interview with Apple's head of privacy

00:57:13   about this. The Apple answer is, the reason we're doing this now is because we figured out a way to

00:57:18   do it that is "privacy preserving." And we'll talk about the photo scanning and what their meaning of

00:57:24   that is. But what they're saying is these other companies that are doing it, like Facebook and

00:57:28   Microsoft and so on and so forth, they do it the brute force way of like, "Hey, we have access to

00:57:32   all your photos that are stored on our servers. It's our servers, they're on our cloud services.

00:57:36   We're just going to scan them there. And if we find anything, we're going to report it." Because

00:57:39   the law in the US is if you find it, you have to report it. But they're actively looking for it.

00:57:43   They're scanning all your photos on the server side because they have them. Apple could do that

00:57:47   too, but Apple apparently considers that not privacy preserving. And the Apple side of privacy

00:57:52   really hits on this thing of saying, "Well, I know it's much worse when you scan on the server side

00:57:56   because it's more opaque and you can't tell what we're doing. And we could decide to just scan one

00:58:00   person's thing because they're under scrutiny and all these sorts of other things." Apple is very

00:58:05   big in their messaging to say, "That is not from Apple's perspective privacy preserving. What is

00:58:09   more privacy preserving is if we do it on device." And we'll talk about that feature in a second or

00:58:13   whatever. But Apple's story is, "Hey, the reason we're doing this now is not because we're afraid

00:58:17   of regulations coming down or whatever. It's because we found a way to do it that is privacy

00:58:21   preserving according to our definition of privacy preserving." But surely part of Apple's motivation

00:58:26   is that Apple knows that whenever there is an attack on Apple's privacy preserving features,

00:58:32   like the San Bernardino thing of the FBI or whatever saying, "Apple, this is a terrible

00:58:36   terrorist. You need to let us have a backdoor on all your iPhones because terrorism is bad."

00:58:40   That's not a good situation to be in and Apple has to make the difficult argument that we're not in

00:58:46   favor of terrorism, but we also don't want to put a backdoor on all our devices because there's no

00:58:50   such thing as a backdoor that can only be used by the good guys. It's an argument that tech people

00:58:54   understand, but it's hard to understand when emotions are high and terrorism is involved.

00:58:59   Same exact thing with child sexual abuse. If it's a child sexual abuse situation, you can say,

00:59:03   "Apple, I know you said you don't want to include a backdoor for some reason, but child sexual abuse,

00:59:08   you have to do it for the children." So features like this, where you can say, "We found a way

00:59:14   to do this without backdooring every single iPhone," is a great defense when the time comes

00:59:19   when someone says, "Oh, just like in the movies, this kid has been kidnapped by the boogeyman,"

00:59:24   and some scenario that never happens in real life. "A stranger has kidnapped a beautiful,

00:59:28   innocent child and you need to unlock this phone to get it and Apple, you need to let this happen,"

00:59:32   or whatever. Features like this that hopefully catch the boogeyman before they kidnap a kid by

00:59:37   detecting the fact that they're downloading CSAM and stuff, done in a way that doesn't require

00:59:43   putting in a backdoor that "only the good guys can use" or some other technical fantasy that

00:59:48   doesn't actually exist, is a great way for Apple to be prepared when, like Marco said,

00:59:53   those regulations start coming in the US. It's not a free-for-all anymore. It's probably part

00:59:59   of the same reason that Facebook and Microsoft and Google and all those things do their own

01:00:03   CSAM scanning server-side. Just say, "Look, we're already doing a thing that will help with this

01:00:08   terrible situation, so please don't ask us to backdoor our encryption," or "Please don't outlaw

01:00:12   and end encryption," or all sorts of other much worse policies that will actually make everyone

01:00:17   less safe but that are politically appealing to people who don't understand the tech.

01:00:21   Yeah, so let's talk about iCloud Photo Library. So like I'd said, again, the summary is that iOS

01:00:30   and iPadOS will use new applications of cryptography to help limit the spread of

01:00:34   CSAM online. While designing for user privacy, CSAM detection will help Apple provide valuable

01:00:38   information to law enforcement on collections of CSAM and iCloud Photos. So let's start off.

01:00:43   If you are not using iCloud Photos, this does not apply to you. That's as simple as that.

01:00:49   Before moving on from that point, that's another thing that a lot of people will bring up,

01:00:54   which is, "Oh, well, then there's no point in this feature," because all the nefarious

01:00:59   child sex abuse predators will just read that and say, "Aha, I'm safe from Apple. I just won't

01:01:04   use iCloud Photo Library." Why would Apple announce the way to avoid this feature? It's totally

01:01:11   pointless. All it will ever do is catch innocent people because no guilty person will ever use it.

01:01:15   If you look at the CSAM scanning that Facebook and all these other big companies do,

01:01:22   and you see how many instances of it they catch every year—I think the Facebook number was 20

01:01:28   million reported last year. Oh my god. And it's not like it's a secret information that Facebook

01:01:35   does this scanning, right? So you would think, "Well, if Facebook announces to the world that

01:01:40   they do this scanning, why would anyone who's a child sexual predator use Facebook?" People

01:01:48   do things that don't make a lot of sense, but it's—we'll get to this in a little bit—saying,

01:01:56   "I just won't use Facebook. I just won't use Google. I just won't use Apple. I just won't

01:02:00   use iCloud Photo Library." Yes, in theory, if you were the world's biggest criminal mastermind,

01:02:06   you could avoid all these things, right? But practically speaking, it's very difficult to

01:02:12   essentially avoid using the internet and the major players on the internet. And practically speaking,

01:02:17   20 million cases caught by Facebook shows that they don't avoid it. They do it, and we catch them.

01:02:24   And that's why features like this, even though there's a way to work around them, still have

01:02:30   value in catching criminals. If you caught zero of them per year, we should have to rethink this,

01:02:36   but 20 million per year at Facebook is a big number. And by the way, Apple, which prior to

01:02:42   these features was not actively doing anything to catch this stuff, reported something like 200 last

01:02:48   year. And who knows how they found those 200, right? Maybe they were reported or something like

01:02:52   that. But when Facebook is doing 20 million and Apple is doing 200, I feel like that shows that

01:02:59   Apple needs to do more. And so here is that, thus these features that we're talking about. So there

01:03:04   is this next feature. So yes, it's only if you use iCloud Photo Library. If you don't use iCloud Photo

01:03:08   Library, none of this stuff ever happens, but that doesn't mean that no one will ever be caught by

01:03:13   this. Right. So I tried to do some deeper reading into the mechanics of how this works and I did

01:03:20   some, but my eyes glazed over for some of it. I didn't get through it all. So I have tried to do

01:03:26   some research and I have failed. So call me John Saracusa, but we will try to cite what we can and

01:03:31   people who have done a little more research than us. And certainly like, you know, Marko's disclaimer

01:03:35   earlier, you know, I am not a cryptographic expert. In fact, a lot of it is way above my head. So I'm

01:03:41   trying my darndest to understand this a little better, but I need a little more time to get it

01:03:45   a hundred percent right. But with that said, mostly quoting Gruber's really good summary.

01:03:50   So for iCloud Photos, the CSAM detection for iCloud Photos only applies to images that are being

01:03:55   sent to iCloud Photo Library. Like I said earlier, if you don't use iCloud Photo Library, no images

01:04:00   on your devices are fingerprinted. Photos are compared on device to a list of known CSAM from

01:04:08   NCMEC, N-C-M-E-C, which is the National Center for Missing and Exploited Children. So let me

01:04:12   unpack that sentence. So NCMEC, the National Center for Missing and Exploited Children,

01:04:17   they do keep a database or repository of some sort, if I understand correctly, of CSAM. And

01:04:24   they are the only organization, they're the only people that are legally allowed to do that here

01:04:29   in the United States. And that's because they're the people in charge of trying to prevent it and

01:04:33   fight it. And so, from my understanding is, Apple, and I'm filling in a couple of blanks here,

01:04:38   but Apple will provide some sort of tool to NCMEC to scan all their files and their database. These

01:04:44   are things that they know are bad. This known, you know, sexually explosive material, child sex

01:04:50   assault material, whatever. They will scan all that and that will generate a bunch of hashes.

01:04:54   So basically a bunch of numbers. And they'll be post-processed a little bit by Apple, the hashes,

01:04:59   that is, not the photos. And that generates a whole bunch of hashes, again, so these are numbers that

01:05:08   Apple can then use to compare your photos to. So the idea is, and I'm dramatically oversimplifying,

01:05:15   but let's say there's a CSAM picture of whatever, doesn't matter what the specifics are, and it

01:05:22   yields a number of 42. Now, obviously these numbers are way longer than that, but let's just say

01:05:27   it yields the number 42. Well, if I had a picture on my phone that also yielded 42 as the hash,

01:05:34   as that unique number, and it should do this, by the way, even if I make it grayscale, even if I,

01:05:40   you know, twist it upside down or whatever the case may be, because it's doing some semantic

01:05:44   processing and some other things. But one way or another, if I end up with a photo that ends up with

01:05:49   a hash of 42, and Nicmec has provided a photo and scanned it using Apple's tool and provided the

01:05:55   hash of 42 to Apple, then uh-oh, we've got a match and things will carry on from there. But before I

01:06:00   go on any further... And when you say a match, by the way, you're not saying this is a similar

01:06:05   picture, this is a picture of a similar thing, it is literally the same picture. Plus or minus,

01:06:09   like you said, zooming, cropping, grayscale, blurring, like, but basically what it's trying

01:06:15   to say is this is literally the same picture. Like it's not like saying, oh, this is a picture

01:06:18   of an apple. It's like, no, this is the exact picture of an apple that's in the CSAM database,

01:06:23   right? It is the exact picture. So there are a finite number of pictures that this is trying

01:06:29   to detect. It is the database provided by Nicmec. I don't know how many pictures it is, but that's

01:06:34   it. Those are all the pictures that it's ever going to find. It's not going to find a picture

01:06:37   that's not in that database. And if it finds one, it's not saying this is a similar picture,

01:06:41   a picture of the same thing, or even a picture of the same person or anything like that. It is

01:06:44   saying this is literally that picture. So it is extremely limited in that if it's not on the

01:06:50   Nicmec database, it will never be detected if this system is working correctly, right?

01:06:55   Which if that disclaimer we'll get to in a little bit, but that's what this thing is

01:07:00   attempting to do. Right. This is in contrast, mind you, to the messages stuff, the iMessage stuff we

01:07:06   were talking about earlier, where that is trying to say, oh, that looks like a body part covered

01:07:10   up by a bathing suit. That is something we should figure, you know, that that's something we should

01:07:14   alert you about. This is different. This is exactly what John said. This is not, oh, that looks like

01:07:19   a body part covered by bathing suit. It's no, no, no. It's this picture matches whatever picture

01:07:25   is in that CSAM database. And Apple doesn't get the CSAM database because not only do they not

01:07:30   want it, I'm quite sure, but it is illegal for them to have it. All they are getting is the ha-

01:07:34   or the the list of hashes generated by it, presumably by some tool that Apple provides.

01:07:39   So the thing is, though, you just one match isn't enough. Nothing happens if there's one match.

01:07:46   There is some threshold. Nobody knows what that threshold is. That's probably for several different

01:07:51   reasons. Probably so, you know, like if we all knew that the threshold was 20, then some nefarious

01:07:57   individual could keep 19 photos on their phone and they'd be fine. But we don't know if the threshold

01:08:02   is 20 or two or two million or whatever. So one way or another, one match isn't enough to trigger

01:08:08   any action. There is this threshold and we don't know what that threshold is, but eventually that

01:08:15   threshold will be reached. And, and again, I'm totally making this up, but just to make discussion

01:08:19   easier, let's say it's 20. And so once 20 pictures are hit, then at that point, the cryptographic

01:08:27   protections that are built around these, um, these, I forget what they call them. I'll have to

01:08:31   help my head now. Safety vouchers. That's actually before we even get to the threshold board. That's

01:08:35   an important point. When, when one of these matches is found, there's one of these safety

01:08:39   vouchers is sent to Apple, but Apple itself can't decrypt that to do anything with it until the

01:08:45   threshold is met. Like there's a bunch of cryptographic stuff, which like Casey said,

01:08:47   is probably over all of our heads, uh, that makes that possible is using cryptography to say, okay,

01:08:52   when we find a hit, we'll send the safety voucher to Apple, but Apple cannot do anything with that

01:08:59   safety voucher. They can't tell what the original picture was. They can't tell which picture it

01:09:02   matched. They can't do anything with it until the threshold is reached. And when the threshold

01:09:06   is reached, then at that point, Apple has 20 safety vouchers from this person's phone.

01:09:10   And at that point, then because of the cryptographic cryptographic stuff,

01:09:14   then they can actually decode them and say, now we need to actually look at these pictures. And

01:09:18   so that brings us to the next step. Yeah. It's kind of like the world's most complicated and

01:09:22   worst raid array. Like we need, we need a certain number of these before we can decrypt any of them.

01:09:29   Honestly, like from a technical point of view, that's a really cool idea. Like that is very

01:09:34   clever. They do a bunch of other privacy preserving stuff that again, if you can understand the

01:09:38   cryptographic stuff where they will intentionally send like false, uh, seed it with false

01:09:42   information. So there's no way to sort of, there's no way to sort of pick out people who are

01:09:47   potentially bad actors until the threshold is reached. Because just because you see some,

01:09:52   like it's, they do a bunch of stuff where to try to be privacy preserving, because as we've learned,

01:09:58   even just from metadata, just knowing that like safety vouchers are arriving could be

01:10:03   some information that could be used to determine something. So they intentionally seed in some

01:10:07   information to add noise to the thing. But the whole point is even Apple, even under like

01:10:12   threat of law, again, if they, someone subpoenaed said, we demand that you decrypt these safety

01:10:17   vouchers and show what these pictures are. Apple literally can't do it because of math until the

01:10:22   threshold is reached. Right. Which is, which is very cool. And again, that 20 number that

01:10:26   we're using that's just made up. We have no idea what the threshold is. So the, but the threshold

01:10:31   is designed such that, and now this is a quote from Apple. So to provide an extremely high level

01:10:37   of accuracy and ensures that less than one in 1 trillion chance per year of incorrectly flagging

01:10:44   a given account that now mind you, that's not incorrectly flagging a photo incorrectly,

01:10:47   flagging an entire account. So if you're to believe Apple, whatever that threshold is,

01:10:52   be it 20 or 200 or 2 million or whatever, there is less than a one in 1 trillion chance that any one

01:10:58   of the three of us or anyone else for that matter, we'll have an oops and get our account flagged,

01:11:02   even if it shouldn't be. So this, this thing reveals some information about this because

01:11:06   we just got done saying like the whole point of this algorithm is to try to tell, is this the

01:11:10   same picture as that accounting for things like zooming, cropping, rotating, color changing,

01:11:16   stuff like that. So when I say, Oh, accounting for those changes, it's clear that it's not a

01:11:21   byte for byte comparison because that would work on any of those things, right? Obviously there is

01:11:26   some amount of, I don't know if you call it machine learning, but some amount of processing that is

01:11:30   done to try to determine if this picture is the quote unquote same picture as that one, even if

01:11:36   it's been converted to black and white, even if it's been zoomed a little bit, even if the crop

01:11:40   is slightly different, even if a new section of it was blurred out, even if it has some words stamped

01:11:44   on it, you know what I mean? Like a human could tell if they're the same picture, but for computers,

01:11:48   it's harder to sell like a human can tell, Oh, this is the same picture. It's just rotated a

01:11:52   little bit in zoom track. We can do that pretty easily, but computers have harder time with it.

01:11:55   Right? So this one inch trillion chance thing. And the fact that there's a threshold at all

01:12:01   is telling us this algorithm is not a hundred percent accurate when it comes to determining

01:12:06   if this picture is the same as the other one, because if it was, you wouldn't need a threshold,

01:12:09   right? It's not like they're trying to say, you're allowed to have some number of CSAM on your,

01:12:14   on your computer. That's not what they're saying with this threshold. Like, Oh, it's okay. If you

01:12:16   have a little, but if you have a lot, we're going to report you to the law. It's because this

01:12:21   algorithm is not a hundred percent accurate. Right. And so to make it and, and, you know,

01:12:26   obviously having a false positive is really bad. So to try to avoid a false positive,

01:12:30   Apple has done the math and said, we're going to make this threshold and we're going to make it. So

01:12:34   it's really, really bad, hard to have a false positive. And there's, there's two strategies

01:12:39   in that one, the consequences of a false positive are, could be devastating to the person involved

01:12:44   in it. You do not want to be reported for law enforcement for having CSAM when you have actually

01:12:49   none because of some stupid algorithm. Right. That is super harmful and Apple would never want to do

01:12:55   that. Right. And the second thing is, since the algorithm is not a hundred percent accurate,

01:13:02   Apple wants to actually know that like it's an Apple's interest to, to try to, to make sure that

01:13:09   you, and also to get the most egregious offenders, right. You really like the whole point of this is

01:13:14   to catch the people doing the bad thing. I'm going to, I don't know much about this field,

01:13:19   but I'm going to say it's probably unlikely that people who are doing this have one picture,

01:13:23   right. They probably have more than one. So we, again, we don't know what the threshold is,

01:13:27   but by putting the threshold like this, hopefully they can avoid any false positives and also pretty

01:13:34   much catch everybody who's doing this. Again, it depends on the threshold. If the threshold is a

01:13:38   million photos, maybe this is not a great feature. Uh, but if the threshold is 10, you're probably

01:13:42   going to catch all the people. Right. Like, again, why don't they just keep nine or is the, if they,

01:13:47   we found out the secret threshold, people could just keep one under see also Facebook catching

01:13:51   20 million people. Um, it like, that's not the way criminality works and there is no system that can

01:13:57   own that can catch the master criminals. Right. Cause they just won't use the internet and they'll

01:14:00   be safe. They'll live in a shack in the woods. Like there's always some out, right. We're just

01:14:04   trying to do something which is better than nothing in this case. So, so yeah. So the

01:14:08   unreliability of this needs to be a factor, like the threshold, that's the way to think about this.

01:14:12   Right. Um, and Apple's calculations presumably are well-founded, but, um, the reason a lot of

01:14:20   people are, oh, there's lots of reasons people are nervous about this, which we'll start enumerating

01:14:24   now, I think. But one of them is that this is not, uh, that it is an algorithm. And despite the fact

01:14:30   that Apple says it's one in a trillion, it's not potentially reassuring. Now the next backstop on

01:14:34   that is when you hit the threshold and Apple can finally decrypt the safety vouchers. It doesn't

01:14:39   report you to the police at that point. What happens at that point is Apple has someone

01:14:44   whose job is terrible, actually actual human beings then have to look at the photos and do a

01:14:51   final human powered confirmation that yes, these really are the same photos, right? That these

01:14:56   really are. I mean, not the same, but these really are, you know, they are, they are CSAM and not a

01:15:00   picture of someone's dog, right? Human being has to make that determination. That's not a fun job.

01:15:07   Um, but that is the backstop and saying, okay, at that point, after a human looks at it, after it's

01:15:12   past the threshold, it's done all the things. Once it passes the threshold, they get, I think they

01:15:16   get like a, a lower resolution version of it. They don't even get the full version of it, but they

01:15:20   get enough of it. So, so a human can look at it cause Apple can finally decode it now because it

01:15:24   passed the threshold. They look at it, they make a determination. This is by the way, after the one

01:15:29   in a trillion, after the one in a trillion, then a human looks at it. So even if you fall into the

01:15:34   one in a trillion thing, if it turns up not being one in a trillion, but one in a hundred million,

01:15:37   then a human has to look at it. They make the determination. If it turns out at CSAM,

01:15:41   they report you to the authorities because it's us law that they have to do that anyway, right?

01:15:46   Because now they have found it. A human has confirmed it and they report it.

01:15:49   Uh, unfortunately for Apple, even this is not particularly reassuring to a lot of people because

01:15:55   anyone has gone through app review knows that ostensibly human looks at every app in app review

01:15:59   and we've all seen rejections from app review that prove that having a human look at something is not

01:16:05   necessarily a guarantee that something sane or logical will happen. Now you would hope

01:16:08   that the people doing this job have a higher threshold of reporting someone to the police

01:16:15   for child sexual abuse material than rejecting your app because they think you didn't put a

01:16:20   button in the right place. I would also hope they don't have such a volume to deal with.

01:16:24   Well, we'll get to that in a little bit cause that actually might not be true.

01:16:28   Given the, uh, how little they detected so far and what might be lurking in there,

01:16:33   it may actually be a terrible situation. But like, this is not Apple's, well, it's not

01:16:38   entirely Apple's fault, but there is a perception that, uh, you know,

01:16:42   especially within tech community that's thinking about this from a tech and privacy perspective,

01:16:46   that that doesn't actually make me feel that much better because my experience with humans at Apple

01:16:52   is not reassuring in this regard. Now I think that's probably just a sort of

01:16:57   gut reaction to past experiences that I hope has almost no bearing on this situation. Um,

01:17:02   because it seems like app review is complicated. Uh, human being looking at a picture and

01:17:08   determining whether it's child sexual abuse material seems less complicated to me. It seems

01:17:12   more an open shot type of thing. I don't think a picture of your dog is going to be accidentally

01:17:19   flagged as CSAM by an inattentive reviewer. I really hope not. Right. Um, but so why, you know,

01:17:27   so why does this feature make people upset? Why was this feature getting most of the press and

01:17:32   complaints, uh, aside from the messages feature above and beyond the messages one? Why is this

01:17:36   the one that bugs everybody? Um, I think part of it is that it applies to, uh, adults. It's not

01:17:43   just kids because, you know, who's on the internet arguing about this, probably not 12 year olds,

01:17:47   but it's a bunch of adults. And this one does apply to adults if you use iCloud photo library.

01:17:51   Right. So that's one aspect. The other ones that I just talked about are like,

01:17:55   well, Apple says it's one in a trillion, but who's knows what it really is. It's not a deterministic

01:18:00   algorithm, or it's not a, it's not a algorithm that anyone really understands. So it's some

01:18:04   form of machine learning and it's kind of fuzzy and it's not a hundred percent accurate. That's

01:18:07   the thresholds that makes me nervous. And the humans is the backstop don't make me feel better.

01:18:11   So there's some worry about being flagged unjustly, despite all of the backstops that

01:18:16   Apple's put into it. Um, one of the more fundamental underlying discomforts with this entire system

01:18:22   is that it feels, I'm going to say unjust on American, not in keeping with the American

01:18:30   justice system because people have some expectation and you know, part of the constitution, the fourth

01:18:37   amendment or whatever, but like in, in the U S anyway, there is a sense that if you are looking

01:18:46   into something in my life, there has to be some reason I'm suspected of a crime. So you look at

01:18:52   my bank records to see if I've been laundering money. You know, uh, like I, I, you think I have

01:18:59   stolen merchandise because there's a, you know, someone who matches my description caught on a

01:19:03   security camera, stealing something from a store. So you have a warrant to search my house, right?

01:19:08   That is generally the way our criminal justice system works. That if there is some suspicion

01:19:14   that you have done a thing, you have to convince a judge that we think this person might do this

01:19:19   thing. Therefore we need to search something and you get a search warrant and you look into it.

01:19:22   Right? The other side of that is where you just watch everybody all the time.

01:19:28   And that way, if anyone does anything wrong, you catch them. And that's what we call surveillance.

01:19:35   And this feature does not have the concept of probable cause or any of these type of things.

01:19:43   It's surveillance. It is watching every single picture on every single person's phone all the

01:19:51   time. Now, Apple isn't the U S government. The, you know, there, there, there is not the same

01:19:57   situation at all, but from a sort of emotional feel and justice perspective, it feels like I am

01:20:04   now being surveilled that everybody is being surveilled. That we, that everything we're doing

01:20:09   is being watched just in case we ever do something criminal. Again, the messages feature is exactly

01:20:14   the same, but it's like, Oh, that's kids only. It only applies to kids. It doesn't apply to me.

01:20:18   I don't have to worry about that. But every photo that is sent and received by people

01:20:22   who under the age limit of that messages feature, every single photo has that ML thing run against

01:20:28   it. If you've opted in right. And same thing with this thing. If you're using iCloud photo library,

01:20:32   every single one of your photos going into iCloud photo library has a supply to it.

01:20:37   And for some people, especially people who are sort of security conscious and looking at, or

01:20:41   privacy conscious and looking at this through the broader lens of, you know, what seems fair and

01:20:48   just in a technological world, this doesn't feel good. It doesn't feel good to know that you are

01:20:52   constantly being surveilled just in case you do something wrong and everyone tries all those

01:20:56   stuff. Also, if you're not doing child sexual abuse stuff, you have nothing to worry about.

01:21:00   If you have nothing to hide, it's okay for the East Germans to listen in on your phone calls. Right?

01:21:04   Like, again, Apple is not the government. It is not a direct comparison, but it feels similar.

01:21:10   People don't like the idea that you're being surveilled. Setting aside the fact that like,

01:21:14   you know, the CSAM scanning is going on for every photo uploaded to Facebook, every photo put into

01:21:18   Google Drive, every photo put into Microsoft OneDrive, like that's also surveillance because

01:21:24   they're not discriminating. They're not saying, oh, this person might be a criminal. We're going

01:21:27   to scan. They just scan everybody's because that's what computers do. But it feels like surveillance

01:21:32   to people. And this gets back to the app argument of like, oh, we didn't want to do this until we

01:21:38   could do it in a privacy preserving way. But by doing it in this quote unquote privacy preserving

01:21:43   way, it still feels like surveillance. No, they're not scanning it on the server, but they're still

01:21:47   scanning every picture for everybody. They're just doing it on the client. And Apple can't even make

01:21:51   the argument of like, oh, we can't even see your photos because they can because Apple doesn't do

01:21:55   any encryption on the like iCloud backups and iCloud photo library. Like in the end,

01:22:02   if you back up to iCloud, Apple can get at those backups. And so some people are making the argument

01:22:07   that this feature is a precursor to Apple finally providing end-to-end encryption for iCloud photo

01:22:11   backups. Again, more arguments about like, well, the criminals just won't do iCloud backups. Like,

01:22:15   you know, they will and they do because they're people and some of them won't, but most of them

01:22:21   will, right? Because it's just law of averages. Anyway, if this is a precursor to end-to-end

01:22:26   encrypting iCloud backups, great. But if it's not, it doesn't feel any more privacy preserving. And

01:22:33   I say feels specifically here than scanning on the server side. Apple's argument is that

01:22:38   it is more privacy preserving because the scanning happens on your device and everyone gets the same

01:22:43   OS with the same database of NicMec images. And you can prove that cryptographically and you can

01:22:47   look at the bytes and you can be clear that we're not targeting an individual and so on and so forth.

01:22:51   But in the end, Apple is still saying, hey, every single person who has an iPhoto who uses iCloud

01:22:56   photo library, your device that you hold in your hand is looking at every single one of your photos.

01:23:01   And as Ben Thompson pointed out in his thing today, a lot of people feel like their own phone,

01:23:07   quote unquote, spying on them is somehow feels worse than, you know, sending it to a server

01:23:13   where it gets scanned by Apple, right? Because you feel like you're being betrayed by the physical

01:23:17   thing you hold in your hand. Like even though it's not actually worse, it's the same thing, right?

01:23:21   And in many ways it is more secure for it to be happening on device and not, you know, not sending

01:23:27   it unencrypted across the wire or letting Apple see it or all those other things, setting aside

01:23:30   the iCloud backup issue. But it feels worse. And this is a lot of Apple's problem with this feature

01:23:35   that every part of it, Apple is making the argument that this is, preserves your privacy better.

01:23:42   But to the average person, when you explain it to them, it feels worse than the alternatives

01:23:48   that Apple says are worse. But in the end, all of them are essentially some form of surveillance

01:23:53   by your device, which is common practice and is the way that we use computers to try to

01:23:58   catch criminals in this particular situation, which I don't know. Again, you know, people

01:24:05   who don't like this, okay, what should we do instead? Well, there's always the people who say,

01:24:07   let's do nothing. I'm not in favor of that solution. And neither is pretty much anyone

01:24:13   else in the tech industry. And if those laws come through that says Apple has to scan, they need a

01:24:17   solution. And if you needed a solution, this one is in keeping with Apple's values, which is we'd

01:24:22   rather do it on device. We don't want to, you know, we don't want to compromise our end-to-end

01:24:27   encryption where it exists. In theory, Apple, this leaves Apple free to do end-to-end encrypted iCloud

01:24:32   backups at any point in the future while still being able to say to government regulators,

01:24:36   hey, we're still scanning for CSAM, right? You can't make us unencrypt our end-to-end encrypted

01:24:42   backups because that's not stopping us from doing the thing you want us to do, you know,

01:24:45   save the children and all that stuff. But from a feel perspective, I don't think this feels great

01:24:53   to a lot of people. I mean, for me, the more I sit with this and the more we learn about some of the

01:24:59   details, I'm put a little bit more at ease about it. You know, and the more I think about a lot of

01:25:06   it, you know, it's tricky because as you mentioned earlier, like, CSAM is a special case, both morally

01:25:13   for most people, but also legally in most jurisdictions that, you know, normally you can

01:25:19   take a picture of, in most cases, whatever you want, and it's generally not going to be illegal

01:25:26   for you to even possess that picture. You know, there's things like copyright infringement that

01:25:30   could be a problem or, you know, other issues, but like, for the most part, most types of data

01:25:36   are not themselves like totally illegal to even possess, whereas this is. And I think the more

01:25:44   life experience or perspective you have, the more reasonable you think that is. Like,

01:25:50   when, if you know at all how horrific this kind of stuff can be, then yeah, you kind of realize

01:25:56   why it should be illegal. It's, you know, to even possess it. So Apple then is in a tough position

01:26:03   because they sell, you know, a billion devices that have really good encryption built in and

01:26:10   really good privacy built in. And it gives their customers the ability to do a lot of illegal

01:26:16   things to great effect. A lot of that you can, you can, you know, look the other way and say,

01:26:21   well, you know, it's out of our hands. It's not our problem. And, you know, the good outweighs

01:26:25   the bad. But there is this, like, there's always this big exception that what if you

01:26:30   enable child abuse? That's a pretty horrible thing. And in this case, if you look at the way

01:26:38   they designed this feature, you know, we'll talk about potential future motives in a minute,

01:26:42   but if you look at the way they designed this feature, they didn't design it to prevent the

01:26:48   iPhone camera from capturing CSAM images. They didn't prevent other apps from transmitting

01:26:57   them back and forth. You know, they, they, like, you know, we mentioned earlier about the,

01:27:01   the iMessage feature. I think a lot of kids are going to be using other services to do that kind

01:27:05   of thing, not just iMessage, but anyway, they did roughly the bare minimum they could do

01:27:11   to keep themselves out of hot water with the CSAM scanning feature. Like the iMessage,

01:27:17   the iMessage thing, that's a little bit different, but they, what they did was keep themselves out of

01:27:23   possession of data that's illegal to possess. So in that way, they clearly did a very narrow thing

01:27:31   here. Well, they're not keeping themselves out of possession because surely they possess tons of it

01:27:36   now. And they're not going to find that tons of it unless it is uploaded or downloaded from iCloud

01:27:40   photos. Do we know if they're going to be retroactively scanning existing libraries? I

01:27:44   would assume they are. They won't, they won't be because that's the whole point. They're not doing

01:27:47   server-side scanning. Now they will scan it if it comes to or from a phone, which may allow them

01:27:52   to scan it. But like, I mean, if it goes off the end because you're optimizing storage and you pull

01:27:56   it back in, but the Apple has been very explicit that they are not scanning it server-side.

01:28:00   Eventually they'll probably get it all because if it's in an iCloud photo library and you

01:28:04   load up a new phone or even just scroll a whole bunch or, you know, like things go back and forth

01:28:08   from iCloud photo library all the time to the phone. And every time something goes back and

01:28:12   forth to any device, a Mac, a phone, I don't know if it's a Mac, it's just iPadOS and iPhoneOS. But

01:28:16   anyway, anytime it transfers, it is then scanned on that device. But they're explicitly not saying,

01:28:24   oh, and by the way, we're going to go through our back catalog of all the, of all these iCloud

01:28:27   phone backups that we have access to because we have the keys and scan all the photos, right? So

01:28:31   Apple will undoubtedly continue to be in possession of CSAM just as they are at this moment.

01:28:37   But going forward, they are trying to catch any collection of it that starts to exist or that is

01:28:44   newly downloaded to a new phone or a new iPad or whatever. Yeah, that makes sense. All right. But

01:28:48   anyway, I think, you know, they, they are clearly trying to, for the most part in, in most ways,

01:28:56   still let your device be your device. In this case, they are, they're basically mostly protecting

01:29:02   themselves from being in possession of data that's illegal to possess. And so I'm, I'm a little bit

01:29:09   heartened. Did we figure out if that's a word or not? It is. I don't know why you doubt this. It

01:29:14   is. I'm a little bit heartened that they've done this in a relatively narrow way. You know,

01:29:19   there's lots of ways that governments have applied pressure to tech companies that, that I think are,

01:29:27   are a little bit more overreaching. Like for instance, try scanning a picture of a $100 bill

01:29:33   or any, you know, Euro bank note or, you know, any, any modern bank note, try scanning it and

01:29:38   open it up in Photoshop. See how far you get. Oh, that, that actually brings up the other

01:29:42   big objection to this, which is the Slippery Slope thing having to do with governments.

01:29:45   So we just described the feature. It's the NicMec database. It's the comparison against things.

01:29:50   One of the things people jumped on really early was like, first of all, how does stuff get into

01:29:55   the NicMec database? Cause if it's totally opaque and Apple doesn't even get to know what's in there,

01:29:59   you're just trusting NicMec. What if someone, you know, from some company says here,

01:30:03   put in this picture of our, you know, copyrighted image into your NicMec database. So then we'll

01:30:08   know if anyone shares our copyrighted image or whatever. And the second thing is that's just

01:30:13   one database. Next, it's going to be a database of anything, you know, movie companies will be

01:30:18   putting in databases of movies and trailers. And we're just going to find every, you know,

01:30:23   it's going to all be copyright infringement and patents. And also this stuff will just be like,

01:30:26   Apple would just take anyone's database and just compare against and all this stuff. There's lots

01:30:30   of Slippery Slope arguments there. Apple for what it's worth has explicitly said, no, Apple itself

01:30:36   is not adding stuff to the database. It's not letting anyone else add stuff to the database.

01:30:39   The NicMec's entire purpose in life is not to allow random companies like Disney to add pictures

01:30:44   of Iron Man to the database because they don't want people sharing pictures of Iron Man. Like

01:30:47   it is very, very narrowly defined, right? The second part of the, and Apple says that they'll,

01:30:52   you know, like that's the intended function of this feature, right? Second part is, okay, well,

01:30:57   but the government can make Apple do all sorts of things. And in fact, the government can make Apple

01:31:00   not tell people about it. So what if the government makes Apple add pictures of secret, like Pentagon

01:31:07   documents, so they don't want it to be leaked or whatever, and we need, we want it to be leaked

01:31:10   because they show like, you know, abuses in Abu Ghraib or whatever, right? The government can make

01:31:15   Apple do that. And then the government can make Apple not say anything about it. All right. So

01:31:20   the solution to the government being able to force companies to do things that we don't like,

01:31:25   when you live in ostensibly a democracy, plus or minus the voter suppression and

01:31:31   gerrymandering and all the other terrible things that afflict this country, is that we change the

01:31:35   government and the government changes the laws. Then we have things in the constitution that

01:31:40   prevent, you know, like there was a whole big argument about how the fourth amendment

01:31:43   would prevent any sort of evidence, you know, gathered in this way from being admissible in

01:31:48   court or whatever. But anyway, in the US, in theory, I'm being buried under a storm of asterisks

01:31:54   here. They're just falling from the sky, just burying me under a pile of asterisks. Yeah, I know.

01:31:59   But, but anyway, in the US, in theory, we have a mechanism to stop that from happening. But

01:32:05   what it comes down to is, yes, companies are subject to the government that runs the country

01:32:12   in which they operate. And the US is subject to the US government. And the US government has a

01:32:15   bunch of terrible laws. And it's very difficult to change those terrible laws. And we know all that,

01:32:19   but that is that situation. But then one step up from that is, okay, let's say you're okay with the

01:32:24   US and you think they're not going to do anything too terrible. What about in China? Well, I have

01:32:28   some bad news, as we've discussed in the past. Apple has a China problem and the world has a

01:32:33   China problem. And part of that problem is that China already has access to everything that Apple

01:32:38   does in China, because China has made Apple put all their stuff in Chinese data centers where

01:32:43   China holds the keys, right? That's not a problem Apple can solve. The only way they can solve it

01:32:50   is say we're either going to be in China and do what Chinese law dictates, which is essentially

01:32:54   give China access to everything, which is what the situation currently is, or we don't do business

01:32:59   in China, which is some other what some other companies have chosen to do. So that's the

01:33:03   conversation you need to have there, which is like, first of all, China doesn't need to stick

01:33:07   things in the Nicmec database. They have access to everything because they're an oppressive

01:33:11   authoritarian regime, right? They've already done that. They probably have way better systems in

01:33:15   this for, you know, keeping track of the dissidents and doing all terrible things that they do, right?

01:33:19   That's terrible. That's also not a problem Apple can solve. And it's not made worse by this feature.

01:33:25   So like so many things, if you don't trust your government to not do oppressive authoritarian

01:33:32   things, nothing the technology company that operates in your in your country can do will

01:33:38   fix that. Like Apple can't fix the US government except for through lobbying and all the other ways

01:33:42   they can fix it. But again, as they're all the aspects that are falling down from this guy from

01:33:46   Marco. Government problems need to have unfortunately government solutions. So this,

01:33:55   the reason technology is so difficult to regulate is because the the issues are complicated and

01:33:59   nuanced. And there's lots of you know, we have to do this because terrorism or save the children or

01:34:04   whatever. So we need backdoors and online encryption. And we continue to fight that as tech

01:34:08   savvy voters and consumers. But the the the I think the most salient point here is that regardless of

01:34:15   your dim view of the US government, and I think we all share that we can say that in the US,

01:34:21   our ability to change what the government can and can't do is way better than in China. And as we

01:34:29   said at the top of this program, this policy is only in effect in the US. So if you see this,

01:34:34   and you think this is terrible, the government can make Apple do all sorts of sneaky things.

01:34:37   A I would say, yeah, the government could already make Apple do all sorts of things

01:34:42   and and force them not to tell you about it. This has already happened and will continue to happen.

01:34:46   And if you don't like that, vote for people who want to change that. That's the only stupid tool

01:34:51   we have to change that. No, you know, there is no complaining on Twitter about Apple policy that is

01:34:56   going to change that. Because Apple, believe me, Apple does not like being told to do something

01:35:00   about the government, and also being told that they can't tell anyone about it. Apple doesn't

01:35:04   like that either. Right? So if you don't like that, and if you feel bad about that,

01:35:09   let's change the laws related to that. And again, in theory, the Constitution is some form of a

01:35:16   backstop against the most egregious offenses, because our certain rights are very difficult

01:35:20   to change without a constitutional amendment and yada yada yada, right? And then if you're worried

01:35:25   that Apple is going to let China do whatever they want, they already are, sorry, right? And if you're

01:35:30   worried that Apple is going to let some other country do whatever they want, this eventually

01:35:34   comes down to this the foundation of trust that we've talked about when talking about many features

01:35:38   in the past, which is in the end, you have to trust your OS vendor or your platform vendor with

01:35:44   something because no matter what they do, like, oh, we have end to end encryption. Somebody writes

01:35:48   the app that implements end to end encryption. And if you don't trust the person who's writing the app,

01:35:54   even if it's open source, oh, I trust them because I can see the source code. Oh, really? You audited

01:35:57   all those lines of your source code? If that was true, a heartbleed wouldn't have happened, right?

01:36:01   In the end, you have to have some baseline level of trust of the person who is

01:36:05   implementing your encrypted system, even if you agree with all of the, you know, the way it's

01:36:10   supposed to work. That's what it always comes down to. Do you trust Apple to not secretly put pictures

01:36:17   of Mickey Mouse and Iron Man into the database and find people who are illegally copying,

01:36:21   like movie trailers or something stupid, right? You either do or you don't. And if you don't trust

01:36:26   them, who do you trust? Buy your phone from them instead, right? That's what it comes down to.

01:36:30   Because yes, Apple, like, whenever what their encryption things is, in the end, the messages

01:36:34   app eventually has access to all of your messages. The mail app eventually, because it shows you them

01:36:38   on the screen. Like they're in memory on the phone. Like the phone could be doing whatever it wants.

01:36:44   Like it doesn't matter about all this encryption, provable security or whatever. Something has to

01:36:48   decrypt them and send the information to your eyeballs. And the people who write that software,

01:36:53   you have to have some amount of trust in them because somebody has access to it. And it's not

01:36:58   just you, it's the person who writes that app. And that's Apple in this case. So if you find yourself

01:37:02   spiraling down a trust hole here and being like, I can't trust Apple to run this CSAM system because

01:37:08   they could do anything. Yes, they can do anything. China can do anything. The US government can do

01:37:13   almost anything, right? That's true. But each one of those stages is like, what can I do to change

01:37:20   the government in China? What can I do to change the government in the US? And do I trust Apple

01:37:24   to do something that's in my interest? On the plus column for Apple, they have proven in the past

01:37:30   that they will resist US government pressure to do a thing that would be essentially a PR win. Oh,

01:37:36   Apple is so great, they unlock that terrorist spawn for the FBI. Apple refused to do that,

01:37:41   despite the fact that to many people it made them look bad. Oh, does Apple side with the terrorist?

01:37:45   Do you enjoy, you know, these San Bernido killer? Is that your number one customer? You want to

01:37:50   protect that person? Because there is a higher principle. So if you're worried that Apple would

01:37:55   never do that, they have at least once and probably more times proven that they will do that. If you're

01:38:00   worried that Apple is being forced by the government to do things and not say anything

01:38:02   about them, yeah, that's probably happening. But nothing about what Apple implements can really

01:38:08   prevent that. You could say, oh, if Apple didn't implement this feature, then they wouldn't have

01:38:11   to bow to government pressure. No, because once the government can make you do stuff and not say

01:38:14   anything about it, there are very few limits on that. Like, and again, iCloud backups are not

01:38:20   end to end encrypted. So already, the government can probably force Apple to give them that

01:38:24   information not saying anything about it. Right. So I kind of understand the argument that tech

01:38:30   companies shouldn't implement features like these because it makes it easier for the government to

01:38:34   demand they do things. But I don't really buy into it too much. Because if your problem is that the

01:38:39   government can make a company do a thing, the solution is not tech companies should never

01:38:44   implement features, because government can make them use the features for nefarious purposes. The

01:38:48   solution is, it shouldn't be legal for the government to use these to make companies do

01:38:52   these things for nefarious purposes. And in general, it's not except for in these quote,

01:38:56   unquote, extreme circumstances, 9/11, never forget, where these laws can be used and abused to make

01:39:03   companies do things because of terrorism, because of child sexual abuse, and so on and so forth.

01:39:07   And then finally, as we've been discussing the whole time, sometimes, as they say at the

01:39:11   beginning of the show, certain crimes are particularly heinous. Like I'm not getting

01:39:15   the quote right. That's law and order SVU for people who are not getting the reference I'm

01:39:19   trying to make. Sometimes there is what is the worst of the worst of the worst thing that society

01:39:24   treats differently for reasons we all agree on. And in those particular cases, I think it is worth

01:39:30   it to try to do something rather than doing nothing because you think the nothing will somehow

01:39:35   protect you against an oppressive government, slightly, and I don't think it will. So as with

01:39:41   the messages feature, if this feature works as designed, I think it is a reasonable compromise

01:39:47   and is way, way, way better than the nothing that Apple had been doing before this.

01:39:51   Jared Ranerel And I think it's way, way, way better than

01:39:54   what governments might eventually force them to do if they hadn't done this.

01:39:58   David Tompa Yes, exactly.

01:39:59   Jared Ranerel Yeah, my first reaction to this was, this is garbage. And the more I read on it, the

01:40:06   more my reaction and my thoughts on it are calmed down. I still, I think maybe, Jon, you're slightly,

01:40:15   I don't know, underselling is the best word I can come up with. But I understand, I really,

01:40:21   really, really understand, certainly after 2016 through 2020, I understand better than I ever have

01:40:28   that we, it is easy for us to lose control. This already sounds bad, but it's easy for us to lose

01:40:36   control of our government. And by that, I mean rational humans. And so when one cannot fundamentally

01:40:43   trust your own government, which is probably been true my entire life, but it's only felt

01:40:49   true in the last five-ish years, particularly 2016 through 2020. When one can't trust their

01:40:57   own government, then it makes it hard to trust that they won't compel Apple to do this. And

01:41:06   ultimately, as much as Apple will say, no, we will refuse, we will not capitulate, we will never allow

01:41:12   this to happen. Even with that said, ultimately, when it comes down to it, the government has

01:41:19   guns and bombs. And not that they would literally bomb Apple, but like if the government really went

01:41:24   that haywire and really wanted to win this argument, they will win the argument. There is no

01:41:30   ifs, ands, or buts about it. And the reason I think everyone's worried, including me, although by and

01:41:35   large, I'm not too upset about this anymore. But the reason anyone, everyone is worried is that

01:41:40   before there was no real mechanism that we knew of to scan your photos for content, justified or not,

01:41:49   that someone has deemed inappropriate. There was though, because Apple has access to all

01:41:54   your iCloud backups. If the government came to you and said, hey, we want you to scan all of

01:41:57   KCLS's photos, they could totally do it right now without this feature. Like that's what I'm saying.

01:42:02   That this doesn't add any, you know what I mean? And that's, that's where we get to the end and

01:42:06   backup thing of like closing that door. But right now that door is not closed. Like, so

01:42:10   like I understand the argument, like if you add a feature, it makes it easier for the government

01:42:14   to make you do a thing, but the thing that the government would make you do, they can already

01:42:17   make Apple do it. And they have been able to make, and in fact, they have actually done it. I'm

01:42:21   pretty sure the government has used law to get access to people's iCloud backups, right. With

01:42:27   or without letting Apple tell you that it's happening, right. They do it all the time.

01:42:31   That's already technically possible, right. The unlocking of the phone is like, oh, we just want

01:42:35   to see what's on that phone. But if it was an iCloud backup, it would have had access to it.

01:42:38   Right. So like, I, I know what people are saying of like, if you implement this feature,

01:42:43   the government can force you to do it. But I think the solution to that, like, I don't think this

01:42:47   strategy of we'll just never implement features that could be abused by the government is a good

01:42:51   one because almost any feature can be abused by the government and lots of useful features can be

01:42:55   abused by the government. The solution to government abuse is government. Like, is, you know,

01:43:00   part of the reason the constitution exists and the whole argument that I saw in some article or

01:43:05   whatever of like, would the fourth amendment allow you to submit as evidence in any kind of criminal

01:43:11   trial information gained by forcing Apple to scan things like, you know, secretly or whatever. And

01:43:18   like, you know, like that's, that's the reason we have courts and the constitution and our laws and

01:43:22   the fourth amendment to try to protect against those kinds of abuses to try to protect against

01:43:26   the government saying how we're just gonna, the government's gonna listen to everyone's phone

01:43:29   calls. Oh yeah. Does that sound familiar to anybody? Yep. 9/11. Never forget. Like that the,

01:43:36   this is a problem, but when I see this problem, I don't think this is a problem that needs to be

01:43:41   solved by tech companies. It's not, it's, it's a problem that tech companies to live with. And I

01:43:45   get that argument, but it really just sort of makes me even more hardened to fight against

01:43:50   stupid laws, stupid politicians that appoint stupid judges through stupid processes that

01:43:56   don't respect the will of the people. Like there's plenty of problems here,

01:43:59   but the way I feel like attacking them is not through the tech stack and living within those

01:44:04   limits. I feel like this specific feature of the NicMec database and scanning for CSAM on devices

01:44:12   against a collection of data that the government already has access to is not a feature that,

01:44:17   that worsens the situation. Like I feel like it does acknowledge that yes, our government is bad

01:44:22   because it doesn't give the government access to anything they didn't already have access to.

01:44:27   I don't, I, I, I do see what you're saying. I don't think I entirely agree. I think the rub

01:44:32   for me is that yes, the government could say scan Casey's photos for such and such imagery and

01:44:41   presumably right now, because you two jerks made me join iCloud photo library,

01:44:46   then it is hypothetically possible. Sure. Your photos were all over Google before

01:44:51   they're scanning everything. That's even worse. Yeah. You already,

01:44:53   photos have already been fully scanned by Google. Oh, absolutely. Well, hopefully not anymore. Well,

01:44:58   I guess it never really dies. We were getting sidetracked. Did you have to your trash yet?

01:45:02   Yes, I did. So, um, so the thing is, is that there was, you could argue and nobody really

01:45:11   knows, but you could argue that while it is easy for the government to say, scan all of

01:45:15   Casey's photos and look for such and such, I would assume maybe ignorantly, maybe naively,

01:45:21   that it is less easy or it was until iOS 15 less easy for the government to say, Hey,

01:45:26   I would like to know across all Apple users in the United States who has a picture of Trump or

01:45:33   whatever. And now there is clearly a mechanism that Apple claims would never be used for this,

01:45:40   that, you know, uh, the Nick mode or whoever they are, they would never, would never give us an

01:45:45   image like that. But the, the technical solution to show me all of the iPhones with a picture of,

01:45:54   of Trump on it, they could hypothetically do that now in a far easier way than they ever could

01:46:00   before. And what you'd said earlier, do you not remember when the U S government was listening

01:46:04   into every single phone call in the entire United States? Does that not ring a bell? Like,

01:46:09   do not underestimate the government's ability to do like, you know, well, they could target,

01:46:14   they could tap my phone, but they're not going to listen to all the phone calls and that would be

01:46:18   crazy. No, they will. Like after the government can absolutely look at every photo and every iCloud

01:46:24   backup, if they want to do, they can look at every photo going across the entire industry. Like that's

01:46:28   the power of our, that's our tax dollars at work. Are we making our own little oppressive regime

01:46:33   under the guise of fear-mongering for terrorism? Uh, that's, those are all terrible things that

01:46:37   have happened in our country and are probably still happening. Exactly. Right. And it's,

01:46:43   you know, that's again, the difference between surveillance for like technology enables

01:46:47   surveillance. Like this is plenty of sci-fi on this, right? Without technology, you have to

01:46:52   look at just the one person, but technology is like, you know what? We can just look at everything

01:46:56   all the time. Why don't we try that? Like, that's why so many sci-fi stories have to do with the

01:47:00   techno dystopia where, you know, the panopticon where you're being watched all the time. That's

01:47:04   not possible with humans. It's very possible with computers. And so, you know, again, with

01:47:08   the discomfort, Apple's solution is essentially surveillance, private surveillance by a private

01:47:13   company of private people's stuff, right? Uh, but government also does surveillance and thanks to

01:47:18   technology, they can also do it on a mass level, right? And, you know, so if, if the government,

01:47:24   for all we know, the government is already doing this without Apple's knowledge, that's another

01:47:26   thing that our wonderful government does sometimes, see the phone tapping or whatever. Um, but, you

01:47:32   know, and it's not, again, it's not a human listening. It's machines processing. That's always

01:47:35   the way it is the magic of computers. But like, that's why I think you have to look at these in

01:47:39   terms of capabilities. If you are tasked with searching all, uh, photos for every U.S. citizen,

01:47:45   your go-to is not, let's get something into the NicMec database, right? Your go-to is not,

01:47:51   aha, finally Apple invented this feature. We'll finally have our opening. No, you've long since

01:47:55   implemented your own solution to this that is not Apple specific, that is not Google specific,

01:48:00   that is not Microsoft specific, that spans the entire internet and has nothing to do with any

01:48:04   specific feature a tech company had to build for you, right? And, um, like there's all sorts of

01:48:10   conspiracy theories. You can think about how that might be done, but like, that's what I get to where

01:48:14   you really need to look at the specific features as does this specific feature make it more likely

01:48:19   that this bad thing is going to happen in this specific feature in this specific case, I think

01:48:24   doesn't because it doesn't provide any new capabilities and it doesn't even make it any

01:48:28   easier. In fact, it's harder because of the limitations of this database and exact matches

01:48:32   and so on and so forth. It's easier to just scan everything for, you know, anything you want to

01:48:38   and your own scanning technique of not being as strict of saying it has to be in this fixed

01:48:41   database or whatever and doing it client side, scan them all server side using whatever logic you want,

01:48:46   right? Look for whatever you want. You're not limited by this feature. This feature is too

01:48:50   limiting to be useful as a government tool. Government has much better tools already at

01:48:54   their disposal that I feel like they would prefer, which is why this specific feature doesn't bother

01:48:58   me. The broader question of like, why is Apple implementing essentially surveillance features

01:49:04   is slightly bothersome, but I think that is mostly explained by the fact that they're doing this for

01:49:10   essentially trying to be narrowly targeted as Margaret was saying before, narrowly targeted

01:49:14   to their own apps in the worst case scenario or thing everyone agrees is awful that has

01:49:17   specialized already written for it. And so if you're going to be comforted by any of the

01:49:21   narrowness, this has all the narrowness you could possibly imagine. Yeah. And to be clear, and Casey,

01:49:26   I agree with your concerns for the most part. I think we all saw how big the mountain of asterisks

01:49:35   on our government was over the last, and not even just from 2016 to 2020, but I would even say a lot

01:49:41   of that happened from 2000 to 2016 as well. It happened much longer than that, but it started

01:49:45   affecting white men recently, so we all know. Yeah. I mean, that's the truth of like, if you

01:49:52   think you have distrust of government or distrust that government's going to do things that are in

01:49:55   your best interest, you're very lucky if you just had that realization in the last decade or so.

01:49:59   Most Americans have had that realization for way longer. Yeah, exactly. And so,

01:50:03   I have a slightly more, I guess, defeatist view on this, but I think that it enables a more

01:50:10   clever solution, which is, I think people keep saying like, "Well, this is okay in the US,

01:50:16   but what happens if China gets a hold of this?" No, it's not okay in the US either. It's not okay

01:50:21   for the government to have widespread surveillance powers in the US either. And we have seen over and

01:50:28   over again how the freedoms that we claim to have, the justice system that we claim works and is

01:50:36   reasonably impartial, the oversight the government agencies are supposed to have over each other,

01:50:42   the checks and balances, we've seen how all of that can just be forgotten and made exceptions

01:50:48   for at the drop of a hat. And it isn't even just like one or two bad presidents that get us there.

01:50:54   We have so many exceptions on all those freedoms and protections that we think we allegedly have.

01:51:02   And the reality is, we have a government that largely does whatever it wants, and that when

01:51:09   bad actors come in, they are able to get a lot of bad stuff through our system. I mean, geez,

01:51:18   it's, I'm always reminded of like how, imagine how dark things would have gotten in the last four

01:51:22   years if they were actually competent. They were like incredibly cruel and mean-spirited,

01:51:28   but they weren't very competent. Imagine if they were also competent, like how much damage could

01:51:33   have been done. - It'd be like the Reagan years. (laughs)

01:51:37   - So my point is, if you desire or if you need for whatever you are trying to do, if you need

01:51:46   super privacy, if you wanna have private conversations, say, about the government,

01:51:50   you are not doing yourself any favors by having those conversations on public cloud services

01:51:58   that are not end-to-end encrypted. There's lots of arguments whether iCloud should have end-to-end

01:52:04   encryption for everything. iMessage is by default, obviously there's the issue of what happens when

01:52:09   it backs up itself to iCloud, which is, I forget if that's the default now, but that can be turned

01:52:14   off and it was off for a long time before it existed. Anyway, the point is, if you want

01:52:20   governments that are ill-intentioned, which over an infinite time scale, that's gonna be every

01:52:26   government at some point, if you want your data to be above government surveillance, you have to

01:52:34   take a purely technical approach to that and use things like strong encryption and even then,

01:52:39   hope that the NSA hasn't broken that encryption very easily in ways that you don't know about yet.

01:52:43   - Or intentionally weakened it, you didn't even know, but they weakened it from the beginning.

01:52:46   I love those conspiracy theories and some of those you look at and you're like,

01:52:49   "It doesn't make me feel good." - Yeah, exactly. But yeah, so the point is,

01:52:54   if you want to get out of the potential for governments to abuse their power and for Apple

01:53:02   to abuse its power and to work together to try to get you, the way you do that is using technological

01:53:10   measures, using encryption and stuff that is beyond, where you are protected, again,

01:53:16   assuming that it's good encryption and it hasn't been cracked or broken or sabotaged,

01:53:20   you are protected by math and logic and provable things, not just policies.

01:53:25   - I don't know about it, but the law outweighs math though. Remember when there was encryption,

01:53:30   there was export restrictions on heavy encryption and the PlayStation 2 couldn't be exported or

01:53:36   what do you like? That's what we keep getting out of, if you don't do anything, the government will

01:53:43   make some terrible law outlawing end-to-end encryption. So yeah, math is the protection

01:53:48   against, that's why Apple can refuse the FBI's thing, is like, "We literally can't do that."

01:53:52   It's physically impossible because of math, right? But the government can always come back and say,

01:53:56   "Oh yeah, guess what? End-to-end encryption is legal now." And that's super bad. So in the end,

01:54:01   the solution to all this all has to be a government-powered solution. In the interim, when

01:54:06   we are protected by whatever crumbling foundation remains of our constitution that protects our

01:54:12   supposedly inalienable rights, as upheld by a bunch of lifetime appointed judges who got there

01:54:19   by an incredibly corrupt, terrible process, and many of them are themselves terrible people,

01:54:24   hopefully we protect enough of our foundational rights so that, let's say, if the government

01:54:30   makes a terrible law that makes it impossible to provide any kind of secure communication,

01:54:35   that that would be shown to be unconstitutional by someone who isn't an originalist.

01:54:39   If the founders never knew about encryption, this must be legal.

01:54:42   What Marco's point, though, is about the talking about it on public clouds or whatever,

01:54:48   gets to this really good aspect of this whole discussion that brought up by the person who

01:54:54   runs the Pinboard service and Twitter account. I forget this person's name.

01:54:57   Mike Chay-Sudlofsky? Yeah, there you go. I'm just going to read these three tweets because

01:55:02   basically it summarizes it better than I could. This is directly from the Pinboard tweets.

01:55:06   "The governance problem here is that we have six or seven giant companies that can make

01:55:09   unilateral decisions with enormous social impact and no way of influencing those decisions beyond

01:55:14   asking nicely for them to come talk to the affected parties before they act."

01:55:17   So this is the problem of like, "Oh, also if you don't trust Apple, maybe you should try Google.

01:55:23   Or if you don't trust Google, maybe you should try Microsoft. Or if you don't trust Microsoft,

01:55:26   I'm running out of places to get my phone real quick."

01:55:30   Didn't Stallman try to make a phone or something? Whatever.

01:55:32   Large tech companies at this point can do things with their policy. Let's say Apple

01:55:39   implemented a bunch of these policies for child safety and they were much worse. They were super

01:55:43   harmful and they did a much worse job of trying to balance the concerns of the chance of false

01:55:51   positives were really high and it was just going to look like it was going to be a disaster.

01:55:55   You don't have a lot of recourse as a consumer because these companies get so big and so powerful

01:56:01   and they all tend to do similar things. See all the other companies that are doing the server

01:56:05   side scanning. That if you really don't like what they're doing because they're not government

01:56:10   entities, you can't vote them out and you quote unquote voting with your wallet has a limited

01:56:16   effect on them unless you can get millions and millions and millions and millions of other people

01:56:20   to do the same thing. In the end, people tend to need in the modern era cell phones to just live

01:56:27   their life and if there are only a few sources of those cell phones and all those sources agree

01:56:32   that they're all going to do a thing and you don't like it, the idea of "I just want to have a cell

01:56:36   phone" is very difficult to convince millions and millions of other people to also do to the degree

01:56:41   that it affects them. We've talked about this before. Why is it bad to have a small number

01:56:46   of giant companies that control important aspects of our life? In general, it's bad. So continuing

01:56:51   the pinboard tweets, "The way we find out about these technology impacts is by rolling them out

01:56:55   worldwide and then seeing what social political changes result." See also social networking,

01:56:59   Facebook, so on and so forth. Sorry, I'm adding some commentary. Hope you can see which parts are

01:57:04   mine. "It's certainly a bracing way to run experiments with no institutional review board

01:57:09   to bog everything down with pessimism and bureaucracy." So it's important to note like

01:57:13   yeah private companies can do things more efficiently in these regards and sometimes

01:57:18   it is better to not like this shouldn't be done through one governmental agency. Innovations,

01:57:22   the reason we have all these great things from Apple and Microsoft and Google and all that good

01:57:26   stuff right. So continuing from pinboard, "But the problem is there's no way to close the loop right

01:57:32   now. To make it so that Apple or Facebook or Google inflicts huge social harm, their bottom

01:57:36   line suffers or their execs go to jail or they lose all their customers. Profits accrue while

01:57:40   social impacts are externalized." So say you start a social network, you know, originally to try to

01:57:46   rate which girls in your school are hot or not and eventually you end up fomenting genocide halfway

01:57:51   across the earth. Does that affect your bottom line? Are you harmed by that? I guess it's a

01:57:57   press relations issue. We can probably smooth that over but when you get to the size of Facebook,

01:58:02   if you accidentally foment genocide, like the loop is not closed. Those are the externalized

01:58:10   harms but your stock doesn't suddenly drop in half and you don't get fired. Nobody goes to jail.

01:58:16   Maybe you get brought in front of congress and they yell at you a little bit while you say that

01:58:20   you can't remember or are just trying to do the right thing. But this is yet, like, you know,

01:58:26   I know we just got done talking about how Apple we think is mostly trying to do the right thing here.

01:58:29   It's important for technology companies to do something but let's not lose sight of the fact

01:58:33   that having gigantic incredibly powerful, a small number of gigantic incredibly powerful tech

01:58:39   companies is itself its own problem independent of the problem of trying to have a government.

01:58:43   Because as bad as the government system is, we have even less control collectively over what

01:58:50   these companies do. In some ways you may think we have more because like, oh, the citizens are

01:58:55   make or break these companies. But practically speaking, especially in areas that have technical

01:59:00   nuance, it has proven very difficult for consumer sentiment to close the loop. To say, hey company,

01:59:08   if you do a bad thing, you will be punished in a way that makes you motivated to not do bad things

01:59:14   in the future. That loop tends to only work in terms of like products that explode in your hands

01:59:19   or like, you know, supporting the worst of the worst possible politicians with your donations.

01:59:25   But in general, if you do something and there's like a third order effect, again, if you make

01:59:29   Facebook and it accidentally foments genocide, most people are like, yeah, but that wasn't really

01:59:34   Facebook's fault. And the genocide people were going to do a genocide anyway. And Facebook is

01:59:38   trying to stop it. And like, like the loop is not closed there. Right. And so if there's something

01:59:44   that all these big phone companies are doing with their phones that you don't like,

01:59:48   it's not actually that easy to change that, especially if you don't like it,

01:59:54   but no one else cares. Like the people like you may be listening to this and saying,

01:59:58   I'm never going to buy an Apple phone. They're spying on me. And so is Google and so is Facebook

02:00:02   or whatever. But just try getting one of your friends who's not into the tech world to listen

02:00:05   this far into this podcast. In general, what we've seen from technology stuff like this is that

02:00:12   people just don't care. Like they just want their phone to work. They just want to do their things

02:00:16   as long as it doesn't bother them. As long as they're not falsely flagged for child sexual

02:00:20   abuse material, they mostly don't care. So trying to affect the policies of these companies by

02:00:27   rallying the people to refuse to buy Apple phones or Google phones or Microsoft phones or

02:00:33   Android phones of any maker is really, really difficult because to paraphrase singles,

02:00:38   people love their phones. We are sponsored this week by Burrow, the company setting a new

02:00:45   standard in furniture. Burrow has timeless American mid-century and contemporary Scandinavian styles

02:00:52   for their furniture. It's easy to move and comes in modular designs. You can get it upstairs.

02:00:56   You can mix and match different pieces and stuff. And it's all made from premium durable materials,

02:01:01   including responsibly forested hardwood, top grain Italian leather, reinforced metal hardware,

02:01:07   and everything you might expect out of high quality furniture. Their in-house design takes

02:01:12   a research driven approach to make sure their furniture fits your lifestyle. This translates

02:01:17   to things like a simple mounting guide for the index wall shelves they sell, a tool free assembly

02:01:22   process, and of course, a modern convenient shopping experience. They got rid of the far away

02:01:29   warehouse stores, the high pressure showrooms, and they replaced them with modern, easy to use

02:01:34   online shopping. Of course, that's what you want these days. You get to create and customize your

02:01:38   own furniture without leaving your house. And there's free shipping for all. Every order,

02:01:45   no matter how big or how small, is delivered directly to your door for free. This can save

02:01:50   easily up to $100 or more when it comes to big stuff like couches. And all this is backed with

02:01:55   Burroughs world class service. You know, everyone needs a little help sometimes. The Burrough team

02:02:00   is always available to lend a hand from custom orders to delivery scheduling, whatever you might

02:02:05   need. So listeners can get $75 off your first order at burrow.com/atp. That's B-U-R-R-O-W

02:02:15   burrow.com/atp for $75 off your first order. burrow.com/atp. Thank you so much to Burrough

02:02:24   for sponsoring our show.

02:02:26   [Music]

02:02:29   Leilo Vargas writes, "Hello friends. What's your current thinking on Bitcoin and crypto in general?

02:02:33   I think I never heard you talking about nerd money. Do you hold any, without disclosing any

02:02:38   amounts, any project in particular that you like? Thanks friends." So a couple of things. First of

02:02:43   all, let me try to be brief, which we never successfully do on this show. My thoughts on

02:02:49   crypto art, you know, I think the heat death of the universe is coming fast enough without crypto.

02:02:54   Let's not accelerate it. But with that said, one of you, probably Jon, added two delightful links

02:03:00   to the show notes, which is en.wikipedia.org/wiki/ponzi-scheme and /pyramidscheme, which made

02:03:09   me laugh more than I am comfortable admitting when I saw those in the show notes a day or two back.

02:03:13   So Jon, would you like to explain the relevance here? Yeah, Jon, do you hold any Bitcoin?

02:03:19   So we did actually talk about this on a past show. Not that long ago, too. I put these links in there

02:03:25   just because it's fun to, like, if you read the little summary on pyramid scheme, you would read

02:03:29   it and say, okay, technically Bitcoin isn't a pyramid scheme because pyramid scheme is a

02:03:34   business model that recruits members via a promise of payments or services for enrolling other members

02:03:38   into the scheme. It's like, that's not how Bitcoin works. You don't get Bitcoins for recruiting other

02:03:42   people into Bitcoin. So that's, it's not really a pyramid scheme. So let's look at Ponzi scheme. Is

02:03:46   that what it is? Ponzi scheme is a form of fraud that lowers investors and pays profits to early

02:03:50   investors, which funds some more recent investors. It's like, well, that's not how Bitcoin works. When

02:03:53   new people invest in Bitcoin, their money doesn't go to the early investors, like,

02:03:57   like directly like it does in a Ponzi scheme. The reason I put these links in here, though,

02:04:01   is that although Bitcoin technically isn't exactly like the technical definition of a pyramid scheme

02:04:09   and technically isn't exactly like the definition of a Ponzi scheme, it operates very much like them

02:04:14   in that thus far, the only value inherent in Bitcoin is based on the speculation that the

02:04:21   price of everyone's Bitcoin will go up. And so getting more people to invest in Bitcoin and

02:04:27   therefore making Bitcoin look more desirable does in fact benefit the early investors and

02:04:32   quote unquote recruiting people into getting Bitcoin, just like in a pyramid scheme, does in

02:04:37   fact raise the value of the people who already have Bitcoin and were in earlier, right? Setting

02:04:43   that aside, we've talked about in the past the mathematical foundations of like, oh,

02:04:46   isn't it cool that you can have two people who don't trust each other have an exchange of money

02:04:50   without a central party mediating it, right? That technology is interesting. Unfortunately,

02:04:55   it uses a lot of energy and is really slow and doesn't have good concurrency and has all sorts

02:04:59   of other problems that have to do with it, which makes it not that interesting for many problems,

02:05:04   except for buying heroin. It's a great way for criminals that don't trust each other to

02:05:08   exchange money in a way that's not observable by governments. So there is a use case for Bitcoin.

02:05:13   It just happens to be a terrible Bitcoin. If you are a criminal and don't want to use the banking

02:05:17   system because you're doing something criminal, Bitcoin is a great way to do that. So what Bitcoin

02:05:22   has enabled is a huge explosion in ransomware, because guess what? You can get paid for

02:05:26   ransomware anonymously through Bitcoin. It's way easier than trying to get money into,

02:05:31   because think of what you have to do with ransomware without Bitcoin. You have to get

02:05:34   someone to transfer money into like a numbered account in like Switzerland or something. It's

02:05:39   like way more complicated. Bitcoin is so much easier. So that's why there is a huge explosion

02:05:43   in ransomware. So what do I think about cryptocurrency that uses proof of work

02:05:47   or even the ones that don't like? Yeah, proof of stake is the new one.

02:05:51   Yeah, proof of stake is slightly better for the environment. But the bottom line is like

02:05:54   lots of bad uses are enabled. Most of the people who are into it, and the reason you see so much

02:05:59   evangelism is because the more people they can get to get into Bitcoin, the higher the value of

02:06:03   Bitcoin goes up and that helps them with their investment. And those are all the earmarks of

02:06:07   a pyramid scheme or a Ponzi scheme, even if it's not technically exactly the same thing. So are

02:06:12   people getting rich off of Bitcoin? Yeah, people get rid of pyramid schemes and Ponzi schemes all

02:06:16   the time. That's why they exist, because they make people rich. But they're not a great thing to get

02:06:21   into. And the whole thing about Bitcoin is like, well, if you had said, you know, people thought

02:06:25   it was about reaching the tipping point five years ago, but if you had heeded that advice and not

02:06:29   invested, you wouldn't be rich like I am now. That's true. That's true of Ponzi schemes and

02:06:34   pyramid schemes too. But it doesn't make me excited to get into them because I am not planning on

02:06:40   ransomware-ing anything. I'm not trying to buy heroin. And I do not have confidence that were

02:06:46   I to put my life savings into some kind of cryptocurrency, that I would not be the last

02:06:51   person left holding the bag. But I instead would be one of those early investors who gets rich off

02:06:55   of it. So if you have gotten rich off it, congratulations. Good job. But if you have not

02:07:03   invested in Bitcoin, I would suggest that it is not a particularly safe place to put your life

02:07:08   savings, given that no one really knows how and when this story will end. But most people are

02:07:14   pretty confident that it will end in some way. And when it does end, you don't want to be the one,

02:07:19   you know, left holding the bag. You don't want to be the one playing musical chairs who has no place

02:07:23   to sit down when all the other people cash out, the early people, if they haven't already,

02:07:26   and you're left with a bunch of Bitcoin that becomes not worth quite that much. And if you

02:07:30   were wondering if Bitcoin really has great utility and worth in the world, look at what people do

02:07:34   with it, which is they like to exchange it for what I would call real nerd money, which is actual

02:07:38   money that you can use to buy things. Wow. A couple of quick thoughts here. First of all, I think it

02:07:45   was ATP 424 from April where we discussed this. I put a link to that in the show notes. And

02:07:50   additionally, I do think as much as I snark on Bitcoin and crypto, I do think, and John, you

02:07:56   alluded to this earlier, the mathematics behind it, or the principle of the mathematics behind it,

02:08:01   I think are fascinating and very clever and very cool. And I've talked about this a couple of times,

02:08:07   but there's a really good video by Blue31Brown or something like that. I forget the name of this

02:08:12   person. 3Blue1Brown. I was close. It's like a 25 minute video or something like that, but it is

02:08:18   extremely well done and builds up from like, "Hey, how do you and a couple of roommates figure out

02:08:24   how to settle up bills if you don't trust each other?" And it basically ends up with Bitcoin. So

02:08:30   as a solution to a problem, I think it's very clever and very interesting as something that

02:08:36   is using incredible amounts of power and is extraordinarily inefficient by and large,

02:08:41   and is surely going to create a lot of email for us that we don't want. Not a fan.

02:08:46   This is an example of externalities, right? So the technology is there and it's like, "Oh,

02:08:51   and the externality that we just essentially made it profitable to burn energy?" Because as long as

02:08:58   you make more in Bitcoin than you spend in electricity, it is a profitable endeavor. So the

02:09:03   unintended externality of these cool systems for having sort of zero trust, no middle party

02:09:11   relationship to be able to exchange things, the externality is, "Wait a second. I think you just

02:09:16   made it profitable to burn electricity." And they did. And people do. And it makes sense from a

02:09:21   financial perspective, but from an earth perspective of like, "So what value are you creating?"

02:09:28   "Well, it's kind of like a pyramid scheme and some people get rich." "Okay. And then the cost is?"

02:09:32   "What? How much CO2 emissions?" "Oh, we only use solar power or spare energy." "I'm not quite sure

02:09:37   about that." The real question for all that is like, "Okay, look, if Bitcoin didn't exist,

02:09:42   would that coal have been burnt? Where would that energy go?" Obviously, the silly ones are like,

02:09:49   "There was a shutdown power plant and Bitcoin miners bought the power plant and turned it on,

02:09:53   and all it does is run Bitcoin stuff all day. And by the way, it's making everyone's gaming cards

02:09:57   more expensive. Can't we at least agree on that?" Well, even nerds should be able to say,

02:10:02   "It's not good that we can't get GPUs to play games." Games produce value in the form of

02:10:07   happiness in the world, right? And people get paid to make games. Like it's an actual economy.

02:10:13   Bitcoin, for the most part, does not produce any value except for speculative investors making

02:10:17   money at the expense of later investors, and maybe some cool technical papers that help someone get

02:10:23   their PhD. Well, that sounds like a Ponzi scheme. Yeah, and I think like, to me, like, multiple

02:10:28   parts of this are offensive to me. Like, first of all, I am in agreement with Casey, and I think,

02:10:35   Jon, that, you know, the technological concepts of shared work like this, the idea of blockchain

02:10:43   verification of transactions, that's a really cool set of technologies and approaches, and it's very

02:10:49   clever and it's fascinating. But I think what that has enabled, if you look at the total, like the

02:10:55   net good and bad that have been enabled by cryptocurrency, I think the bad dramatically

02:11:00   outweighs the good. It's not even close. The killer app of Bitcoin is literally ransomware,

02:11:06   right? Exactly. And possibly also drugs, but I mostly just hear about the ransomware and the

02:11:10   circles I travel. Like, the net net is not good for the world. For individuals, it might be great.

02:11:15   Some people who got rich, it's great for them, but for the world, it's super negative at this

02:11:20   point. It's not even close. Yeah, and then my second major problem with Bitcoin is the people.

02:11:26   Now, I know we're going to hear from a few of them, and I'm going to tell you right now,

02:11:31   I don't like you. Because here's, and if you write to me and say bad stuff, I won't care.

02:11:37   I don't like you because what you most likely are if you're into Bitcoin. So A, you are very likely

02:11:44   to be a person who is willing to make the world a slightly worse place, whether it's through carbon

02:11:50   emissions or through participating in a system that enables a lot of illegal and damaging activity,

02:11:55   whatever it is, like you're willing to make the world a little bit worse place to make a buck.

02:11:59   And that tends to attract not the best people to that area. Now, when you combine that factor with,

02:12:07   okay, I know I am a privileged white man in tech, but can I use the word tech bros?

02:12:13   I think so. I think so. You're old enough now that I think you're a lot,

02:12:18   now you're an old white man in tech, so you can say tech bros.

02:12:20   And I don't think I ever was a tech bro necessarily. I was near...

02:12:24   You totally, you 100% were. No, no, I was near that area, but I don't think I was never,

02:12:31   I wasn't like, you know, one of those people who would like go on stage at TechCrunch Disrupt and

02:12:36   pitch my startup that's going to change the world. That was never me.

02:12:39   Yeah, but you were in a startup that changed the world, so...

02:12:41   Not, no, I was in a startup. Yeah, I don't know if it changed the world. Anyway,

02:12:44   I wouldn't say, I would never... Change the world of porn, okay?

02:12:47   I would never claim that. And that actually mostly happened after I was gone for the record.

02:12:50   Anyway, so, anyway, so I, the world of tech bros is a world that I don't usually get along with

02:13:01   very well. It's all those people who the Silicon Valley TV show makes fun of and they don't think

02:13:07   it's funny. Like it's like, it's that crowd, right? So what Bitcoin and cryptocurrency in general,

02:13:13   the kind of people that attracts, it's the combination of tech bros, which are largely a

02:13:21   pretty terrible group of people in, you know, some relatively minor ways, but mostly it's still

02:13:26   terrible people. The intersection of tech bros with a far worse group of people, finance bros.

02:13:33   Oh, and living in the New York Metro area, I see a lot of these people. Oh my God, they're the worst.

02:13:41   So when you combine tech bros with finance bros... And libertarians.

02:13:45   Yeah, and libertarians. That's the whole other thing. When you combine these groups of people,

02:13:50   and especially the prospect of making money pulls in the finance bros and converts some

02:13:55   of the finance bros into wannabe tech bros. And so the intersection of this produces just the

02:14:01   worst group of people ever. Like you do not... And oh, and of course all the, you know, profiteering

02:14:06   people who will burn carbon to make a little bit of money. So like, this is the collection of these

02:14:11   people is just the worst people. And so cryptocurrency as a thing, while I think it's an

02:14:16   interesting concept, the realities of the kinds of people who are mostly into it and the kinds of

02:14:21   people it attracts and the kind of usage it attracts are so terrible. And both from like an

02:14:27   annoying point of view and from, you know, from a world damage point of view. So it's just, it's a

02:14:33   terrible thing that it has actually produced. And it's, these are all the worst people that have

02:14:38   invaded our industry and taken over all the GPUs and everything. It's like, they're making the tech

02:14:44   industry worse. And they're like, you know, they're a burden on us. They're a burden on the world.

02:14:50   Like I just, I don't see any benefit to it. So to answer the question, I don't hold any

02:14:56   cryptocurrency. And I'm not a big fan. I'm not as harsh as Marco in that like, when like,

02:15:03   I think I talked about this before when we first talked about Bitcoin, like when a new technology

02:15:06   comes out, it's natural for nerds to be curious in it. So if you like got a bunch of Bitcoin,

02:15:11   especially because you thought it was a cool technical thing or whatever and play with it.

02:15:14   Hey, especially if you made a bunch of money off of it because you mined Bitcoin back when

02:15:18   it was easy and they became worth a lot of money, great, more power to you. Like, especially in the

02:15:23   beginning, it wasn't clear how this was going to turn out. It's like any new technology. And

02:15:26   as tech enthusiasts, we're interested in new technologies, right? I mean, when Bitcoin first

02:15:30   came out, I downloaded the software and tried like mining for it. I never actually got any Bitcoin,

02:15:34   so I don't have any, I have never owned any, but it's a technical curiosity. And so if you

02:15:39   became rich off Bitcoin, I say more power to you. You found a way, hopefully you use that money for

02:15:45   something good. You use it to have a happy life and to support your community and your family.

02:15:50   Like kudos, right? But what Margot is talking about is like at this point, at this today, 2021,

02:15:55   correct? The footprint of cryptocurrency and understanding what it is, what it's good for,

02:16:00   what it's not good for and how what you have to do to make money off of it is much more clear now

02:16:05   than it was. And so I would say if you have Bitcoin, I'd be looking for to make it the most

02:16:11   advantageous exit as possible. And what I wouldn't say, like, I would say that if you're super

02:16:17   enthusiastic about the sort of utopian possibilities of cryptocurrency, try to come

02:16:22   up with one that's better than the ones that we have now, which to the credit of a lot of people

02:16:25   involved in this, they do. That's why proof of stake exists instead of proof of work, right?

02:16:30   People are trying to improve it, but Bitcoin gets all the press because it's like the one that sort

02:16:34   of broke through is the most popular. It has a lot of mystique around it. And when a lot of people

02:16:40   say cryptocurrency, what they really mean is Bitcoin and Bitcoin has a lot of bad externalities.

02:16:46   And I would not suggest anyone get into it. If you got rich off it, great. But at this point, it's

02:16:51   not great. If you're trying to improve it or do something better, that's good.

02:16:55   But at this point, like at this point, like these, you know, I'm going to make a cryptocurrency and

02:16:59   I'm going to convince a celebrity to endorse it because they don't understand the tech,

02:17:02   but I'll just tell them that it'll make the money and actually will because the celebrity will give

02:17:06   it publicity. And then the early people who have most of the coins will make money. And like,

02:17:10   it's just another way to scam people out of money, to scam investors out of money. It's

02:17:14   tales all this time. This is not like what you see happening with Bitcoin always happens with

02:17:18   financial insurance. Like look at the various financial crashes caused by those tech bros

02:17:21   that Marco doesn't like, right? It's just that now there's a different angle on it. And that is just

02:17:26   generally distasteful and bad. I will say though, I do have some cryptocurrency. Some in the early

02:17:31   days of crypto, some cryptocurrency company was giving out free cryptocurrency for signing up to

02:17:35   their website. And I did that and I got free cryptocurrency, which I still have. And it just

02:17:42   sits there as a number and it's not a very big number. But I'd never do anything with it or look

02:17:46   at it because it's not worth enough money for me to cash out, right? And if it is someday worth

02:17:54   enough money to cash out, I'll cash out and be like one of those people who says, oh, great,

02:17:57   you got rich off cryptocurrency, but it's probably never going to be worth any money. So I just

02:18:02   ignore it. But I do have some of it. I did actually have to like declare it on my taxes as an asset or

02:18:07   whatever, because it's above like whatever the $200 limit or something like we had to have our

02:18:11   account and go through all this or whatever. So it's an official thing that I own and I look at

02:18:16   it and if it ever becomes worth millions of dollars, you can bet your butt I'm going to

02:18:18   cash out of it and take that millions of dollars. But I got it for free and it's not a thing that I

02:18:23   use as an investment instrument. I do not use it to do any transactions. I don't do anything having

02:18:28   anything to do with crypto. Richie Haroonian writes, I know the clever trick to limit iCloud

02:18:34   photo library disk usage by creating a separate APFS volume or disk image. Recently I noticed

02:18:38   that messages was using almost 100 gigs on my 256 gig SSD. That is not desirable. I did a bit of

02:18:45   research, but couldn't find a similar trick to limit messages disk usage. I think it's a little

02:18:48   more complicated since message attachments are somewhere under the library folder. Any insight

02:18:52   here? I don't have any insight, although I thought the messages, like the actual text of messages was

02:18:58   in like a series of SQLite databases if I remember right. And I think that Richie is right that the

02:19:03   attachments are stored somewhere semi-predictable, but no, I have no useful insight here.

02:19:07   You could probably find the folder that they are being stored in deep within library, whatever.

02:19:14   You could probably use a Simlink trick to Simlink that into a disk image that is limited or an APFS

02:19:21   volume, however you want to do it. So that's the first thing I would try. But also I would also ask

02:19:28   does Richie not use iCloud for message attachments? Because message attachments could be very big. It

02:19:34   is in some ways a photo library. Actually I don't think we even heard whether those are being scanned

02:19:39   for the CSAM. Interesting. But you can, I'm kind of surprised we didn't hear that actually.

02:19:45   But if you store your messages attachments in iCloud, I bet it offloads them pretty soon when

02:19:53   you're low on space and so it probably doesn't keep that big of a cache. Because I use iCloud

02:19:57   for iMessage attachments. As I scroll up through messages, it has to page them in and load them off

02:20:01   the network after a while because it's not keeping all the attachments locally. Whereas

02:20:06   before iCloud photo library, I remember that was always a big chunk of my iPhone storage space.

02:20:11   You'd see messages and it would be like 5 gigs, 10 gigs, whatever on your phone because it was

02:20:16   all those attachments historically over time. So I would suggest if Richie does not use iCloud

02:20:22   photo storage or iCloud storage to store message attachments, I would suggest trying that or

02:20:27   considering that if this is going to be a big problem. Like if you can't just get rid of these

02:20:31   attachments because they actually are like the only copies of them. But if this is just some

02:20:38   disk quota not being enforced well and they are being paged off as they get older, I would attempt

02:20:43   some kind of SimLink trick into a disk image. I think it's pretty brave to try the SimLink trick.

02:20:47   I generally don't want to mess with, especially now with like the containerization and the various

02:20:52   container folders, don't want to mess with the internal structures of these iCloud powered apps

02:20:57   just because, you know, it's not as straightforward as it used to be. The library folder used to be

02:21:02   much more tractable but now with the advent of the container system with iCloud stuff it's a little

02:21:07   bit sketchier. The problem with messages is the problem with a lot of Apple apps in that it's

02:21:12   quote-unquote "supposed" to manage your disk space in an intelligent manner by purging things that

02:21:17   are safely ensconced in the server, which to Marco's point means that you basically should enable

02:21:21   the iCloud messages sync thing, which means the government will be able to look at all your

02:21:25   messages in your iCloud backups too, which is nice. But yeah, that's the consequences of that. And

02:21:30   Apple's solution to this in recent years has been one, the iCloud messages thing, which helps solve

02:21:36   this problem if it operates correctly. If it doesn't there's nothing you can really do except

02:21:39   cross your fingers and hope that crap gets purged. But two, they added with some surprising amount of

02:21:45   fanfare a couple years ago the ability to tell messages to trim off attachments older than some

02:21:52   date, right? Because this was a big problem on a lot of people's phones. They were filling their

02:21:55   phones with message attachments. Eventually you just fill it, right? So your choices are either

02:22:00   get that stuff into the cloud so you can purge it from your phone and not lose it or delete it from

02:22:05   your phone. And Apple did both. They came up with, you know, messages in the cloud feature, that's

02:22:09   the cloud version, and they also came up with features in the messages app that will let you

02:22:12   delete that crap. They're doing that in reminders now too. It used to be that reminders would just

02:22:18   pile up forever, which reminders are obviously tinier, they're not like photos, but eventually

02:22:22   after, you know, 10, 15 years of the iPhone people have a lot of reminders too. So the features to

02:22:27   delete them will clean it up. Ironically, the thing that deletes your data, like oh delete all

02:22:31   attachments older than a year, that will probably actually clean your space up as soon as you

02:22:35   activate it. Whereas the iMessage in the cloud thing, you activate it and then you just wait,

02:22:40   I guess, and hope that something eventually purges crap from your phone. But yeah,

02:22:46   the solutions are not great, but those, I think those are the solutions for you.

02:22:50   And finally, Andrew Nelson writes, "What camera or lens should I rent for a Disney World trip?

02:22:56   I want good at low light, great and fast autofocus, some water resistance in case of

02:23:01   unexpected rain, better sharpness and bokeh than the iPhone 11 Pro, easy to use, and good battery.

02:23:08   Andrew does not care about long zoom, RAW or touching up pictures." I will answer this because

02:23:13   I can be very quick. What you want is your iPhone 11 Pro because having just gone to Disney World

02:23:19   a couple of years ago, we went in late 2019, I did bring my big camera and on a couple of occasions

02:23:26   it was very useful. But by and large, and maybe it's just because my big camera, which is an

02:23:31   Olympus OM-DEM10 Mark III, I think I have that right. Maybe it's just my particular big camera

02:23:38   and that's the crux of the issue, so maybe I'm being unfair. But in my opinion, the iPhone,

02:23:44   particularly with HDR, which I'm waiting for John to jump in and tell me that his big camera does

02:23:50   all these things, but the HDR on the iPhone is really impressive, particularly for outdoor shots

02:23:54   where you're trying to get a decent sky that's not blown out smithereens as well as your subject

02:23:59   matter. Plus the low light on my iPhone is actually quite a bit better than it is on my Olympus here,

02:24:04   especially is where John is going to say, "Oh, so not so fast." But in so many ways, it was just

02:24:09   a pain in the butt to carry anything bigger than an iPhone onto rides or anywhere else.

02:24:14   So even though I did have my big camera with me pretty much always, I should have actually done

02:24:20   and looked and seen how many pictures I took with each, but my gut tells me 80% of the pictures I

02:24:26   took on the most recent Disney World trip I had were with my iPhone. And in fact, it was either

02:24:31   10 or 11, whatever was current at the time in late 2019. And almost probably 20% at most were taken

02:24:38   with the big camera. And I think even that is optimistic. I think it was probably like 90/10.

02:24:42   I'll come back to John since you have more Disney experience than Marco. Marco,

02:24:47   do you have any thoughts on this real quick? - Yeah, it was funny. So because Andrew wanted,

02:24:53   you know, you look at the list of wants, it's everything iPhones are great at. Low light,

02:24:57   fast, good autofocus, water resistance, ease of use, battery, and then Andrew says, "Don't care,

02:25:05   long zoom, raw, and touching up pics." So initially I'm like, "Well, okay, just iPhone really." But

02:25:13   unfortunately in the middle of the want list, Andrew says, "Better sharpness and bokeh than

02:25:18   iPhone 11 Pro." Okay, so first, I mean, the smart ass answer is go get rent or buy an iPhone 12 Pro Max,

02:25:26   which is honestly probably the best answer if you actually just don't want to use your iPhone.

02:25:33   So I have a couple of alternatives here. So if Andrew insists, you know, because see,

02:25:40   not caring about long zoom and raw, that to me really says like, "All right, you want pictures

02:25:46   that are just great right out of the camera without a lot of effort, you want an iPhone." But if you

02:25:52   actually want a significantly more resolution and better like actual optical background blur,

02:25:58   then, you know, better than what an iPhone can do with its weird simulated background thing that

02:26:00   blurs your ears off, here are some good options. So from cheapest to most expensive. And also

02:26:07   Andrew says rent, which is good. So from cheapest to most expensive. The cheapest option is still

02:26:13   just use your iPhone, but get a nice lens that you can clip onto it. A decent telephoto lens that

02:26:20   gives you like a, you know, two to four x kind of zoom range. I don't really know what's out there

02:26:26   in this area, but that will give you better background blur, because that's the principle

02:26:32   of how those optics work. You get really good background blur if you use a very like, you know,

02:26:37   the longest telephoto lens you can get, and you get as close to the subject as possible, then you

02:26:42   will get really good background blur. And there's other factors, of course, but that's what's going

02:26:46   to be relevant here. And that's, you know, those lenses are like, you know, 30 to $50 for the

02:26:50   various clip on things. I know that the Moment case and lens assembly together is a little more

02:26:58   expensive, but tends to be pretty quality. I even have like, we have Tiff wanted a macro lens to

02:27:04   photograph our butterfly caterpillars that we were raising here. Don't worry about it. And

02:27:11   and you know, we try different options. And I just I went on Amazon and just found one that was well

02:27:15   reviewed. And it was like 30 bucks. And it's like a clip on thing. So you just clip it onto the

02:27:19   phone, you align it on top of the main camera of the cluster. And it just works. And that was great.

02:27:25   And it was inexpensive. So 30 bucks, you get something like that. But get like a telephoto

02:27:29   lens and that that'll give you what you want. Otherwise, use your iPhone. Now, the next most

02:27:34   expensive option is to actually, you know, do what Andrew asked for, and actually rent a camera lens,

02:27:40   I would say because zoom is not one of Andrew's priorities, I would say get a fixed lens compact

02:27:48   camera. And again, rentals are make this easier. Now the water resistance thing makes some of this

02:27:56   little trickier. So I will say, rent a camera that is not water resistant, hope it doesn't rain,

02:28:02   and get the insurance plan. Because the you know, any kind of like, you know, lens rentals is

02:28:08   wherever different from before. They, and pretty much anywhere else you can rent a camera will have

02:28:13   some kind of, you know, somewhat pricey insurance plan you could add on that will cover all risk. So

02:28:19   you can drop it in the ocean and you won't be responsible for all of it or some of it or

02:28:23   whatever. So I would say rent whatever you want, and get the insurance and then water resistance

02:28:29   is kind of checked off the list. Okay, so as for what you want, what I would suggest having never

02:28:34   used either of these cameras is at the low end, the Fuji X 100 f, because it's a fixed lens camera,

02:28:41   Fuji I found is very, very good at getting really good pictures right out of the camera with no

02:28:48   post processing whatsoever. They have really good JPEG rendering, usually really good color rendering,

02:28:54   it's just it's very, very good for low effort, good shots. And I why I've never owned this

02:29:01   particular camera Fuji cameras tend to have very good reviews for things like basic usability,

02:29:05   ergonomics, stuff like that. It's also reasonably compact. And but yet it is going to give you a

02:29:11   significantly better optical setup that you can get from an iPhone for things like total resolution

02:29:18   and background blur ability optically. And then the high end option, you know, that's,

02:29:23   lens rentals has that for about $83 a day or a week. So the high end option for about three times

02:29:29   that is the Leica Q2. I have never owned a Leica camera. I have rented Leica cameras before. I have

02:29:38   briefly used a Q1 I have not used a Q2. But the Q the Q series of Leica cameras is delightful to use.

02:29:46   They're extraordinarily expensive to buy. But if you're going to be renting one for a short trip,

02:29:52   it's you know, 250 bucks plus whatever they want for the for the insurance. So you're probably

02:29:57   looking at you know, three 350. So again, not cheap. And you're really getting close to you know,

02:30:03   just the buy an iPhone 12 Pro Max territory. But But what you get with the Leica cameras in my in

02:30:10   my experience is, again, really good JPEGs right out of the box with with not a lot of messing

02:30:17   around. You do have amazing optics, amazing resolution, you have, you know, great ability

02:30:22   to have good blur, even at even with this relatively wide angle lens. And they're just

02:30:28   fun to use. They're very fast and responsive. And that's something that's really hard to find in

02:30:33   full frame cameras. But But here it is like the Leica Q2 has that. So anyway, that's my

02:30:38   recommendation. But again, I would go with Casey and suggest just getting like maybe a fun little

02:30:46   clip on lens for your iPhone and a battery pack might be the better approach. For what it's worth.

02:30:52   At lens rentals.com, which both Marco and I have used in the past, and although they've never

02:30:56   sponsored, I definitely recommend them. They're they're excellent. The lens cap plus coverage,

02:31:01   which is the most expensive, I don't know exactly what it covers. For the Leica Q2, it's $60. So that

02:31:06   brings the rental price from $257 for a week to $317 per week, which is not cheap. And like you

02:31:15   said, we're talking about at this point, you know, you're Why not just buy yourself a new iPhone. But

02:31:19   I do understand what you're saying. And I do like the idea of what you're saying there, Marco,

02:31:23   but I stand by iPhones the way to go, john, what do you think?

02:31:27   So this list of criteria is a little bit odd, because it doesn't have any kind of waiting.

02:31:32   So both of you said like, Oh, the iPhone is good at low light. That's true, as long as your subject

02:31:37   is not moving. The way the iPhone gets good at low light is by taking 100 pictures and combining

02:31:43   together into one picture. If you're trying to take a picture of a kid running through some dimly

02:31:48   lit ride, you're going to get nothing because the sensor on the iPhone is tiny, it does not gather

02:31:52   a lot of light computational photography is doing all the heavy lifting on low light. Now that said,

02:31:57   maybe you think your subject won't be moving and all you want to take is pictures of people

02:32:00   standing and smiling in front of things, then the iPhone is good at low light again, congratulations,

02:32:04   right. But good at low light. Like if that it's listed first, but like if that if that is your

02:32:09   number one priority, to actually get a camera that is good at low light, you need a much bigger

02:32:16   sensor. And because this list isn't prioritized, it's like, okay, but how good at low light? Like,

02:32:22   do you need a full frame camera? Do you want medium format? And when I get into stuff like this,

02:32:26   there's one thing that wasn't listed, which is like, reasonable size is listed nowhere. So,

02:32:33   like, it really opens the door to like, do you want to carry a gigantic 50 pound camera? I don't think

02:32:38   you do, but you didn't list it on your wants. So it's hard for me to say what I should recommend,

02:32:43   because you're kind of saying like, like, what this list says to me is, I don't actually mind

02:32:48   if it's kind of a big camera, like a size and portability and convenience like that wasn't

02:32:53   listed, right? The next item, great fast autofocus. This is where I start to get into the cameras I

02:32:58   have the most experience with. Sony has one of the best if not generally agreed upon to be the best,

02:33:03   great fast autofocuses in the entire industry across almost their entire camera line. It's

02:33:09   really, really good about finding the thing you want to focus on and latching on to it really,

02:33:14   really quickly and not letting go. That's like the major selling point of the software side of

02:33:20   the Sony cameras. It's really, really good, right? Lots of Sony cameras are water resistant,

02:33:25   and then better snarkiness and bokeh than iPhone 11 Pro. Yes, iPhone 12 is the snarky answer,

02:33:30   but like, really, it's not that much better. That makes me think you want a real camera,

02:33:35   because if you want actual optical depth of field, do you need actual optics, which means you need an

02:33:40   actual camera. And so given that, given that you didn't say like that, like, it's not super

02:33:47   important to have the smallest, lightest thing. I'm setting aside the cameras that Marco

02:33:51   recommended, but just like the little compact all in one non interchangeable lens cameras,

02:33:55   because that wasn't listed in the criteria. So why would you pick that camera unless compactness is

02:33:59   one of your priorities, which leads me to considering the Sonys that I have the most

02:34:04   experience with. And especially if you're willing to rent, I would say that makes it even easier.

02:34:08   Now you say long zoom is not important, but I know from experience of taking pictures at Disney

02:34:13   World, long zoom is not important. But having a prime lens can be limiting, because you won't know

02:34:21   what focal length to put it. Maybe you want a big picture of like the the big ride of like, oh,

02:34:26   here's Space Mountain or the Matterhorn or whatever. And then another situation, maybe you

02:34:30   want a picture of just your kid, right? You probably need some kind of zoom range to say,

02:34:36   this is a wide shot versus this is a tighter shot, right? So it's going to be really difficult to

02:34:41   pick a single focal length. So what I'm saying is, get an interchangeable lens camera with a pretty

02:34:48   big sensor and a decent lens that has a reasonable zoom range, not a long zoom, it's not going to zoom

02:34:54   in probably any farther than you know, an average camera, but you really want that range. Maybe

02:34:58   you'll even find yourself in a cramped situation where you want to take a picture of your family,

02:35:01   and they're all in front of you, and you're like a foot away, and you want to get the whole family

02:35:04   in. Now you need a wider angle. Right? So kind of like the range that the iPhone does is a reasonable

02:35:10   range. But I think the iPhone is it falls a little bit short about getting picture of your kid on the

02:35:15   Dumbo ride, because they might be far away from you, like the little barriers of where you have

02:35:18   to get the picture from, right? So you need some kind of zoom range. So my main recommendation,

02:35:23   and you know, this is both based on my experience, but it's also based on my very limited experience,

02:35:29   but it's also based on the reason I bought this camera is if you're going to rent, get the Sony

02:35:33   a6600, which like there are better, cheaper options if you're going to buy, but if you're

02:35:37   going to rent, it's probably not that much more expensive to rent the 6600 than the 65 or 64 or

02:35:42   61. So get the 6600. It comes with the amazing fast autofocus system. It is weather resistant

02:35:48   slash weather. It's water resistant, right? So it actually is kind of weather sealed. And so is the

02:35:53   lens I'm going to recommend you get for it and get the Tamron 17 to 70 lens, which has a great zoom

02:35:58   range is an amazing lens and is weather sealed. And it's not that big, but the sensor is way bigger

02:36:04   than an iPhone. It has way better low light performance than the iPhone with any subject

02:36:09   that moves in any way, including your hand shaking, right? Because the sensor is so much bigger. And

02:36:14   the step up from that I would say is the a7c, which is a full frame sensor, same exact size body,

02:36:19   same great autofocus system, same weather resistance. And you can get the same exact

02:36:23   Tamron 17 to 70. Actually, no, it's not full frame. You can get the full frame equivalent of that lens

02:36:27   from a different manufacturer and use that on the full frame 7C. But the camera that is literally

02:36:35   sitting on my desk here right now, the a6600 with the Tamron 17 to 70 will absolutely cover all of

02:36:40   your actual photography needs. And it will take way better pictures than any of the cameras

02:36:45   recommended so far at a similar price. Thanks to our sponsors this week, ExpressVPN, Memberful,

02:36:51   and Burrow. And thanks to our members who support us directly. You can join at atp.fm/join. Thanks,

02:36:58   everybody. We'll talk to you next week. Now the show is over. They didn't even mean to begin

02:37:08   because it was accidental. It was accidental. John didn't do any research. Marco and Casey

02:37:17   wouldn't let him because it was accidental. It was accidental. And you can find the show notes at

02:37:26   atp.fm. And if you're into Twitter, you can follow them at C-A-S-E-Y-L-I-S-S. So that's Casey Liss

02:37:39   M-A-R-C-O-A-R-M-N-T. Marco Arman S-I-R-A-C-U-S-A. Syracuse. It's accidental. They didn't mean to

02:37:56   accidental. Accidental. Tech broadcast. So long. Tamron. Is that what you said? Yeah.

02:38:08   I've not heard of that. 1770. All right. I'm trying to get some. Yeah. Sigma and Tamron are

02:38:14   like the two big, like kind of third party lens makers for most of the SLRs and so. Gotcha. Yeah.

02:38:19   The 17 to 70 is like, I should have recommended, I mean, I'm still on the show, so it's fine.

02:38:24   The Sony 16 to 55 is actually a better lens, but it costs twice as much. But if you're again,

02:38:30   if you're renting, maybe that doesn't make a difference. So consider that as well. I mentioned

02:38:34   the Tamron just because it has a slightly bigger range and it's cheaper. And if that factors it in

02:38:38   in the in at all the rental, then do that. But the Sony 16 to 55 is actually slightly better,

02:38:45   tiny bit more compact. And if you're renting, it's probably like only five bucks more or something.

02:38:50   See, I don't know how you would want to lug around a full frame like Sony interchangeable setup.

02:38:54   Well, they didn't, they didn't list compact size. Like they didn't say it has to be small enough

02:38:59   to fit in my thing or whatever. And having lugged around a camera of this exact size on a extended

02:39:04   Disney vacation, I can say it wasn't that bad. Like, like these are compact cameras. They're

02:39:09   small. Like the, the a seven C is the same size body. They're small for interchangeable lens

02:39:13   cameras, but they're not small compared to an iPhone. But I don't think they're that bad too,

02:39:17   to lug around, even in the million degree heat, even with like a backpack on everything. I did it,

02:39:22   I was fine. I survived. And so if you're not going to list compact size, then you're going to get

02:39:26   recommended larger cameras. It's not like I'm recommending a gigantic, you know, Canon SLR

02:39:31   full frame. That's like weighs seven times as much. Right? Yeah, that's fair. But I don't know.

02:39:37   I mean, this is why like, I ultimately like, I think Casey's experience of just mostly using

02:39:41   the iPhone is worth heating. Like it's so like the iPhone is so good as a vacation camera for

02:39:48   most people's priorities. Like John, John, you are, I think much more willing to lug around a

02:39:55   camera than most people are. But, but Andrew specifically is trying to say not an iPhone with

02:40:01   this criteria. They want better sharpness, better bokeh. Like they want actual optical depth of

02:40:06   field and they know they're not going to get that with an iPhone. So they're set and they're talking

02:40:08   about renting, right? So they're obviously saying they might as well have just said, don't recommend

02:40:13   me an iPhone because they know what the iPhone is. It's known quantity has its qualities. And even

02:40:17   though for most people, it probably does everything you needed to do. Andrew is specifically asking,

02:40:20   I want better pictures than I would get with an iPhone. You want real optical depth of field,

02:40:25   you get a real camera. That's what you'll get. I totally get you. And I'm glad both of you made

02:40:29   those recommendations, but ultimately I feel like it is worth hearing someone say it might be worth

02:40:35   just saving your money and sticking with the thing that's most convenient. Well, what I said last time

02:40:39   is look, they're going to have their iPhone with them anyway. So if there is a situation in which

02:40:43   you don't want to have the big camera or you think the iPhone would take a better picture, just use

02:40:47   the iPhone. Like I didn't not take iPhone pictures on my Disney vacation. Of course I had my iPhone

02:40:51   with me. I had both, right? You're going to have your phone with you anyway. Like, it's not like

02:40:55   you're going to say, I got a real camera, so I don't need to bring my phone. Of course you're

02:40:58   going to bring your phone. Everybody brings their phones. It's so the government can surveil you. No,

02:41:01   that's not why. People love their phones. So you're going to have the phone anyway,

02:41:06   so you're not giving up the phone, right? You're just adding to it. And I, you know,

02:41:10   again, this question specifically looks to me like someone who says, I want pictures of my vacation

02:41:17   that don't look like they were taken on a phone. So, got to have a camera for that.

02:41:21   Yeah. We recently had some friends visit and one of our friends uses a small, I believe it's a Fuji,

02:41:30   like a small, I think it's a micro four thirds camera. And the photos she was able to take on it

02:41:36   were noticeably better than the iPhone photos, but not necessarily in the like, you know,

02:41:43   massive amounts of sharpness. Like that's not what I noticed about them. What I noticed about them

02:41:48   was that they just had a different color tone. Like just like the way that the camera rendered tones

02:41:54   and colors and skin tone and the color science is what they call it in the biz, right? Like color

02:42:00   science, which is so weird when I read it, but that's what they use in reviews.

02:42:03   And I wouldn't necessarily even say better. It was just different. And that was refreshing. Like

02:42:08   after seeing mostly only iPhone pictures myself for a very long time, to have a few pictures that

02:42:14   were in the, in this like group photo library we had from the trip that were taken by like a quote,

02:42:21   real camera, they looked noticeably different and it was just a refreshing thing to see. And I think

02:42:28   in some ways they were better technically. In some ways the iPhone pictures are easier to take, you

02:42:33   know, good pictures with, but it was interesting like seeing what another camera could do. And it

02:42:40   was nice, you know, the way the iPhone renders colors and contrast and stuff, it's very, you know,

02:42:48   scientifically optimized. It's like when you eat at like a fast food place, you know, like this has

02:42:54   been optimized by flavor scientists to like maximally taste exactly the way it's supposed to,

02:42:59   you know, eating like a Dorito. It's like, this is like all flavor science has gone in here, but

02:43:04   then you have like different foods sometimes that has different priorities and it's refreshing and

02:43:09   it's good and it's different. That's how the color rendering of this camera was.

02:43:12   I think it's the same as those analogies in another way. And that the iPhone photos are

02:43:17   processed, just like processed food. Like the reason they look the way they do is you're

02:43:21   starting with garbage and you really have to process it to make it appealing. Whereas the

02:43:26   ones that kind of look different are starting with a better product, isn't a less noisy image

02:43:30   from the sensors. Like the iPhone is doing a lot of work. And so the iPhone pictures look the way

02:43:34   they do it because the raw material they're starting with is just total garbage. And the

02:43:37   computational stuff is working overtime to combine them, denoise them, contour them. You do the HDR

02:43:43   stuff like, and it's amazing. Don't get me wrong. That's amazing. That's why we like iPhone,

02:43:47   but that's why they come out so good because the phone does all that stuff. But the regular camera

02:43:52   can do so much less and just say, look, our raw material off the sensor is 10 times better.

02:43:56   We don't have to do that much processing. And you know, even for things like the colors,

02:44:00   a lot of the colors, I'm not saying they're synthesized, all the colors are synthesized

02:44:03   from various sensor readings, but like you're getting more raw material to work with from a

02:44:10   camera with a big sensor and big glass and all that. So you don't have to grind over it as much.

02:44:15   You can allow it to sort of come through as is more. And that lets you have,

02:44:18   I'm sure different kinds of quote unquote color science. Whereas the phone has to do tons of heavy

02:44:26   lifting and multiple exposure and exposure bracketing and combining to get what it thinks

02:44:31   is a representation of what's in front of it. Like, I'm not going to say that the big camera

02:44:35   looks quote unquote more natural, but like you said, it can, it can look different to you because

02:44:39   bottom line is it has been through a very different pipeline to get to the final form.

02:44:44   It's funny coming off a beach vacation a couple of weeks ago and actually a day trip to the beach,

02:44:50   literally today, I brought the big camera with me today. I brought my GoPro with me and I brought,

02:44:57   of course, my phone with me. And the only thing I really took pictures on today happened to be

02:45:02   the GoPro, which is a terrible still still camera. Like it's, it's truly bad, but I was in the water

02:45:08   and I certainly don't want to bring my big camera in there. I do, John, the same thing you do. So

02:45:12   I'm not like, I'm not absolutely opposed to it, but generally speaking, I try to avoid it if I can.

02:45:17   I have lightly cracked the back of my iPhone. And so I don't want to get that wet because I

02:45:23   have never again going caseless, caseless. And so I was left with the GoPro because I was in the

02:45:27   water a lot today. When I was on the beach trip, I did use the big camera a lot. And it's so

02:45:33   frustrating. I've probably said this before. It's so frustrating because I'll look at the pictures

02:45:38   that the big camera took. And in terms of like having a proper telephoto lens so I can get close

02:45:44   to my subject without actually being close to them. And in terms of the bokeh, even on a,

02:45:49   I think my zoom is an f/2.8 and my prime is like an f/1.4 or something like that. And I almost

02:45:53   never put the prime on anymore because I'm trying to get close to like a moving child or whatever

02:45:58   the case may be. Or just a far away child more than anything else. And I look at these photos

02:46:03   and the bokeh is great. And, you know, I think the color is pretty good, although I'm not, you know,

02:46:07   a particularly astute critic of these things. But then I'll look at the sky and like the sky is

02:46:12   completely blown out. And so I miss the HDR of the iPhone. And then I think about how I have to go and

02:46:17   like post-process all these to put geotags in because I'm not a monster like you, Jon.

02:46:21   And I wish I had the iPhone. And so... Are you shooting on auto? No, I'm shooting aperture

02:46:25   priority. Well, what do you have your aperture set for? Usually like between two and four,

02:46:30   generally speaking. I don't know why your sky is blown out as much as it is when people are outdoors

02:46:34   on a sunny day. I feel like it's not a challenging HDR situation where you should be able to get

02:46:38   reasonable balance. Well, because I don't have HDR. I don't have HDR at all in this camera.

02:46:42   I know, but I'm saying even without HDR, like it's not, it doesn't seem like it would be a super

02:46:47   challenging situation to have a reasonably good exposure on the person's face that's in sunlight

02:46:51   and the sky that's behind them. Well, and also I'm firing these from the hip, so to speak, in the

02:46:56   sense that, you know, I'm not, I'm not doing hours and hours of, oh, that's exaggerating. You know

02:47:00   what I'm saying? Like a lot of calibration. No, no, no processing just right off the, right off the

02:47:04   camera. I mean, just like, I don't know enough about photography to know how, what, what you

02:47:08   might need to change other than it seems like you're overexposing a little bit. But if you,

02:47:11   if the faces, I don't know, you'd have to look at a specific picture. All I can say is like,

02:47:14   I take a lot of pictures of people at the beach and having the sky blown out behind people is not

02:47:18   usually a problem for me. And I am not doing anything particularly fancy with my camera.

02:47:22   And I think you have a better camera than I by a fairly large margin, but like I dropped in the

02:47:26   chat in, in our super secret private text channel, or private Slack channel. I don't want to put

02:47:31   these on the show notes. And I apologize for that because it has pictures of the kids, which I

02:47:36   mostly try to keep off the internet now. But if you look at the first couple of pictures, they are

02:47:41   shot on the big camera and you can tell because the subject's super close. And then you look at

02:47:45   the next couple of pictures and maybe you wouldn't, I would say the sky's blown out in the ones in the

02:47:50   big camera and maybe you wouldn't, but certainly without question, the sky on the pictures taken

02:47:57   with my phone is far better exposed than the ones taken with the big camera. And perhaps that's user

02:48:03   error on my part. Well, that's not the same sky. It's totally, it's framed totally differently.

02:48:07   No, I, I, I agree with Casey. It's what, where the iPhone really excels is first of all, an area I

02:48:14   forgot to mention video, like it's nearly impossible for lay people who are not really good

02:48:21   video shooters to get better video from any other camera than you get out of an iPhone with, with no

02:48:26   effort whatsoever. So that's part number one. But I would even say a lot of that actually stands to

02:48:30   photos now too, like what you get photo wise out of an iPhone, especially in regards to dynamic

02:48:37   range. Uh, and you know, whether it's, you know, the built in HDR stuff or just, you know, just,

02:48:42   just various other ways that it processes dynamic range, it's so much so far ahead of what any

02:48:50   standalone camera does. Now there's reasons for that. You know, people who really know what they're

02:48:54   doing with standalone cameras can, you know, capture the much better data from the much

02:48:58   better optics and much better sensor and can typically do a good amount of post-processing

02:49:03   on it to do things like, you know, you expose to the left or close to the right, whichever one it

02:49:07   is where you, you basically expose for the highlights not to be blown out. And the result is

02:49:12   your shadows are super dark right out of the camera. But then in post you raise up the shadow

02:49:17   detail with all these amazing, you know, sensor dynamic ranges that we have nowadays with the

02:49:22   standalone cameras. But that all takes work and skills and talent that many of us don't have or

02:49:28   don't have time for. Right. And so what you get out of an iPhone for dynamic range is so much

02:49:36   better and more pleasing and more usable. And typically you get more dynamic range detail

02:49:41   because it's so hard to use for most people to use standalone cameras to capture things like

02:49:48   a bright sunny sky with anything else in the frame. Yeah. And so what I'm driving at in a

02:49:54   roundabout way is, and the pictures I've shown Marco and John are not the greatest representations

02:49:59   of, you know, like really excellent pictures that my phone has taken. Actually, John, I forgot to

02:50:04   show you, I did take a single bird picture for you since you were apparently spamming all of Instagram

02:50:09   with 300 bird pictures while you were on your beach vacation. The reason I put all those pictures is

02:50:14   they're not pictures of people who might want to have their pictures shown. So birds don't complain.

02:50:18   That's true. But anyways, you know, there are examples of pictures I took with my big camera

02:50:23   and also the day I've stumbled on just now is relatively overcast day. So in many ways, I'm not

02:50:28   giving you a great example, but you know, I would get out the big camera, particularly in zoom situations.

02:50:34   And I would think to myself, man, I'm so glad I brought the big camera, but then about half the

02:50:39   time I would think, wow, the sky is blown out, man. I kind of wish I had the iPhone for this. Oh, I got

02:50:43   to geotag everything now. I kind of wish I had the iPhone for this. So the big camera definitely has

02:50:48   space in my life and that's why I still bring it. But as I've said many times over the last couple

02:50:53   of years, as the iPhone gets better and better, if it wasn't for just having such better glass on this

02:50:59   camera, I don't think I would ever use it. But to get that really decent bokeh, to get some, I would

02:51:05   argue in some cases, some much better color, I really do need to get out the big camera. And that's

02:51:10   not really a complaint, but it's just, it's wild to me how in just a few years, again, we've said this

02:51:15   many times on the show, in just a few years, we've gone from, yeah, we'll use the iPhone in a pinch to,

02:51:22   yeah, I'll use the big camera when I like really want to get a really good picture and God, what a

02:51:27   pain in the ass it is. It's just such an unbelievable change from the way it used to be. And that's a

02:51:33   good thing in the grand scheme of things. But as someone who wants to be a ever better amateur

02:51:40   photographer, I feel like it is limiting for me to only use my iPhone, which is also not really true

02:51:46   because you can get incredible shots from an iPhone if you work at it. But I don't know, it's just a

02:51:49   very odd place to be, that here it was, I had the big camera with me and I had people that I wanted

02:51:54   to take pictures of with my big camera, including not only my family, but the family that we were

02:51:59   visiting with. But I ended up just using a friggin GoPro because that was the most convenient tool

02:52:04   for that particular work.

02:52:05   [beeping]

02:52:07   [BLANK_AUDIO]