PodSearch

The Talk Show

275: ‘Fake Faces’ With Glenn Fleishman

 

00:00:00   Oh man, Glenn Fleischmann, you're back on the show. We have so much to talk about.

00:00:04   We'll never get to all of it, but we can try.

00:00:08   [laughter]

00:00:10   It's a good problem to have.

00:00:12   Thanks for having me back.

00:00:14   It's a quiet week.

00:00:16   What do you want to start with? I say we start with the...

00:00:18   Jeopardy! Jeopardy!

00:00:20   So, number one, I knew this.

00:00:22   I don't know that we've ever really spoken about it at length, and if we have, I've forgotten about it.

00:00:27   But you were on Jeopardy!

00:00:29   When? When were you on?

00:00:32   2000... I taped in August 2012. My episodes went up

00:00:35   October 2012 and I won! I won two episodes by the skin of my teeth. That's how I describe it.

00:00:42   I still had my... now I'm too old. It's a young man's game, young person's game.

00:00:46   But I won two episodes and over $30,000

00:00:50   Which is kind of neat because my whole life people had said you should go on Jeopardy!

00:00:54   You seem to remember a lot of trivia and things that aren't important

00:00:57   Yes, I do

00:00:59   And it's well you got it both. It's

00:01:01   At least two skills. It's more. You know, it's like in it like any kind of athletic competition. It's

00:01:07   Multivariate you need you obviously need to know the trivia, but you've also got to be fast. I

00:01:13   Don't I don't have the eye

00:01:15   There's no way even if I got like an exception and and got to buzz in ten seconds late

00:01:21   I wouldn't get shut out on Jeopardy. I would I would get a couple in

00:01:26   But I'm too slow and I've always been too slow my mind

00:01:29   It's you saw that thing

00:01:31   There's an article a few months ago about how they've finally done the metabolic testing on Grandmaster chess players

00:01:37   Mmm, and yeah those folks they burn like thousands of calories today. Yeah, we said it people in the chess world said

00:01:43   You know this may seem not that demanding but in fact we are we you know

00:01:48   we can lose 15 pounds over a tournament.

00:01:50   Everyone's like, yeah, yeah, you little nerds or whatever.

00:01:52   So anyway, they finally started doing

00:01:54   some of the real lab testing.

00:01:56   And these guys are burning like they're doing

00:01:58   the Tour de France practically.

00:02:00   And so they now, the best chess grandmasters,

00:02:03   now have a nutritionist.

00:02:04   They have a plan for like, you know,

00:02:06   carbo-loading or protein-loading.

00:02:08   They work out like athletes between all the chess practice

00:02:12   they do with their coaches and themselves.

00:02:14   And it makes a huge, I mean, it's like,

00:02:16   you need the physical edge to play that game. And I think that's like wonderful and hilarious that

00:02:22   it took this long for anyone to actually believe them.

00:02:25   I totally saw that and the basic explanation. I did see that story. I'll put it in the show notes. I

00:02:30   swear to God, I just wrote it down. But it's it the basic idea is that the human brain is

00:02:37   comparisons between the human brain and our computers today and the idea of you know, how do we

00:02:45   get AI to work like a human brain, blah, blah, blah, blah, blah, it's, you know, it's a decades

00:02:49   old thing and part of science fiction. But at some fundamental level, the brain is like

00:02:55   a computer. And the harder it works, the more energy it consumes, it totally consumes calories

00:03:03   to do to to intensely concentrate on something such as, for example, grandmaster level chess,

00:03:10   Which it, you know, I think everybody, I don't see how anybody could dispute that a game

00:03:16   like chess or go or something like that.

00:03:19   It takes intense competition, intense concentration.

00:03:23   And it consumes large amounts of way more serious amounts of calories, like you said,

00:03:30   like over the course of like a week long or 10 day match it they could lose 15 pounds.

00:03:35   Yeah, they need they need guys with panniers coming up and like handing them energy bars.

00:03:40   - Yeah.

00:03:41   - And can't, I mean, but it is really,

00:03:42   I think it was like five,

00:03:43   I wanna say it was five or 6,000 calories a day,

00:03:46   you know, a normal human adult person

00:03:49   is somewhere in the two to 3,000 calories,

00:03:51   what you consume with, you know, normal activity.

00:03:53   So it is getting into this like a high athletic thing.

00:03:57   So I think about that with "Jeopardy!"

00:03:59   like "Jeopardy!" is a very funny game, right?

00:04:00   Because at its face value, when you play it at home,

00:04:04   it looks like a trivia game with some timing.

00:04:07   And then when you play it in reality,

00:04:09   And I kind of prepped for it when I went in.

00:04:12   I'd read a bunch of books by people.

00:04:14   I always cite Bob Harris is the prisoner of Trebekistan

00:04:18   is a wonderful book because it's half memoir.

00:04:21   And Bob is a very interesting and lovely fellow.

00:04:23   And he's done a lot of interesting things in his life,

00:04:25   including writing for TV and being involved

00:04:28   in microloans and whatever.

00:04:29   But this book is like, it looks like it's a story

00:04:31   about winning on Jeopardy and a strategy guide.

00:04:33   And then it becomes sort of memoir.

00:04:35   And so anyway, from that book, I only,

00:04:37   I say a big chunk of why I won two games as strategies that he described in the book

00:04:42   that I wouldn't have known otherwise.

00:04:44   And it's he understood how to bet.

00:04:46   And that was I remember Arthur Chu who had a good run, I think 15 day run a few years ago.

00:04:51   Arthur Chu drove people nuts because he jumped all over the board, which has now become

00:04:56   more of a standard playing style and had been a style occasionally before him.

00:05:00   But he really understood how to bet.

00:05:02   And James Holzhauer was a demon in embedding,

00:05:06   and I had no idea.

00:05:08   So I went in there from that book having a,

00:05:10   so you need to know how to wager,

00:05:12   you need to have fast reflexes to buzz in on time,

00:05:15   you need to be able to have that fast memory recall.

00:05:17   So you see it, you know you know it,

00:05:19   you buzz it in and you can produce

00:05:20   the sounds from your mouth.

00:05:22   And do it in this incredibly, you know,

00:05:24   in front of an audience, an audience is a live studio audience.

00:05:27   And it's like, anyway, here's the funny thing.

00:05:31   I describe the experience as being like the, you know,

00:05:35   the Willy Wonka movie, the original one with Mike TV.

00:05:38   And he gets onto the TV set and he like stands in the TV

00:05:42   and he's beamed into the TV and he's tiny.

00:05:45   When you walk out of the "Jeopardy" set, it is exactly,

00:05:48   I mean, not like I thought there was CG or something,

00:05:50   but it is exactly like you see.

00:05:52   So it is weirdly like walking into a TV set.

00:05:56   Well, here's the funny part.

00:05:57   Last year, Paris Lemon, the guy who played Mike TV

00:06:00   in the movie was on Jeopardy.

00:06:02   - Really? I didn't know that.

00:06:04   - He didn't win.

00:06:05   Alex did not ask him about the movie,

00:06:08   but everyone's like, wait a minute,

00:06:10   that's Mike TV up there.

00:06:12   And his wife had actually played previously on Jeopardy

00:06:15   and I believe won a little bit.

00:06:16   - One of the things that I've always enjoyed

00:06:21   and these two shows have been on, man,

00:06:24   as almost as long as I can remember,

00:06:27   Jeopardy and Wheel of Fortune and they're paired. I think they're paired in syndication markets. Yeah

00:06:33   I don't know if there's any I've never been aware of a market where they're not

00:06:36   Where I live here in the Philadelphia market, they've always been on jeopardy at seven Wheel of Fortune 730

00:06:45   We're backwards. Oh, are you really? Yeah, we get wheel first in jeopardy. Isn't that weird? Oh

00:06:52   Well, I've always thought that that order made sense because Jeff yet you watch Jeopardy first and then you feel dumb and then you watch Wheel

00:06:59   of Fortune and you feel really smart you're like

00:07:02   Because you can shout all that, you know, you it's easy to shout out the answer

00:07:09   You know before the contestants can because they you know, they don't have to buzz in there's no reflexes

00:07:15   They can just they just say I would like to solve the puzzle

00:07:19   Right, and then they give the gas so it you know as soon as it's pretty obvious and you can you know

00:07:24   You're good at that sort of

00:07:26   Puzzle solving you know, it's a lot easier to feel smart watching Wheel of Fortune than watching jeopardy

00:07:32   So I always felt I always felt that the jeopardy then Wheel of Fortune order

00:07:35   Is the right order? Yeah, and you got to kind of calm down as opposed to key yourself up

00:07:41   So the other thing so some of the things I've read about and I've watched for years

00:07:46   Never was tempted to go on like I said, I know I mean, you know

00:07:48   Like I was on the quiz Bowl team or whatever they called it in high school and did okay at that level

00:07:53   But you know jeopardy is obviously, you know, it's like saying you played high school basketball versus you know

00:07:58   You played the pros or played even even in college, you know

00:08:02   You know, there's a definite difference between being a recreational high school quiz Bowl player and and you know the a-league

00:08:11   But the one thing I didn't know and - and I still I'm not clear about it is when you're allowed to buzz in. Oh

00:08:17   Yeah, yeah. Yeah. Yeah, so it's uh, and this is the funny thing. So I read this story recently

00:08:24   I think it was while the this greatest of all time tournament was going on. Someone wrote a clever story

00:08:28   I think it was a might have been a jeopardy for a contestant that

00:08:32   Jeopardy is the quietest show of any game show you can think of I'd never thought of it

00:08:37   I've watched the show my whole life

00:08:38   I was on the show and I never thought of it and they said really the the buzzing device

00:08:43   It's that we caught the know jeopardy calls it a signaling device. It doesn't buzz

00:08:46   There's no sound when you click in there's a little plate sound if your time runs out for a question

00:08:53   Right and the Daily Double thing does that?

00:08:55   You know kind of like special TV thing, but otherwise, it's really a pretty in the in the jeopardy music

00:09:00   I was like, wow, that's wild. So so you don't have a lot of there. There's very little oral cues or timing

00:09:06   So when you're standing behind the podium, you're looking at the game board and when they show the game board at home

00:09:11   they frame it so you can't see that on the left and right are a set of lights and

00:09:16   Alex reads the clue and

00:09:20   The moment he stops he reaches the end of the clue

00:09:23   There's one of the producers of the show is sitting at the stage watching and listening to him and hits an unlock button

00:09:29   When that producer hits the unlock button and it is the same person typically over a season

00:09:35   I think I made me one person for many years now.

00:09:38   The lights on both sides of the board go out

00:09:41   so that you know you can, or they light up.

00:09:43   Isn't that funny?

00:09:44   I think they go, I can't remember if they light up,

00:09:46   but I think they light up, sorry.

00:09:47   They light up to indicate that you can ring in.

00:09:49   And at that point you can buzz in.

00:09:51   If you buzz in before the signals are unlocked,

00:09:55   then you are locked out for something like,

00:09:57   I think it's a half a second,

00:09:58   but you know, it's substantial in that world.

00:10:01   So some Jeopardy players listen to Alex's cadence

00:10:05   and time themselves to figure out when he's done.

00:10:08   And the producer is going to click the button

00:10:10   and they time it to that.

00:10:12   Other people watch for the lights.

00:10:15   So like Ken Jennings is a Alex's Cadence person.

00:10:19   James Holzhauer said he follows the looking

00:10:23   at the lights thing.

00:10:24   So you can't say like one or the other gives

00:10:25   you the championship skills.

00:10:27   Clearly because the two best players in Jeopardy's

00:10:30   history basically are two of the very best players

00:10:33   have different strategies.

00:10:34   But it's that's the thing is because you get locked out.

00:10:36   Apparently I think in the very early Jeopardy!

00:10:40   games maybe in the first season because it started in the 60s

00:10:44   you could ring in at any time and apparently

00:10:47   was very frustrating like the host didn't like it.

00:10:48   It was Art Fleming in those days and I think

00:10:51   it did not test that well because people wanted

00:10:53   to hear the whole question because it's meant

00:10:55   for home viewers.

00:10:56   And if you don't hear the whole question then you don't

00:10:59   have time to think of an answer if someone's

00:11:00   it's like what is that you know that's it.

00:11:03   The show is too fast and weird. Yeah, I always assumed because it doesn't make a

00:11:09   Buzz or a bleep or anything when somebody buzzes in and it it it's interesting because most game shows throughout history with some kind of

00:11:18   Buzzer clicker thing it you know makes some kind of noise. I think of the

00:11:22   The Family Feud, you know has the the big red thing that you slap your hand on

00:11:27   but it but

00:11:30   the gist of those

00:11:33   Games make sense that there's a sound effect or as jeopardy

00:11:37   I would guess

00:11:40   easily

00:11:42   95% of questions get at least one

00:11:45   Guess oh, and there's another sound effect. There's a sound effect when nobody

00:11:49   When nobody takes a guess at a question. Oh, yeah, it's like

00:11:54   Yeah

00:11:58   But it's because every question does get at least one answer

00:12:02   It doesn't even make it makes sense that there's no buzz because it would be a non-stop string of buzzes

00:12:07   You know the buzz isn't helpful if it's every single thing, you know, it's almost like

00:12:11   The way that in movies when people are typing on computers every time they click the mouse or type a key it beeps

00:12:20   Right like people do I'm hacking your computer

00:12:26   It doesn't make any sense it would drive you crazy if your computer beat every time you typed a key

00:12:30   Yeah, I think it would be very distracting if every time you you buzzed in on jeopardy it made a noise or something

00:12:36   But I as a viewer it on TV. I have assumed for decades

00:12:40   that you were allowed to

00:12:43   buzz in whenever you wanted to but

00:12:46   Alex Trebek would continue reading the question regardless and then when he's done reading they cut to

00:12:54   The podium and whoever buzzed in first at whatever point

00:12:58   Theirs is the podium that's lit up and they get to answer. I did not realize that you can't

00:13:03   You can't click in you can't buzz in whatever you want to I guess buzz is the wrong word

00:13:08   You can't click in until Trebek is done reading. I really didn't realize that

00:13:13   But that's interesting because it it gives you more time

00:13:17   I always thought that there must be a tremendous advantage to people who could read fast because

00:13:24   they could buzz in but there's not you you have at least as long as it takes for back to to

00:13:29   Read the question had no idea and I kind of feel like I kind of feel like they there should be some

00:13:36   indication of that on the show, I mean, it's you know, too late now, I guess to change it but

00:13:40   Yeah, I don't think I understood. I think you yeah, cuz they never discuss it on the show

00:13:46   right, you have to only know it if you read about it and

00:13:50   But you you can tell yeah, I guess I never I think they never decided because they could have said at the beginning of every

00:13:56   episode Alex could have said

00:13:58   Contestants can't ring in until the question is finished reading and it's possible. I don't remember it's possible

00:14:02   They said that in decades past and they stopped because everyone knew but and you know newer viewers, but I don't remember ever

00:14:08   hearing that and it becomes this wild thing because

00:14:12   When Ken Jennings went on his his winning streak in 2004

00:14:18   After I forget how long he had been playing,

00:14:20   how many games he'd won.

00:14:22   And the producers are like, this is terrible.

00:14:23   I mean, this is great, but it's also terrible.

00:14:25   No one is ever going to beat a guy

00:14:27   'cause he has now played more games of Jeopardy

00:14:29   than anyone.

00:14:30   Like remember that until,

00:14:32   it was just before the season he was on,

00:14:34   you could only win five games in a row.

00:14:36   Then you got a car, or two cars.

00:14:38   It was the end of the season

00:14:40   and they didn't have a choice, they'd give you two cars.

00:14:42   So Bob Harris got two cars, Brad Rutter got two cars.

00:14:46   you could sell them and you paid the tax on them.

00:14:48   Then some people sold them.

00:14:49   But so Ken had, you know, on game six or seven,

00:14:53   if you exclude the tournament play,

00:14:55   he had played the most regular games of Jeopardy

00:14:57   of anyone ever.

00:14:58   So after 20 games, he has massive amounts of experience

00:15:02   that nobody else playing him can.

00:15:04   So the producers started to do more training.

00:15:07   So, and this is well in advance of when I got there in 2012,

00:15:11   when you get there in the morning, there's a bunch,

00:15:13   there's like a couple hours of orientation.

00:15:15   "Oh, Jon, can I tell you the funniest thing is,

00:15:17   "you know those incredibly awkward stories,

00:15:19   "the little Alex chit chat?"

00:15:20   - Yeah, yeah, yeah.

00:15:21   - Okay, so this is the most important part of the show.

00:15:23   - Before Double Jeopardy.

00:15:25   - Before Double Jeopardy.

00:15:27   You spend so long.

00:15:28   So when you apply to the show,

00:15:29   you take a test, online test, anyone could take it.

00:15:31   Everyone should sign up for this

00:15:33   if you have any trivia interest,

00:15:34   'cause it's totally painless online.

00:15:37   And they test like 100,000 people with each test.

00:15:39   Then they select a few thousand to do in-person interviews

00:15:42   about 400 people a year, I think, go on the show,

00:15:45   if I do the math right.

00:15:46   And so once you get to the audition stage,

00:15:49   before you even go in in person to audition

00:15:51   in your city or wherever, they're like,

00:15:54   "Write up three stories of the con you would hear in Jeopardy,

00:15:56   like an anecdote about your life, something is interesting."

00:15:58   Then you do the audition, then they make you

00:15:59   tell those stories, then they ask you for more stories.

00:16:02   Then when you get to Culver City, California,

00:16:04   to the Sony Pictures Studios, you spend so much time,

00:16:08   it is like the most time you spend at the show

00:16:10   is crafting your terrible little anecdote for chit chat

00:16:14   'cause it's the hardest thing to get people

00:16:16   to have a little story.

00:16:18   So they work on that part and it's distracting too,

00:16:20   which is good.

00:16:21   But then the other thing is you play several games.

00:16:24   So they do, I think we spent an hour or two in orientation,

00:16:27   making sure everyone understands all the law involved

00:16:30   'cause there's a lot of game show law

00:16:31   since the quiz show scandal

00:16:33   and how you can hit, if you feel like something unfair,

00:16:35   there's a representative there

00:16:36   that you can talk to directly to complain.

00:16:40   and that sometimes results in changes.

00:16:41   And it's very rare, but it does.

00:16:44   Anyway, so then you go on stage

00:16:46   and one of the contestant coordinators plays Alex

00:16:49   and you run through with the podiums,

00:16:51   they sweep people through, they run through categories

00:16:53   and the producer who unlocks the signaling system

00:16:57   is actually there.

00:16:58   And before Ken Jennings, that person wasn't just anyone

00:17:00   was there managing it and they decided

00:17:03   it was such an advantage that Ken had learned the timing

00:17:06   between question and the producers so well

00:17:08   that they had to have that person there, or it was not a, you know, it was a too unfair

00:17:13   for other people playing.

00:17:14   Interesting.

00:17:15   But in the end, Ken Jennings won in a final jeopardy.

00:17:18   So he didn't win on he didn't lose on get regular gameplay.

00:17:21   Hmm.

00:17:22   Trying to think was Oh, I have to ask is this sort of thing I care about?

00:17:26   Like is the as they call it the signal indicator?

00:17:30   Does it have a good feel in your hand?

00:17:31   Is it is it a good feeling device?

00:17:33   Does it have a good click to it?

00:17:35   Yeah, it's um, it's

00:17:37   I'm trying to think what it feels like.

00:17:41   'Cause usually it doesn't click, you depress it.

00:17:44   And you can, and some people have the style

00:17:46   that you press it a lot in a row

00:17:47   so that if you get in too early,

00:17:50   you can still get in past the next person

00:17:53   who hasn't signaled, if that makes sense.

00:17:55   - Get in at the quickest possible time after that.

00:17:58   - Yeah, I'm trying to remember.

00:17:59   That's so funny when you say that.

00:18:00   I don't think, they tell you to practice with a pen

00:18:03   and I did that.

00:18:03   And actually at the audition,

00:18:05   they give you a cheapo Jeopardy branded pen

00:18:07   And I took that home and I sat there and practiced with that as the clicker.

00:18:10   And that's a traditional thing.

00:18:11   A serious, you know, like super high level players who go back to tournaments, some of them

00:18:16   have actually built training systems.

00:18:18   They've had built for them by people often who work in the building, the game show, you

00:18:23   know, device building industry.

00:18:24   They will buy a clicking system or a buzzer system to to test.

00:18:29   So they're able to work on their skills.

00:18:32   But it is athletic in that sense is, you know, if you can shave half a millisecond off your

00:18:37   timing then you could come in first. So I mean this is the other thing even when

00:18:41   you're not playing at tournament level like the like the GOAT tournament you

00:18:45   have everybody on stage has been auditioned so that most of the time say

00:18:51   two or three people will know the answer to any given question and some of them

00:18:55   are a little harder and only one person or nobody knows or the person knows they

00:18:58   don't ring in they're not secure enough but a lot of the time it's two and

00:19:03   often all three and you're just working on timing to get in there so if you're

00:19:07   timing is a little bit worse, you could know the answer to

00:19:10   nearly every question and still easily lose the game.

00:19:12   Yeah, I totally understand that. That's fascinating. Also, what

00:19:18   about the the touchscreen where you I guess you use it twice.

00:19:24   First you write your name. And so your name is written there on

00:19:28   the you know, in white, white pixels on a blue background. And

00:19:31   then for Final Jeopardy, you use it to to write your answer. Is

00:19:36   it? What is the user interface for that?

00:19:39   Oh, it's, it's terrible. It's really low resolution. It's really hard to do it. So when you see people with these really horribly written things in Final Jeopardy, they're barely legible. They should really I mean, I think the the technology has come a long way. I mean, 2012 maybe it was still, you know, acceptable, but in 2020, I think they should have upgraded those to slightly higher resolution and quality, perhaps.

00:20:06   I was actually really impressed James Holtzauer would draw a different design, I think, almost every night after he started winning.

00:20:10   Then I was like, man, he really he must have worked with some system that used a stylus and, you know, crummy old input thing because he he really actually got some good legible designs on his name.

00:20:23   Before Final Jeopardy, do you now for those of you, I mean, I'm assuming there's a big assumption here that people know what Jeopardy is.

00:20:32   And I guess if you're listening from outside the US, maybe you don't.

00:20:36   Sorry, but it's a really fun game.

00:20:37   But there's two rounds where you answer trivia questions

00:20:41   and there's three contestants who buzz in.

00:20:43   And but then in the final round, it's just one question.

00:20:47   All they tell you in advance is the category,

00:20:49   which often is some kind of pun and doesn't give you.

00:20:53   Sometimes you have a good guess what it's going to be about, sometimes not.

00:20:57   And then what each contestant secretly wagers some amount of the money

00:21:02   They've already won all or nothing on whether they get get the answer correct and they do it in written form

00:21:08   Of course because you know and then they reveal them one at a time because otherwise, you know, how else would you do it?

00:21:13   Do they give you like pen and paper so you can like work out the math because there's a lot, you know

00:21:18   There's a lot of strategy involved. Like if you're in first place

00:21:22   You want to basically want to bet enough so that if you're right

00:21:28   You'll win no matter what the second place contestant bet

00:21:32   So if the second place contestant bets every dollar they have you want to make sure that if you're right you still come out ahead

00:21:38   But you probably don't want to bet everything because then if everybody's wrong

00:21:42   You want to still have the most amount of money because that's that's ultimately all that matters

00:21:47   Do they give you a pen and paper to work that out? Yeah, see John I can tell that you gamble because oh, yeah

00:21:53   You get it. You totally get it. That is the crux of it.

00:21:57   And I've seen people lose games because they didn't do all the math.

00:22:01   And yeah, they do. They literally give you a pencil and paper.

00:22:04   And the thing they don't show you, this is the only thing, you know, they say,

00:22:07   Jeopardy is not edited for answers. They sometimes edit it for outcomes.

00:22:12   They edit it for timing. So sometimes it seems like people are playing boom, boom, boom.

00:22:16   It's because the episode went a little long.

00:22:18   I mean, they have a time for each round and they'll actually pull out

00:22:23   people's like pauses between answering or between buzzing and answer whatever.

00:22:28   So sometimes you're like, God, people are playing super fast.

00:22:31   It's like, no, no, that's, that's the edit.

00:22:32   But here's the thing they don't show you is Alex exposes the Final Jeopardy

00:22:36   category. And we were watching during the Greatest of All Time tournament and

00:22:41   we're whole family was watching the category was something like presidents and

00:22:45   Bibles. And we're like, Oh, what the, how do you, what? All right. Okay, sure.

00:22:50   Something like that. Anyway, so after this,

00:22:52   Then they go to commercial break.

00:22:54   So on set, since they're obviously, it's not a live show,

00:22:58   they give you literally as much time as you need.

00:23:00   They put in partitions between contestants before you start,

00:23:05   and then they give you as much time as you need

00:23:07   to make the wager.

00:23:08   So sometimes I think it might've been five plus minutes,

00:23:12   and they kind of push you along if it's taking a bit,

00:23:14   'cause the audience is there, it's restless,

00:23:15   and they won't give you forever.

00:23:16   But it's not like you have 10 seconds

00:23:18   or 30 seconds to make the wager.

00:23:19   So in my, was it my second game, I...

00:23:23   - Well, in other words, basically,

00:23:26   the basic thing you're trying to say

00:23:27   is you have more than the commercial break,

00:23:29   which is probably like 90 seconds or two minutes.

00:23:33   - Yeah, so we could have, you can have five minutes,

00:23:34   whatever they bring out, they're so nice.

00:23:35   The contestant coordinators, they bring out water to you,

00:23:39   they check on how you're doing, the whole show,

00:23:41   I don't think I've said this so far,

00:23:42   the whole show is the most humane experience.

00:23:45   They do such good hiring.

00:23:46   Everybody there is such a lovely person.

00:23:49   I remember having great time with the sound guy,

00:23:51   with the makeup people.

00:23:54   You know, just, I went back to a taping a few months later,

00:23:57   one of the contestant coordinators saw me in the audience.

00:23:59   They got me tickets.

00:24:00   They give me a VI, like friend of Jeopardy sticker.

00:24:04   One of the contestant coordinators sees me,

00:24:05   comes up, gives me a big hug.

00:24:07   And I'm like, oh, it's such a lovely show.

00:24:10   And I don't know how much that comes through,

00:24:12   but they really do everything they can

00:24:14   to make the contestants feel good.

00:24:15   So anyway, so the final Jeopardy thing,

00:24:18   my second game I was very unsure about the category and in the end I got the

00:24:22   wrong answer. I was not the number one I don't think I had the most cash so I had

00:24:28   a bet so that the I would be left with enough money if I was wrong and the

00:24:33   person who had the most cash was wrong and the person who had the third most

00:24:37   cash was right and that is what happened. The number one and number two person

00:24:42   this other woman and myself were both wrong the third person bet everything

00:24:48   and was right and I beat him and he was so peeved.

00:24:51   He kind of kept looking at the screen

00:24:53   'cause he's like, he didn't really,

00:24:54   I mean, so I didn't bet the maximum,

00:24:56   which is what you'd expect

00:24:58   because I only had a bet enough to beat,

00:25:00   I knew I couldn't beat the number one person

00:25:02   if she bet everything.

00:25:03   So anyway, she bet the maximum,

00:25:05   she needed to beat my maximum.

00:25:06   So I bet strategically

00:25:08   and that is the only reason I won that game.

00:25:10   - Right, there's a strategy there

00:25:11   where you're thinking what's, it's not so much,

00:25:13   how can I get the highest dollar amount?

00:25:16   it's what is the what can I do here that gives me the most likely chance of winning? And

00:25:23   in some cases, if you're behind your most likely chance of winning is what if the what

00:25:28   if the people ahead of me get get it wrong? Yeah, and then what if we all get something

00:25:33   and James Holtz s 34th game when he lost people went nuts because he bet at a level where

00:25:41   he's there like he's throwing the game. Why do you throw the game? They get tired of playing.

00:25:44   like no he bet because he had a bet on the assumption that the woman who ultimately beat him

00:25:49   that he needed to have enough money left to beat the number three player who in the event

00:25:55   actual event I think was wrong.

00:25:57   But if he had the right answer the number two player would have like twenty two thousand

00:26:01   dollars and James needed to wind up with twenty two thousand and one.

00:26:04   So he had to be that conservatively to beat her if she was wrong the number one player.

00:26:09   Anyway. But people went crazy.

00:26:11   They were like oh my God why did he throw the game like no no no.

00:26:13   Let's do the math here are the but there's like eight possible outcomes and he bet correctly for every outcome in which he could win

00:26:21   yeah, and

00:26:23   So anyway the greatest of all I think I think that that wraps up your first-hand experience on it

00:26:28   Oh, yeah, and that leads us to this greatest of all time

00:26:31   tournament which is

00:26:34   fascinating so the three contestants

00:26:36   Were I had this on a screenshot somewhere it was Ken Jennings

00:26:43   who won 74 times in a row what's the guy's name from Philly James

00:26:53   Brett Brad rudder Brad rudder yeah who's not as well known right not as well

00:26:59   known because he originally played back in the five-time champion and you're out

00:27:05   but then wound up making his claim to greatest of all time by coming on their

00:27:12   tournaments of champions where they would bring back five-time champions and

00:27:17   play in like a tournament and he according to the Jeopardy bio up until

00:27:22   now had never lost to a human being before I guess he lost to the IBM Watson

00:27:26   one time in another one of these gimmicky special events yeah he won I

00:27:32   think it was because there's a tournament champions and jeopardy in the

00:27:35   early 2000s was still popular enough that they did some primetime specials

00:27:39   And this one is the first one they've done in many years.

00:27:42   And so they do like the ultimate tournament,

00:27:44   the tournament one, and then they did some tournament

00:27:45   of decades, the blah, blah, blah.

00:27:47   So he won, I think either he won four

00:27:50   and was on the winning team for the fifth.

00:27:53   So he won like a million dollars,

00:27:54   two million dollars, a million dollars, you know,

00:27:57   300, whatever it was, and even got,

00:28:00   I think it was 200 grand from the IBM Watson thing.

00:28:03   So he racked up, I think it was $4.4 million

00:28:07   in jeopardy, play post his appearances, because he won like 100 grand, including the car value on the original appearance and then like 4.3 million after that, and can he beat Ken Jennings in four of those outings?

00:28:23   four point seven million dollars total. That's right. That includes the last tournament. So before that, and so Brad rudder was for a while the winningest he'd won the most money of any person in game show history American game show history. Then Ken briefly got ahead of him because Ken won some money in some non jeopardy game shows, then Brad won another tournament or two. And now Ken is the most winning game show contestant of all time.

00:28:52   And then this dollar and then the third is the guy who?

00:28:55   Shot to fame I think over the summer just the last in the last year late spring James James Holzhauer

00:29:02   who

00:29:04   Really shook things up by playing in an unprecedented style

00:29:08   and

00:29:10   What he's hold his claim to the best of all time is that he has

00:29:13   all 15 of the top single game winnings

00:29:18   So in other words, how much can you win on a single in one single 30-minute match of jeopardy?

00:29:24   spots 1 through 15 on the list are all James Holzhauer and

00:29:29   It really felt for a while like he had broken the game

00:29:33   But I always thought while he was on his run. I was like this is genius and

00:29:39   It happens in pro sports all the time where somebody it doesn't really

00:29:46   Or I shouldn't say all the time, but occasionally like a coach or a team will come up with a strategy

00:29:52   Yeah that nobody had ever seen before like when you and I were young

00:29:57   I don't know if you're you're probably not I know you're not a big sports guy, right?

00:30:00   But I like the sports math, but but there is there in when I was in high school

00:30:06   So this is like around 1990 1991. There was a typical men's college

00:30:12   Well, actually men's and women's scoring is pretty close but in college basketball, that's a 40-minute game

00:30:17   it used to be a 45 minute second shot clock and a typical score of

00:30:23   top teams

00:30:25   Is somewhere, you know, like a winning score is often somewhere around 70 to 80 points, you know

00:30:31   And you know that you know, there's a lot of variation. That's just the way it goes

00:30:36   so you might see a 50 to 55 game once in a while and

00:30:40   sometimes there's high-scoring teams that get close to 90 and

00:30:44   For a college game a hundred points is a lot. I mean that that means you were really really off the charts

00:30:52   Just because it's only 40 minutes whereas NBA plays a 48 minute game and therefore those extra it's you know, it's

00:31:00   That's significantly longer and they have a shorter shot clock and they're better shooters because that's the NBA versus college

00:31:08   So, you know, NBA typically, you don't win games with fewer than 100 points.

00:31:13   But in the 90s, there was this right around 1990-91, there was this team called Loyola

00:31:19   Maramont and they were scoring like 160-170 points a game because they played like their

00:31:28   pants were on fire all 40 minutes.

00:31:32   So like imagine like the way that you would coach the team.

00:31:35   You don't have to be you don't have to really know much about basketball.

00:31:37   You just have to just have a basic idea of the game.

00:31:39   But imagine you're down by six points

00:31:42   with a minute and a half to play.

00:31:44   The way that you would instruct your team to play

00:31:46   is full court press, cover them from end to end,

00:31:50   and try to trap them.

00:31:52   Don't just sit there and wait for them to shoot

00:31:54   like a typical moment at basketball.

00:31:56   Do whatever we can on defense.

00:31:58   Double team whoever has the ball.

00:31:59   Gamble everything.

00:32:00   Gamble everything on every play defensively.

00:32:03   And as soon as we get the ball, no matter what we do,

00:32:06   Try to get a shot off as quickly as we can.

00:32:09   Don't dribble down the court, right?

00:32:10   If you're not a sports fan, but you know basketball,

00:32:13   you know, the other team scores, they give it to a guy

00:32:16   and he dribbles the ball down the court

00:32:18   and then they'd start passing it around

00:32:19   until somebody gets a shot.

00:32:21   Loyola Maramount, there was no dribbling down the court.

00:32:23   You always, you passed it down the court.

00:32:25   - Yeah, yeah.

00:32:27   - And they would--

00:32:28   - Also, it's super fast game that way.

00:32:29   - Right, but it felt like,

00:32:31   and they never won the NCAA championship,

00:32:33   but they were very good and they did go far.

00:32:36   They ended up with the tray was a terrible tragedy actually where there was a guy named

00:32:39   I mean, I hate to bring it all down, but there was a player on their team who actually

00:32:43   In their best season at the peak of their powers. He guy named Hank gathers actually died of a heart attack at practice

00:32:49   Oh, I remember that that was one of those those one of those first prominent ones were people with a unknown heart. Yeah, exactly

00:32:55   Yeah, I remember that

00:32:58   yeah, there was no the team but I remember him yeah and

00:33:04   Well, I forget his teammate, but he were two guys on the team who were better than everybody else and his teammate

00:33:11   I'll try to get it for the show notes

00:33:13   but like in honor then they had to go to the NCAA tournament and

00:33:17   In honor of him the guy was right-handed, but he did Hank gathers was left-handed

00:33:22   So his teammate decided he would shoot all his free throws left-handed

00:33:25   Which sounds like a recipe for like hey, that's you know, maybe wear an armband or something

00:33:30   Yeah, but it but he shot like 85% from the free throws left-handed. He's so good. But anyway

00:33:37   anyway

00:33:39   It just looked like you'd be watching ESPN and it just seemed crazy like

00:33:43   175 points or something like that. It doesn't happen and

00:33:47   Wholes hours jeopardy dawdles just seemed like that like it just seemed like that

00:33:53   You're not supposed to you're not supposed to score

00:33:58   thousand dollars in a day in Jeopardy.

00:34:00   Or 133,000.

00:34:03   And I remember watching that episode and just, you know, my brain is melting because, I mean,

00:34:09   so Roger Craig had the previous one day total.

00:34:11   I mean, Ken Jennings had won, I think, 75 grand in one day.

00:34:15   And that had been a little bit of an outlier for Ken.

00:34:17   Then Roger Craig, who is one of us, he's a computer programmer, I think database specialist,

00:34:24   And to prep for Jeopardy, he sucks down the JArchive, which is a user created, constantly

00:34:30   updated database of every Jeopardy clue and game and statistics.

00:34:34   So I can find all my games there.

00:34:36   People. So he sucked it down with, you know, which is fine.

00:34:39   And he wrote software to to relentlessly train himself on all the common Jeopardy clues.

00:34:46   So he went through and just was like, you know, slashing and burning.

00:34:50   He knew a lot of information and he bet super aggressively, but he only won I think

00:34:54   Nine games because he bet it all lost, you know, he and James Holzhauer

00:35:00   Holzhauer had

00:35:02   The knowledge like that guy I cannot believe

00:35:04   How quickly that guy is the answer to every goddamn thing in the world?

00:35:09   Yeah, like I don't think I've ever met anybody who just has that much packed in his head. Yeah

00:35:14   By the way update here that the producers have told me that

00:35:18   Hank gathers teammate was Bo Kimball. That was the fellows name who shot left-handed. Anyway, I put it I'll put it in the show notes

00:35:24   There's an article speculating how they would play the game. Yeah, so so while whole tolerance playing by the way

00:35:29   I don't know anything about him yet. And there are some secret corners of the internet in which jeopardy contestants talk to each other

00:35:35   There's no corners for every everything and so a friend of mine tells me hey, I got cast on the show guy

00:35:41   you know in Seattle say I'm going down to tape and

00:35:43   He comes back and I'm like, I know I'm not gonna ask you how to go

00:35:46   He said he said I had a great time or whatever. I'm like, oh, okay

00:35:48   So the show airs it turns out he was playing on the third game that a whole tower play

00:35:54   and and he made an incredibly strong showing and

00:35:58   Holzauer what he and told tower were contending neck and neck and my friend wound up with like

00:36:02   17,000 bucks at the end of double jeopardy

00:36:06   But Holzauer had gotten like I think two double jeopardy or two daily doubles

00:36:11   So Holtzauer finishes up with like forty two thousand dollars

00:36:13   But you know my friend had like they were neck and neck and in any other game

00:36:17   My friend was playing it like a master level and you're playing against the Grandmaster and my friend would have won

00:36:23   You know five or ten or fifteen shows I think if he hadn't been playing against all that's unbelievable

00:36:28   But what about bad luck? Well, the other weird thing about Holtzauer is is that

00:36:33   he apparently a part of his

00:36:36   Preparation was that he

00:36:40   He apparently really studied where they put the daily doubles because they're not completely random.

00:36:48   And he strategically would play for them.

00:36:53   And I guess what he would do is avoid them at first to build up a bit of money so that he'd have something to wager.

00:37:00   And then strategically hunt for the daily doubles by jumping all over the board.

00:37:04   and it just and would just tend to push it all in and just try to double his

00:37:11   money and build these insurmountable leads you know and there were times when

00:37:15   the matches they were obviously over you know 20 minutes into the show yeah

00:37:19   because he'd already hit two of the daily doubles doubled his money both

00:37:23   times and there wasn't enough money on the board for anybody to get the basic

00:37:33   math is that to have a chance to win you have to be at least within half of

00:37:36   whoever's in first place so who if the person's in first place has already has

00:37:41   $30,000 you need at least 15,000 to have a chance because otherwise you know

00:37:47   you're the person in front is not going to bet enough you know unless they're a

00:37:52   maniac they're not gonna bet enough to risk losing on the final jeopardy yeah

00:37:56   yeah so it's the amazing parts Holtzauer we're going to final jeopardy with so

00:38:00   much money that he could bet and still win if he was wrong and he still would get all day you know

00:38:06   these 133,000 total. I mean a typical game of Jeopardy! finishes with the winner getting

00:38:11   somewhere between I don't know like twelve and thirty thousand dollars. Jeff Duncan, long time

00:38:16   technology writer, musician, another Seattle guy, he was also on. He did not play James

00:38:22   Holzhauer. He won over thirty thousand dollars. He only won one game but he

00:38:28   blew through that and won like 31 grand in one day.

00:38:31   And we're all like, "Wow, it was great!"

00:38:33   What is up with tidbits, contributors,

00:38:35   and winning on Jeopardy?

00:38:36   I know we now have the highest,

00:38:38   highest percentage of any publications members.

00:38:42   It is funny though.

00:38:43   I, if you go, if I go to certain kinds of events

00:38:47   that are for like quirky or artistic people

00:38:49   or certain kinds of tech things, there are always,

00:38:52   I went to a Kickstarter thing,

00:38:53   and I think there were two other Jeopardy contestants,

00:38:55   including another winner, who were among the attendees is just

00:39:00   a kind of personality.

00:39:01   So anyway, they held this tournament, the greatest of all

00:39:04   time with those three fellows, Ken Jennings, who won the most

00:39:07   episodes, Brad rudder, who had won the most total money over

00:39:11   the years and Holzhauer, who had the single single record things.

00:39:16   And they created a primetime special and the rules, I thought

00:39:21   the rules were very well designed. The idea was that

00:39:23   each each episode would be two jeopardy matches. So instead of a 30 minute, you know, one one

00:39:30   time through jeopardy, it was two. And those two jeopardy matches combined, you know, whatever

00:39:38   you finished with would it would give you the winner of the day. And the first person

00:39:43   who won three days would be crowned the greatest of all time. And so it was indeterminate how

00:39:49   long the tournament would last in theory it could have lasted nine matches if

00:39:54   they'd all won twice and then the last one would be you know would be the

00:39:59   whoever one would would hit three or it could have only lasted three you know

00:40:03   and I my fear was that and you asked me before it started or maybe I had started

00:40:09   but I told you I hadn't started watching it and by the way if you want to watch

00:40:14   You could pause the podcast now

00:40:16   Because we haven't spoiled who won yet

00:40:19   It is totally worth watching if you really want to you could pause this show right now

00:40:25   It'll it'll sit there in your podcast player and you could take a couple of nights and watch it

00:40:30   They're all Hulu. It's on Hulu. It is on Hulu. So anybody who has Hulu can get those episodes

00:40:37   It is really good. And so if if you've been listening to me and Glenn and you think this sounds great

00:40:42   I would like to watch it before we tell you who won

00:40:46   You could pause this podcast right now and go watch on Hulu over the next

00:40:51   Couple of nights or whenever you have time to watch Hulu. It's a lot of fun

00:40:55   but

00:40:58   from now forward

00:41:00   We'll reveal spoilers as we say anything comparable fire off the spoiler horn

00:41:04   All right

00:41:04   You asked me who I was rooting for and I and I was rooting for Ken Jennings because I I've never seen I'm not a daily

00:41:11   Jeopardy! Watcher. I don't watch all the time. I watched a lot more when I was younger and when I lived it, you know

00:41:16   I mean really young I mean, I mean like geez like, you know living at home

00:41:20   So, you know, we're talking like 30 30 years ago

00:41:23   I

00:41:26   But I did watch Jennings during his streak and and then once Holtzauer started

00:41:31   I started watching and I was fascinated and it makes sense to me that Holtzauer is you know

00:41:35   His profession is that he's apparently a professional gambler

00:41:40   Which I know a lot of people think is like a myth that there are people who could like live in Vegas and be professional

00:41:46   gamblers

00:41:48   Well, the blackjack angle is probably it's really tough

00:41:53   I mean, I don't have a big steak is the thing of steak and you get marginal returns

00:41:57   Yeah, if you know how to play it, right, I don't make a ton of money poker but poker is real

00:42:02   I forget does Holzhauer play poker or is the only best sports? I know he says he bets sports and sports is tough

00:42:07   I think he does mostly sports. I think he's been I think he's generally pretty good at it as you can tell he's got undeterred

00:42:14   I mean, I think you have to be yeah

00:42:16   I think you have to pursue a strategy if you're a professional right you pursue a strategy and you don't second-guess yourself all the

00:42:22   time because you're trying to get a certain level of

00:42:24   Consistent return and it might only be like 3% right like a grocery store or something

00:42:29   But yeah comped rooms you get other kinds of free things a little way besides money

00:42:33   Yeah, the basic idea and when you bet pro sports is you have to bet a hundred and ten dollars to win a hundred

00:42:41   That's that and that 10% difference is the big so if you bet a hundred times

00:42:47   And you bet $100 each time you win $100

00:42:50   Or if you win, you know

00:42:53   If 50/50 you you just lose

00:42:56   110 every time you lose and you win 100 every time you win and that 10% eventually kills you you have to win

00:43:03   That's if you win 50% of the time you have to win. I think it's

00:43:07   54.3% of the time to break really it's not that it doesn't sound that much higher than 50%

00:43:15   Which is why people like me love to bet on sports

00:43:20   but in practice

00:43:23   They're dastardly good that the the people who set the betting lines. They're dastardly good at

00:43:30   Setting them in a way that more or less makes it 50/50, you know

00:43:34   That's they're not trying to Vegas isn't trying to beat you beat everybody all the time. That's impossible

00:43:39   All they want is for half the people to win and half the people to lose and they just collect that 10%

00:43:44   And it's a lot of money but in you know

00:43:46   The if you're good enough to pick the games where the line is

00:43:52   Mathematically off where you see that, you know

00:43:56   They're you know the for the Super Bowl the Kansas City Chiefs are last I checked are currently favored by

00:44:02   1.5 points over the San Francisco 49ers if you've got like a mathematical model for

00:44:09   How you know based on the season stats?

00:44:12   you know maybe probably geared towards their more recent games and their more recent results and

00:44:18   You've also follow along and you know who's injured and etc

00:44:22   And you can you know figure out that the line really ought to be that Kansas City or that San Francisco should be favored by

00:44:30   One point well, then you realize it's a really good bet

00:44:33   Mathematically to bet on San Francisco that doesn't mean your bet is going to win

00:44:37   It just means you know that and if you do that more often than not maybe you can win

00:44:43   57% of the time and stay ahead of the big something like that

00:44:47   Definitely possible and you watch holds our play jeopardy, and I believe it

00:44:52   But poker poker is the other game where you can easily be a professional gambler

00:44:56   Because you're not playing against the house. You're playing against other players and

00:45:00   Yeah, they just take a small cut out of the pot out of every pot and that's it and you just have to be both

00:45:08   Better than other players and better enough than the typical other players to compensate for the you know

00:45:15   Like three or four percent that the house takes out of every pot

00:45:20   Definitely a lot of professional poker players. Anyway, he

00:45:25   definitely played like a gambler, not like a quiz show

00:45:28   contestant.

00:45:29   Yeah, he he was just incredible. I mean, he had basically he was

00:45:34   he was like a machine to he was, he was relentless. He's not very

00:45:38   emotional, because he's in work mode, right? So people gave him

00:45:42   there's some criticism about like, boy, he seems like a, like

00:45:45   an uncaring person or whatever. And to be like, Well, no, he's

00:45:49   got a game face. And this is a job. He's come in here thinking about this. And he kind of

00:45:53   talked about it that way. His plan was to win as much money as he could. And he did

00:45:57   and not that he didn't have a good time, I don't think but you know, the way he talks

00:46:01   about his family kept putting in birth dates as the wagers he was writing in messages.

00:46:07   You know, he talked about his grandmother is a Japanese immigrant who didn't speak much

00:46:12   English and how he promised to go on the show and she you know, and oh, they watched it

00:46:16   together. Like, you know, he's actually I think, a very sweet

00:46:18   guy. Yeah, who comes across as many people like us in this

00:46:23   technology world. Sometimes we come across as a little in our

00:46:26   heads. And I was very sympathetic with the way he came

00:46:30   off. But he really it was a job and he came in every day and he

00:46:33   just went bam, bam, bam, bam. And if he lost, it would be

00:46:36   like, whatever, because he just he just was so good at racking

00:46:41   up the money. He'd already won all the money you could possibly

00:46:43   want. So at that point, I mean, I think he talked about that a

00:46:46   little bit. It's like, how do you go all in? When you can lose

00:46:49   everything? It's like, well, if you no longer care, if it's a

00:46:51   wager, and you're not looking at as dollars, it's like a one or

00:46:54   zero, it's not a, you know, $50,000. Because it's fake

00:46:58   money. It's not real money until you take it home. Right. And I

00:47:00   think and I think he had played that strategy better than anyone

00:47:04   has ever been able to play.

00:47:05   Yeah. So my fear I was rooting for Ken Jennings, I not strongly

00:47:09   because they're all interesting people. And they're all seem

00:47:12   very likable. And it's just such an interesting premise. But if I had to say I was rooting for

00:47:17   someone in advance, it was Ken Jennings. But my fear was that Holzhauer was just going to

00:47:21   win every you know, when just clean the plate. It just seemed like there was a chance there that

00:47:28   he's just unstoppable and that he's too good on the buzzer. Even against someone like Ken Jennings,

00:47:34   who, you know, obviously, like you said, has tremendous buzzer experience. But I think what

00:47:41   What did it go? So it was Ken Jennings won first.

00:47:45   Yeah, by like 200 points for the game.

00:47:47   Yes. Right.

00:47:49   Two games. Right.

00:47:49   And it was like, oh, it's going to be like that.

00:47:51   And then James won the second game by a pretty good margin.

00:47:55   Right. And then Ken kind of demolished James in the third episode.

00:48:01   And then the fourth, it was kind of the same thing.

00:48:03   Like there were a couple tricky questions in the fourth where Ken could have lost practically everything and really tanked.

00:48:09   James would have won by a huge amount but Ken got the answer correctly for some big bets. Yeah.

00:48:15   And he said later, he said, if I'm playing against James Holtzauer, I have to play James

00:48:19   Holtzauer style. So he would go in all in in a way that Ken typically had in another play. Right.

00:48:24   And it worked for him. Yeah. And and Ken had didn't seem to have as much success in previous

00:48:30   like tournament of champion things compared to Brad rudder. And rudder got more or less swept

00:48:35   He was really not competitive in any of the four days. So anyway, Ken Jennings was the winner

00:48:42   He won the first third and fourth

00:48:44   But on the second Holzhauer creamed everybody

00:48:49   And a couple of Jennings wins really came down to like getting one answer correct

00:48:55   push and

00:48:58   Alex even

00:49:00   Alex Trebek even called him on it where on a lot of his answers

00:49:03   he was really answering like a question like and and

00:49:07   It you know Ken Jennings has been on the show enough times that Trebek

00:49:12   Knows that he's not like doing it for dramatic dramatic effect like he was making his best guess on a lot and got them

00:49:21   Right. Yeah, like the nullification one

00:49:23   Yeah, yeah States rights and I was like we're all you know

00:49:27   I think that all four of us in the family were watching and you know, the kids are actively studying some of this stuff

00:49:31   So it's a little fresher to them than my wife and I.

00:49:33   And we're all like,

00:49:34   "Is it? Was that this or that? Was it…?"

00:49:36   But like I think maybe my older said,

00:49:38   "Nollification?"

00:49:39   I forget somebody in the house I think said it.

00:49:41   And Jenny said,

00:49:42   "Nollification?"

00:49:43   We're all like, "Oh it doesn't sound right."

00:49:44   And it's like, "Yep, that was it." Like, "Oh my God."

00:49:46   But it was great TV and it had huge ratings too.

00:49:50   It was one of the,

00:49:51   I think the top rated show in its slot.

00:49:54   It did very well. All four nights.

00:49:56   It was a great idea.

00:49:57   It was fun. It was fun.

00:50:00   No, Brad, I wrote a piece, Motherboard asked me if I'd write something up after Game 3, before Game 4, and I did some statistical analysis.

00:50:07   And Brad Rutter was only answering.

00:50:10   He was only, he only buzzed in half as frequently as the other two to start with.

00:50:14   He was wrong about 11 percent of the time.

00:50:17   And Ken was wrong slightly more than...

00:50:21   I'm sorry, James is wrong slightly more than Ken.

00:50:23   And Ken was right on every daily double he got in the first three games.

00:50:28   So you could see just from the standpoint of play over three games it looked like Ken was playing just I mean

00:50:34   This is kind of the boat or the the gambling thing. Ken was playing just slightly better not

00:50:39   crazily better but just enough and that edge meant that every time he won on a daily double he was able to

00:50:45   Leverage it ahead. I don't know how old they all are. Oh

00:50:48   Can't I forgot the exact reason we're about 10 years in age apart

00:50:55   Holzhauer is the youngest in his 30s Brad Rutter is about five years older and Ken's about five years old

00:51:01   And I think I think there's about a ten year age Ken is the oldest by

00:51:04   And he was talking about you know, kind of hanging up his spurs like I'm an old man

00:51:09   And being 43 or something 45. I don't know

00:51:13   He said I don't know if I'll do it and he's now he said he's not gonna play anymore

00:51:15   Like he said this was I can't top this. I'm going out, you know handed over to the next. Thanks

00:51:21   Roll rudder even said during one of the hey

00:51:24   Let's talk about the champions things that he he could tell that he's lost his step since his you know jeopardy prime

00:51:30   15 years ago, or I guess close. I don't know 20 years ago whenever it was

00:51:34   Yeah, the tournaments. I mean he's played tournaments more recently, but the tournament play is different because you're you're beating

00:51:40   I mean even though you're going head-to-head against someone like Ken Jennings you kind of work your way up

00:51:44   So you're not playing every game against the best people in the world or the two other best

00:51:48   I mean the way I look at it is Brad Rutter had never played against the two best players in Jeopardy history only one

00:51:54   And against two of them and you know, I couldn't win a single game today. I mean I was stumped like the entire first

00:52:01   Gameboard of the first game this run. I was like what? Oh my god

00:52:07   And the second I was better but like I couldn't I couldn't win a game today. I just don't I don't have the speed

00:52:12   Yeah, or necessarily the recall

00:52:14   Yeah, and the one that really was shows how smart these guys are and how good they are were the couple of days they had

00:52:20   the categories where it's like

00:52:22   Some kind of like two different things and you have to put them together into a pun. Oh my god that one

00:52:28   Yeah, that was hilarious

00:52:29   I I think I could get some of those if you gave me a half an hour and that would be one of them boom

00:52:35   They're like buzz is it?

00:52:37   How does your brain work anyway, I had a great time watching

00:52:41   alright, let me take a break here and

00:52:44   and tell everybody about clear.

00:52:47   Clear makes your life safer, simpler, and more secure.

00:52:52   With clear, your eyes and your fingertips

00:52:54   get you through security faster

00:52:56   at airport stadiums and other venues.

00:52:59   Never run to your gate again.

00:53:01   Clear helps you get through security

00:53:04   with the tap of a finger

00:53:05   so you can get to your gate faster

00:53:07   and reduce pre-flight stress.

00:53:10   You are your ID.

00:53:12   Start getting through security with a tap.

00:53:16   Clear replaces the need for physical ID cards,

00:53:18   using your eyes and fingertips to get you through security

00:53:22   because you, your biometrics, are the best ID out there.

00:53:27   Reminds me a lot of 2001.

00:53:29   I just went to see it again last year

00:53:30   when it was released into theaters.

00:53:32   I had forced my son to wait.

00:53:35   He'd never seen the movie before.

00:53:36   And Haywood Floyd, top security expert

00:53:39   for the US government when he gets to the moon base.

00:53:41   Clavius, how does he get through security?

00:53:44   With his biometrics, a little retinal scan,

00:53:47   looks through his eyes.

00:53:48   Well, that's what clear is like.

00:53:49   It's just like 2001 has finally caught up with us.

00:53:52   You go to the airport, you sail right through security.

00:53:55   What you do, here's how you do it.

00:53:57   You create your account online

00:53:59   before you go to the airport.

00:54:00   Once you get to the airport, go to the line for clear,

00:54:04   and a clear ambassador will be there

00:54:06   to help you finish the process of signing up,

00:54:08   and then you can immediately use Clear.

00:54:12   Clear helps you get through security faster

00:54:13   in 65 plus airports, but it's not just airports.

00:54:17   Stadiums and other venues too.

00:54:19   A whole bunch of stadiums, Major League Baseball,

00:54:22   NFL Football, these leagues now have policies

00:54:25   that implement airport style security

00:54:27   before you get in a stadium.

00:54:29   It's a big pain in the butt, but guess what?

00:54:33   If they have Clear, you sail right through,

00:54:35   just like you did in the old days

00:54:36   before there was onerous security like that at stadiums.

00:54:39   And they have family plans.

00:54:42   If you're traveling with your family,

00:54:43   up to three family members can be added

00:54:45   at a discounted rate.

00:54:46   And kids under 18 are free when traveling

00:54:50   with just one clear member.

00:54:53   Totally free if you have kids under 18.

00:54:55   Clear is the absolute best way

00:54:57   to get through airport security.

00:54:59   And it works great with TSA PreCheck too.

00:55:03   Right now, listeners of the talk show

00:55:05   can get their first two months of clear absolutely free.

00:55:10   Go to clearme.com, C-L-E-A-R-M-E.com, clearme.com/talkshow

00:55:15   and use that code talk show.

00:55:21   That's clearme.com/talkshow with code talk show

00:55:26   and you will get free two months of clear.

00:55:29   Go sign up, sail through the airport

00:55:31   like you never have before.

00:55:33   My thanks to clear for sponsoring the talk show. Next topic.

00:55:37   What do you think? iPhone encryption never, never gets old. Does it?

00:55:40   I've been writing about it a lot. I, I really,

00:55:44   really feel like it's the sort of thing that needs to be nipped in the bud.

00:55:49   Like it's not worth waiting until it,

00:55:52   if to me the DOJ gets its way and gets legislation proposed to limit it.

00:56:00   Like, it really needs to be nipped in the bud as a bad idea.

00:56:06   I think people who listen to the show and follow my site

00:56:09   know the basic story.

00:56:10   But basically, there's this Pensacola shooter, this guy

00:56:14   who shot up a Navy base.

00:56:16   And he had two iPhones and shot a couple of people.

00:56:21   And he's dead now.

00:56:25   But the federal government would like to get access to his phones.

00:56:28   And Apple has provided as much information from his iCloud

00:56:34   accounts as they could, apparently measuring in the

00:56:37   gigabytes. But Attorney General William Barr is raising hay

00:56:43   that it's not it's not right, or according to him, that the

00:56:47   FBI that law enforcement officials can't just quote un

00:56:50   quote, get into iPhones. You know, when they have a court

00:56:56   order, etc, etc. And Apple's answer is more or less we can't,

00:57:00   we can't help you. Although they haven't definitively said,

00:57:04   can't, I guess, you know. And, and there's some ambiguity over

00:57:10   that. And I feel like I've, I've over the years, I've lost track

00:57:14   of exactly how this works. And what literally what we know can

00:57:18   and can't be done. You know what I mean?

00:57:21   Well, yeah, because the right, the fundamental thing is that

00:57:26   starting with I've lost track because of the five s is starting with when Apple built secure enclave

00:57:34   in as a separate chip into the iPhone architecture, they effectively abrogated their ability to

00:57:41   extract information that like the passcode, right, that would decrypt the phone. And because they

00:57:47   designed the system around pushing all of the generation of key material that would allow

00:57:54   decryption, or access to each device. So the issue is, and I think this came up with the San Bernardino

00:58:02   shooters a few years ago, and this previously came up is Apple had said, we'd have to create what they

00:58:08   would call a government OS, we have to create a special OS be able to load it on the machine. And

00:58:12   this OS would allow more rapid calculation, it would bypass the the limits on lockouts that be

00:58:22   that Apple started to add, you know, alongside Secure Enclave. So the issue even isn't so there's

00:58:29   like multiple issues. One is the current iOS and iPad OS is the hardware architecture is designed

00:58:35   to prevent rapid fire, brute force cracking. So you can't just enter a bunch of different pass

00:58:41   codes. So the was a gray not gray bar gray gray lock. Yeah, it's the one right. They they and

00:58:48   maybe some other folks had ostensibly allegedly developed systems

00:58:51   Gray shift is a gray shift. And their product is gray key. Gray key is the little sort of like I Mac

00:59:00   mini sized. That's Thank you device. Yeah, celebrate and gray shift are the companies at least the two,

00:59:07   they claim to be able to break in. And they seem to have been able to bypass hardware and software.

00:59:13   Well, I guess has to be baked in a hardware protections and protections in the in the certified OS

00:59:19   that would allow them to then do brute force cracking and crack shorter keys.

00:59:24   And as you know, as the audience I'm sure knows, the longer key is, or the more complicated it is,

00:59:29   you know, a four digit key can be solved very quickly, a pin.

00:59:32   Six digit takes quite a bit longer.

00:59:34   If you do something like me, a 20 mixed character, or alphanumeric,

00:59:39   the heat death of the universe couldn't come too soon.

00:59:41   But you know, really, it's years or something to go through all the iterations

00:59:45   as rapidly as the software could take them with no throttle in place.

00:59:49   So Apple said, hey, we're not going to make in the San Bernardino case, we're not going to make a government OS, we don't want to be compelled to do this, that would bypass the brute force cracking speed is my recollection, right? And then the other part was we can't give you the key because we built our system so that we don't have access to it. It's locked away. You'd have to break open the secure enclave in like a clean lab, and even then the chip would probably self destruct. I mean, that's they didn't say that, but that's the way these chips are built. You wouldn't you shouldn't be able to extract information from them, even with

01:00:19   with like scanning electron microscopes, right? The issue

01:00:22   today?

01:00:23   Well, I think if my understanding is correct, and

01:00:27   this is complicated enough where I feel like I'm uncertain, but

01:00:30   tell me if you agree, I think the best way to think about it is

01:00:33   that there's two, I don't want to say operating systems, let's

01:00:38   just say systems on an iPhone, and it just goes for iPad two.

01:00:42   There's iOS, which is what is the traditional operating system

01:00:48   that runs the device as a computer as we know it.

01:00:53   I think with the CPU and RAM and its storage

01:00:57   and input output and USB support

01:01:01   through the lightning connector

01:01:03   or the USB-C port on an iPad Pro.

01:01:06   And it's an operating system that runs on a computer

01:01:09   and Apple does its best to keep that secure.

01:01:13   They also have at a hardware level,

01:01:17   I believe it is true that that an iOS device will not accept an operating system that has not been signed by Apple.

01:01:28   That that's correct. And there was this jailbreak, if you jailbreak a device, you're essentially figuring out a way to bypass to take a modifying, take a bug and exploit the bug to run arbitrary code.

01:01:40   And then that code does something to defeat Apple, but but you can't really create your own

01:01:46   version of

01:01:49   iOS and get it to install on an iOS device without having it be signed by Apple and

01:01:54   When I say signed by Apple there was a story

01:01:57   I was looking for today and couldn't find it off to see if I can find it

01:02:01   but there was a story a couple of years ago that

01:02:03   to sign a

01:02:06   release of iOS it is it really is like something out of a Mission Impossible

01:02:12   Movie, you know, there was a question about whether people were trying to subvert the engineers at Apple

01:02:18   There's only a few people at Apple

01:02:19   It's a very small number who have like the fingerprint and retinal and whatever stuff that allows them to access

01:02:26   The thing that allows them to sign right the operating system

01:02:30   And so if you subvert those people could you then conceivably right?

01:02:34   So even who they are is kept secret and that was a concern I think you know and it you know

01:02:40   You think like I would you know, it's nothing really works like that where there's a secure underground vault

01:02:45   with with locked doors and guards and you know, you know like

01:02:51   You know and people believe it for like a nuclear submarine were to two members of the crew

01:02:58   More than what is possibly arms distance away, you know, like let's say at least

01:03:03   ten feet away have to put keys that they keep with them around their neck 24

01:03:10   hours a day and simultaneously turn the two keys so two officers have to turn

01:03:16   keys that couldn't possibly be done you know just physically by by one person

01:03:22   there's a system like that in place where it like to Apple employees have to

01:03:27   use physical objects I don't know if they're keys they're probably you know

01:03:31   a more complicated than like, you know, a physical key, but

01:03:35   yeah, it's like, it's an encrypted, it's a device itself

01:03:38   that contains encrypted information that has to be

01:03:40   inserted, right. And they both have to do it in a room and can

01:03:44   only be done in a room whose location Apple literally won't

01:03:48   won't disclose the actual location. I mean, presumably

01:03:52   it's somewhere in Cooper Dino on one of Apple's campuses, you

01:03:56   know, but who knows? I mean, just for practicality sake, you

01:04:00   know, you would assume it's somewhere in Cupertino, and

01:04:03   perhaps they've moved it from the old Infinite One campus to

01:04:06   the new one. But I honestly believe it's actually

01:04:09   underground. Only a handful of Apple employees have access,

01:04:14   there's obviously some kind of plan in place for what happens

01:04:16   if one of the people involved, you know, it proverbially gets

01:04:21   hit by a bus. But it really is that difficult. It is not it,

01:04:26   you know, really sounds like something out of Mission

01:04:28   Impossible or James Bond movie or something like that. So just to get a version of iOS that that

01:04:33   that that these devices will sign at a hardware level, they really won't sign unless they can

01:04:38   check.

01:04:39   This is a real thing in a there's a bunch of there's a bunch of industries, there's a bunch of

01:04:45   kinds of things around the world that are like this. I mean, this is one of the most critical

01:04:49   because it's device specific. And I assume Google has a very similar Yeah, thing. I remember a few

01:04:55   years ago, I had to look it up while you're talking because I want to remember the detail is there was

01:04:59   a ceremony in which to sign the route for DNS. So you know how there's all these route.com and

01:05:07   whatever different domains, but there's also route servers that are the root of authority for all of

01:05:13   DNS. There's a bunch of them run redundantly around the world. Well, in order to sign the root

01:05:17   certificate to start the process to get a secure and encrypted form of DNS underway, they had to

01:05:24   get a bunch of people together. They all got together and they all had to be in the same

01:05:28   physical place to sign these keys with the specific cryptographic hardware that they

01:05:36   owned. And I was talking to the folks at Let's Encrypt that have they would offer free SSL

01:05:44   certificates or HTTPS certificates for websites talking to their CEO last year. And he said one

01:05:50   of the most expensive things they own is a device that is designed to protect their route. And, you

01:05:57   know, it's tamper resistant, it's it's got various safeguards, it's in a secure facility that has to

01:06:02   meet certain international standards. And this is a real it's it's a it's not casual. Now, I think

01:06:07   you also need years ago, it was much more casual. And now it is certainly not now. Right. In

01:06:12   addition to the fact that the the iOS devices at a hardware level will not accept a version of iOS

01:06:20   that is cryptographically signed provably by Apple.

01:06:23   I also believe and I think you would know, correct me if I'm wrong,

01:06:28   that you cannot.

01:06:30   Update, there's no way to update an iOS device offline.

01:06:36   It also must be online.

01:06:40   To. Call back to Apple over

01:06:46   over SSL secure and make sure that Apple is still agreeing to sign this OS so that if

01:06:53   there's a known bad version of iOS, if Apple releases a version of iOS and then after it's

01:07:00   out realizes we know that this version is compromised in some way by a bug, we're going

01:07:06   to revoke the signing certificate for that. And then there's no way to get that version of iOS

01:07:12   onto a device because it still before it'll go it has to call home to Apple securely and

01:07:19   Apple has to say yes, that is still a valid signed version of iOS.

01:07:23   No, you're totally right.

01:07:25   And I don't think I thought about it that way.

01:07:26   But it's correct.

01:07:27   Mac OS, you can still update offline with an installer that's signed and so forth, right?

01:07:32   But there are revocation processes.

01:07:34   But I think if your Mac were not on the internet, I believe it's still I think so too feasible.

01:07:38   Yeah, so whenever you see if you're using iTunes to restore well an image and I think that's because that Apple still acknowledges that there

01:07:45   are there are

01:07:47   Scenarios. Oh, yeah, where people have Macs that are in rooms that you know aren't

01:07:53   Internet access isn't allowed. Oh

01:07:55   Yeah, no that makes total. Yeah, that's that's right

01:07:58   And yeah, but so every iPhone and iPad when you install an update directly from the device

01:08:03   It has that verifying stage which may seem mystifying because didn't it just download it. It's like yes it did

01:08:08   But a cracker could conceivably force a device to sideload. I mean, there's you know management tools

01:08:15   Maybe they could get an update that's not legit

01:08:18   So I still have to phone home

01:08:20   all of this is to say is that Apple takes the security of iOS and the ability to only sign known good versions of

01:08:27   iOS as as I don't know what more they can do other than to continue to hunt bugs and continue to look for better ways

01:08:35   but it literally is to the best of their abilities to keep iOS as

01:08:40   secure as possible

01:08:42   At every step of the way that said there are known exploits in iOS

01:08:48   That allow people and especially older versions. What's the name of that one that came out in the bootloader?

01:08:54   Which is unfixable?

01:08:56   There's there's a low-level part of the operating system. It's in this only for older devices

01:09:02   Yeah, it was fixed and I think the last version affected was the iPhone 10

01:09:07   Checkmate

01:09:11   CH eck m and then the number eight because whoever found it, you know is a hacker and names things that way

01:09:20   but there's a

01:09:22   There's a bug in the bootloader of every iPhone up to the iPhone 10 generation. So I guess the iPhone 8 would be included

01:09:31   Which unfortunately compromises iOS in a way and Apple can't fix it because the bootloader is part of is

01:09:38   Is ROM it you know to go?

01:09:40   You don't really think about exploit. You know physical proximity after physical access to the device

01:09:46   Yeah, you do have to have physical access to the device

01:09:48   But anyway that said they take as much they do as much as they can to keep it secure

01:09:53   They fix that bootloader issue in the iPhone

01:09:55   10s and 11

01:09:59   But people can still get into iOS and Apple fixes those bugs when they as soon as they know about them

01:10:06   but getting into iOS does not get you into the data on the device because the

01:10:13   The encryption of the device is that second system I alluded to earlier

01:10:18   So there's iOS and it's not like Apple takes iOS security lightly

01:10:22   They take it as seriously as they can but then there's the secure enclave and the secure enclave

01:10:27   Isn't just a a secure storage device it is a it's a system it is in effect its own computer

01:10:35   and

01:10:39   It it now like one of the things that they changed I believe is

01:10:45   It so like there's an 80 millisecond delay

01:10:51   Anyway, when you type in your passcode when you type in your password passcode, whatever you have to unlock your phone

01:10:57   IOS doesn't it takes the input you tapping 1234 on the keypad on the screen is obviously

01:11:09   handled by iOS but then iOS takes that passes it to the secure enclave and says here's a

01:11:16   guess at the passcode to the encryption. It's the secure

01:11:22   enclave that takes that guess and computes whether or not it's

01:11:29   valid. And Apple has designed it I don't think originally, I

01:11:33   think it's one of the changes since the five s and I know that

01:11:35   the secure enclave your your guess was right. It definitely

01:11:38   coincides with the five s which is biometric authentication.

01:11:43   Because they offered encryption of, yes, because they offered, that's it. Before that, you could

01:11:48   encrypt your phone, but it was a slow process. Right. And that was added in some iOS. But it's

01:11:53   with the that's exactly what the fingerprint sensor, they very wisely. And I think this is,

01:11:57   you know, not to criticize other smartphone makers, but whatever. I think Apple, from the very

01:12:03   start said, we're not going to store any biometric information in a way that is retrievable in any

01:12:08   fashion. So it's all one way into secure enclave. And then over time, they've expanded that to

01:12:12   include like Apple pay and face ID and a bunch of other stuff. It's in and third party apps can throw

01:12:19   stuff into secure right enclave as well have the same advantage. I don't know the inside story

01:12:24   of the timing of that. And you know, I would love to know it if anybody out there does know it. But

01:12:30   if it's true, that Apple, Apple, in theory might have had a touch ID fingerprint sensor

01:12:40   ready to go a year, two years before the five s. But there,

01:12:46   they were never going to use it until they also had the secure

01:12:50   enclave. So in theory, I don't know which took longer to

01:12:53   develop. Maybe they had this, I think it's quite possible that

01:12:57   they might have had the fingerprint sensor ready to go

01:13:01   before the secure enclave because I think if the secure

01:13:04   enclave was ready to go a year or two years earlier, they might

01:13:08   have included it even before they had biometric

01:13:11   authentication because it still would have been useful. Right?

01:13:13   It still would have helped increase it. I suspect that the

01:13:17   fingerprint sensor and touch ID that was introduced with the

01:13:20   five s was waiting for the secure enclave. It's also true

01:13:24   that the five s was the first to go 64 bit on the on the a series

01:13:28   chip and maybe that plays into it too. I don't know. But there

01:13:31   were a couple of things that all had to come together for that to

01:13:34   work. But I do know I do I do know for a fact talking to

01:13:38   people at Apple off the record, that there was no way they were

01:13:41   ever going to do the fingerprint sensor before they had the the

01:13:45   secure enclave, because there's just no way that they would

01:13:48   store any biometric information without it.

01:13:51   Right. I think that's exactly right is I would not be

01:13:53   surprised the fingerprint sensor was among the more advanced ones

01:13:56   at the time, it was more accurate use the you know, the

01:13:59   capacitance and so forth. But I don't think anything they did

01:14:02   was totally like so, you know, what's the right word?

01:14:07   Such a breakthrough that it wasn't,

01:14:09   you know, that they were waiting.

01:14:10   I'm sure they perfected it, they had more time,

01:14:12   but the Secure Enclave, that marked a huge change.

01:14:15   I mean, I was writing something for,

01:14:18   about the YubiKey and the, for Fast Company.

01:14:22   YubiKey finally was able to release a lightning version

01:14:25   of its authentic, external authenticating device,

01:14:29   'cause Apple didn't support that until,

01:14:31   actually pretty recently in Safari for Mac OS

01:14:34   and then they added it to iOS,

01:14:36   but it seemed like it was coming to iOS.

01:14:37   So these guys did it.

01:14:39   But so the Yubikey is part of the,

01:14:41   it's essentially an external secure enclave at some level.

01:14:45   It gives you some of the same protections

01:14:47   that you get from a device-based secure chip in a token,

01:14:52   even though the token conceivably can be moved around.

01:14:56   And anyway, the thing is, I think as I was researching this,

01:15:00   I realized there had been this huge sea change

01:15:02   is before Secure Enclave,

01:15:04   the encryption was the top bar and verification,

01:15:08   and there might've been some hardware steps,

01:15:10   but they were work aroundable.

01:15:13   And then Secure Enclave set the bar that everyone,

01:15:16   Microsoft now encourages in many,

01:15:19   you can buy PCs, a large number of PCs

01:15:21   that have the equivalent of a Secure Enclave in it.

01:15:23   Google had to shift to it ultimately

01:15:25   so they certify devices with it.

01:15:28   and the web authentication standard that got promoted,

01:15:31   the whole FIDO2 standard is kind of built around

01:15:34   having a one-way chip that can only produce

01:15:37   sort of outputs based on inputs after it's been set up.

01:15:41   So anyway, it was a big and important change

01:15:44   and no one should want us to go backwards from that

01:15:47   'cause it protects people at every level,

01:15:49   protects vast amounts of personal and commercial information

01:15:52   and those people in government who rely on it

01:15:55   for security as well or as part of a security system.

01:15:57   I do believe it is true that the call it firmware and firmware seems like an outdated term to

01:16:03   me, you know, but let's call it the firmware that runs the secure enclave.

01:16:08   I believe that Apple in a software update to iOS can update the firmware on the secure

01:16:14   enclave.

01:16:15   So there's aspects of the secure enclave that could that's correct, that could get a bug

01:16:20   fix if necessary.

01:16:22   the

01:16:24   8080 millisecond and and I always forget this a millisecond is a thousandth of a second. So 80 milliseconds is

01:16:31   8/100 of a second so

01:16:35   12 per second give or take great 12 times 8 is 9

01:16:39   What is 12 times 8?

01:16:43   6

01:16:46   I had to do my I had to do my trick my trick for 12 times 8 is 8 times 10 plus 8 times 2

01:16:53   So that's 96

01:16:55   So, you know roughly 12 per second

01:16:58   Which doesn't you know sounds pretty fast?

01:17:02   but that that 80 millisecond

01:17:05   Computation time for taking iOS saying to the secure enclave. Here's a password guess 1 2 3 4

01:17:13   Secure enclave takes it

01:17:16   Does the computation that computation runs in hardware and that 80 mil

01:17:21   So there's no way that somebody could force a firmware update to the secure enclave to say hey

01:17:28   How about you reduce the time to zero, you know, how about you do it as fast as you can?

01:17:33   That can't be done. That's actually

01:17:36   Implemented in hardware and so 12 per second is the most guesses you can and so basically the way these products that we know about

01:17:43   about work, celebrate the gray.

01:17:46   I keep forgetting, which is the gray key is the product

01:17:49   from gray shift the company, they jailbreak iOS

01:17:54   to get in so that they can talk to the secure enclave

01:17:59   because there's no way to talk to the secure enclave

01:18:01   from outside the iPhone.

01:18:03   And then once they're in, they install, they run software

01:18:08   and the software talks to the secure enclave

01:18:11   and starts running guesses as fast as it can.

01:18:16   And then once they find it, then they,

01:18:20   like there was a story I linked to it a couple of days ago

01:18:23   on Daring Fireball where somebody leaked

01:18:25   to a security researcher two years ago,

01:18:28   how the gray key device works.

01:18:31   - Oh yeah, yeah.

01:18:32   - And they even had photographs.

01:18:34   They even had some pictures of the software running

01:18:36   and it runs and then when it finds the right passcode,

01:18:39   displays it on screen and then you can unplug it from the device and then I

01:18:45   guess restart start the iPhone and then enter the passcode that the device tells

01:18:49   you what to do but even that software can only do 12 guesses per second and

01:18:57   like you mentioned I think before we started recording that that doesn't mean

01:19:01   that they start zero zero zero one zero zero zero or you know if they know it's

01:19:06   six because I guess you know how long the passcode is just by

01:19:10   iOS gives up how long it is, right?

01:19:13   Let me double if I turn on my phone if you do a pet if you have a four six digit passcode

01:19:19   I think yeah disclose that right? Yeah, cuz you chose you how many circles I because I can alphanumeric passcode though

01:19:25   If you choose that option, even if you have a short passcode

01:19:29   Then it's impossible to know how long it is anymore

01:19:31   which is what I did at some point when a security person

01:19:34   like Rich Mogul or someone else said,

01:19:36   "Yeah, six digits is probably not adequate."

01:19:37   I was like, "Oh, okay, making the change now."

01:19:40   Yeah, that's right.

01:19:42   So this is more true for,

01:19:44   or more useful for breaking into databases

01:19:48   that are hijacked online because,

01:19:50   and this is changing as well,

01:19:51   but a lot of databases, the ones that were encrypted

01:19:54   were using a very weak algorithm

01:19:57   that was very easy to brute force through.

01:19:59   SHA-1, and that's been updated to later versions,

01:20:01   a lot of databases.

01:20:03   And then they've added another technique

01:20:04   in a lot of online, you know, good online sites,

01:20:07   so that no cracking a password once

01:20:10   doesn't help you crack the same password

01:20:11   you spy everyone on the same database.

01:20:14   They're all seeded in a slightly different way.

01:20:17   Salted is the word.

01:20:18   But anyway, so those people,

01:20:20   the people who are trying to crack exfiltrated databases,

01:20:25   They have these huge, you know, not rainbow tables,

01:20:30   we used to call them, those still exist

01:20:32   that were just, you know, essentially pre-cracked versions

01:20:35   or pre-computed versions of passwords

01:20:39   going through certain algorithms like SHA-1.

01:20:41   But what they have now is there are well-known passwords.

01:20:45   There's passwords that have been extracted.

01:20:46   There's like a, I can't remember if you go to-

01:20:48   - I think that's more likely 'cause I-

01:20:50   - Have you been pwned?

01:20:51   I think that the SHA-1 computed versions of passwords

01:20:56   are probably not that useful

01:20:58   and certainly not useful getting--

01:20:58   - Not very useful anymore.

01:20:59   - Right, because everybody's gonna hash them some way.

01:21:02   And so they're not going,

01:21:03   it's gonna be the hash of the password,

01:21:06   the hash and then SHA-1 computed.

01:21:09   And if you don't know that hash,

01:21:11   knowing that list isn't gonna help.

01:21:13   - Right, but I think this is still used in the same way

01:21:16   is that because of the computational overhead

01:21:19   now being higher, people trying to crack those databases,

01:21:21   they have like, you know, they know,

01:21:23   so it's have you been pwned,

01:21:24   which I recommend everyone to sign up at.

01:21:27   They have a list of, what is it, a billion?

01:21:29   I don't know, it's hundreds of millions of passwords

01:21:31   that have been leaked in the clear online.

01:21:33   They don't associate them with accounts,

01:21:35   but researchers and others can see,

01:21:37   can get access essentially to a huge set of passwords

01:21:41   people have used.

01:21:42   So what are the first passwords that are checked

01:21:44   are the shortest ones from that list

01:21:46   that are most commonly appeared.

01:21:47   And then there's those dictionary words,

01:21:49   there's all these other things.

01:21:50   So when you're brute forcing,

01:21:52   you don't start with zero, zero, zero, zero, zero, zero,

01:21:55   zero, one, whatever, you use common letters and numbers.

01:21:58   And then at some point you go into the brute force thing

01:22:00   after everything common turns out not to work.

01:22:03   - Right, and I'm sure there's a similar list

01:22:06   of six digit passcodes, just numeric by frequency.

01:22:11   And so instead of starting at all zeros,

01:22:14   you start at some combination.

01:22:17   And I forget what I was using recently.

01:22:20   Uh, it wasn't that recent that I'll remember, but something where I had to

01:22:24   enter a, create a numeric passcode.

01:22:27   Oh, I entered one and it said that one is commonly used and I didn't enter one, two,

01:22:33   three, four, and it was something that I didn't really care much about, which is

01:22:38   why I don't even know what it was that I guessed, but let's say I just went to the

01:22:42   four corners, right?

01:22:44   So I did four corners in order or something like that.

01:22:48   And it said that one is frequently used.

01:22:49   Get, you know, try something else.

01:22:52   But anyway, jack Nikas, I'll put this link in the show notes, I just linked to it on

01:22:57   during fireball to had a good story, basically describing what we're trying to describe here,

01:23:03   which is basically how does this actually work?

01:23:05   And and in the context of this law enforcement campaign, does the FBI really need Apple's

01:23:10   help?

01:23:13   some numbers he got from talking to security researchers and he listed them in his Twitter

01:23:19   thread who helped him out to get this story together.

01:23:24   Average time it would take to guess different passcodes depending on length, four numbers,

01:23:28   seven minutes, six numbers, 11 hours, eight numbers, which I don't think is an option

01:23:33   on iPhone 46 days.

01:23:35   You could you could do it, but you wouldn't know you have to be alphanumeric.

01:23:39   Right?

01:23:40   Yeah, right.

01:23:41   10 numbers would be 12.5 years. Now if you use alphanumeric pass phrases, a four character

01:23:50   alphanumeric passphrase, if you use letters and numbers, seven days, six characters, 72 years,

01:23:58   eight characters, 276,000 years. Yeah, mine is like 20 characters. Yes, probably unnecessary.

01:24:06   If these numbers are right, that's unnecessary. Here's the thing, the entropy that he's talking

01:24:10   about there the the is if you're choosing actual random alpha numeric so if you pick the entire set

01:24:16   of eight characters and it's you know capital z lowercase b eight exclamation point mine isn't

01:24:21   that i use the um the method that a lot of researchers promote and that you can do in

01:24:25   one password now it's the dice where method i have three words that have never appeared together in

01:24:31   any work of literature and so they're long enough so each of them is is a not a common word and each

01:24:38   of them is easy for me to remember because I remember three things instead of you know

01:24:42   eight, eight round things. So I can type it in and so it's critical for me for a passphrase

01:24:47   that I can remember it pneumonically because I'm never going to have, I don't have it stored

01:24:51   anywhere because it's my passphrase so I have to be able to remember it. I mean I got stored in some

01:24:56   some careful ways but anyway so mine is not because it's the amount of entropy isn't high enough

01:25:03   mine is probably crackable in oh I don't know a hundred thousand years yeah because it's because

01:25:07   it's got real words in it but still good enough maybe a million years good enough

01:25:10   well and it's a balance between you know convenience and uh security i mean and it's how much

01:25:19   how much do you really care if somebody who had one of these devices could get your phone i mean

01:25:25   for an awful lot of people they don't like my like my mom uses just a six digit passcode um

01:25:34   uh she probably used four really honestly if somebody got her phone honestly she the only

01:25:43   thing she'd be upset about is that she needs a new phone i mean there's literally nothing

01:25:47   you know somebody could read all of her email and and she just doesn't care much i mean yeah then

01:25:52   the other extreme is like you remember was it two plus years ago that the uh that united arab emirates

01:25:58   a human rights activist that somebody burned three separate zero days against it.

01:26:05   Yeah, yeah.

01:26:06   And you're like, oh my God, the amount of resources a government entity has to throw.

01:26:11   So that's one extreme is three zero days all at once versus I don't have my recipe,

01:26:17   my favorite recipe for ricotta pudding anymore. You know, like we do have there's a range of

01:26:21   security, but like so your mom could use four, but like six is great because six deters anyone

01:26:27   casual. There's no system out there that is below. There's no casual forensic or personal cracking

01:26:34   system that would allow someone to get your mom's phone. She would have to piss someone off at a

01:26:40   federal national security level probably or mass murder level. So that's unless she's planning to

01:26:46   do that, which I hope she's not. She's good. But anyway, long story short, if you're listening,

01:26:51   and you're paranoid about this, which is reasonable, and just if it bothers you,

01:26:55   If you just wake up in the middle of, you know,

01:26:58   at three in the morning and it just bothers you thinking

01:27:00   about this idea that there are devices out there

01:27:03   that they can plug your iPhone into and hit a button

01:27:06   and 12 hours later, reveal your passcode.

01:27:09   If it just makes you uneasy,

01:27:13   switching to the alphanumeric passphrase

01:27:16   and even making it, you know, like two words,

01:27:19   two words separated by a space or a dash

01:27:23   an underscore or something will literally make your data impregnable at the iPhone level. I mean,

01:27:33   unless that, you know, they'd have to hit like a greater than the number of subatomic particles

01:27:40   in the universe to one shot to get it. Yeah, I mean, quantum computing might put the lie to

01:27:45   all this. Well, no, but not with an iOS device, because it's totally no, right. You're totally

01:27:49   right. You need so you know, there have been experiments in the past at extracting the

01:27:53   secure enclave and try to do things like use electron microscopy, right? Other kinds of

01:27:59   things. But but I think even that level, I think Apple is obviously they've got an amazing

01:28:03   chip group. They're constantly improving the secure enclave. And it is it is resistant.

01:28:09   And when someone develops an attack there, or the theoretical one, they're working on

01:28:12   that talking talking to a lot of people on Twitter recently and emailing some readers

01:28:17   as I write about this, I and I wasn't certain I think I used to know this better. I'm probably

01:28:22   gonna forget it again. If hopefully this whole thing dies down again. The thing you have

01:28:26   to remember is that cracking iOS doesn't crack your data at all. You have to get the secure

01:28:33   enclave and to to everybody's public knowledge, every public security experts knowledge, the

01:28:40   secure enclave on the iPhone has not been cracked and on recent ones that 80

01:28:46   millisecond computation time for a guess you could you know your iPhone today

01:28:54   could could be found 40 years from now when we have quantum computers that are

01:29:01   as much faster today's than today's computers were from 40 years ago with

01:29:09   Apollo program, and it still wouldn't help them crack the secure enclave if they still have to go

01:29:16   through the secure enclave because that 80 millisecond per guest thing is still there.

01:29:19   Now there might be, you know, who knows, there, you know, does the NSA have a crack for the secure

01:29:26   enclave? Does does a foreign government have it? We don't know. But to everybody's to everybody's

01:29:33   knowledge publicly, that hasn't happened. I think if they have it, if anyone has it,

01:29:38   it's a nuclear bomb, right? Because they cannot, they can't deploy it. Because if they deployed it,

01:29:45   and it was ever discovered, and it renders it, you know, right, it's it's the end, and then Apple

01:29:52   will fix I mean, this is there's always an arms race about this kind of thing. But that was why

01:29:56   the three zero day thing I remember it so distinctly, is because once you burn these things

01:30:01   are gone forever, and the system gets better. And it's not like there's always gonna be flaws and

01:30:05   things, but it's not like they're always going to be the same category of flaw. There's not always

01:30:10   gonna be ones that that allow, I mean, at some point, iOS could be effectively impenetrable. I

01:30:17   mean, I know that's like saying the Titanic is unsinkable. But But to all intents and purposes,

01:30:21   they have figured out enough ways to close down enough avenues that even the most determined

01:30:27   government funded whatever cannot find, you know, a really effective way and they only find

01:30:32   little things around the edges. That's not infeasible.

01:30:35   And you know, the secure enclave is very, very different. And, and

01:30:40   if you understand it, even at the layman's layperson's version,

01:30:44   the level that I do, it's easy to see how much more defensible

01:30:49   it is from Apple's perspective than iOS. Because iOS has

01:30:54   lightning input, which also takes USB, which is obviously

01:30:59   some part of the jailbreaks that the gray key installed, because

01:31:02   what we know about great keys, you plug it in via lightning cable, and it has

01:31:06   obviously has networking. So there could be bugs in the networking stack. There

01:31:12   was there was an exploit recently somebody wrote about where there was an

01:31:16   iMessage exploit, where a carefully crafted URL if tapped in iMessage would

01:31:23   exploit it. You know, there's all sorts of vectors like that, because iOS from a

01:31:28   Functional level has to take input from so many ways whereas the secure enclave literally is physically connected to the hardware

01:31:35   One way, you know, there's there's literally one way in and so there's and there's no other you know, it's like the

01:31:42   You know, it's like defending a building with doors and windows and

01:31:49   roof access and

01:31:52   antennas versus like the underground bunker in

01:31:57   you know the Rocky Mountains where there's one door and and a giant Rocky Mountain above it and

01:32:05   A you know 20-story elevator going under down underground to get to the you know

01:32:11   The secret bunker where the president will live, you know in a nuclear war

01:32:14   All right. Let me take a break here and thank our next sponsor to our good friends at hover

01:32:19   Hey

01:32:21   hover is

01:32:23   is a jumping off point for a ton of entrepreneurs,

01:32:25   and they want you to start your business

01:32:27   with a new domain name.

01:32:28   They have over 300 domain name extensions,

01:32:31   the top level domains.

01:32:33   Everybody knows dotcom, dotnet, dot org.

01:32:35   They've got over 300 of those to choose from

01:32:38   when you wanna build your brand online.

01:32:40   No matter what you wanna build,

01:32:41   there's a domain name waiting for it.

01:32:42   You'll find excellent tech support available

01:32:46   to answer any questions you may have.

01:32:47   Their support team doesn't upsell you.

01:32:49   They only work hard to help you get online.

01:32:52   They have free who is privacy protection.

01:32:55   Man, the domain name companies that charge you

01:32:59   for privacy protection, what a racket.

01:33:01   They have a super clean, well-designed user experience

01:33:06   and user interface and monthly sales

01:33:09   on popular top-level domains.

01:33:10   It's hard to see why anybody would go somewhere else.

01:33:15   It's such a popular choice for getting a new domain name.

01:33:20   One of my favorite features is the guessing.

01:33:23   So if you enter your intended domain name,

01:33:26   dot whatever, and dot whatever isn't available,

01:33:29   they give you a tremendous number of options

01:33:31   to get something similar.

01:33:33   They offer, like I said before,

01:33:35   they offer best in class customer technical support,

01:33:37   answer any questions you want.

01:33:39   Super good security, like Glenn and I were just talking

01:33:42   about all these security things.

01:33:43   One of the things you certainly don't wanna have hijacked

01:33:45   is a domain name.

01:33:46   Hover has you covered on that regard.

01:33:49   best in class, locked in.

01:33:52   You can make changes when you want to,

01:33:54   but nobody else is gonna be able to

01:33:56   jimmy their way into your account

01:33:58   and steal your domain name.

01:34:00   And like I said, over 300 domain name extensions

01:34:04   to choose from.

01:34:04   So grab your next domain name at hover.com/talkshow.

01:34:09   The other thing they offer, let me just say this,

01:34:15   The other thing they offer is a tremendous user experience

01:34:19   for transferring existing domain names

01:34:21   from another registrar.

01:34:23   So if you have a registrar, you already have,

01:34:25   even if you don't need a new domain name,

01:34:27   if you just wanna move your existing domain names

01:34:29   from some crummy registrar to the best, which is Hover,

01:34:34   they've got you covered on that regard too.

01:34:35   Really, really easy.

01:34:36   If you're putting it off

01:34:37   because you think it's just a huge pain in the butt,

01:34:40   don't worry about it.

01:34:41   Really, they've got you covered.

01:34:43   So go to hover.com/talkshow, get a 10% discount

01:34:47   with their referral link.

01:34:49   And that's good for all new purchases.

01:34:52   Make a name for yourself with Hover.

01:34:55   Next on my list was,

01:35:02   well, what do you wanna talk about next?

01:35:05   Let me ask you.

01:35:06   - Would you like to talk about fake faces?

01:35:07   I love this topic.

01:35:08   - Yeah, let's do this.

01:35:09   Let's do this.

01:35:10   - Is there going on for a bit?

01:35:11   So I wrote about this last year,

01:35:13   and I certainly wasn't the first by any means,

01:35:16   but I got kinda excited about the topic

01:35:18   because it's a little contrarian at one level

01:35:21   and now it is blowing up and I'm kind of excited

01:35:26   to see people covering it.

01:35:27   So I think it's really important.

01:35:28   So what it was is Nvidia has a research department,

01:35:31   of course, and they're always working on interesting stuff

01:35:33   they can do with their super fast GPUs

01:35:36   'cause all these graphics companies

01:35:37   are selling computational power now

01:35:40   to security researchers and other folks

01:35:42   and graphics professionals, of course,

01:35:44   strangely also graphics professionals and gamers.

01:35:47   And they came up with this thing,

01:35:49   if I'm remembering it's right, it's called StyleGAN.

01:35:51   And GAN is generative adversarial networks,

01:35:56   that's what it's called.

01:35:58   Okay, so what it is is you pit machine learning algorithms

01:36:02   against each other to get closer and closer

01:36:06   to a desired outcome.

01:36:08   And what they figured out a way to do

01:36:09   was to use what they called a style-based generator

01:36:14   to use adversarial networks to make fake faces.

01:36:19   So you feed in correct human faces in profile,

01:36:23   and over time, the algorithm evolved enough

01:36:25   and improved enough that they released a version of it

01:36:27   and they've improved it since,

01:36:29   that makes fake static profile shots

01:36:32   that look often exceedingly real.

01:36:35   They look just like a real person,

01:36:37   and you can dial in characteristics.

01:36:41   So I like brown hair, red hair,

01:36:42   a different kind of smile or face.

01:36:44   So there's not a ton of control, but there is some.

01:36:47   And every time you run the algorithm-

01:36:48   - What about age? I presume you can-

01:36:50   - Yes, to some extent.

01:36:52   And you can, I forget if age,

01:36:54   no, I think that is a factor.

01:36:55   So age and gender, hair color, eye color, glasses,

01:36:59   all these things.

01:37:00   And the first pass came out in, I wanna say in 2018,

01:37:05   and it made a little bit of a stir

01:37:07   because the video for it was just outrageous.

01:37:11   It kind of like blew people away.

01:37:12   They'd done it for one of the big graphics conferences

01:37:16   or something and you were like, oh my,

01:37:19   and they showed in real time, like not in real time,

01:37:21   but they showed the input faces

01:37:22   and then all the output faces

01:37:24   and they would zoom in and show more faces

01:37:26   and it was just like, oh my God.

01:37:28   So a lot of the focus in machine learning based generation

01:37:33   of material has been around like deepfakes,

01:37:36   which are typically looking at video.

01:37:38   So you create a video with enough different

01:37:41   pictures of somebody or video of somebody

01:37:43   and you can merge them with porn

01:37:45   or you can make Obama, sorry, Barack Obama

01:37:50   say he loves Osama Bin Laden or whatever.

01:37:53   And so that is very disturbing

01:37:55   because the video can look very realistic

01:37:58   and while it can be pretty funky and easy to tell

01:38:00   for an informed person now,

01:38:01   it's gonna get better and better.

01:38:02   But I thought there wasn't enough attention being paid

01:38:05   to the still face version because it looks so good already.

01:38:10   So what happened is I wrote this piece last year

01:38:13   and talked to these researchers

01:38:14   at the University of Washington

01:38:15   that had built a page you can go to

01:38:17   and they made a little test.

01:38:18   So they put a real photo next to a generated one

01:38:21   and ask you which one is real.

01:38:23   And you know, you'd be surprised,

01:38:26   you'd run 20 of them through or something

01:38:28   and you'd be like, man, I was wrong like 75% of the time.

01:38:32   The fake ones look more real than these real ones.

01:38:34   So these algorithms continue to improve.

01:38:36   And there's a good article recently,

01:38:39   there's actually been a series this year already

01:38:41   about two aspects of it.

01:38:44   One, or I guess three, right?

01:38:46   A company is now offering this as a commercial service.

01:38:49   So you go to them to get essentially model release free,

01:38:54   royalty free images you can use as stock photography.

01:38:58   - Right.

01:38:59   - So there's no real person,

01:39:00   so you don't have to get a release.

01:39:01   You can use the picture any way you want.

01:39:03   The second is businesses that are then creating these for,

01:39:08   or sorry, businesses that are using fake images now

01:39:11   from that company and others to populate sites

01:39:14   so they look like they're full of real people.

01:39:17   And the third is, there was a story just recently

01:39:19   about Facebook having, is looking now,

01:39:22   that was actually an issue.

01:39:23   It used to be that you could reverse image search, right?

01:39:26   This ties into another story you're interested in too,

01:39:30   but you could reverse image search and find,

01:39:31   that fake avatar on Twitter. Oh, that's been used all over the place. 1000 sites. Well, these, they

01:39:36   can't reverse image search them, because they're uniquely generated. So Facebook had to remove a

01:39:40   bunch of profiles with fake pics. So I'm sort of fascinated by this, because it is so good already,

01:39:46   that we are already in the crisis of reality about fake still pictures of people's faces.

01:39:53   Yeah, and the one you know, like using them for royalty free

01:40:01   stock photography, it I don't know who would be bothered by

01:40:06   that other than someone who already owns, you know, a legit

01:40:10   stock photography thing. And that's their business. I mean,

01:40:13   in terms of morality, that to me strikes me as, you know, fine,

01:40:18   like that's using technology to solve a problem. And if you want

01:40:22   to create an advertisement that makes it look like there is a 35 year old woman who's very

01:40:30   happy with her banking account, you know, for your ad to say sign up for our banks,

01:40:37   you know, low interest checking, whatever.

01:40:41   Fine, right.

01:40:44   But using it to create the illusion of people who don't even don't really exist is really

01:40:51   bothersome. And it really plays into the the the the flooding, flooding the zone, to, for lack of a

01:41:00   better term, of misinformation in the social network age. You know, to create thousands, tens of

01:41:11   thousands, I mean, what's the limit, right? Once you're talking about a computer, you know, it's

01:41:15   just like spam, right? Like, imagine if you got as much physical mail as you get spam, I want to talk

01:41:19   about spam later but let me just let me literally just take a look here at my

01:41:25   junk mailbox my junk mailbox which I go through and empty right now for my email

01:41:30   accounts and I have mail set to throw out the old ones on a periodic basis

01:41:35   I've got 4,000 pieces of spam I mean just one address alone let me see here

01:41:44   from my main address I have 11,000 pieces of spam and it only goes back to

01:41:51   December 21 so again because I have mail configured to throw out old month-old

01:41:58   spam so in the last month I've gotten eleven hundred and fifty pieces of spam

01:42:03   to one address imagine if you got that much junk mail you know it'd be crazy I

01:42:09   mean I feel like we get so much junk mail and marketing mail as it is but

01:42:12   that's, you know, six, seven pieces a day, you know, it because there's an actual cost

01:42:18   to actual postage. You know, once you're talking digital, there's just no price. I mean, there

01:42:23   there have been AstroTurf campaigns, it's, you know, for a long time, where you try to

01:42:31   create the illusion of more people supporting some political issue or something, then actually

01:42:37   exist. But in the digital age, it is just it, it's just another

01:42:47   ballpark, you can create thousands and the way that like

01:42:50   algorithms work and you know that promote, you know, you all

01:42:54   of a sudden you get stuff in your newsfeed on Facebook. And

01:42:58   Twitter, you know, does a lot of this now, too. I don't really

01:43:00   see stuff that seems fake in my Twitter. And I don't use

01:43:03   Facebook. But it's obviously a problem for a lot of people in

01:43:06   in terms of pushing a lot of this political propaganda,

01:43:11   having an army of fake Facebook accounts is,

01:43:15   it's not like a resource problem for Facebook.

01:43:19   Facebook isn't trying to get rid of these accounts

01:43:21   because it's using up computing power

01:43:24   and running up their electricity bill and their storage bill.

01:43:27   It's ruining the integrity of their service.

01:43:30   And as down on Facebook and as I am,

01:43:34   I

01:43:36   I do you know I

01:43:38   And I think there's a lot of ethical problems at that company. I don't I really think that they want their

01:43:46   algorithms to be

01:43:48   As they intend them to be they don't it's not like they

01:43:55   You know they have an interest in not having them be overrun by fake fake accounts

01:44:00   right and having a profile picture that is unique is

01:44:04   You know something that can be done and you know and I've seen it I've seen it when when

01:44:11   Fake accounts have been exposed over the years

01:44:15   I don't I can't think of an example off the top of my head that I could put in the show notes

01:44:18   but maybe I can find one but I've seen it where people have exposed it and

01:44:21   Found that the profile picture is from like a stock photography service

01:44:26   You know, yeah, this this is what led me to this article is I was interested in the topic

01:44:30   But I thought I'd been well covered and then some folks in the Bitcoin world

01:44:34   said they were being contacted by somebody who said she was a journalist and

01:44:38   They looked at the profile photo. There were very little traces of who she was they

01:44:44   Could not and she was contacting people privately and everyone was getting a little nervous

01:44:49   Was she pulling a scam was she working for some, you know exchange or whatever and they discovered someone's like, you know

01:44:55   this is a fake photo because look at these and they enlarged it like look at these characteristics. This was generated by the StyleGAN

01:45:01   software and it almost certainly was because there's some if in the older the first version there was some artifacting there's some

01:45:09   techniques you could tell but you couldn't reverse image

01:45:12   reverse image search it because that's such an easy way to tell initially and if it doesn't work you're like well

01:45:19   this is probably a legitimate photo that someone posted like no now you can't do that what the scientists at the

01:45:25   or academics at University of Washington told me,

01:45:28   they said there's two actually really easy tells.

01:45:31   And this, I thought this was brilliant

01:45:33   and it will change over time.

01:45:34   The fake generated faces, you will never,

01:45:37   the current states of the algorithms,

01:45:39   and this may continue indefinitely

01:45:42   because of how the faces are generated,

01:45:44   they can only produce that face once.

01:45:46   So you can use the same inputs

01:45:47   and produce exactly the same face a second time,

01:45:50   but you can't produce other versions of that face

01:45:53   in different scenarios.

01:45:54   So you can't, if you want to say,

01:45:57   send me your left profile,

01:45:58   'cause I'm seeing your right profile,

01:45:59   they can't, they can't,

01:46:01   they're typically not yet in scenes.

01:46:03   So you're not standing with a background

01:46:06   in some kind of environment, you can't hold up,

01:46:08   I mean, you could hold up today's paper,

01:46:09   but you know, that kind of thing.

01:46:11   Actually, what was hilarious about this research too,

01:46:13   is they've shown examples of taking like shots from Ikea

01:46:17   of rooms full of furniture, and they can train them.

01:46:21   It just produces random rooms full of very realistic looking sets of furniture that don't

01:46:27   exist.

01:46:28   And it is, you can run these algorithms online and you're just looking through it and it

01:46:31   is, there's something so weird about seeing the real world.

01:46:35   Like faces are bad, but the real world being like your objects of material possessions

01:46:39   being generated algorithmically from other photos is, is like beyond surreal.

01:46:45   It's, you know, and it's, you know, it's one of my big fears as we go forward.

01:46:55   And one of the things that I feel like we're already inundated with misinformation.

01:47:01   And one of the things that makes me so pessimistic, honestly, about democracy, you know, going

01:47:08   forward.

01:47:09   I don't mean to be overdramatic, but it's a big frustration for my wife in particular

01:47:16   that when video of some scandalous thing comes out, that it resonates so much more than just

01:47:31   somebody's, you know, what should, what ought to be taken equally is proof of it.

01:47:37   And one example that comes to mind would be the, the Access Hollywood Trump tape that dropped a

01:47:47   month before the election four years ago. You know, this is the famous tape where Trump is

01:47:53   saying that, you know, when you're famous, you can do whatever you want. You can grab them by the pussy.

01:47:58   And that tape came out.

01:48:06   And you know, again, and some people will say, well, who cares?

01:48:08   He won anyway, but it hurt him.

01:48:10   It hurt him severely at the, you know, in, you know, polling, you know, if that tape

01:48:16   had never come out, he would have won by more, most likely maybe, you know, maybe half a

01:48:21   percent who knows, but these elections have been so close that a half a percent in a few

01:48:25   states makes a huge difference, right?

01:48:27   the tape matters. And my wife was so angry, not after the election, but after

01:48:33   that tape dropped when this became this huge, huge news cycle consuming scandal

01:48:41   that's, you know, and had people saying, "Oh my god, this is so bad he needs to

01:48:45   drop out of the race." And my wife's response, her anger was, "There is

01:48:49   absolutely nothing on that tape that we did not know for a fact about Donald

01:48:54   Trump and the way he behaves around women and the way he thinks about them.

01:48:59   And, and, you know, there were dozens and dozens of stories of exactly the

01:49:03   same sort of thing from over the years, from decades ago.

01:49:06   And so it anger and I, my take is you just don't understand the psychological

01:49:12   effect that like a video tape has, you know, and, and there are so many stories,

01:49:17   you know, it, it's a huge trend of the last decade for the better, you know,

01:49:21   it's for lack of a better catch-all phrase the whole me to movement but just

01:49:26   the the whole holding accountable men mostly but anybody behaving either

01:49:36   inappropriately or even worse illegally mm-hmm but it's when tapes come out and

01:49:45   I say tape you know videos but you know some celebrity who comes out and there's

01:49:49   something caught on tape it is so much worse there was oh man what's his name

01:49:57   Ray Lewis was a football player Oh No Ray Rice it's oh oh wait Ray Lewis was

01:50:04   the murderer right alright well we can leave this in Caleb alright I just want

01:50:10   to get at least I'm not maligning someone Ray Lewis was Ray Rice was the

01:50:18   running back for the Ravens and he, you know, assaulted his wife badly. And, you know, there

01:50:26   were there were, you know, it was going to be repercussions. And but then all of a sudden,

01:50:30   somebody happened at a hotel, somebody in an elevator and somebody who worked at the hotel,

01:50:35   that the elevator had a security cameras, you know, no surprise, and they leaked the tape.

01:50:41   And the tape showed him more or less cold cocking his wife or his fiance, I guess.

01:50:47   Yeah, I remember that video. I was absolutely horrific. I honestly, if you don't have the

01:50:53   stomach for it, don't even look it up. It is truly a horrific video. I mean, it's a

01:50:58   professional football player in the prime of his athleticism, just hitting a woman as hard as he

01:51:05   can. And just, I mean, she just falls like a sack of potatoes. But it was only when the video came

01:51:11   out that he really suffered the consequences he should have, whereas it should have been evident

01:51:16   to anybody paying attention that he was guilty. There was no need for a video, there shouldn't

01:51:23   have been a need for a videotape for him to be suspended forever.

01:51:26   But we're seeing this exactly, this is exactly what's going on with the impeachment process

01:51:31   right now, is that there is 100,000 pieces of circumstantial evidence right now that

01:51:38   absolutely paint the picture that Trump was directing, you know, with the withholding of

01:51:44   aid from Ukraine, and so forth. And actually, at this point, what's I mean, there's no there's no

01:51:48   direct thing of Trump sending an email, or a horribly Sharpie written note or, excuse me,

01:51:53   or audio or whatever of him saying, we need to get the Biden's let's withhold aid, we're going to

01:51:58   pressure Ukraine, but it is absolutely painted. But, you know, would it make a difference not to

01:52:05   anybody who has a staked out political opinion that Trump can do no wrong, or that he's being

01:52:11   being railroaded by the deep state. However, the thing that's interesting, the only difference

01:52:15   here is that weirdly, his party has mostly decided to say, well, okay, that's what happened,

01:52:21   but it's not actually impeachable. It's not illegal. And that's strange. Usually you need

01:52:25   video to get to that part. But I think the amount of circumstantial evidence finally

01:52:29   crossed what is usually a video or even audio tape threshold. Because I know I totally agree

01:52:35   with you is people will see something comes to your eyes in a very different way than

01:52:40   Even still photos, right? So on the one hand, you know when a tape does so it's not right in an ethical or logical

01:52:47   Sense that that that the repercussions are worse when Ray Rice has tape

01:52:52   Cold-cocking his wife when he actually should have been proven guilty otherwise, but at least when the tape came out the right thing happened

01:52:59   We in the world where these deep fake videos are easily produced and are indistinguishable

01:53:09   from a

01:53:11   legit video I

01:53:13   Worried that we lose that and then of course the flip side is it's easy to create a fake one, right?

01:53:17   It's easy to create one where some political candidate is is there on tape?

01:53:23   Looking looking like it's the security camera from an elevator and they think they're in private

01:53:29   So that's you know would explain why they're caught talking on tape and you say hey, here's somebody from the you know

01:53:35   the Hilton in

01:53:37   you know somewhere and they released this tape to us and it shows a

01:53:41   Presidential candidate talking about taking bribes from China or something like that

01:53:46   Well, and there's the tape and it's completely freight

01:53:48   You know the Nancy Pelosi one that happened where it showed her as if she were, you know, it slowed it down, right?

01:53:53   It made it teach. Yeah, it made it seem like she was drunk, you know, yeah and Facebook

01:53:58   I think wouldn't take that down

01:53:59   They didn't refuse to take it down ultimately because it wasn't altered in a way that they considered substantive

01:54:04   Although it was obvious. Right. I thought they I thought they did take it down eventually

01:54:08   Could you use it in your as a politically political ads where you could include it and it wasn't well done it really

01:54:15   What they're there, you know

01:54:17   You could it was funny as a gag if it had been presented as a gag in the in the same way

01:54:23   Like I'm sure you did

01:54:25   Did the same thing?

01:54:26   Like when we were kids you would take a tape recorder and record your voice and if it certain tape recorders would let you play

01:54:33   it back slower, slower or faster. And then all of a sudden, you sounded like a chipmunk

01:54:38   or you sounded, you know, like, we're talking like this. And we would, you know, as a kid,

01:54:44   I would play for hours with that. It was it was the craziest thing in the world. We loved

01:54:48   it, but it wasn't. It didn't really sound realistic. The Nancy Pelosi thing was better

01:54:53   than that. But it anybody who, you know, got sucked into thinking it was real was seeing

01:54:58   what they wanted to believe or was naive. But enough people thought it was real. And the people

01:55:04   who made it didn't make it as a gag. They made it as propaganda. And it caught fire enough, you know,

01:55:12   and it was really poorly done. You know, it's so easy to imagine a much better done version.

01:55:19   I remember I had this great conversation. This is like 95. Like this long time ago,

01:55:27   I was at some conference and we had the, oh, I think it was an early like web design conference that I was working on.

01:55:33   And we had the webmaster for Netscape, that'll take you back, who attended, who was a super nice guy.

01:55:39   I'm blanking on his name. Super nice guy. Great things to say.

01:55:41   And I was chatting with him at some point and he said, you know, we're so obsessed with video and we keep thinking about like, when can we stream videos?

01:55:48   Like 95, right? So when can we stream video that's big enough that people will accept it and like it and whatever.

01:55:55   And he said, and then I was on this panel with somebody from Penthouse, and they said,

01:55:58   you know, we can stream like 240 by 320 at five frames per second in grayscale.

01:56:04   Our audience is delighted.

01:56:05   And he said, I had to really rethink like that.

01:56:09   And I was thinking of that with the Pelosi thing where it's like, you know, we could

01:56:13   be, we're in an era of, of real, of deep fakes and they could create, you know, incredible

01:56:18   simulation that could show Hillary Clinton shooting Vince Foster or something horrific.

01:56:23   Right.

01:56:24   They'll probably do that whoever the leading Democratic candidate is and probably Republican

01:56:27   Trump they'll probably be

01:56:30   Concocted videos because they're so easy to do as we get into the the general election

01:56:33   But I was like wow at some level all you need is the fidelity of that like streaming porn

01:56:38   You need the Nancy Pelosi where they just slowed it down and people like oh my god. She must really be drunk like no

01:56:44   No, how do we get people to have more credibility? But at the same time I think you're absolutely right that video

01:56:49   Conveys a kind of truth

01:56:52   that

01:56:53   bypass is normal

01:56:55   You know cognitive processes

01:56:57   And I really really think I think I see it it is going to be a huge problem

01:57:03   I am certain of it. I I just cannot I don't know what the repercussions are gonna be I mean I

01:57:09   I

01:57:11   Think the ultimate I think the old the best case scenario is that?

01:57:16   video and audio

01:57:19   Proof of X no longer holds the cache a it'll take years though for that to happen and that people

01:57:27   People will start to treat with equal skepticism

01:57:30   video of you know a political candidate

01:57:34   drunkly groping somebody in an elevator the same way they would if somebody just said and

01:57:41   Was quoted as saying hey this this this person

01:57:46   drunkly groped somebody in an elevator, right, that the video doesn't give it any more credibility

01:57:51   than the quote, and that you have to trust the source, right? It gets back to, you know,

01:57:57   the older days where you've really got to build trusted sources. And, you know, Joe Random Twitter

01:58:03   account or Jane Random Facebook doesn't count. And that we that's, that's the optimistic view.

01:58:09   The pessimistic view is that everybody believes the worst of everything and nobody believes

01:58:14   anything. And nobody trusts anything. How's that for a

01:58:18   lighthearted take?

01:58:20   Can I segue us into the what it feels like is you put this on

01:58:25   the list of story topics. I was like, this is like exactly the

01:58:29   perfect complement to this. Yes, yes. Talking about which is,

01:58:33   well, the Facebook or the face ID, the secretive company that

01:58:38   might end privacy as we know it, which is a great headline and

01:58:41   Kashmir Hill is a incredible reporter. She's now worked across. I can't think how many public a modern era

01:58:48   so journalists as

01:58:49   As a side note just a parenthetical to tie in the last time friend of the show Dan Fromer was on this show Dan

01:58:57   pointed out that

01:58:59   because he's doing his own thing now with the

01:59:02   New consumer, I believe it's called. Yeah, sorry. Sorry Dan, but I think it's a new consumer, which is awesome

01:59:09   And it's he's doing his own thing, but that his observation was that

01:59:13   The trend towards journalists and writer types and pundit types like me

01:59:19   Doing their own thing has sort of gone the other way and the New York Times and Bloomberg and a few other top and and maybe

01:59:26   The Washington Post would throw in are sucking in all the good talent. I

01:59:31   Would absolutely put Kashmir Hill in that list. I think her most recent thing that I remember reading was at gizmodo

01:59:39   where she agrees. I think she had done a I forget what it was. But she's a series where

01:59:45   she did the like, I'm gonna get off Facebook. Yes, I'm gonna get I'm gonna I'm gonna stop

01:59:49   using all of the big five tech company products. Yeah. And what was that like? And it was like

01:59:53   she was living in a desert. Right? It was amazing. What happens? What happens if you

01:59:58   don't use Google, Amazon, Facebook? I forget if Apple was on the list or not. I mean, rotation,

02:00:04   but it was yeah, she's worked across I was looking at her because she's worked across

02:00:07   several publications, all her pieces are must reads. She does incredible I mean, so she does those like, I would say like almost consumer style investigations, like what are the companies learning about us? And how do we extract ourselves then she does the others, which are like, what happens when you know, if I put myself in this situation, what do companies find out about me? Yeah. And that's sort of a little bit of what this this is a little, I think, a mix of that. We all need so I have been saying and I don't want to like, I don't want to put myself up because I'm not saying things

02:00:37   anyone else hasn't said. But a few years ago, when I started seeing the effectiveness of AI-based

02:00:42   transcription in its early stages, so I've been using TRNT, which is a very, was very inexpensive,

02:00:48   they just switched to a more expensive, well, a cheaper but subscription model that is more

02:00:52   expensive for ad hoc use, like I used it. And it was pretty good when it started, it's gotten better.

02:00:57   And it's it's not as good as, you know, paid transcription services with people who are, you

02:01:01   know, professionals and looking at everything, but it's very good for AI-based stuff. And then there

02:01:07   there were some people throwing together some podcasts, like cheap and dirty, AI based or

02:01:12   language learning, podcast transcription stuff. And I was like, wow, this is totally amazing.

02:01:18   And I knew voice recognition, of course, with Siri, and Google Assistant, all these other

02:01:22   things had gotten really good, right? So I'm aware of that. And then when I started seeing

02:01:26   the really cheap bad stuff, I had that innovators dilemma feeling where you're like, oh, yeah,

02:01:31   this is where you see something really bad and really cheap. And it eats a bottom of

02:01:35   market but then it gets better and it eats from the bottom up. And I thought oh here's the point

02:01:39   at which so maybe I don't know three or four years ago I started tweeting at some point everything

02:01:45   every picture of you that's available publicly online every piece of audio in which you speak

02:01:50   every video in which you appear will be categorized and labeled with you and someone will be able to

02:01:56   search and find every instance of you in any public source that you have ever posted and it's

02:02:01   just a matter of time. It's not a when or if it's a when and over

02:02:05   time that's become closer and closer. And this story is

02:02:09   basically we're now over that bar. We're now in the next

02:02:11   level. It is now a thing. It's still images, but we're moving

02:02:15   your video is the next right, but that there's so there's a

02:02:18   startup I've never heard of, to be true, Clearview AI, and they

02:02:23   have apparently they have apparently tried to index every

02:02:27   publicly available photo that they can find on the internet.

02:02:31   I guess billions. I don't even know if I think they're saying three three billion three billion images that yeah

02:02:37   That's what the company claims that they've scraped from Facebook YouTube

02:02:40   Venmo millions of other websites Flickr, I'm sure you know if you're talking about three billion you know mm-hmm

02:02:49   and so you take a photo of somebody you have no idea who they are and you throw it at

02:02:56   Clearview and Clearview comes back and says here's who it is and

02:03:00   and that it's fast and that they have hooks

02:03:03   for augmented reality already.

02:03:06   So in theory, you could get the,

02:03:08   somebody showed Terminator,

02:03:11   I'm sure it's been in a bunch of movies,

02:03:13   but the idea that you could be wearing

02:03:15   some kind of AR goggles or glasses

02:03:17   and as you walk down the street,

02:03:18   I mean, depending on how fast it is,

02:03:20   but we know how fast Google search has gotten

02:03:22   for searching the index,

02:03:23   but that you can just look at somebody on the sidewalk

02:03:26   and they'll know who you are.

02:03:30   You could certainly imagine, absolutely imagine how of interest this would be to retailers

02:03:35   because as you enter a physical retail shop, there's certainly it's easy to capture a photo

02:03:39   of the people who are coming in.

02:03:41   Most of them already have cameras for security purposes meanwhile.

02:03:46   And you could go into a store of any, you know, pick your type of bookstore or whatever.

02:03:52   And I mean, a bookstore is probably a bad example.

02:03:55   But you know, some sort of store where you get greeted by a salesperson.

02:04:00   say a jewelry store or something like that. And they could already know, you know, and

02:04:04   get like some kind of estimate as to, you know, well, does this person, you know, maybe

02:04:09   is, you know, get a guess as to how much money you make or something like that and start

02:04:14   steering you in a direction that you know, and you have no idea that they've done this

02:04:19   and that the salesperson already knows something about you. I mean, there's just all sorts

02:04:24   of ways that this can go bad.

02:04:26   Yeah, there's already so many ways that's happening.

02:04:29   And it defeats human nature's instinct that as you're walking down the street that you're

02:04:33   anonymous, you know.

02:04:36   That the only people who recognize you are the people who know you.

02:04:39   Well, this ties in a little bit with, I saw, I forget if he's a security researcher or

02:04:44   not, somebody tweeting about, I've got a call, I was just thinking about going out to get

02:04:48   cold medicine and Facebook or Google just showed me cold medicine ads and I hadn't been

02:04:52   searching on it.

02:04:53   And I'd read recently, you know, the there's a myth that Facebook is analyzing audio all the time when you're running the app or in the background.

02:04:59   And, you know, I don't want to put anything past Facebook, but I don't have the app installed on my phone.

02:05:05   And I was off chatting with someone at a cocktail party.

02:05:09   I met someone who knew a late friend of mine.

02:05:11   I talked about this person for years.

02:05:12   We talked about her, about how much we missed her.

02:05:14   I come home, I open up Facebook and there is an ad for her former business.

02:05:18   And I'm like, what in God's name?

02:05:20   So I know it wasn't listening to me.

02:05:22   Right.

02:05:22   And I know it wasn't fed that way. The explanation that I read recently, and I think this wasn't Kashmir Hall, it was someone it was a another New York Times reporter, Claire, I'm blanking on her name, they'll come to me, but she wrote a piece, a co-wrote piece recently, in which to describe that because of all this location information that's being gathered, it doesn't matter. They don't have to listen to you. They know that you just walked to the store, and what you bought at checkout and came home, because they can gather from your app.

02:05:52   all your location information, they can associate the

02:05:55   purchasing with your unique phone. And even if you don't do

02:05:58   it, if 10 people in your apartment building have just

02:05:59   gone and bought cold medicine, and come back, they're like,

02:06:02   "Oh, there's a bunch of colds in that building, we're going to

02:06:04   show him cold medicine ads." And I was like, "Oh, that makes

02:06:08   total sense. They've got complete surveillance on us." So

02:06:10   of course, they don't have to know that I have a cold. They

02:06:14   know that a lot of people around me have bought cold medicine and

02:06:16   come back to the same location.

02:06:18   Ah, and people are so paranoid about it, too, because people do

02:06:21   pattern matching. And rightly so people are so paranoid about

02:06:25   this, like my phone is always listening to me and then does

02:06:28   things that it could only know if it was listening to me. I was

02:06:33   at a big family gathering somebody had a birthday over the

02:06:37   weekend. We were a big family gathering over the weekend. And

02:06:43   one of Amy's aunts asked me, we I forget how we got talking

02:06:49   about iPhones, but you know, they know what I do. And they

02:06:52   don't typically I certainly don't talk Apple stuff to them.

02:06:55   But if they have questions, you know, they asked me, you know,

02:06:57   and she said to me, let me ask you this. And it was about this

02:07:03   topic of the phone listening, and she has an iPhone. And she

02:07:07   said that the one day at work, her and her friends were Oh, it

02:07:13   got to the point it was somebody else brought up the fact that

02:07:16   they have the feature on where the his his commute his schedule

02:07:22   is very rigid. He does, you know, takes his kids to school

02:07:24   and then goes to work. And it says and he's just amazed by the

02:07:27   fact that he gets in and says, Hey, it'll take 17 minutes to

02:07:30   drive your kid to school. And he goes, Damn it, if it doesn't

02:07:33   take 17 minutes, and then it says it's gonna take you 23

02:07:36   minutes to get to work. And 23 minutes later, he's pulling into

02:07:39   work. And he's like, it's awesome, but it's a little

02:07:42   spooky. And then Amy's aunt said, Well, let me ask you this

02:07:46   that one day at work her and she works with a bunch of other

02:07:49   women, she said they were just really having a good time and

02:07:51   just not like that they were doing something that they get in

02:07:56   trouble for, but they were I don't know, laughing and

02:07:59   laughing and they got just got off, you know, just goofing

02:08:01   around at the end of the day at work, and really, really having

02:08:05   a good time laughing. And then she leaves to go home and her

02:08:08   iPhone gave randomly gave her said directions to like, the

02:08:15   local comedy club in town. And she said, and I've never been

02:08:20   there. I've never gone there in my life. I didn't ask for

02:08:22   directions there. And it just said, it'll take you, you know,

02:08:25   18 minutes to drive to the Comedy Store in whatever

02:08:28   Pennsylvania. And she got freaked out that it was because

02:08:33   her and her colleagues had been laughing it up the last half

02:08:36   hour at work. And I was like, that's really weird that it

02:08:40   would offer you directions there that you've never never even

02:08:43   been there. But that that's not how it happened. It wasn't

02:08:45   because you and your friends were laughing. But, but I, I

02:08:51   don't blame her. She's obviously she's not a technical person at

02:08:54   all. She knows these stories. She's probably seen other spooky

02:08:57   things happen where you do x, y and z and then all of a sudden,

02:09:01   Facebook or whoever is, you know, showing you ads or giving

02:09:04   you suggestions for something that they seemingly would have to know. You know,

02:09:09   that's obviously not what happened. But it's totally natural that as a human

02:09:15   being, you would draw those two together, right? Like that's the way human beings

02:09:19   minds work. Anyway,

02:09:21   well, but I think it fits it fits into that possibility, too, is that it actually

02:09:25   could be that when you have a bunch of people together, and you're staying late,

02:09:29   and people are searching for certain things on their phones, right, the

02:09:33   algorithm of whatever provides this kind of stuff says hey, there's a bunch of people here's a thing, you know, these kinds of connections are

02:09:39   What's what's scary actually is how?

02:09:42   Predictable we are rather than that. It's listening to us

02:09:46   Hey shocker among shockers

02:09:48   Guess who's one of the bad financial backers of this Clearview company that is already working with law enforcement, by the way

02:09:54   That's who their customers are at this point with this reverse image

02:09:57   I knew this before I read this story when it said I read the paragraph that said

02:10:01   Investors included and I was like Peter Thiel, right?

02:10:04   Shocker among shockers, but anyway an unknown number of law enforcement agencies are already using this

02:10:13   I don't know what there is that we can do about it

02:10:16   And then that ties into another story that I just ran into the other day where Nelson Minar

02:10:21   Who's been blogging forever?

02:10:24   great person

02:10:27   Brought up that this search engine I have never heard of

02:10:31   called, oh, Yandex. Yeah, yeah. Yandex. Yandex is a Russian, it's a search engine in Russia.

02:10:41   And in fact, there's just a story also about them because they're apparently do not engage

02:10:47   in the same behavior, or the same restrictions that other search engines do globally around

02:10:52   child pornography. And it's become a huge issue. Just a big story about that a few days ago. Well,

02:10:57   Well, one of the things what Nelson showed was that they do reverse image search on faces,

02:11:04   which is something that Google has claimed for years could do and that they chose not to

02:11:10   because of the obvious privacy implications.

02:11:13   And Nelson's blog post, I'll put it in the show notes, he took a picture of himself.

02:11:19   It works best if you take what I would just offhand, I think everybody will know what I mean.

02:11:26   Take like a fake ID picture, like the type of picture that you would take of yourself

02:11:31   if you were going to submit your own new picture for a driver's license or a passport, you know,

02:11:35   mostly your head, a blank background, that type of selfie. Take one of those, submit it as just,

02:11:44   you know, put it on your computer and then go to their image search. And it says there's like a

02:11:48   camera button for like, hey, do you want to, you know, start and instead of searching for words,

02:11:52   search based on a starting image. Click that button, upload your selfie, and his came back

02:12:00   with, I don't know, it's like they give you like 12 results, but like the first the first two or

02:12:05   three were definitely him. Yeah. And the other people did look like him. I tried it with one.

02:12:09   I showed you the screenshot. I have to I guess I'll put the screenshot in the show notes if I can.

02:12:14   I don't think there's anything revealing. It's pretty good. They're thinking the top four

02:12:20   Pictures were from I think my iPhone review from two years ago

02:12:25   I think it was two years ago where I included some pictures of me that my wife had taken with portrait mode of the

02:12:32   iPhone camera so but if you had that picture of me that you started that I started with and had no idea who I was

02:12:39   But just had that picture if you sent it to Yandex

02:12:42   it the first four or five results and most of the

02:12:47   First eight are me and then you could click through to the origin and figure out okay, that's John Gruber

02:12:53   The cool thing though is you can see what you'd look like with a lot of different mustaches

02:12:57   The ones that are wrong are close enough not not like that you would think is that John Gruber

02:13:10   but they certainly you know, you could see why a

02:13:13   Distant family resembling well or at least show me people who look like this guy, but it is absolutely

02:13:19   Whoa, eye-opening. I have never seen this before I can send a totally anonymous photo to this site and half a half a second later

02:13:28   Find out that it's John Gruber

02:13:31   Yeah, I did now tested a couple for myself and it's not as great but it did find oh

02:13:36   I don't know like six or seven different ones for two different shots or any are any of them you

02:13:42   Yeah, they're on me. I'm sorry. It found a bunch and found six seven that were me. Oh, okay. Yeah, so

02:13:48   Somebody who started with your starting point would definitely figure out that's Glenn Fleishman. Oh, yeah

02:13:53   I mean in one case, I mean you're saying that you're saying that the misses weren't as

02:13:57   Glenn Fleishman II they were I mean, that's a good way to find relatives like

02:14:03   They're actually a reasonable I would say in a dark room you might think

02:14:11   No, it was very it's very funny

02:14:13   Yeah, because I've been using reverse image source for a long time and I never thought about

02:14:19   I never thought about the fact that was actually relatively difficult to do a full image, you know

02:14:25   and of course it was actually by intent so

02:14:27   Well, it's a real eye-opener and I don't know what we can do

02:14:31   I mean this seems like this is toothpaste that can't go back in the tube, you know, it's a genie that's out of the bottle

02:14:36   I mean, we're not gonna what are we gonna do and and and is this going does?

02:14:41   Anybody else outside of Yandex who's been sitting on this out of fear, you know that they don't want to be the first

02:14:50   Because of the backlash that might happen

02:14:52   You know

02:14:55   Once this sort of thing is out. It's open game

02:14:57   You know how long now until Google it flips the switch that turns it on and how much how much better will Google's version be?

02:15:04   Although I would actually argue I am not usually

02:15:08   Let's say how do you say this? It's like it's not that I'm opposed to regulation or necessarily in favor of it

02:15:13   And I think there's a lot of things that regulation can't handle like I love regulation as a tool

02:15:18   To make things fairer and level the playing field when money or power distort things, right?

02:15:24   So, I mean, that's a I don't know where that falls on the political spectrum

02:15:28   But like I don't think that you know, there's all this talk about like, you know

02:15:31   What kind of regulation should you put on tech companies? Obviously, there should be

02:15:36   Some more and different than is now but not necessarily the kind that Congress wants to do but with this particular thing

02:15:42   I actually think because

02:15:44   the greatest value of it is going to be in industries that are

02:15:48   working in the US and then of all US companies that regulating this as

02:15:53   Making it illegal for police departments to rely or law enforcement officers to rely on these kinds of databases

02:16:00   And to make it illegal to sell it as a service or to strictly regulate it in some fashion

02:16:04   I think it might have an impact because even though the toothpaste that are the tube and you can go to Yandex or other international sites

02:16:10   potentially and do the same thing

02:16:12   it could not be used legally like if a police officer uses it as the basis or

02:16:16   should say one of the bases of attain a

02:16:18   Warrant as is discussed in the story that it can't be the sole basis to go to a judge

02:16:23   Then if that's illegal then the case, you know

02:16:25   and it's determined the case gets thrown out or a judge refuses to accept it because of the rules of

02:16:31   issuing warrants, or Google being told, you know, maybe they

02:16:35   would want to get in the business, and then are trying to

02:16:38   comply with the law and wouldn't or this company that's in the

02:16:41   article might be forced to not offer it services anymore. So I

02:16:45   think regulation might be a tool to, you know, like, you can't

02:16:51   put the toothpaste back in, but you can make the toothpaste not

02:16:54   as valuable to sell basically.

02:16:56   All right. How are we doing on time? We have time for one more

02:17:00   More segment on email spam and then we're out.

02:17:02   Oh, I have just a real time correction really quickly is the author I was thinking of at

02:17:06   the New York Times is Claire Kane Miller, who is a very fine reporter, although she's

02:17:10   moved on from technology reporting.

02:17:12   The person I actually met was Nicole Pearl Roth, another excellent security reporter

02:17:16   in the New York Times, which they seem to have a ton.

02:17:19   But the story I think everyone should read, and we don't have to cover this, you may have

02:17:23   talked about this too, is it's the one nation tract that was actually produced by the New

02:17:28   York Times opinion section as a reported piece. And that's the

02:17:32   one that actually had the whole thing about following people and

02:17:35   associating purchases and locations. That's by Stuart

02:17:38   Thompson and Charlie Worzel, who's also fine people. They

02:17:41   have a great, great venture people there. So anyway, that's

02:17:44   just my correction.

02:17:45   You know, that was a good story. I did I linked to that the One

02:17:49   Nation Tract, we don't have to go into it in depth. That was

02:17:51   where they were talking about these shadowy companies that

02:17:53   collect location data that they obtain through means we're not can't really be sure of, but

02:18:00   I think it's most I think it's mostly through frameworks that third party apps include in

02:18:06   in the app. That's Yeah, exactly. And then the app asks you, like, let's say it's a weather

02:18:11   app. And the weather app says, Can Can we have your location data to show you the weather?

02:18:16   And you say yes, because that makes sense. And then they also include this framework

02:18:21   to a location collection company and the framework takes your data in a unique identifier of

02:18:30   some sort and tracks your data as you go about.

02:18:36   And then they correlate it to purchases or something else about you and they can build

02:18:40   a profile that shows here's a person who bounces between Brooklyn and Manhattan and the hours

02:18:46   show clearly show that you live in Brooklyn and you work in lower Manhattan. And here's

02:18:52   where you go and all this creepy stuff. Here's here's somebody going to a building that isn't

02:18:56   associated with the CIA, right? But they seem to be traveling overseas and then going between,

02:19:01   you know, they track police officers, law enforcement, and they identified a celebrity

02:19:07   based on a date they found like, like a concert where they're, you know, a date and a time

02:19:12   in a location of the concert and a phone that was going to a hotel and then they forget who the artist was, but they contacted her, you know, and asked for her permission. And she's, you know, she gave them an on the record interview and said, Yeah, that's definitely me, you know, that that was me, that was where I was staying. That was my concert. That's creepy AF.

02:19:32   So, so your Apple put that new feature and it says for location it says

02:19:36   Used to be that an app could request permanent permission within the app right now

02:19:41   You cannot and the reason is this is obviously this I wondered why Apple had tighten the screws and apparently all the ad networks and and

02:19:49   ad advertising companies at software companies are

02:19:53   Peaved about this because it reduces their ability to do targeted ads and make nearly as much money I gather

02:19:58   But this is exactly why so, you know, why does a weather app do you know my location all the time?

02:20:03   I'll let it use it now, but you have to go into

02:20:05   privacy settings in iOS and iPad OS

02:20:08   To change that people to make it permanent that people get indignant when they were able to make money doing X

02:20:16   And then they're no longer to make money doing X they get peeved and in and not just peeved

02:20:24   Because you know you get peeved if you made money doing X for any reason and no longer can make the money

02:20:29   But they they get indignant about it and feel as though

02:20:32   They have been wronged

02:20:35   In a way that if X had never been allowed they wouldn't feel

02:20:40   wronged and so I wasn't gonna go off on this topic, but

02:20:45   There was congressional testimony this week and the people at tile

02:20:52   Testified against Apple it was a and it was a congressional investigation into is are the major major

02:20:58   You know, it's this I guess probably the big five

02:21:00   Facebook Amazon Apple Google

02:21:03   Who am I missing?

02:21:06   Mac Microsoft

02:21:09   Are they how are they are abusing their?

02:21:11   Size and and you know in anti-competitive ways and tile testified that Apple had had

02:21:20   competitively moved against them by

02:21:22   including the find my find my feature in iPhones because now it makes its building in the ability to find your lost phones

02:21:30   So you don't need to use a tile product to do it, which to me seems like a nonsense argument

02:21:36   Not not nonsense

02:21:38   But it just seems like barking up the wrong tree like that's a great feature and you really can't argue that you should have to

02:21:43   Buy a third-party physical object to get it to be fair and they're already

02:21:50   preemptively complaining about Apple's supposed tile tracker

02:21:54   Dinguses that have been rumored for a year, but haven't even been announced and yet they're already complaining to Congress that they're you know

02:22:03   It's anti-competitive for them to build it into the system

02:22:06   and you know and

02:22:09   complaints from other people specifically about the thing you just said that iOS 13

02:22:15   Removed the ability for an app to ask you to grant it always on

02:22:20   Location tracking even though the feature is still there. You just have to go to the settings app

02:22:26   Privacy location and then find the app and there you could say if if the app does want always on

02:22:34   location tracking

02:22:36   There's where you can grant it and I do think that the app is allowed to have a shortcut that jumps you there

02:22:43   Like that's for sure because I know I've tapped those and it's right in the right spot

02:22:47   But it does you do have to jump out of the app into the settings app to do it

02:22:51   And so I think the fact that it's more than just telling you

02:22:56   Instructions for how to get four levels deep into the setting or three levels

02:23:01   Whatever it is into the settings app to do it the fact that they let you

02:23:05   Tap a button to jump you there to grant it

02:23:08   I think it's a reasonable compromise in my opinion for what is what has turned out to be a very severely

02:23:16   privacy invasive feature

02:23:18   And then the other thing that I think that they are very annoyed about is the new feature in iOS

02:23:24   I don't think they mentioned it because I don't think there's any way

02:23:27   I don't think there's any way they can mention it without them looking bad

02:23:31   Like like the angle that people always take against Apple and sometimes it's true, right?

02:23:36   Like there absolutely there are good anti-competitive arguments against Apple and a lot of them. I think the biggest is the the

02:23:43   insistence upon taking 30 to 15 percent of App Store

02:23:49   Transactions and that that an app like an e-book reader

02:23:53   I mean, I did, you know pity poor Mike Amazon with Kindle

02:23:57   But like the fact that the Kindle app can't just have a button that jumps you to

02:24:02   Safari to buy your Kindle books in Safari. You can't even have that button in your apps is to me

02:24:09   anti-competitive and I really think that that's the sort of thing the government could really you know

02:24:13   Make some change, you know force Apple to bend on because I don't think it's right

02:24:17   I think it is anti-competitive and you can't even mention it. You can't even have a thing in your app that says go to oh

02:24:24   You know my content store or like the you know net again

02:24:29   Netflix is a huge company with billions in revenue, but Netflix can't even tell you go to Netflix.com

02:24:34   To sign up for Netflix. They let alone making it a hotlink

02:24:38   I uploaded an e-book to the Apple bookstore a few years ago the magazine that I ran the magazine Marco the one I bought for

02:24:45   Marco, I did a complete archive of the entire run of it

02:24:49   Uploaded as a book Apple's fine selling it, you know, not that many copies

02:24:53   But it was just want to make sure it was out there and then I get this thing

02:24:56   It's like we have rejected your book and you mean to make this change and the change was

02:24:59   there was one embedded link to an Amazon item in a 1200 page ebook.

02:25:07   That's crazy. So I think that's anti-competitive. So I'm not saying Apple is like, I'm not some kind of blind Apple loyal to this,

02:25:15   saying Apple has never abused its competitive position. I think the App Store and that 30%, 15% cut,

02:25:23   cut or we get some kind of money from everything that goes through and you can't even link to stuff

02:25:27   there that's something but this always on location thing is clearly a move that they've done for

02:25:33   privacy it is not it is not a power move to sell more of their own location tracking tiles but i

02:25:41   think the other change in ios 13 that these people are really peeved about are the periodic reminders

02:25:47   where apps that are using your location in the background.

02:25:49   I love that.

02:25:51   It is it is super great.

02:25:56   The map and everything.

02:25:57   Yeah, and it's a map and everything it shows you.

02:26:00   So it shows you so one app that I am happy to grant always on location access to is dark sky,

02:26:08   the weather weather app that gives you warnings for upcoming precipitation or other weather events.

02:26:16   Get you know every week or so. I think it's somewhat random or the algorithm behind it is you can't just time it

02:26:23   It's not like every Tuesday at 3 o'clock. I get this reminder

02:26:26   But you know every week or two I get a reminder that says

02:26:30   Hey dark sky has been using your location in the background

02:26:34   Nine times over the last

02:26:38   Whatever period of time and then the map shows with little markers where they are

02:26:44   And of course every single one of them is at my house because I don't go anywhere but

02:26:48   But there it is hop saying come on, right exactly

02:26:52   But I love that but the people who are you know using your location data

02:26:57   Apps that might you know, who knows, you know like games or something like that some free game and it just says hey

02:27:04   We want to use your location data for something and maybe you okayed it

02:27:07   And then all of a sudden you find out that this weird game that you haven't played in three months has been using your location

02:27:12   in the background, do you want to stop it?

02:27:14   And you don't even have to go to the app.

02:27:16   You just say, oh yeah, I don't want that.

02:27:17   And you click a button and it goes away.

02:27:19   - This is great.

02:27:20   It's not, it'd be difficult to argue

02:27:23   this is anti-competitive unless Apple were collecting

02:27:25   and selling the information in the same way.

02:27:28   And because of its privacy stance, you know,

02:27:30   and sometimes it slips up and sometimes they have

02:27:32   to apologize and sometimes they have to change the software.

02:27:35   Sometimes they don't realize the consequences,

02:27:37   but Apple is not enforcing this.

02:27:39   I mean, Google and some others might claim Apple

02:27:42   has put this in place to hobble the advertising market to force app developers to route more

02:27:50   money through with like in-app purchases or higher app prices that Apple gets a cut of.

02:27:55   There is an argument for that, but you can take two different apps that have the same

02:27:59   abilities and functions and one is doing horrible tracking that you have to block and the other

02:28:03   is not. It's just doing normal advertising that is not full of horrible trackers. And

02:28:08   those two exist side by side and Apple isn't doing anything preferential between them if

02:28:12   they both meet the rules. What it's saying is these random apps can't track you all the

02:28:16   time in the background without explicit user intervention and it's not trying to make money

02:28:22   off that itself. That would be anti-competitive. Did you also see though the Bluetooth one

02:28:27   is the one that got me. Do you get the Bluetooth pop up? This app would like to use your Bluetooth

02:28:31   settings or Bluetooth.

02:28:33   I did for a little bit, but I don't think I have many apps that are trying to do it.

02:28:38   I've seen it.

02:28:39   - There were some weird ones.

02:28:39   CNN app and some other apps.

02:28:42   And I was like, I don't know why these would need,

02:28:44   there's apparently there's an innocent explanation about,

02:28:47   oh, I don't know.

02:28:48   There's something,

02:28:49   it's not actually trying to communicate with your devices,

02:28:51   but it has to use some library or whatever.

02:28:53   But I, yeah, when I first fired up,

02:28:56   I was 13, I think I had like 20 different apps

02:28:59   were asking for Bluetooth apps access

02:29:01   and none of them except one podcast app needed it.

02:29:04   - I've heard that some of that

02:29:05   was for location tracking too though.

02:29:07   So like when you'd enter like a retail building or something like that, they might, yeah,

02:29:11   yeah, they might have the iBeacons.

02:29:13   And then if the app is still running in the background, cause you just checked CNN, it would check in, you know, that there was some, you know, I really, I do think, I don't think Apple added that idly, you know, just out of kicks.

02:29:25   No, I think they were sharing. Yeah, they think there was a standard library for, for iBeacon and similar things that were being used broadly.

02:29:32   Although I beacon is kind of I don't know how widely deployed that was but I wouldn't be surprised if retailers

02:29:37   Have Bluetooth beacons that are not for information but for tracking and they get tracked through that mechanism to provide location

02:29:44   information but yeah

02:29:46   So I don't want to run the CNN app and have it

02:29:48   Be doing Bluetooth tracking of me because they happen to include that library if that's what was going on

02:29:53   It's absolutely shocking to me Glenn that you're on the show and we're running long

02:29:57   But I got to take a break here and thank our third and final sponsor and I have one more major topic

02:30:02   I wanted to talk about. But first, I want to thank our friends at Squarespace. Hey,

02:30:09   it's still New Year. I know Larry David thinks it's too late to say Happy New Year, but I

02:30:15   still say Happy New Year to you. And New Year is when people naturally it's just human instinct.

02:30:22   You set new goals, you make resolutions, might be a time to start a new business, start a

02:30:27   new side gig, launch a new creative project. Well, anything that needs a

02:30:33   website for your new project, your new job, your new business, whatever it is

02:30:37   you're doing, start it at Squarespace. That's my suggestion. Make your next move

02:30:42   there. Go to Squarespace. You get a free trial. Last 30 days. You can build your

02:30:47   entire website in their WYSIWYG tools right there in the browser. Choose from

02:30:54   tons of award-winning, professionally designed templates that scale from

02:30:59   mobile to desktop in all sorts of styles. You can customize them completely with

02:31:07   your own brand, your own style, in any way you want. You add the features to the

02:31:12   site that you want. If you need a store, add a store. It takes care of all the

02:31:17   commerce, all the security stuff for you. If you need a gallery because what

02:31:21   building is some sort of portfolio for your design work of some sort. They've got that for you. You

02:31:26   want to have a blog, you want to host a podcast right on your site, they've got all of that right

02:31:31   there. And for the stuff like a blog or a podcast that gets updated regularly, you can post right

02:31:37   through the Squarespace interface as you create new entries in your site. Everything really

02:31:45   soup to nuts right there in Squarespace, including also award-winning technical support.

02:31:51   So next time you need to make a new website or somebody you know comes to you and says,

02:31:58   "Hey, can you help me make a new website?" Send them to Squarespace.

02:32:01   It's the easiest way to get started. 30 days free and with special code "Talkshow" at checkout,

02:32:10   you get 10% off. Everything is intuitive, easy to use. Start your free trial today.

02:32:15   Go to squarespace.com/talkshow, squarespace.com/talkshow,

02:32:20   and remember that code talk show 29 or 30 days later when your free trial's up,

02:32:24   and you'll get 10% off your first purchase,

02:32:26   which it could be prepaid for an entire year.

02:32:29   You get like entire month free over a month.

02:32:33   Squarespace.com/talkshow.

02:32:35   I thank Squarespace for their continuing support of the talk show. All right.

02:32:39   My last topic of the day, and you and I were talking about this offline,

02:32:43   because we ran into it, which is that, is it my imagination or is spam filtering for email

02:32:52   getting worse? I missed emails in the last week from you where you were asking, "Hey, do you want

02:33:00   to record the show?" Which, yes, I did. Here we are. And Jason Snell, I missed all, he does the

02:33:10   the annual report card on, you know, the Apple annual report

02:33:13   card, which I love and I love to participate in, because a I love

02:33:18   to hear the answers of everybody, but be it's like a

02:33:20   free column idea for daring fireball by just putting my

02:33:23   entire answers in there. And of course, he sent the original

02:33:27   thing out a month ago, and I thought, I'll do it after the

02:33:30   new year. And then all of his reminder emails to me like last

02:33:33   last call for those of you, you know, I maybe the reason it went

02:33:36   to spam was because it was a message that wasn't just to John Gruber, "Hey Gruber,

02:33:43   you know, what the fuck, send it in." It was like to, you know, the BCC recipient lists

02:33:49   of everybody he had asked to participate who hadn't yet submitted an answer. I guess that's

02:33:55   why, but there is absolutely no way that emails from you, from an email address you've been

02:34:02   using as long as I've known you and Jason Snell using an email address that

02:34:07   I've known for I think 20 years maybe more to my main email address that I've

02:34:15   been using since I registered the daring fireball dotnet domain in 2002 like this

02:34:20   it's not like they're going from or to new email addresses and it's not like

02:34:26   the contents of any of them I mean again the one thing about Jason's is that it

02:34:30   It was like two unknown recipients, but that really shouldn't apply to the spam and yours.

02:34:35   I have no idea.

02:34:36   No zero.

02:34:37   I do ask if you wanted to buy some Cialis too.

02:34:39   No, you did not.

02:34:41   I have noticed this recently, but for a while and my email currently and for years now has

02:34:47   been backed by Gmail.

02:34:49   I don't have, I don't use a gmail.com addresses my address, but my main email accounts are

02:34:54   backed by separate Gmail accounts.

02:34:56   The main reason I did was for spam filtering

02:35:00   Which I had found originally was really excellent and I so excellent that I had fallen out of the habit of eyeballing

02:35:07   my spam mailbox

02:35:09   But recently I have found I have found that I absolutely have to do the old-fashioned thing and visually scan

02:35:18   In like I told you it's you know at least a you know

02:35:23   2000 emails a month total, you know comes all my addresses combined and I have to sit there and visually scan it and I things jump

02:35:31   Out I have missed I mean in addition to ones from you and Jason

02:35:34   Sponsorship inquiries, you know from people, you know companies that want to sponsor

02:35:39   Daring fireball, you know actual business opportunities

02:35:43   Call back to the sort of this episode the jeopardy introduction or audition notice was in my spam filter. I almost deleted it

02:35:50   I was like, oh spam from somebody claiming to be jeopardy. I'm like, oh my god, wait, and if I deleted that

02:35:55   Hey, that would have been a life experience and 30 something thousand dollars lost. So

02:36:00   Yeah, I don't know if it's gotten worse now, that's a good question

02:36:04   I don't know how you measure that but you would think after all these years that Google would have

02:36:08   Automatically whitelisted incoming recipients you'd corresponded with you know where you hadn't marked it as spam for that long

02:36:15   It's the same addresses

02:36:16   even if they're cashing it as they should be in your own account.

02:36:19   I use Fastmail, which I've used for many, many years,

02:36:23   to back up my accounts, and they have a pretty aggressive anti-spam strategy

02:36:29   in terms of dealing with their own customers, and they seem to operate really good filters.

02:36:34   Like, I don't get nearly as much spam as you do, and they have spam assassin behind them,

02:36:39   I put some rules in place for the most egregious stuff.

02:36:42   But even then, you know, some stuff winds up in their junk filters,

02:36:44   filters, some stuff winds up in, I use SpamSieve on my Mac. But what I find is that Google in

02:36:52   particular, Gmail at periods of time is more aggressive. So and I don't know if it's just me,

02:36:58   it seems like it is often. But there's been points where like for six months, every email just about

02:37:03   I sent to someone on Gmail, I had to follow up with with like a text because it went straight to their

02:37:08   spam filter. And then it stopped. And then it was fine for long periods of time. And just recently,

02:37:13   Yours is not the only message I've sent to someone using a gmail account where I think it wound up in

02:37:18   in the spam filter

02:37:21   And I I find that there are some that I correct them and it doesn't get uncorrected

02:37:26   So you should be white listing why isn't it listening to your white list?

02:37:31   I don't know. I always was under the impression that all you have to do to tell Gmail that

02:37:35   Like I'm using the Apple mail app not Gmail's app

02:37:40   But that if you move a message from junk to your inbox, it's saying this is not junk

02:37:46   Because I don't know how else they could interpret that but like Dave Weiner started sending out

02:37:52   daily summaries of his scripting news

02:37:57   And I I guess I should unsubscribe because I follow scripting news in RSS and that's where I would naturally do it

02:38:04   But I'm just I'm just intrigued by how he's using an email newsletter

02:38:09   It's just like a recurring theme for the last four months and I I'm not gonna say I'm on the cusp of

02:38:15   Doing a daring fireball email newsletter

02:38:18   But I'm I've noted the resurgence in email newsletters

02:38:23   and I talked to Dan Fromer about it and his main thing is a newsletter and

02:38:27   Ben Thompson's main thing is the newsletter the daily update at Strateckery and

02:38:32   Paid versus paid and free. There's a resurgence of interest in newsletters

02:38:38   And one of my reasons for for you know, I've I've been saying this over again

02:38:43   I have this theory that the reason people really like reading email newsletters is

02:38:47   You click on the newsletter and you read it and it scroll down and you're done and there's not nobody's popping up anything that

02:38:54   Covers the email and says will you subscribe to our newsletter or will you subscribe to this or will you we're collecting cookies?

02:39:03   okay or no I mean I don't I have told the goddamn Guardian that I'm okay with their cookies a

02:39:09   Thousand freaking times every time I go to their website

02:39:12   I have to click three things on most websites just to read the goddamn article the fact that I can just click on an email

02:39:19   And there it is on a

02:39:21   Unobstructed and I scroll down and then I'm done and that's it

02:39:24   I really think that's part of the resurgence

02:39:26   But anyway, so I want to keep getting Dave's daily updates and they keep going to junk

02:39:31   I don't know what to do to make it stop. No, I know this from you know, I used to for many years

02:39:36   Some weird little software I wrote powered tidbits mailing list and which has tens of thousands of people on it

02:39:42   And as you know tidbits the longest continuously

02:39:46   distributed

02:39:48   Email newsletter on the internet. We're pretty sure like it seems like there was one that might have been longer and it's gone

02:39:54   So they've got a lot of experience with dealing with spamming Adam is very experienced understanding what makes things go wrong

02:40:00   and now is using a service at long last so is not running it himself, but I had

02:40:06   wrote this thing that

02:40:08   did some

02:40:09   sophisticated or I thought spam processing or a bounce processing and

02:40:13   Seemed to not trigger spam filters and tributes ran off that for like 10 years

02:40:17   I think and even then it would just be a mystery like one week

02:40:20   We would do nothing differently and like all AOL users would be would be

02:40:25   Unsubscribed or blocked or we get a thousand bounces from one outfit and I'm still seeing it

02:40:30   so I used fast mail as I said and I

02:40:32   And I get tidbits and half the time it winds up being categorized as sort of light spam. So I see it

02:40:39   It's in my spam filter. It's not deleted with a high score, which I've got set and I'm thinking how is tidbits?

02:40:45   I mean I've received a

02:40:47   thousand issues of tidbits

02:40:49   Most of those at my fast mail address or at fast mail servers and it's absolutely compliant and Adam sends out huge amounts of email

02:40:56   every week so he knows and

02:40:59   It's a mystery. I was telling you in our show notes here

02:41:04   Our pre show discussion is I think it was the late 90s that I wrote an article saying

02:41:09   I'm really concerned about email balkanization because spam filters and other things might prevent people from sending email from one place to another

02:41:17   I'm like, "Well, it's still going on. No one's figured out a way."

02:41:20   I mean, there were all those proposals, some of which are used,

02:41:24   that do different ways of validating domains and senders

02:41:28   and some of it's public key-based and some of it's in DNS.

02:41:32   And this was supposed to be a way to reduce spam by having some kind of verified path.

02:41:36   And even if spammers used it, the people who were verified would be able to

02:41:41   have an identity that was cryptographically proven or matched.

02:41:45   So you could say well this domain is cool because we can confirm all this email actually was issued by this domain because they've got the private key and will whitelist this whole domain.

02:41:54   So another spammer or spammer might use a domain and have their own private key and we know it's verifiedly from the spammer but they can't fake being from this verified domain.

02:42:04   And they just all you know, email is still so disparate. There's no centralized anything or authority, which is great, because then you don't get advertising overlays and cookie requests and all that. But it also means that all these proposals to improve the way in which, like ham can be identified as ham ham sees ham, does not has not matured this many years into it.

02:42:29   Right. And there's so much of my and what makes going through it

02:42:33   manually and just visually scanning it so tedious is that

02:42:38   at least I would guess 98% of it is so obviously spam, you know,

02:42:46   like any human being with 100% certainty would say this is

02:42:51   without ever opening it just just looking at the sender in

02:42:55   the subject. I mean, literally, I'm looking at it right now.

02:42:59   Here's one from ZZY 712712712. The next one's from Zing Z66IXHLVBM@iekh. I mean, I don't even know how they think it's gonna work. I don't know who in the world would open this even if they didn't have any.

02:43:20   Why is machine learning not solve this? These are all readily identifiable.

02:43:23   Five straight ones that are all just random, random strings of letters, and then the next one is from Late Night Peeing.

02:43:30   I'm not making this up.

02:43:34   That's not my favorite Late Night show.

02:43:35   No, it's America's least favorite Late Night show.

02:43:39   That's not Conan.

02:43:41   Late Night Peeing.

02:43:43   It's below Conan's.

02:43:44   Yeah, it's...

02:43:46   It's like...

02:43:48   I

02:43:50   Don't know what to do about it, I don't know but I I would opt into some sort of system

02:43:57   I don't know. It's too late. You know what I mean. It's like

02:43:59   It's email like in some ways like I said email is is is having a resurgence in terms of newsletters

02:44:08   but as a fundamental level like the the initial optimism of the early internet of everything being open and

02:44:16   Anybody with yeah, you know is just its email has just suffered under the weight of that forever

02:44:22   You know that anybody with an SMTP server can send email to anybody with an email address and it just goes there

02:44:31   Is yeah, it's somebody's gonna solve it. I think and not like it's it's but I mean these are the trade-offs, right?

02:44:38   There's there's always extreme. So if you you know, Facebook is unreliable

02:44:42   You can't think that anything you post there gets delivered to people unless you pay for you know

02:44:48   Boosting it and then even then you don't know so you're gonna have

02:44:51   200,000 people liking your page and only 500 of them see a post right that kind of thing goes on and

02:44:56   Email is great because there's no intermediation except that the spam is so bad that it mixes your ham and your spam

02:45:02   And you're you're not getting it like these are these are our extremes

02:45:05   But I do think that I think we I don't think we have seen well enough applied

02:45:11   machine learning and training

02:45:13   because

02:45:15   So much spam is obvious as you say I do not get why

02:45:19   And why this is not more identifiable and I expect that at some point there will be a breakthrough

02:45:25   I mean remember how so before deep learning came in as a the most effective technique for

02:45:30   Artificial intelligence a few years ago and made voice recognition and a lot of other things image recognition

02:45:37   so forth so much better like you know a

02:45:41   20% 25% better got us a lot more closer to the hundred percent score before deep learning

02:45:47   Remember how terrible everything was all the all the recognition was it was statistically based there all these things and you know

02:45:53   You'd speak and it would give you the egg freckles is the yeah the Newton joke for handwriting recognition or whatever

02:45:59   Well, so deep learning came in and it made a huge difference

02:46:01   But I don't feel like email has made that leap and I know that Google's there was machine learning

02:46:06   There was the leap there was the leap with Bayesian filtering whatever that yeah

02:46:10   But that was really the breakthrough was like holy shit. This works whatever

02:46:13   Yes, it's a train is its statistical analysis using certain kinds of like measures of frequency and it and then what happens the spam makers

02:46:21   Would run their email through the same Bayesian filters and tweak them until they found ones that got lower scores because they introduced

02:46:27   You know, that's why you see part of Moby Dick at the bottom of email messages

02:46:30   Yeah, I read and so forth and I remember, you know, I was at bare-bones software working there, you know recent guest Rich Siegel

02:46:38   Which was a great episode? I don't know if you listen to but man that was one of my favorite such

02:46:41   but I remember when I was there mail Smith was still in you know an act a

02:46:47   Successful product that is still my email client is it really see the one thing I miss about I and I miss see this

02:46:54   Man, do I miss this is using mail Smith with spam sieve you'd get a score

02:47:00   And then you could sort your spam mailbox by the score and it made visually double-checking it so much easier because

02:47:07   all of the actual ham

02:47:09   That got incorrectly flagged to spam was at the top and all the obvious stuff was at the bottom

02:47:14   And there's no I can't figure out any way in

02:47:17   Apple mail to get anything similar to that and I don't know what to do about it

02:47:21   but anyway, but I remember before Bayesian filtering we had like a

02:47:25   Like it was like it as a tech support thing like a

02:47:31   Collection of like filters like just manually set up like yeah like if the subject matches this pattern

02:47:37   or if the sender you know like a couple of reg ex patterns, and you know based on the headers and

02:47:43   And for a while it really worked well. We had like you know you could just download this filter and

02:47:49   Add it to mailsmith and all of a sudden

02:47:52   You know your spam problems were mostly solved

02:47:55   And it was really just effectively just like eight if the message matches blank statements.

02:48:02   You know, eventually that fell apart. You know, they found ways around it. And

02:48:07   the Bayesian filtering was a great leap forward. But I feel like you're right. We need the next

02:48:11   one. And I guess it has to be machine learning. But it's not there. I don't know. Maybe I'm nuts.

02:48:18   Maybe it's just me. Maybe it's, you know, now that I've found a couple, I've found a couple

02:48:22   of messages recently over the last like two months in my spam and I'm panicked about it. Maybe it's

02:48:27   just me but I'm gonna slack with some technical you know some friends on you know that I've asked

02:48:33   other you know similarly technically oriented long-term email users have all all said the

02:48:39   same thing either that they think it's getting worse or they're convinced it's getting worse.

02:48:44   Oh the other I would say the the another factor could be that spammers are getting better again

02:48:50   They're getting better at filtering through which makes every time spammers get better spam filters get worse because they produce more false

02:48:56   positives that way I will say this that I'm not seeing an increase of spam

02:49:02   Getting into my inbox the only thing the only problem I'm seeing is good email getting flagged to spam

02:49:09   Oh, but that can happen is you if you the filters could be improving so they're not passing more through

02:49:15   So they're still good on false negatives, but they're bad on false positives

02:49:20   Which is not kind of what you want. Like I think you'd rather have fewer false positives and more false negatives. Not like a thousand

02:49:26   Yeah, spams drop in but I'd rather get one or two. Yes spams to get a hundred percent of my

02:49:31   Right better that a guilty man

02:49:36   Hundred guilty men go go free than an innocent man be put in jail

02:49:43   All right

02:49:45   So Glenn before I let you off the hook

02:49:48   You are people can if they enjoy your voice on this show

02:49:53   They can hear you and this is odd coincidence and it really is

02:50:00   But you've been guest hosting the election ride home podcast. Yeah, that's right

02:50:06   Excuse me

02:50:08   Chris Higgins was the founding host of it and he's had to step away for family reasons

02:50:13   and everything's okay, but he needs to focus on family. So I am stepping in for a bit to do this

02:50:20   podcast because you know, as I can do everything in my career and the folks that ride home. Brian

02:50:29   McCullough, I was on his, he's got a great internet history podcast is great because no one else is

02:50:35   really doing the oral history of the internet comprehensively. I think, I mean, I don't want

02:50:39   want to say he's the only one but I think he's the only one doing it at this level of detail. So he's

02:50:45   talking with early people in the internet, ecommerce companies and whatever. And so I came on a

02:50:50   couple times to to shoot the breeze about some of my early experiences and things and it's great

02:50:55   because I'm like, Oh, now I've told him it's there forever. So he's been doing the tech meme ride

02:50:59   home podcast for a while. And I've substituted on that a few times when he said vacation or

02:51:05   obligations, jury duty and the like. And so Election Ride Home is just, it's the idea is just a

02:51:11   summary. You know, it's I think, and I have this feeling too, it's very easy to get overwhelmed

02:51:16   because there's too much news. There's a year's worth of news every day now, right? About

02:51:22   everything. And so the idea is, you know, it's a 15 minute podcast about everything that happened

02:51:27   in elections today. And it's very interesting to wake up in the morning and go, okay, kind of

02:51:31   what's what's happening? You know, sometimes there'll be like the presidential debates. I was

02:51:36   covering that last week. And it's like, okay, how do I summarize two hours plus of this in a way

02:51:42   that's useful for people who didn't watch it and might be useful for people who did, but still make

02:51:47   it you know, pithy and bring in some sound bites and, and not have it be facile. Like, here's kind

02:51:51   of what you need to know, like, what is the national discussion ostensibly being today, but

02:51:56   some other times I wake up in the morning, and it's like, you know, there's new impeachment thing,

02:52:00   Pelosi sending the articles to the Senate. Some, you know,

02:52:04   Booker drops out or whatever. And so every day is a new

02:52:07   adventure and mystery. But the goal is to like, make it

02:52:11   approachable. So you put on your headphones on your commute home.

02:52:14   And in 15 minutes, you're done instead of, you know, reading a

02:52:18   thousand, you may still read 1000 stories, but I'm trying to

02:52:21   help people. I feel like it's a, it's a service to keep people

02:52:24   from having to fully immerse themselves in it every day.

02:52:27   And how long is the gig?

02:52:29   It's ongoing. I think they're looking for a permanent host for the show, you know, because I've got a lot of his, you know.

02:52:35   I've got a lot of different things I'm working on.

02:52:37   Well, and this also sounds, I know it's like, oh, it's only 15 minutes as opposed to the three hours you and I have been talking right now.

02:52:42   But I can only imagine how much work it is on a daily basis to boil all of it down to 15 minutes.

02:52:49   It would actually probably be less work to do a three hour podcast where you just ramble on about it,

02:52:54   where as opposed to typing it on them.

02:52:56   - It's already a script and everything.

02:52:57   Yeah, it's already a script.

02:52:58   I mean, it's like a four to six hour cycle to do an episode.

02:53:01   And so it's really, it's exciting.

02:53:03   As somebody who's been a freelance writer for a long time,

02:53:06   I've never done daily reporting.

02:53:08   I've only done stuff for weeklies

02:53:10   or sometimes on deadline, overnight or something.

02:53:13   But I did a gig for Fortune from mid 2018 to early 2019

02:53:18   before they got sold to a mysterious Thai billionaire.

02:53:21   No kidding.

02:53:21   No kidding.

02:53:23   They just rebooted with a new design and they, it's a great publication, some great people, great editors I was working with there.

02:53:29   But they were running a kind of daily newsroom within shifts.

02:53:33   And so I was writing kind of breaking news for seven months for a few hours every day.

02:53:38   And that was fascinating to me and a little overwhelming because my pace is, you know, I'm often, I've been describing myself as breaking news from the 19th century because I'm always like, hey, you know, I just discovered some printers were poisoned in printing this newspaper in 1838.

02:53:52   I must write this story. So the election right home is it's a great way to stretch my muscles. And, and it's a fun I feel like it's a little bit of a service thing and giving people you know that that snippet is I think a lot of the political podcasts tend to be discussion and longer. So you have to commit yourself. And there's some of them I really like, like, love it or leave it. I'll listen to every week and a few others. But anyway, it's a very specific, it's a very specific remit for this podcast.

02:54:19   I will I have to disclaim it but it's just an odd coincidence but the sponsor of the previous episode of this show

02:54:26   Last week's episode with Merlin Mann was tech memes ride home. So just another ride home podcast

02:54:32   I actually purposefully juggled the sponsorship schedule so that we would wouldn't have

02:54:37   Tech meme ride home sponsoring the episode where you Glenn talk about election ride home

02:54:43   But I mean it they sponsored last week's episode

02:54:46   They're not sponsoring this week's episode, but the ride home podcasts are really really tight

02:54:50   And it's it's like your own little private NPR for your area of interest

02:54:55   so if you would rather hear about election news on your ride home, then

02:54:59   Tech news on your ride home election right home is a great thing

02:55:03   And if you're you know if your ride home is 30 minutes you could listen to both

02:55:06   There's a celebrity one also. They you know they

02:55:09   Not public news. They raise a little money because I want to do they want to do a bunch of these

02:55:15   You know, there are long podcasts like this one they get into things and they're a short part

02:55:20   I think the news one is interesting because I think that that often I often don't listen to

02:55:23   News podcasts because I like to prefer to read it

02:55:27   I if I knew you know in some areas like I would love a like a security news podcast

02:55:33   It's just not enough interest. That was just like whatever security news happened that day. Just give me the boil it down

02:55:38   Then I'll go and drill and find like an index for my ears

02:55:43   The other thing I know you wanted to mention was the tiny type museum

02:55:47   I I did I want to thank wash

02:55:49   I want to thank during fireball listeners because many of you well

02:55:53   I shouldn't say me and relative proportions not many of you

02:55:55   But some of you have been great supporters of the project and it's it's we're gonna ship very soon

02:56:00   I know this is unheard of a Kickstarter that ships more or less on time

02:56:05   But I've been thanking past Glen so that we did the Kickstarter in February last year

02:56:11   Myself and Anna Robinson has been making the wooden cases handmade wooden cases for the type artifacts that go into it and

02:56:18   She is nearing where like the very final stages of final production for that and I've got my basement is full of like the poor

02:56:26   letter carrier

02:56:28   Describe it in one line, you know, so the tiny type museum and time capsule

02:56:31   It's a collection of printing artifacts like wood type and molds from which type are made and little pieces of photo type

02:56:37   And I've got a CD-ROM p22 the type foundry. It's been around for 20 plus years. They sold me very inexpensively

02:56:44   CD ROM so every time he's gonna have a 1990 era

02:56:48   CD ROM with a personal license to use the font that's on the CD ROM if you can read it

02:56:53   So it's been a it's been a hoot. I did a book along along with it as well

02:56:57   And it's being printed this one's being printed in England right now

02:57:00   It's and so that's the one you mentioned that was actually it is actually being printed in letterpress

02:57:06   Yeah, like it was set in hot metal in

02:57:09   in

02:57:10   North Yorkshire and where there's the printer I work with as a

02:57:15   Former employee who has all the types of equipment now and it's being printed in London

02:57:20   London proper and then be bound in Germany and actually when the brexit situation was happening if it had crashed out in October

02:57:28   I was really worried about how my

02:57:30   Printed pages were gonna get to Germany for binding. You might have to smuggle them out

02:57:35   Well, are you talking like could I fly there and imagine?

02:57:37   Imagine getting busted for smuggling something is as

02:57:41   Esoteric as international book smuggler. So so the postman this poor letter carrier post lady

02:57:49   It's a postal carrier rather. We have a variety of postal carriers and livers

02:57:53   I'm buying lead

02:57:55   brass

02:57:57   Bronze

02:58:00   Lightweight stuff. It's like a joke. It's like the box is one of those tiny USPS

02:58:04   Like everybody in the United States who deals in this stuff ships with priority mail because it's almost unlimited weight for the size

02:58:12   So you can see the guy go up the stairs to hands it off to be like, oh, I'm sorry

02:58:15   It's like 40 pounds of lead. She just gave me so my basement is full of all the artifacts

02:58:21   they're gonna get sorted into sets but

02:58:23   People can people can still get in right there are yes

02:58:27   I the edition is about a hundred and 80 of them are now over 80 of them are now

02:58:31   Pre-ordered or sold in the Kickstarter. Yeah, any type museum

02:58:35   Basically if you even know how cool it is that Glenn is having the book

02:58:41   Physically letterpress if you even know what that means

02:58:44   You should absolutely go check out tiny type museum calm and if not, well, that's probably not for you

02:58:49   It's it's been fun though because I've had all these great conversations a lot

02:58:52   There's a lot of older folks, particularly, usually men who are working in the printing industry or had grandparents work in the industry.

02:58:59   And I have these great conversations with people like, "Oh my God, my father used to be a typesetter or someone telling me..."

02:59:05   I have this great, you know, Chris Finn, who used to write for Macworld, who now is at...

02:59:10   DK... DC Thomas? I'm forgetting the name.

02:59:15   It's a major media company in the UK.

02:59:18   And you can still see they still have a building on Fleet Street.

02:59:21   They're like the only they don't have reporters there. But then when you walk on a fleet street in London, the home of newspapers in London where there are no newspapers left, they still have a DC Thompson. Anyway, Chris Finn recorded a short podcast for me with a colleague of his who was just old enough to have worked on old letterpress when newspapers were printed by letterpress and had this story about how they put the breaking news in with this special little thing you would like stick into the running

02:59:51   press with just a little bit of copy that said extra extra

02:59:54   breaking news. He recorded is so nice. recorded a short podcast

03:00:00   put it on YouTube about that. So anyway, it's been great just

03:00:03   talking to folks about this people have fond memories. And

03:00:05   anyway, I appreciate everyone's support on it. It's been a

03:00:08   great year to 2019 was incredible to work on this. So

03:00:11   thank you all.

03:00:11   All right, my thanks to Squarespace for sponsoring this

03:00:15   show. Go to squarespace.com slash talk show start your own

03:00:17   website clear the absolute best way to get through airport security and the

03:00:26   first sponsor of the episode who I can't remember hover that sound I it's coming

03:00:34   to me Glenn it's coming to me I hover hover.com slash talk show where you could

03:00:40   go and register domain name and get a 10% discount with that link thank you

03:00:44   Glenn, you've been extraordinarily generous with your time and your personal insight into

03:00:49   Jeopardy. What a great show. Thank you very much.

03:00:52   Always a pleasure, John. Thank you very much.

03:00:55   [ Silence ]