243: ‘The God Awful Truth’ With Rich Mogull
00:00:00
◼
►
Alright, first time guests on the show, Rich Mogul. Welcome to the talk show.
00:00:04
◼
►
Thanks for having me. I famously sometimes don't do a lot of prep for the show. But I do keep a
00:00:13
◼
►
running list of topics. And there in the time since my last episode, there has been a very
00:00:19
◼
►
consistent theme on topics that are here to talk about. And they relate to computer security,
00:00:25
◼
►
privacy, which are not necessarily two sides of the one coin, but are certainly related.
00:00:31
◼
►
And I thought I've been thinking about having you on the show for a long time. And I thought, well,
00:00:34
◼
►
for God's sake, if with a big list of security related topics, why not have rich on the show now?
00:00:39
◼
►
Thanks. Yeah. I mean, it's excited to be here. So for people who aren't familiar with you,
00:00:44
◼
►
what's your background? Alcoholism. Take my kids rock climbing. So now the
00:00:53
◼
►
Yeah. Not at the same time. Okay. Yeah.
00:01:01
◼
►
So I've been doing security for like 20 years at this point.
00:01:03
◼
►
So it was actually a analyst over at Gartner.
00:01:05
◼
►
That's when I first started getting involved with Apple stuff, spun out,
00:01:08
◼
►
started my own company about 11 years ago. It's done pretty darn well.
00:01:12
◼
►
I'm just doing a security advisory services and such,
00:01:16
◼
►
and also have a new startup that's doing security stuff over in the
00:01:21
◼
►
cloud. So this has been kind of my security's been bread and
00:01:23
◼
►
butter for a while now pays the bills.
00:01:25
◼
►
And it never, never gets old. It's a growth industry. So it's
00:01:30
◼
►
like plastics. It is, you know, there are, I always say it's
00:01:35
◼
►
like, it's kind of an interesting exercise is what
00:01:38
◼
►
are, what are the fields where it's it, you've got a bright
00:01:45
◼
►
future, or at least opportunities ahead. And I would
00:01:48
◼
►
say computer security is one of those, you know, there's all sorts of fields, you know,
00:01:54
◼
►
and it's not funny when, you know, industries get disrupted, and people lose jobs, etc.
00:01:59
◼
►
It's, you know, but security is one that's good to go.
00:02:03
◼
►
John "Slick" Baum: It's weird to see, you know,
00:02:05
◼
►
when I started like 1520 years ago, it was kind of a small thing. I mean, even for the
00:02:09
◼
►
Apple stuff, we were, you know, almost screaming into the void. And now there's things like
00:02:14
◼
►
Russia and China and cars and rockets.
00:02:19
◼
►
And so it's been a, it's been a pretty wild ride.
00:02:21
◼
►
Yeah. Yeah. I don't remember talking about the Russians 20 years ago. All right.
00:02:25
◼
►
I mean, I remember talking about them, but not as it relates to personal security.
00:02:28
◼
►
See, we knew there was a movie red Dawn. If you watch that,
00:02:31
◼
►
it was a precursor to everything.
00:02:33
◼
►
Uh, I haven't watched red Dawn since I think red Dawn came out when I was exactly
00:02:40
◼
►
the age of the, the protagonists of red Dawn.
00:02:43
◼
►
I'm going to guess red dawn without looking at IMDB.
00:02:46
◼
►
I'm going to guess it came out in 1984.
00:02:47
◼
►
I probably cause I'm, I think you and I are about the same age that it resonated.
00:02:54
◼
►
Let's see. Let's see how close I am. Let's see. Red Dawn.
00:03:01
◼
►
Damn. That's impressive. Why in the world? I,
00:03:05
◼
►
I was just talking to a friend about this, about it.
00:03:08
◼
►
I can remember very specifically weird things from like the eighties.
00:03:13
◼
►
I could tell you exactly what year certain World Series were. I believe I was talking,
00:03:18
◼
►
it was, we were talking about Super Bowls. And I remember that 1984 was the season where Dan Marino
00:03:22
◼
►
broke all sorts of passing records. Like why in the world, I don't remember it vaguely as the mid
00:03:28
◼
►
80s, but specifically 1984. What a waste of brain cells. Whereas stuff that happened two, three years
00:03:34
◼
►
ago, I'm like, I don't know, I got to go look that up. Like when did the iPhone 8 come out? And I got
00:03:39
◼
►
got to sit there and count on my fingers like what year it was.
00:03:43
◼
►
Yeah. I can't remember two days ago,
00:03:46
◼
►
but I remember my exact emotional state when I saw the very first preview to the
00:03:51
◼
►
Michael Keaton Batman. It was, it was really exciting going back to the eighties.
00:03:56
◼
►
I'm writing my moped after around town when I was 16 years old or 17 after
00:04:01
◼
►
watching Top Gun, just, I was, man, I was,
00:04:03
◼
►
I was right there on my crotch rocket moped, right?
00:04:07
◼
►
It is funny because I remember the Keaton Batman too very vividly and it was in hindsight looking
00:04:13
◼
►
at it. It is so Tim Burton-y and sort of comical and farcical. You know, it's almost, it almost
00:04:22
◼
►
verges on the border of a musical, you know, like the scene where Nicholson's Joker comes in and,
00:04:29
◼
►
you know, in the restaurant where Vicki Vale is waiting for him and his henchmen are, you know,
00:04:36
◼
►
spray painting all the artwork and knocking stuff over. It's almost a musical, right? Whereas as a
00:04:42
◼
►
kid being used to the Adam West that this is what televised Batman looks like, right? The the
00:04:49
◼
►
Keaton Batman look like this is this is seriously dark and serious.
00:04:54
◼
►
Well, it's like all those two because you've got Burton or like Zack Snyder, who's now remade
00:04:59
◼
►
Watchmen three, four or five different times because all of his DC movies look the same,
00:05:04
◼
►
right. But with Burton, I mean, he's more creative than that. But it's just that tone. But that was
00:05:08
◼
►
early enough that, you know, that that tone was like new and original. Yeah, I still think casting
00:05:13
◼
►
Michael Keaton as Batman is still one of the master strokes of casting and movie history.
00:05:18
◼
►
Like, and everybody was like, Oh, my God, this is ridiculous. How can Mr. Mom be Batman?
00:05:22
◼
►
That was not only was that genius, but the way they brought him back for Spider-Man Homecoming
00:05:28
◼
►
and that role. Yeah, yeah, yeah. The maturity and the gravitas he brought to that and that that
00:05:32
◼
►
history for those of us who are old enough. Yeah, sorry, I'm nerding out here. I know you want to
00:05:37
◼
►
talk about security. But he's a terrific actor. That's the thing. That's the thing is that Michael
00:05:41
◼
►
Keaton is just absolutely a terrific actor. And it's one of those things where there's all sorts
00:05:46
◼
►
of serious actors who cannot do comedy. And there's all sorts of actors who can do comedy who,
00:05:50
◼
►
who can do serious stuff. You know, Robin Williams would be a terrific example of that, where,
00:05:55
◼
►
my God, when he got serious, it was, you know, he could be, you know, as good as any actor on
00:06:01
◼
►
planet. Then you've got jerks like Chris Hemsworth and Channing Tatum, who are like these buff action
00:06:08
◼
►
gods and they just act really well and they are funny as shit. I just saw Bad Times at the old
00:06:14
◼
►
Royale and I'm like, Oh, Jesus, where did he come from in this movie? It was amazing. Yeah,
00:06:18
◼
►
I liked that movie. That was pretty good. Yeah. It reminded me sort of of a 90s throwback, you know,
00:06:24
◼
►
like the almost theater-like limitation of how few sets there were. Like the whole thing took
00:06:35
◼
►
place at the hotel. I don't want to spoil it for people who haven't watched it, but it's very good,
00:06:38
◼
►
entertaining. My main thing now these days is just, can I predict exactly where this movie
00:06:46
◼
►
is going or not? And the answer was no, and so I was interested. Well, I mean, that's why I love
00:06:51
◼
►
last Jedi, which is so controversial for a lot of people because I had no idea where that movie was
00:06:55
◼
►
going till the end. Yeah, I didn't love it, but I liked it, you know, and yeah, I don't expect them,
00:07:01
◼
►
you know, if I wanted to, you know, I should have devoted my life to trying to become a writer for
00:07:05
◼
►
Star Wars if I wanted to. I don't know, there's an awful lot of people who seem to want to write
00:07:10
◼
►
write the next Star Wars movie and it's like, you know, just sit back and enjoy. Yep.
00:07:17
◼
►
I have a one piece of follow up. There was on the last episode I mentioned offhandedly and of course
00:07:22
◼
►
I mentioned something offhand it suddenly becomes a big story on Mac rumors that somebody told me,
00:07:28
◼
►
somebody who would know told me basically that Apple has been selling Apple TV pretty much since
00:07:34
◼
►
its inception, since the original Apple TV, that they more or less sell them at cost. And they're
00:07:42
◼
►
they're not a big profit maker,
00:07:43
◼
►
and that they're selling to HomePod for under cost.
00:07:48
◼
►
And this got picked up by MacRumors,
00:07:51
◼
►
and oh my God, did I hear from people who said,
00:07:53
◼
►
no way, Apple doesn't sell anything,
00:07:57
◼
►
everything they have is super high margin,
00:07:59
◼
►
blah, blah, blah, I can't prove it,
00:08:01
◼
►
I'm not saying that I know this for a fact,
00:08:02
◼
►
all I'm saying is a source that I trust very much said so.
00:08:06
◼
►
And to me, it makes a lot of sense as to why HomePod
00:08:09
◼
►
is so much more expensive than its ostensible competition.
00:08:13
◼
►
That's all I can say about it is,
00:08:18
◼
►
boy oh boy did I hear it from people who said,
00:08:20
◼
►
Apple doesn't sell anything at less than a high margin.
00:08:22
◼
►
And I just think people are underestimating
00:08:25
◼
►
like when 99% of everything they sell
00:08:27
◼
►
is either a Mac, iPad, or of course iPhone,
00:08:32
◼
►
that something like HomePod or Apple TV
00:08:34
◼
►
that sells in relatively minuscule amounts,
00:08:39
◼
►
if it was selling at cost or about cost,
00:08:42
◼
►
would have no effect on their margin whatsoever.
00:08:44
◼
►
- I think the, I mean, I use both of those devices.
00:08:49
◼
►
We're all in on that.
00:08:50
◼
►
And there's such a clear quality difference.
00:08:52
◼
►
I mean, not just the build construction,
00:08:54
◼
►
but the reliability of the most of the other things
00:08:57
◼
►
that I've tried, including the Alexas.
00:08:59
◼
►
It's pretty noticeable.
00:09:00
◼
►
It would not surprise me at all,
00:09:03
◼
►
or at least in the first year or two
00:09:04
◼
►
of the manufacturing of those.
00:09:07
◼
►
Well, and I just think that, you know,
00:09:08
◼
►
And the backstory that I've heard on HomePod makes a lot of sense was that initially it
00:09:11
◼
►
was meant as a sort of, you know, like a more of a Apple TV peripheral where you'd plug,
00:09:20
◼
►
you know, hook it up, or maybe not hook it up with wires, but that it was meant to be
00:09:24
◼
►
like the speakers for your Apple TV or something like that. But it really was I get to it's
00:09:27
◼
►
another one of those cases where my rule of thumb of take Apple at its word, it's usually
00:09:33
◼
►
the truth where their description of it as primarily a speaker, not really a, "Hey,
00:09:39
◼
►
this is our answer to the talking devices thing." First and foremost, it's a serious,
00:09:48
◼
►
high-quality speaker system, and then it just happens to have Siri as the interface secondarily.
00:09:54
◼
►
That's honestly the truth of it. I don't know. People make a lot of hay over the market
00:10:00
◼
►
share this thing. But there's there's no question. I mean, Apple has there's no way that Apple
00:10:05
◼
►
introduced this thing at $350 and expected it to compete in a unit sale comparison with
00:10:13
◼
►
Amazon things that cost $60 or Google's low end Google home things, Google voice, whatever
00:10:19
◼
►
they call them.
00:10:20
◼
►
Jared Ranerelle Well, you write about this all the time. And
00:10:23
◼
►
one of my personal pet peeves is Apple come out with a new product and everyone assumes
00:10:29
◼
►
because it's not a one-to-one parody with something else that's already on the market,
00:10:33
◼
►
down to price and features. And if it doesn't work exactly the way the thing that was there
00:10:39
◼
►
first worked, then there's got to be a flaw with it. And yet they built all these class-defining
00:10:45
◼
►
devices. And in many cases have taken the time to nurture those through the process. I mean,
00:10:50
◼
►
some of them they drop like, but whatever, what was the HiFi stereo thing and a few others. But,
00:10:55
◼
►
But I mean, it, it seemed pretty clear that this was a longterm bet from the beginning
00:11:00
◼
►
on the home pods, at least.
00:11:02
◼
►
Have you ever heard my story about Steve jobs in the iPod?
00:11:08
◼
►
No, I had a friend who worked at Apple and long story short, had to have a meeting in
00:11:14
◼
►
jobs is sweet and got up there, you know, five minutes early, of course.
00:11:20
◼
►
and checked in with Jobs's personal assistant and relatively small. I've seen Tim Cook's
00:11:32
◼
►
office. I don't know if it's the new one. I guess it was his old one at Infinite Loop,
00:11:36
◼
►
not the new one at The Ring, but very humble relative to what it could be. But he's got
00:11:45
◼
►
a receptionist or a personal assistant, and then he's got his own personal office. And
00:11:49
◼
►
over to the other side. There's, there's, you know, like a boardroom type table where
00:11:53
◼
►
the meeting was going to be, but he could look right into jobs, his office, I forget
00:11:56
◼
►
what year this was maybe, you know, 2009, 2010. And he said he expected it to be like
00:12:05
◼
►
this austere monk, like, you know, there'd be nothing in there except a glass desk with
00:12:11
◼
►
like mint condition, one mint condition, iMac and nothing else. And he said, instead it
00:12:18
◼
►
It was a relatively small office and it was just stacked from floor to ceiling with boxes
00:12:23
◼
►
of the now did at that point discontinued iPod Hi Fi. Like they discontinued it, but
00:12:30
◼
►
John's loved it so much where he was like, we'll put 40 of them in my office. Just like
00:12:35
◼
►
in the box. Yeah. Like just the retail boxes. Like that's right. That's where they were
00:12:42
◼
►
storing them and it like threw them off and like it was like, I got to get my head in
00:12:46
◼
►
game because this is a big meeting. That's awesome. I have no idea what happened to this
00:12:52
◼
►
iPod high fives, but that's my Jason still have one. He probably does. I don't know. I never bought
00:13:00
◼
►
one because I never really had an interest at that time of a, I never had any kind of plug my iPod
00:13:07
◼
►
into a speaker system thing. I don't know why, but I didn't. I was the same way, but I worked at a
00:13:13
◼
►
desk. And so I had iTunes and I had computer speakers and that was fine. Yeah, that sort
00:13:17
◼
►
of was my situation was I already had decent speakers, decent enough speakers connected
00:13:23
◼
►
to my computer in my office, my home office. And there was no other nowhere else in the
00:13:28
◼
►
house where we needed speakers. So and I never really got one. But everybody said had good
00:13:32
◼
►
sound. I don't know. Small consolation. All right, let's get started, though. Let's see.
00:13:41
◼
►
on my list. Well, there is a little bit of other news. WWDC 2009 Mac Rumors has published
00:13:48
◼
►
their best guess. Pretty educated that it's going to be June 3 to 7 in San Jose. No surprise
00:13:54
◼
►
there. There's sort of a cottage industry in guessing the WWDC dates because people
00:13:59
◼
►
want to see if they can get, you know, like a cheaper hotel room or, you know, book it
00:14:05
◼
►
in advance or whatever. And everybody I know has been guessing June 3 to 7 for months.
00:14:10
◼
►
I mean, going back to December. But MacRumors uncovered some like, like a, wherever, whatever
00:14:18
◼
►
the name of the public field is where the bash, the annual bash is held, there's already
00:14:25
◼
►
there was already something on the schedule for the Thursday of that week. It even had
00:14:29
◼
►
Apple's name on it, which is sort of like an up. Oh, but the big tell to me and I didn't
00:14:35
◼
►
I didn't realize this was that O'Reilly is holding something called the velocity conference the next
00:14:41
◼
►
the very next week at San Jose, because that would be the only other guests that would be a
00:14:46
◼
►
normally scheduled WWDC would be the week starting June 10. So I'm, you know, I feel like this is a
00:14:54
◼
►
lock. I've actually made a hotel reservation based on this. Do you go to the DC? I usually fly in for
00:15:01
◼
►
for like a day and we'll have me come for the, uh, the keynote stuff, but I tend not
00:15:06
◼
►
to have the time to stay cause that time of year just with the security industry stuff
00:15:11
◼
►
Yeah. Yeah. I think I usually run into you there. Yeah. Yeah. Yeah. Most years the, um,
00:15:16
◼
►
I'm so glad they moved it though. Like, uh, I have to go out in a couple of weeks, San
00:15:19
◼
►
Francisco for our big security trade show conference of the year. Uh, hotel rooms are
00:15:24
◼
►
for anything within walking distance of Moscone between 700 and a thousand dollars a night.
00:15:29
◼
►
Oh my God, it's nuts. It actually makes San Jose's four to five fifty rates look reasonable.
00:15:37
◼
►
I got friends staying in Oakland for five hundred and nine. That's the discount. The
00:15:43
◼
►
discount is you stay in Oakland. You're probably a good. I've taken the Muni for that because
00:15:51
◼
►
I've flown into, I used to fly into Oakland all the time on Southwest. So I know that
00:15:55
◼
►
Muni trip, it's like about 30 minutes and then you get off it right on Market Street.
00:16:01
◼
►
Jon Moffitt If you time it right.
00:16:03
◼
►
Dave Asprey If you time it right, right. I mean,
00:16:06
◼
►
it's not super inconvenient, but it's not certainly not super convenient. But yeah, when the all the
00:16:11
◼
►
hotels are $700 to start and the ones that are lower end in that area are not $700 a night hotels.
00:16:20
◼
►
Some of the places I've stayed over the years when especially when Mac world was still over there
00:16:26
◼
►
and because I have to pay my own way to those. The yeah, there was not good when they first moved
00:16:33
◼
►
when Apple first moved it to San Jose two years ago. I'm not Mr. I'm not miscounting right? We've
00:16:40
◼
►
had to two and counting. This will be the third WWDC back in San Jose, I believe. Speaking of my
00:16:48
◼
►
shoddy modern memory pretty sure so because I skipped one yeah I'm almost
00:16:52
◼
►
sure I went to one I remember the first year two years ago they even mentioned
00:16:57
◼
►
hey this should be more affordable you know San Jose should be more affordable
00:17:01
◼
►
compared to San Francisco and it's like at this point the hotels for San Jose
00:17:06
◼
►
are more expensive now than San Francisco was three years ago so it's
00:17:11
◼
►
like what's the point but then if you like price out San Francisco it is like
00:17:16
◼
►
like, oh my God, this is crazy land.
00:17:20
◼
►
- It depends on what it is.
00:17:21
◼
►
Like I've had to go out for meetings with Apple on occasion.
00:17:23
◼
►
I haven't done those in a while, but when I was,
00:17:25
◼
►
like depending on the amount of warning,
00:17:27
◼
►
it was $100 a night for halfway decent room
00:17:30
◼
►
or $500 a night because somebody had some event
00:17:32
◼
►
going on in the Valley.
00:17:33
◼
►
- Yeah, it definitely varies tremendously.
00:17:36
◼
►
I don't know what this,
00:17:41
◼
►
I guess I'm glad it's not in San Francisco anymore.
00:17:43
◼
►
I like San Francisco as a city better than San Jose,
00:17:46
◼
►
but my God, the cost is just, it's just off the charts.
00:17:51
◼
►
- Yeah, I think they both suck.
00:17:52
◼
►
I mean, no offense to everybody listening in the Bay Area,
00:17:56
◼
►
but at least that, the Moscone area, San Francisco,
00:18:00
◼
►
if I never have to go back there, I'd be fine.
00:18:02
◼
►
- And it's, you know what sucks?
00:18:03
◼
►
I tell you what really sucks is getting from SFO
00:18:06
◼
►
to San Jose, I guess I gotta get a Uber.
00:18:09
◼
►
Last year, I made a huge mistake and just got in a cab,
00:18:12
◼
►
and I think I should have known from the year before,
00:18:15
◼
►
don't do it, 'cause it's like a $110 cab ride.
00:18:19
◼
►
- There's more with traffic.
00:18:20
◼
►
- It was like ridiculous, and it's not that far,
00:18:24
◼
►
but I guess 'cause they are technically San Francisco cabs,
00:18:29
◼
►
San Jose is, I don't know, you cross some sort of perimeter
00:18:34
◼
►
of this is a normal San Francisco cab ride,
00:18:36
◼
►
and then all of a sudden it's like a $110 cab ride,
00:18:39
◼
►
and I already feel like there's a vacuum sucking
00:18:44
◼
►
making money out of my pocket.
00:18:46
◼
►
- Like you're not flying to San Jose Airport?
00:18:48
◼
►
'Cause that's, I mean, it's right there.
00:18:49
◼
►
- Well, I could in theory,
00:18:51
◼
►
it is extremely close and convenient,
00:18:53
◼
►
but from Philadelphia,
00:18:55
◼
►
it's even with the hassle of getting from San Jose to SFO,
00:19:00
◼
►
there's no direct flights from Philly.
00:19:03
◼
►
So I'd rather have a direct flight to SFO
00:19:06
◼
►
and deal with getting there than have a stop.
00:19:11
◼
►
I don't know, as I've gotten older,
00:19:13
◼
►
have become almost allergic to anything other
00:19:17
◼
►
than nonstop flights.
00:19:19
◼
►
Mostly, but not necessarily because of it takes longer,
00:19:22
◼
►
but because it doubles your chances
00:19:26
◼
►
of something going wrong, right?
00:19:28
◼
►
A mechanical problem with a plane,
00:19:30
◼
►
weather from the inbound city,
00:19:33
◼
►
weather at the city that you're at,
00:19:36
◼
►
that anything can happen to delay it.
00:19:37
◼
►
- Yeah, I mean, I fly two, three times a month.
00:19:40
◼
►
I'll take the 6 a.m. direct over a 10 a.m. with a connection or something, even if it's
00:19:46
◼
►
less convenient. And never, ever, ever check bags if you have a connecting flight or you're,
00:19:51
◼
►
I mean, that's just-
00:19:51
◼
►
Trenton Larkin That's another one too, right? And for,
00:19:53
◼
►
because my wife always comes to WWDC to help with the live show and stuff. And so we definitely
00:19:58
◼
►
check bags and yes, checking bags with a connection is just, you might as well just kiss your luggage
00:20:05
◼
►
goodbye. It's like rolling the dice.
00:20:08
◼
►
Yeah, I've been known to pack spear clothes and toiletries in my carry on if I have to
00:20:14
◼
►
All right, first, first topic. Big one is this I almost forgot to put on the list, but
00:20:20
◼
►
the Jeff Bezos story with the National Enquirer, which when I first started writing about it
00:20:25
◼
►
on Daring Fireball, it was a week old at least. And I had sort of ignored it because initially
00:20:31
◼
►
I had filed it under sort of personal gossip, which I tend to stay away with. Not out of,
00:20:37
◼
►
I'm not trying to be holier than now, but it's just not my bag. If so-and-so is a famous
00:20:45
◼
►
industry executive and they are having marital strife or running around on their spouse or
00:20:52
◼
►
something like that, that's just not daring fireball material. I sort of filed it under
00:20:57
◼
►
that and didn't really give it much thought. But then, wow, Bezos—I'm sure everybody
00:21:04
◼
►
listening to this knows the basic gist of it, where Bezos wrote a post on Medium, which
00:21:10
◼
►
is interesting. I'm sort of anti-Medium in general and really wish more people would
00:21:17
◼
►
have their own blogs. But this was sort of a perfect use of Medium because I don't think
00:21:21
◼
►
it would have been appropriate at all for him to use either some sort of Amazon blog
00:21:28
◼
►
or somehow like publish an op-ed in the Washington Post, which he owns. I don't think either
00:21:34
◼
►
of the, I think both of those would have been very inappropriate. I've go so far to say,
00:21:37
◼
►
and I'm sure he knows it too. So having medium as this neutral platform and then being able
00:21:44
◼
►
to tweet the link and every, you know, his Twitter is verified like his tweet while it
00:21:50
◼
►
didn't contain the meat of the subject was the sort of, well, this was definitely Bezos.
00:21:56
◼
►
isn't somebody saying they're Jeff Bezos. More or less coming out and having email proof that
00:22:03
◼
►
the National Enquirer was trying to extort him into saying things that weren't true,
00:22:09
◼
►
lest they reveal even more personal information, texts, pictures from his girlfriend.
00:22:22
◼
►
You know, it's like coming out of the, I'll put my security hat on instead of my movie fanboy hat.
00:22:28
◼
►
And this is a really fascinating story because you look between the lines and like one of the
00:22:36
◼
►
problems in security, you get really paranoid and you'll hear zebras. You know, you'll think
00:22:41
◼
►
zebras when you hear those hoof beats. And then, but then sometimes it's a zebra and boy, this,
00:22:47
◼
►
this sounds like a zebra to me because the story of the brother of the woman he was having an affair
00:22:56
◼
►
with somehow getting her text and releasing it in that way, maybe he was responsible for it.
00:23:01
◼
►
But the way all of this lines up and the Trump stuff and the Saudi Arabia stuff,
00:23:05
◼
►
I suspect it's not the extreme of the conspiracy. But I mean, it really, I don't know. I like to
00:23:13
◼
►
sit back and wait and see how these stories develop versus jumping to conclusions.
00:23:17
◼
►
Right. So the basic story is that, and it seems as though everybody is in agreement at this point,
00:23:22
◼
►
although it's not, there's been no conclusive proof that her brother somehow is, was the one
00:23:29
◼
►
who released the, these text messages, which I'll come back, which phrase I'll come back to in a
00:23:35
◼
►
moment and photos exchange between the two to the national enquirer. And it just bizarrely, I mean,
00:23:44
◼
►
it's just like what are the odds that Jeff Bezos's girlfriend would have a brother who is a long time
00:23:51
◼
►
associate of Roger Stone who's now been indicted and a well-known Trump supporter. The most
00:23:58
◼
►
innocent looking walking out of jail photo in history. With Trump being a vocal and continuous
00:24:08
◼
►
critic of the Washington Post and Bezos' ownership thereof, Trump seems convinced that, you know,
00:24:18
◼
►
Trump had clearly, whatever you think of Trump politically, he clearly comes at this from
00:24:22
◼
►
the perspective of if you're a billionaire who owns a newspaper, of course, you wield
00:24:26
◼
►
your ownership to pursue personal vendettas, which is a not how the Washington Post should
00:24:35
◼
►
and B, not how Jeff Bezos, by all accounts and appearances,
00:24:40
◼
►
Jeff Bezos is doing his ownership of the Washington Post,
00:24:44
◼
►
editorially is completely hands-off.
00:24:48
◼
►
He is, you know, and knowing that, you know,
00:24:52
◼
►
I don't know anybody personally at the Washington Post,
00:24:54
◼
►
but you know, from what I know of their leadership,
00:24:58
◼
►
that, you know, they wouldn't stand for it.
00:25:00
◼
►
You know, they're not going to,
00:25:01
◼
►
but it, like, good luck convincing Donald Trump of that.
00:25:04
◼
►
Well, I mean, it's it's inconceivable to someone like like him. I mean, look at I mean, look
00:25:10
◼
►
at the history with a my in the National Enquirer, which has directly been used. So why would
00:25:15
◼
►
he possibly think that Bezos would have ethics or something?
00:25:18
◼
►
Right. And so Trump doesn't own the National Enquirer, but he's longtime friends with
00:25:22
◼
►
the guy who does whose name unbelievably is David pecker, which has used his friendship,
00:25:31
◼
►
even ownership, it's friendship with the owner of the National Enquirer to a bury stories
00:25:36
◼
►
damaging to him famously, and now part of the Mueller investigation in terms of the
00:25:40
◼
►
payoffs that they made to keep women that Donald Trump had affairs with their stories,
00:25:46
◼
►
you know, what do they call it catch and kill, where the National Enquirer would give them
00:25:50
◼
►
$100,000 for the exclusive rights to their story and then just bury it, put it in a literally
00:25:57
◼
►
put it in a safe and use the National Enquirer to slander, in the non-legal term, let's say in the
00:26:06
◼
►
colloquially, Donald Trump's enemies. For example, National Enquirer was the publication that ran
00:26:12
◼
►
on the front page on the cover, a story alleging that Ted Cruz's father helped kill JFK.
00:26:20
◼
►
By the way, I found the headline. It's Bezos exposes Pecker.
00:26:26
◼
►
That exposes how could I forget Bezos exposes.
00:26:29
◼
►
Well, and so the most stunning thing to me, other than that is, I mean, you write that headline,
00:26:36
◼
►
you retire. I mean, it doesn't get any better than that. But the I can't believe anything
00:26:43
◼
►
that Enquire publishes is even true. Like I just assumed that was fiction. It was like,
00:26:48
◼
►
the onion for Oh, see, I'm like disparaging there.
00:26:52
◼
►
See, I knew that that's not the case.
00:26:55
◼
►
Like the Weekly World News,
00:26:56
◼
►
I don't even know if they still exist,
00:26:57
◼
►
but in terms of like the heyday of supermarket tabloids,
00:26:59
◼
►
the Weekly World News was the one that was like the Onion,
00:27:02
◼
►
where they had like Bat Boy and Aliens and stuff like that.
00:27:06
◼
►
The National Enquirer, because they deal with real people
00:27:10
◼
►
and could therefore be sued for real money,
00:27:14
◼
►
plays by different standards.
00:27:16
◼
►
And famously, I would say one of the biggest bits of,
00:27:21
◼
►
I don't want to say history altering, but certainly affected the political landscape
00:27:25
◼
►
was that they were the publication that revealed that John Edwards was cheating on his wife.
00:27:32
◼
►
I forgot about that.
00:27:34
◼
►
Who was in 2008, or was he Kerry's vice president in 2004 nominee? I believe he was, right?
00:27:47
◼
►
I think so. Yeah, Kerry O'Gourd.
00:27:48
◼
►
And then in 2008, everybody remembers 2008 as a very close Democratic primary between
00:27:54
◼
►
Hillary Clinton and Barack Obama.
00:27:57
◼
►
But John Edwards was in that.
00:27:59
◼
►
I mean, and it was a very serious three-way race for quite a while.
00:28:05
◼
►
I mean, it's not, you know, certainly not out of the imagination that he could have
00:28:10
◼
►
been vice president under John Kerry or presidential nominee in 2008, but they broke the story
00:28:17
◼
►
that he was cheating on his wife.
00:28:19
◼
►
You know, no, you know,
00:28:22
◼
►
there's probably a lot of people like you
00:28:23
◼
►
who think everything in The Enquirer is fake or made up,
00:28:26
◼
►
but it's not, it's, you know,
00:28:28
◼
►
I don't know that it's, (laughs)
00:28:30
◼
►
I don't wanna swear that everything is the god of,
00:28:33
◼
►
you know, the exact truth,
00:28:35
◼
►
but it's, you know, it's more true than not
00:28:38
◼
►
in terms of what they print,
00:28:40
◼
►
because they're, you know, liable for slander and libel.
00:28:45
◼
►
- Well, and this kind of spun off the conversation.
00:28:48
◼
►
How did they potentially get those texts?
00:28:50
◼
►
- Right, that is, right.
00:28:52
◼
►
And my thing that I wrote about,
00:28:54
◼
►
and I always, I really wish that publications
00:28:58
◼
►
would get more serious about it,
00:29:00
◼
►
is what format, what medium were these
00:29:04
◼
►
quote unquote texts sent as?
00:29:06
◼
►
I would think if you really,
00:29:08
◼
►
most people would think by a technical definition,
00:29:11
◼
►
a quote-unquote text message is an SMS message,
00:29:15
◼
►
or MMS, whatever you want to say, but that's a text.
00:29:20
◼
►
But colloquially, people call texting
00:29:23
◼
►
pretty much anything that has DMs.
00:29:26
◼
►
And they certainly, people certainly refer
00:29:28
◼
►
to iMessage messages as quote-unquote texts.
00:29:33
◼
►
Which is, it's only natural for people to say that,
00:29:38
◼
►
because Apple, one of the reasons iMessage
00:29:41
◼
►
has become so successful, so popular,
00:29:43
◼
►
is that Apple integrated it with the same app
00:29:48
◼
►
that's used for text messaging, messages.
00:29:54
◼
►
So if you send your text, your actual SMS text messages
00:29:57
◼
►
with the same app as iMessage,
00:29:58
◼
►
it's only natural to refer to them as text.
00:30:01
◼
►
But once you get to the technical level
00:30:03
◼
►
of how did they get these,
00:30:05
◼
►
boy is there a big difference between iMessage and SMS.
00:30:08
◼
►
Yeah, I mean, they are night and day SMS being completely broken and insecure at this point
00:30:16
◼
►
and I message, you know, not perfect, but but really good. I mean, almost as close as
00:30:21
◼
►
you can get without introducing barriers to usability. That way, that would take away
00:30:26
◼
►
that that seamless experience.
00:30:30
◼
►
You know, and the big one is that I message is end to end encrypted. Where the encryption,
00:30:37
◼
►
you know, Apple doesn't see the messages.
00:30:39
◼
►
Now, one of the reasons I want you on this to show
00:30:42
◼
►
is that doesn't mean that there's no way to get them
00:30:46
◼
►
from your iCloud account,
00:30:48
◼
►
but they're not on your iCloud account stored unencrypted,
00:30:52
◼
►
and there's no way that Apple sees them.
00:30:55
◼
►
Like, I send you an iMessage, it goes, you know,
00:30:57
◼
►
I type it on my phone,
00:31:00
◼
►
and it is encrypted as it leaves my phone,
00:31:02
◼
►
and it isn't decrypted until it gets to your devices.
00:31:07
◼
►
- Yeah, and it's a really, really interesting way
00:31:10
◼
►
that they set that up where you've got these key rings.
00:31:13
◼
►
So think of it as a ring of your house keys.
00:31:16
◼
►
And every time you basically are sending a message
00:31:20
◼
►
to somebody and I'm dramatically oversimplifying,
00:31:22
◼
►
but your, every message is encrypted with the key
00:31:25
◼
►
for that person and device.
00:31:27
◼
►
And these are all different and the encryption is entangled
00:31:29
◼
►
with device IDs depending on kind of which versions
00:31:32
◼
►
things you're on and which things you're using. Obviously, it's stronger on iPhones and iPads,
00:31:37
◼
►
and it's going to be a bit weaker on Macs if you've got those synced up. As well as now,
00:31:42
◼
►
we've got it up, as you said, stored up in iCloud, but encrypted in no keys up in iCloud.
00:31:47
◼
►
And so it's all device to device keys exchanged. So in a situation like this,
00:31:52
◼
►
for somebody else to gain access to those, if it's SMS, stupid easy. You just go to the carrier
00:31:59
◼
►
and you get a record of it or you hack the phone. For iPhone, you either have to hack the device,
00:32:03
◼
►
which we know is possible, but like kind of nation state level stuff, for the most part,
00:32:08
◼
►
or very serious attacker level stuff. You have to have physical access to the device. But even then,
00:32:13
◼
►
I mean, was he doing screenshots and then forwarding these on via email to the Inquirer?
00:32:18
◼
►
Like somehow it sounds like the Inquirer has those texts and has those images,
00:32:23
◼
►
which means somehow they weren't just viewed and conveyed verbally to them. They were
00:32:29
◼
►
actually forwarded on and that's a big deal. Yeah. So that involves that means some deeper
00:32:34
◼
►
level of access if it was iMessage. Yeah. And well, but one of the things one of the things that
00:32:41
◼
►
is game over is device access, right? So if her brother if she had a relationship where she
00:32:46
◼
►
trusted her brother, she being Bezos his girlfriend, if she if her brother she trusted her brother
00:32:53
◼
►
enough that he could have her phone and either knew her passcode or God forbid, she doesn't even
00:32:59
◼
►
use a passcode. But if he could get into her phone, or could get her to say, Hey, can I use
00:33:04
◼
►
your phone for a second? You know, let me look something up. And she trusted him enough for that
00:33:08
◼
►
it's game over, no matter what you're using. If I'm hanging out with the president's public enemy
00:33:13
◼
►
number three, and my brother's friends with his like, buddy, Roger Stone, I'm probably not going
00:33:19
◼
►
I'm not going to let my sister touch log into my devices, but yeah, I mean,
00:33:23
◼
►
that's feasible and, um, and there's no interception by the way,
00:33:27
◼
►
just to like, maybe he could trick her and add a device,
00:33:31
◼
►
but then you have to approve that and put in your wife.
00:33:32
◼
►
It's really hard to it's doable. It's not, not easy. Yeah.
00:33:37
◼
►
Uh, I do wonder though how he exported them. I like, I do want to know, like,
00:33:43
◼
►
I, I don't want to, I don't want to see the pictures,
00:33:48
◼
►
honestly, but I do want to know how like how I still want to
00:33:52
◼
►
know even if he's the guilty party, you know, even if that's
00:33:55
◼
►
true, I still want to know how he got the pictures. Did he
00:33:57
◼
►
screenshot them? Did he and if so, how did he get the
00:34:01
◼
►
screenshots off her phone? Right? Yeah. You know, did he
00:34:05
◼
►
take pictures of her phone with his phone? Is that you know,
00:34:08
◼
►
did she leave her Mac unlocked during Thanksgiving?
00:34:11
◼
►
Who knows, right? Because it has to be your phone. It could be
00:34:15
◼
►
something like that. Then there was even a story along those
00:34:17
◼
►
lines in the—I forget the title—the latest tell-all book to come out of a former Trump
00:34:24
◼
►
administration official, some low-level guy who was like a speechwriter who got called
00:34:30
◼
►
into Kellyanne Conway's office. Do you see this story?
00:34:35
◼
►
Uh-uh. I'll have to look up his name. But basically, he needed to write a speech and
00:34:43
◼
►
and he came into Kellyanne Conway's office
00:34:45
◼
►
and she said, "Here, just use my MacBook."
00:34:48
◼
►
And she opened her Mac, or was open or whatever,
00:34:52
◼
►
and he sat there and was typing this thing
00:34:54
◼
►
on her MacBook Air, and she's over at her desk.
00:34:56
◼
►
He's like at a table, she has a fair,
00:34:59
◼
►
she's high enough up that she has a fairly spacious office
00:35:01
◼
►
in the West Wing.
00:35:03
◼
►
As he's writing, doing this speech writing thing,
00:35:06
◼
►
he's seeing all these iMessage alerts come in
00:35:11
◼
►
in Notification Center, and she's totally
00:35:15
◼
►
blabbing inappropriate stuff to the press.
00:35:18
◼
►
- Oh, I did hear about that, yeah.
00:35:20
◼
►
- Just, and I don't think she's, she's not a stupid woman.
00:35:26
◼
►
I just think it was easily, she just easily forgot
00:35:30
◼
►
that she was logged into iMessage on that,
00:35:33
◼
►
and that whatever you're doing on your phone
00:35:36
◼
►
in iMessage is going to, if you're signed into iMessage
00:35:40
◼
►
on your Mac too is going to be mirrored over there.
00:35:45
◼
►
- You know, I mean, mistakes happen.
00:35:48
◼
►
I was given a presentation the other day off my iPad Pro
00:35:52
◼
►
and Face ID timed out and I logged in.
00:35:54
◼
►
I'm like, oh shit, I'm screen mirroring.
00:35:56
◼
►
Like everybody just saw my password.
00:35:59
◼
►
- Oh really? - For my device.
00:36:01
◼
►
And I've never done that.
00:36:02
◼
►
I mean, I teach training classes at like Black Hat,
00:36:05
◼
►
Def Con and never done that.
00:36:07
◼
►
- How did they see it though if it wasn't just bullets?
00:36:09
◼
►
Is it because the letters show up before they turn into bullets?
00:36:11
◼
►
They've show up for a split second before it goes into bullets. And I'm like,
00:36:14
◼
►
ah, that's fine. And it's like, I use different, we'll get to passwords later.
00:36:18
◼
►
I use different ones for everything and I can change it. It's no big deal, but.
00:36:20
◼
►
Yeah. It's a, I think it's called pit of vipers or a team,
00:36:26
◼
►
team of vipers.
00:36:27
◼
►
My 500 extraordinary days in the Trump white house by a cliff
00:36:32
◼
►
Sims, I will copy and paste this and put it in the show notes. But,
00:36:36
◼
►
But let's also be clear. We don't know if she was on an iPhone as well.
00:36:40
◼
►
We know Bezos uses predominantly an iPhone based on his tweets.
00:36:43
◼
►
We don't know if she was. But the thing is, is even SMS,
00:36:47
◼
►
it's not like direct interception of SMS is something that the average
00:36:54
◼
►
person on the street can pull off.
00:36:55
◼
►
Right. Right. But it's on the table.
00:36:58
◼
►
And part of what made Bezos post extraordinary was
00:37:03
◼
►
that he alleged that there was some sort of nation state involved in it at some level
00:37:08
◼
►
without getting specific, but then said that they were told by the AMI, is the parent company
00:37:16
◼
►
of the National Enquirer, by representatives of AMI that David Pecker, the owner, was particularly
00:37:22
◼
►
quote apopleptic, very angry about allegations or an investigation into their relationship
00:37:30
◼
►
with the Saudis. It has since come out, somebody else reported a couple of days ago that at
00:37:37
◼
►
some point, I guess at some point this guy, MBS, whatever his full name is, was coming
00:37:43
◼
►
to the United States and it was going to be a big deal. Somehow AMI printed up a bunch
00:37:48
◼
►
of these, the New Kingdom, a big flashy glossy magazine talking about how great Saudi Arabia
00:37:56
◼
►
is under MBS's new leadership, blah, blah, blah. Apparently, paid by the Saudis to some
00:38:05
◼
►
degree, to the degree that AMI went to the US Justice Department and asked whether they
00:38:11
◼
►
should register as foreign agents for Saudi Arabia. The Saudis, like you said, me, can
00:38:21
◼
►
Can I go, if I wanted to go snoop on my neighbors' SMSes,
00:38:25
◼
►
is it within my means?
00:38:27
◼
►
Would I even know where to start?
00:38:28
◼
►
How to intercept their unencrypted SMS
00:38:31
◼
►
or their cell phone calls?
00:38:32
◼
►
No, I don't even know where I would start.
00:38:34
◼
►
- Well, call me.
00:38:35
◼
►
It's not that, that actually--
00:38:37
◼
►
- I wasn't gonna say that,
00:38:38
◼
►
but that actually is where I would start.
00:38:40
◼
►
Hey, Rich, this is gonna sound weird.
00:38:44
◼
►
(both laughing)
00:38:46
◼
►
This is gonna sound weird.
00:38:48
◼
►
You know, can the Saudis do that without question?
00:38:52
◼
►
there's no no question that the Saudis have it within their capabilities to intercept cell phone calls or
00:38:58
◼
►
SMS messages and part of this, you know just to not to go too far in the Trump area
00:39:03
◼
►
But this is one reason why a lot of people are very upset that Donald Trump apparently makes a lot of phone calls using an iPhone
00:39:13
◼
►
Well, and let's tie this to a previous security story, which was the and I can't remember the name of it
00:39:19
◼
►
But it came out that there was that tool that they were remote exploiting iPhones with yeah that that is still in use
00:39:25
◼
►
It just has more limited utility
00:39:27
◼
►
Yeah, but some zero day that gave him persistence and been able to track on the device and there, you know bunch of
00:39:33
◼
►
mercenary types from the US and Israel which unfortunately do exist
00:39:37
◼
►
Had come up with that kind of like the gray box for you know
00:39:41
◼
►
governments trying to break into phones. And that story broke right before this story. So
00:39:46
◼
►
really interesting. Like I said, I don't want to rush to judgment. Something absolutely smells off
00:39:52
◼
►
and it does not smell like there's, this is a, you know, an Occam's razor simplest explanation. Is it
00:39:58
◼
►
kind of a thing? I think as this story unfolds, I don't think we even, we're going to see things.
00:40:03
◼
►
I don't think we're even predicting right now. It could go in so many different directions.
00:40:06
◼
►
It is not as simple as Bezos is a famous public figure and the enquirer happened to, you know,
00:40:14
◼
►
somebody said here, take these embarrassing photos of him and publish them. It's more than that. What
00:40:21
◼
►
more? I don't know. But it's it's not as simple. Here's some embarrassing images of a famous person
00:40:29
◼
►
in their extramarital affair.
00:40:32
◼
►
I'm by the way, very glad you made it so clear. You don't want to see the photos because I was
00:40:37
◼
►
getting a little worried about that. I got to say, you know, it is an it. It also is interesting.
00:40:43
◼
►
And I have to I have to say, I'm not saying anything that 1000 other people haven't said,
00:40:46
◼
►
but I salute Jeff Bezos for his getting in front of this, because he's obviously risking further
00:40:53
◼
►
embarrassment, right? They like the gist of the National Enquirer's offer to him, whether it
00:40:58
◼
►
amounts to legal extortion or not. Apparently, I've read enough stories about it that there's
00:41:04
◼
►
actually you know, that's questionable. It's not cut and dry whether what they've put into writing
00:41:12
◼
►
amounts to legal extortion. But certainly, in the common sense of the word, they were trying
00:41:20
◼
►
to extort him, which was to say, shut up and say that you've concluded that there was no
00:41:27
◼
►
political motivation and what we published and we won't publish the following list of pictures
00:41:32
◼
►
which we have and have been holding back. You know, that's extortion. Again, a lot of lawyer,
00:41:38
◼
►
I'm not saying amounts to legal extortion, but the legal definition of extortion in the United
00:41:43
◼
►
States. But certainly in the common sense word saying you're going to say something that you
00:41:48
◼
►
know isn't true because apparently Jeff Bezos is under the belief that there were political
00:41:52
◼
►
motivation. So they were asking him to say something he doesn't believe in exchange for
00:41:57
◼
►
something in his favor. Yeah, I'm going to say that's extortion. And so getting in front of it,
00:42:05
◼
►
I say extortion. Yeah. I mean, it's not even like regular extortion and it's a huge issue in society
00:42:11
◼
►
and a growing issue. Like, I mean, my kids are younger than yours and I've got both daughters
00:42:16
◼
►
and a boy and the conversations I'm going to, I need to have with them on this very sooner.
00:42:21
◼
►
not going to be comfortable. Yeah, it is weird having a 15-year-old.
00:42:25
◼
►
You know, I mean, we've tried to be upfront about it. We've even talked about the Bezos thing,
00:42:30
◼
►
like, "Here, look, you know, don't take pictures." So I had a "Don't make jokes. Don't put anything.
00:42:38
◼
►
Don't text anybody." Whether it's a group text, whether it's a single text, don't text anything
00:42:43
◼
►
where you wouldn't want a screenshot of it seen by the principal at the school. Seriously, don't do
00:42:49
◼
►
it because it's going to happen. Or in 20, 30 years when that pops up. Oh my God, I can't even
00:42:55
◼
►
think about it. But there's a I'll just tell one story related to this real quick. The I can't tell
00:43:00
◼
►
you how the various associations work. They get back to me on this one in large part because I
00:43:05
◼
►
no clue because my wife told me I didn't listen to that part. But the there was a high school kid
00:43:13
◼
►
and some relation of ours knows the mom through like, you know, three separations removed or
00:43:18
◼
►
something. And this car full of girls shows up banging on the door and the woman opens up the
00:43:23
◼
►
door. And there are these girls are like, we need to talk to your son about child porn.
00:43:28
◼
►
She's like, what? Like what? And you, you know, you're thinking what I was thinking when I first
00:43:32
◼
►
heard that part of the story and cut to the chase for time's sake, the girl had sent him a naked
00:43:38
◼
►
selfie unsolicited, like fully unsolicited. He got it. And this kid's 15, 16. He's like,
00:43:45
◼
►
whoa, I don't want to be dealing with this and wiped it right away. And because he spurned her,
00:43:50
◼
►
she was going to get her revenge by pick up his phone. It's child porn. He's got a picture of me
00:43:56
◼
►
on there, which he didn't because he had erased it. And needless to say, the mom gave these girls
00:44:01
◼
►
a massive talking to once the story kind of unfolded there. My wife's response is like,
00:44:05
◼
►
have some self-esteem, man. It's like they weren't even in a relationship. And she sent the picture.
00:44:10
◼
►
Yeah. I don't know. I've seen a lot of stories along those lines where it's just, you know,
00:44:14
◼
►
two 16-year-olds and they start sending pictures and legally a picture of a naked 16-year-old is
00:44:22
◼
►
child pornography, but the interaction isn't... The technical advances, what's possible for kids
00:44:34
◼
►
to send each other and what they would be tempted to do are so far ahead of what the law
00:44:43
◼
►
has been written to protect against you know it's it is I don't know I mean
00:44:50
◼
►
literally I mean I'm so old that literally we had to we had to write our
00:44:55
◼
►
notes on pieces of paper I mean it sounds goofy now right I mean I I can't
00:45:01
◼
►
even imagine that that's even a thing anymore right kids don't send notes to
00:45:05
◼
►
each other with tiny little slips of paper cut off you know you just rip off
00:45:09
◼
►
the corner of a piece of paper and tiny you know with your with your
00:45:12
◼
►
unbelievably acute 16 year old eyes right in microfilm size text. It never even occurred to me,
00:45:21
◼
►
I never got caught exchanging notes as a kid. I came close one time, but never got caught. But
00:45:29
◼
►
it never even occurred to me that with as small as I would write the note that the teacher wouldn't
00:45:33
◼
►
even be able to read. Yeah, the way they're growing up with and even the socialization is
00:45:41
◼
►
different. I mean, I look at my, so I've got older nieces and nephews and then my kids who are nine,
00:45:45
◼
►
eight and five. And for them being social is being on devices.
00:45:51
◼
►
Yeah, absolutely.
00:45:51
◼
►
Like my wife and I, we can't say go out and play with your friends because they all play with their
00:45:56
◼
►
friends on their devices now. No, it's definitely social. I mean, it's hard to parent because I
00:46:00
◼
►
don't know. We don't know how much to say, you know, step away from the device. But if we,
00:46:08
◼
►
if we pull him off his computer,
00:46:11
◼
►
it's not going to make him call up a friend
00:46:14
◼
►
and go to the friend's house.
00:46:16
◼
►
It just means he's not in the group text
00:46:19
◼
►
that is all playing different games at the same time,
00:46:23
◼
►
but texting and talking to each other.
00:46:26
◼
►
In some ways, it actually makes parenting easier
00:46:28
◼
►
if you accept it, because I'm not schlepping him
00:46:30
◼
►
all over town.
00:46:31
◼
►
Actually, it's, you know.
00:46:34
◼
►
And we don't have to worry about where he is,
00:46:36
◼
►
and he doesn't have to come home late.
00:46:37
◼
►
where he is, he's in his room. So it's actually, you know, in terms of being a lazy parent,
00:46:42
◼
►
it's actually kind of convenient. Whether it's doing any long term social harm to his
00:46:47
◼
►
socialization and whatever, I don't, I hope it's all right. But I got to tell you, it's
00:46:52
◼
►
a little bit easier in some ways.
00:46:55
◼
►
You know what I'm looking forward to my, I guess my kids becoming brooding teenagers
00:46:59
◼
►
instead of elementary school kids where we have to drive to eight different activities
00:47:03
◼
►
a day per child.
00:47:04
◼
►
- Yeah, we're hitting the brooding years.
00:47:06
◼
►
They come on quick.
00:47:07
◼
►
It happens fast.
00:47:10
◼
►
Happens very fast.
00:47:12
◼
►
- It comes on you fast and it leaves you fast.
00:47:14
◼
►
- I'm trying to remember what grade it was.
00:47:15
◼
►
I think it was eighth grade.
00:47:16
◼
►
Eighth grade social studies.
00:47:18
◼
►
Mr. Wheeler-- - How do you remember that?
00:47:21
◼
►
- Well, I told you before we started recording,
00:47:23
◼
►
I have very vivid memories of exactly when
00:47:25
◼
►
and where certain things happen.
00:47:26
◼
►
Mr. Wheeler had a basement classroom
00:47:30
◼
►
and he had a classroom with an office,
00:47:35
◼
►
like a little cubby hole, not a cubby hole,
00:47:37
◼
►
but he had a door behind his desk
00:47:39
◼
►
and it was a private room for him.
00:47:42
◼
►
Wasn't part of his classroom.
00:47:43
◼
►
Not many of the classrooms had this.
00:47:45
◼
►
- So his Matt Lauer room?
00:47:48
◼
►
- Yeah, effectively.
00:47:49
◼
►
Well, he definitely smoked it back there
00:47:52
◼
►
'cause he would sometimes come out
00:47:54
◼
►
and in a way that you might think
00:47:57
◼
►
they're never gonna know I smoked,
00:47:58
◼
►
But certainly as a teenager
00:48:00
◼
►
where you have these acute senses,
00:48:01
◼
►
you could smell that he smelled like smoke.
00:48:04
◼
►
And there was no way he went outside
00:48:05
◼
►
because it wasn't close to exit.
00:48:08
◼
►
But my friend Mark and I had a note going
00:48:12
◼
►
where we had drawn, the gist of the note was,
00:48:15
◼
►
what's in Mr., what's back there?
00:48:17
◼
►
Mr. Wheeler's back room.
00:48:19
◼
►
And we would draw things and you'd add something,
00:48:24
◼
►
then fold it back up and give it to Mark.
00:48:27
◼
►
and Mark would add something and you know, you'd like one of them was like a big pile of boogers.
00:48:31
◼
►
You could occasionally catch Mr. Wheeler picking his nose. So we there's like a big pile of dots.
00:48:38
◼
►
And then you'd label it with an arrow. It's a pile of boogers. And how many people listen to
00:48:42
◼
►
this podcast? He just named dropped his nose. I'm hoping that I'm hoping I'm hoping Mr. Wheeler
00:48:47
◼
►
doesn't listen to it. Sorry, Mr. Kef. I tragically died of ALS three years ago. But it was it just,
00:48:54
◼
►
I mean, I'm not even gonna, I don't even remember the specifics, but let's just say that the boogers
00:48:58
◼
►
were the least of the problems, you know, in terms of like, it was rated R, it was a rated R note.
00:49:04
◼
►
Let's just say, you know, it's just, I, off the top of my head, I'm going to guess that somebody
00:49:12
◼
►
drew like a pile of porno mags, you know, just a pile of, you know, just, it doesn't matter if it
00:49:18
◼
►
was artistically looked like a pile of magazines, you just label it pile of porno mags. And remember
00:49:23
◼
►
when you used to have to actually go buy your porn in a store. Thinking about generational
00:49:28
◼
►
differences. Yeah, that was those were the days. Well, anyway, I got caught with the note.
00:49:32
◼
►
Mr. Wheeler said, john, what do you got there? I had the note in my hand.
00:49:38
◼
►
And, and I made I made an immediate, I made it I'm very proud of this is why I remember I
00:49:48
◼
►
I'm calm, cool, and collected under pressure.
00:49:50
◼
►
I took the note, I wadded it up,
00:49:53
◼
►
I put it right in my mouth,
00:49:55
◼
►
with him looking right at me, and I swallowed it.
00:50:00
◼
►
- I'm gonna need a moment here.
00:50:05
◼
►
- This is the honest to God truth.
00:50:06
◼
►
I wadded the note up with him looking right at me,
00:50:10
◼
►
'cause I realized there's no other way out of this.
00:50:13
◼
►
I'm either coughing up this note,
00:50:14
◼
►
which was literally maybe the worst possible,
00:50:17
◼
►
literally had a diagram of the room behind his classroom with all sorts of awful things.
00:50:23
◼
►
Or I can swallow the note. The only way I could get rid of it would be to eat it. And
00:50:28
◼
►
it was on a small piece of paper relative to the amount of content in it. It wasn't
00:50:34
◼
►
like an entire eight and a half by 11 sheet of paper. It was maybe like a, probably about
00:50:38
◼
►
the size of an index card. You know, I'm laughing, but that probably saved your ass. I wadded
00:50:44
◼
►
it up, put it right in my mouth, chewed it up, swallowed it right there in front of him.
00:50:49
◼
►
And I said, "Nothing." And I don't think I got into any trouble. I don't believe that
00:50:55
◼
►
I was—I believe that he was so amused by the reaction that I escaped any sort of penalty.
00:51:07
◼
►
I really do. And it turned out he was my homeroom teacher. And then in ninth grade—I no longer
00:51:12
◼
►
had him for social studies, but I had him for—I went to a relatively small public
00:51:16
◼
►
school where it was 7 to 12, grade 7 to 12 all in the same building. So I still had—I
00:51:21
◼
►
didn't go to a different school between 8th and 9th grade. I still had him for homeroom.
00:51:26
◼
►
He was a basketball coach, not at our high school, but at a neighboring high school.
00:51:30
◼
►
We ended up—as contentious as our relationship was as a student, by the next year, I was
00:51:37
◼
►
like the maybe like the pet favorite in homeroom. So we patched it up. But that's my story
00:51:45
◼
►
I've got nothing of that level of
00:51:50
◼
►
I don't know what you would do. I guess I guess what you would do is delete your text,
00:51:54
◼
►
right? If somebody catches you with bad text, you can
00:51:56
◼
►
Well, I mean, kids today to like for a lot of the stuff they're using, you know, signal
00:52:01
◼
►
or wicker style apps are the ones that look like it's one app and then there's the
00:52:06
◼
►
hidden text messaging behind it. Yeah. There's a whole list of, uh, if you'll look for it,
00:52:14
◼
►
and I don't know where it is, or I pull it up and pop it in the show notes, but there's list of apps
00:52:18
◼
►
that actually have hidden messengers. Then you flick things a certain way. Uh, and it looks like
00:52:23
◼
►
some other kind of a game or something more innocuous. So the, the teens that are like
00:52:27
◼
►
under the gun, cause I've had, I've known people that have had issues with troubled kids. So,
00:52:31
◼
►
depression or suicide or drugs and those kinds of things. Unfortunately you get old enough
00:52:36
◼
►
you've got friends with those kids. And those are the kinds of apps that the smarter kids
00:52:41
◼
►
and they all verbally, you know, kind of network those things among each other.
00:52:45
◼
►
So there are ways of doing it that they can't get caught. I think probably most kids just
00:52:50
◼
►
use text message and get caught. But let me take a break here and thank our first sponsor. It's our
00:52:53
◼
►
good friends at LinkedIn. Look, the right hire can make a huge impact on your business. That's why
00:52:59
◼
►
it's important to find the right person. But where do you find that individual? You can post a job
00:53:06
◼
►
on a job board and hope the right person will find your job, but think about it. How often do you
00:53:10
◼
►
hang out on job boards? Don't leave finding someone great to chance when you can post your
00:53:16
◼
►
job to a place where people go every day to make connections, grow their career, discover job
00:53:22
◼
►
opportunities. LinkedIn. Most LinkedIn members haven't recently visited the top job boards,
00:53:29
◼
►
but nine out of 10 members are open to new opportunities. And with 70% of the US workforce
00:53:36
◼
►
on LinkedIn posting on LinkedIn is the best way to get your job opportunity in front of
00:53:40
◼
►
more of the right people. People who are qualified for the jobs you're looking to fill are right
00:53:47
◼
►
there on LinkedIn right now. It's the best way to find the person who will help you grow
00:53:51
◼
►
your business and why a new hire is made every 10 seconds using LinkedIn. That's a phenomenal
00:53:58
◼
►
So go to LinkedIn.com.com/talk. Just T-A-L-K. That's the code for the show. And by going to
00:54:08
◼
►
that URL, you will get 50 bucks off your first job listing. That's LinkedIn.com/talk. And you
00:54:17
◼
►
will save 50 bucks off your first job post. Terms and conditions apply. LinkedIn.com/talk.
00:54:24
◼
►
All right, so we need to know. We need to know what Bezos was using. Was it iMessage? Was it SMS?
00:54:33
◼
►
Was it something like Signal or WhatsApp or something like that? I don't know, but I can't
00:54:39
◼
►
wait to find out. I hope we do find out eventually. And I'm also curious, just whatever it was, how
00:54:45
◼
►
those messages got to the inquirer's hands and what format they were. Well, and if there was an
00:54:51
◼
►
the obvious route, like, oh, I let my brother use my computer during Thanksgiving. I think
00:54:56
◼
►
we would have known that earlier. I guess at least the investigators would have. Yeah.
00:55:01
◼
►
I mean, I mean, I guess what you could do, you know, off the top of my head, what would
00:55:06
◼
►
I do? I don't know. I mean, and did he know? Was it like an open secret within the within
00:55:11
◼
►
their, you know, the brother sister relationship that she was having an affair with Bezos?
00:55:16
◼
►
I mean, it's, I'm guessing it was, you know, that somehow they were close enough that he
00:55:20
◼
►
knew and he betrayed her. But I guess if he got access to one of her devices like a Mac
00:55:25
◼
►
or even her phone and he could take screenshots or you know, you can't really forward text
00:55:31
◼
►
messages. You know, there's no forward command like email. So I think there is actually an
00:55:36
◼
►
iPhone. Oh, really? Yeah. Yeah. Let me I'm pretty sure. Let me pull this up. But I'm
00:55:42
◼
►
pretty sure I've forwarded messages. I can't remember how. Yeah. Let's see. Tap and hold,
00:55:49
◼
►
more and then click forward and you can send it to somebody else. So there is a forwarding.
00:55:53
◼
►
I don't see it on iPad. How do you do that? So you select it.
00:55:58
◼
►
Yeah. If you select the message and I just tap. Oh, I see. And then there's a little arrow message.
00:56:02
◼
►
Yeah. Oh, same, same place where the tap back comes.
00:56:06
◼
►
Yeah. Yeah. Yeah. It's the same little curvy arrow as the wife.
00:56:10
◼
►
And I use that all the time sending out like addresses for, you know,
00:56:13
◼
►
kid drop off and pickups and those kinds of things when we're coordinating.
00:56:16
◼
►
Well, there you go. I just learned something. Well, I guess he could forward them if he knew
00:56:21
◼
►
about that screenshots would be an easy way. You know, and then, you know, there'd be this trail
00:56:28
◼
►
of those messages between her and him, you know, and then just quick delete those from why you
00:56:36
◼
►
still have her device. And then all of a sudden, she's got no sign that he did it. Well, I mean,
00:56:43
◼
►
there's trails here. So the hardest thing for the private investigator to find would be if it was
00:56:49
◼
►
handled at the carrier level and anything within the system there, because first of all,
00:56:55
◼
►
I doubt the guy built an MZ catcher and was like sniffing SMS and doing phone cloning stuff. I mean,
00:56:59
◼
►
maybe he was, but that's serious. And those are also federal crimes. The other side of it is it
00:57:05
◼
►
bringing into the iCloud stuff. Like if all he did was add a device to the device chain, first thing I
00:57:11
◼
►
do if I go into a situation like that is I look at all the registered devices because that'll tell
00:57:15
◼
►
me now maybe he took the device off and it would disappear. But, and also if multi-factor
00:57:24
◼
►
authentication was set up, you know, you do an interview. I mean, there's no way to sneak that
00:57:28
◼
►
through without punching that code in. So there's, and you can even dig through key
00:57:32
◼
►
chains and stuff if you really got into the deep forensics of this. So, I mean, it should be
00:57:37
◼
►
traceable if it involved one of their devices at all.
00:57:42
◼
►
The other things, you know, and I think it's pretty simple.
00:57:48
◼
►
I think it's as simple as the brother
00:57:49
◼
►
having access to something.
00:57:50
◼
►
But, you know, in terms of how safe are you
00:57:53
◼
►
with your iMessage, number one, I do think everybody,
00:57:57
◼
►
I really do think everybody should have two-factor
00:58:00
◼
►
on their Apple ID accounts.
00:58:03
◼
►
And I don't know at what point, I don't even know,
00:58:05
◼
►
like if you sign up for a new one,
00:58:06
◼
►
Did they even let you do it without two factor?
00:58:09
◼
►
- You can, 'cause I don't have that turned on
00:58:12
◼
►
on my kids' accounts.
00:58:13
◼
►
- Right, I don't know that my son does either, I don't think.
00:58:16
◼
►
- So I've got that on mine.
00:58:18
◼
►
But it really wants you to put it in.
00:58:21
◼
►
You've got to deliberately make choices
00:58:24
◼
►
to not implement it.
00:58:25
◼
►
- Right, but if you do the iMessage in the cloud,
00:58:30
◼
►
it does open you up to somebody being able to,
00:58:35
◼
►
if they only have your password, being
00:58:37
◼
►
able to read your iMessages if you're not using two-factor.
00:58:42
◼
►
Yeah, well--
00:58:45
◼
►
I haven't logged in--
00:58:46
◼
►
I'm going to log in iCloud, because I can't remember.
00:58:48
◼
►
I thought it didn't display the messages in iCloud.
00:58:51
◼
►
Yeah, maybe it doesn't.
00:58:53
◼
►
I got no messages.
00:58:54
◼
►
No, there is no messages.
00:58:55
◼
►
So even when you have iMessages in the cloud,
00:58:57
◼
►
you can't read them online.
00:58:59
◼
►
So I guess the other way you could get them
00:59:00
◼
►
would be through the backup.
00:59:04
◼
►
if you could somehow restore somebody's iCloud backup
00:59:08
◼
►
to a device, you could restore the text messages
00:59:13
◼
►
that are in their backup.
00:59:16
◼
►
But I think if you have iMessage in the cloud,
00:59:18
◼
►
then those are no longer backed up in the backup.
00:59:20
◼
►
- No, but if you add the device to your device ring,
00:59:23
◼
►
now the thing is for that,
00:59:24
◼
►
you get a notification on every device.
00:59:26
◼
►
So when you add one, you have to,
00:59:28
◼
►
even if you have two-factor authentication turned off,
00:59:31
◼
►
you get notifications on every other device.
00:59:33
◼
►
Like every time I, I mean, you and I both test stuff,
00:59:36
◼
►
you more than me, every time I like add a new Apple watch
00:59:39
◼
►
or whatever, it's like, oh crap, you know, then.
00:59:42
◼
►
- I can't believe at one point this fall,
00:59:45
◼
►
I couldn't believe that I could still continue
00:59:47
◼
►
to add devices to my iCloud account.
00:59:49
◼
►
'Cause I had like, I still had my year old iPhone,
00:59:52
◼
►
a two year old iPhone, my new iPhone,
00:59:57
◼
►
three iPhones for testing, my Apple watch,
01:00:02
◼
►
my personal Apple Watch, Apple Watch for testing.
01:00:06
◼
►
I mean, it was unbelievable.
01:00:07
◼
►
I mean, it was, but you know, it actually works
01:00:10
◼
►
and it kind of makes sense 'cause you would guess
01:00:11
◼
►
that there are people within Apple who are going
01:00:14
◼
►
through testing devices at a rate that even makes me
01:00:17
◼
►
as a product reviewer, you know,
01:00:20
◼
►
seem like I don't have a lot of devices.
01:00:22
◼
►
- Yeah, I mean, they probably have accounts with, you know,
01:00:25
◼
►
a hundred devices on there just to see what happens.
01:00:27
◼
►
- But boy, when I like at the end of the review season,
01:00:30
◼
►
when I went in and like started like,
01:00:31
◼
►
forget this device, forget this device, forget this.
01:00:34
◼
►
It was kind of cathartic 'cause I,
01:00:37
◼
►
even though I'm just like eliminating them
01:00:38
◼
►
from this list of devices,
01:00:40
◼
►
it felt like I was cleaning up a mess.
01:00:43
◼
►
- I haven't done that spring cleaning
01:00:47
◼
►
and I've got, you know, like you since, you know,
01:00:49
◼
►
iPhone one stuff still in there and half these devices
01:00:52
◼
►
I've handed off to other family members.
01:00:54
◼
►
So they're fully wiped and I know I need
01:00:57
◼
►
to be better about that.
01:00:59
◼
►
I'm, I'm, I'm revealing my passwords. I'm talking about not cleaning my devices.
01:01:03
◼
►
I'm really sound like a security expert on this thing,
01:01:05
◼
►
but it's like a lot to remember to go through and clean those out.
01:01:09
◼
►
Uh, so what do you, what do you think about, here's the one thing too,
01:01:14
◼
►
about this Bezos story that I would like to come out and if it comes out that
01:01:18
◼
►
they really are just SMS text messages,
01:01:21
◼
►
and I guess it would be technically MMS if, if there were,
01:01:24
◼
►
as there apparently were photos involved,
01:01:27
◼
►
if they were intercepted.
01:01:31
◼
►
And again, it doesn't look like that's the case,
01:01:32
◼
►
but I kind of almost hope that it is
01:01:34
◼
►
just in terms of serving as a public service announcement
01:01:38
◼
►
that SMS and MMS are not end-to-end encrypted
01:01:42
◼
►
and therefore somebody could intercept them over the air.
01:01:46
◼
►
Like it's just a bad state of affairs.
01:01:48
◼
►
And the thing that I would really like to catch on
01:01:51
◼
►
is the idea that this next generation
01:01:55
◼
►
carrier-based texting system, RCS,
01:01:58
◼
►
is also not end-to-end encrypted.
01:02:00
◼
►
And I just think it is nuts that we as a society
01:02:05
◼
►
are going to accept a new standard
01:02:06
◼
►
that does not involve any encryption whatsoever.
01:02:09
◼
►
- We're building foundations of society
01:02:14
◼
►
on text messages at this point.
01:02:16
◼
►
And like in the security industry, it was a huge deal.
01:02:19
◼
►
So NIST, National Institute for Standards and Technology
01:02:22
◼
►
came out and said, "SMS doesn't count
01:02:24
◼
►
just two factor authentication. They like a year or two ago, they just flat out said it and people
01:02:29
◼
►
are screaming a lot of vendors and like providers and stuff that no, you have to allow that. You
01:02:34
◼
►
can't say it's not allowed. Because once it goes into that government standard, when you do your
01:02:38
◼
►
security audits and stuff like the stuff I spend most of my day doing, you know, that becomes a
01:02:43
◼
►
huge issue. And it was interesting because Apple is really early on the edge of circumventing SMS.
01:02:48
◼
►
And using their own mechanism and like way early on that compared to other alternatives. And even
01:02:56
◼
►
some of the alternatives like my bank, I'm not going to tell you who they are, but like I log in
01:03:00
◼
►
and I have two factor turned on and it's every phone and every email address I have registered
01:03:04
◼
►
with them. That's not two factor like sending those messages out because it's insecure.
01:03:08
◼
►
But to go back to the encryption piece, this is a generational tragic mistake that we're probably
01:03:16
◼
►
are going to make at this point. And the carriers don't want to implement it for cost reasons or
01:03:20
◼
►
whatever else. And we do actually have, it would not surprise me if there's government pressure
01:03:26
◼
►
behind the scenes because they want to, but they, they would still be able to build in intercept.
01:03:30
◼
►
I mean, that's the law. They have to have lawful intercept at carrier level stuff. As well as the,
01:03:37
◼
►
like we even saw pressure. There were financial services companies lobbying Congress,
01:03:42
◼
►
not to use the stronger encryption standard in TLS 1.3,
01:03:47
◼
►
because then they couldn't sniff their employees
01:03:50
◼
►
when they did that.
01:03:51
◼
►
It's like your convenience,
01:03:53
◼
►
you want to fundamentally make the internet weaker.
01:03:57
◼
►
But that's how people think sometimes.
01:03:59
◼
►
They're the same people that think global warming's a scam.
01:04:03
◼
►
I don't know.
01:04:04
◼
►
- Yeah, well, let's not go down that path.
01:04:07
◼
►
I am not going to brag about my personal security practices
01:04:12
◼
►
'cause I'm sure it lacks in many ways,
01:04:16
◼
►
but I don't reuse passwords and haven't for many years.
01:04:20
◼
►
I mean, at some point I did have my quote unquote
01:04:24
◼
►
standard throwaway password and I still remember it
01:04:28
◼
►
and I don't think I actively use anything that still has it,
01:04:31
◼
►
but it was sort of, you know,
01:04:33
◼
►
anything where I wasn't really concerned about it.
01:04:35
◼
►
Like, ah, if this got hacked, you know,
01:04:37
◼
►
my credit card's not even hooked up to this thing,
01:04:39
◼
►
I don't care, I had a password.
01:04:40
◼
►
I don't do that anymore.
01:04:42
◼
►
Last year, maybe a year ago, maybe even a little more,
01:04:49
◼
►
I was talking to Mac Jay Cichlowski,
01:04:54
◼
►
I hope that's how you pronounce his name,
01:04:56
◼
►
but the pinboard guy.
01:04:58
◼
►
And he spent a lot of last year helping,
01:05:00
◼
►
he spent much of 2018, probably 2017 and 2018,
01:05:06
◼
►
helping Democratic candidates around the country,
01:05:09
◼
►
small grassroots candidates running for Congress
01:05:13
◼
►
and other offices with fundraising
01:05:16
◼
►
and with getting their security together.
01:05:19
◼
►
Because there were an awful lot of security lapses
01:05:22
◼
►
in the 2016 election.
01:05:24
◼
►
I mean, John Podesta's email, his Gmail account was hacked.
01:05:29
◼
►
I mean, it was literally, you know,
01:05:30
◼
►
and it turns out there really wasn't anything
01:05:33
◼
►
scathing in there, but there was stuff that was embarrassing.
01:05:36
◼
►
A couple of Democrats had their Gmail accounts hacked,
01:05:39
◼
►
pretty much because they got spearfished,
01:05:41
◼
►
from my understanding,
01:05:43
◼
►
where their account was really only protected by a password.
01:05:47
◼
►
They got an email that either they
01:05:49
◼
►
or somebody who worked for them clicked the link,
01:05:52
◼
►
followed through, and entered their Gmail password,
01:05:54
◼
►
and that's game over.
01:05:56
◼
►
Without Two-Factor, they already knew your address
01:06:00
◼
►
'cause they sent you the spearfishing email,
01:06:02
◼
►
and then you give them your password, and there it is.
01:06:05
◼
►
then they use it, they download all your email,
01:06:07
◼
►
and even if you know it, it's too late, they got it all.
01:06:10
◼
►
So he was helping people with that.
01:06:12
◼
►
And he was in Philadelphia and we had a beer.
01:06:14
◼
►
And after the Matt Honan thing a couple years ago,
01:06:19
◼
►
where Matt Honan, who's now an editor at Buzzfeed,
01:06:22
◼
►
he worked at various other places before,
01:06:25
◼
►
but he more or less had his entire digital life stolen.
01:06:29
◼
►
- And one of the ways that that happened,
01:06:32
◼
►
and it wasn't through entirely lax,
01:06:33
◼
►
It wasn't because he was really doing bad stuff,
01:06:36
◼
►
but one of the ways that it happened was
01:06:38
◼
►
the people trying to hack him took control
01:06:42
◼
►
of his cell phone account at the carrier.
01:06:47
◼
►
I don't know if it was Verizon or AT&T or whoever it was,
01:06:49
◼
►
but they went there and socially engineered,
01:06:52
◼
►
"Hey, I'm Matt.
01:06:53
◼
►
"I lost my phone," or whatever their story was.
01:06:57
◼
►
I need to get a new SIM with my phone,
01:07:01
◼
►
somehow they got a SIM with his phone number. And then all of his two-factor stuff that was sent by
01:07:06
◼
►
SMS went to them instead, and they took his iCloud, and his iCloud was in, you know, his email was in
01:07:13
◼
►
charge of all sorts of other stuff. And all of a sudden, you know, he was in a world of hurt.
01:07:17
◼
►
So I took his advice, and I believe, to the best of my ability, anything that uses SMS as the second
01:07:26
◼
►
factor no longer uses SMS as the second factor. I use something else like the I don't use Google's
01:07:33
◼
►
authenticator app. I use Authy. But it's the same one. I use it's the same. I'm glad to hear that
01:07:38
◼
►
you use it. But you know, it is a way. How would you describe it? You can probably describe it
01:07:43
◼
►
better than I can. Yeah, it's called we call it OTP or one time passwords, which is there's a
01:07:49
◼
►
it used to be like this crazy expensive thing. RSA secure ID was the only way to get it with the
01:07:53
◼
►
little key fobs. It's the same thing in an app where there's some cryptographic synchronization
01:07:58
◼
►
that goes on between the app and your phone and whatever the server is on the backend.
01:08:04
◼
►
They share a couple of keys and then they use time codes. And so every minute it generates a new key,
01:08:09
◼
►
which leads to a new kind of anxiety because I have to use it multiple times a day. I flick
01:08:13
◼
►
it open and I'm like, Oh shit, three seconds left. Can I type it fast enough or not?
01:08:17
◼
►
I hate that. Well, three seconds is no question. If I open it up, and I've got three seconds left,
01:08:24
◼
►
no question. I'll just wait. Yeah, yeah. On the line there. All right. Well,
01:08:29
◼
►
have I been drinking? What's my dexterity like right now? And for me, it's one of the rare
01:08:35
◼
►
things that to me is harder to do on a laptop or desktop with a real keyboard than on a phone,
01:08:43
◼
►
because the numbers are arranged like I can enter a six digit number on a three by three,
01:08:51
◼
►
you know, with the zero at the bottom keypad way faster than I can on a keyboard where they're
01:08:58
◼
►
arranged across in a row one to zero. See, I've got an iMac Pro with the big keyboard,
01:09:03
◼
►
which I don't use for anything except entering those passcodes. So on a laptop,
01:09:09
◼
►
Top top, it actually does take me longer. And so if it's like 1510 seconds, I'm like,
01:09:13
◼
►
I think I could do this. But if I make one mistake, I'm screwed. Oh, what do I do? But
01:09:16
◼
►
anyway, but yeah, you get like a one time password that lasts 30 seconds at a time that
01:09:21
◼
►
you use with your regular password. Also, that's why it's the Yeah, right. So you enter
01:09:26
◼
►
in, you know, your name at your email.com, your regular password, and then it'll say
01:09:32
◼
►
what, you know, here, enter your six digit, whatever account and you know, you go to,
01:09:38
◼
►
you know, you go to your app that has a different one for every service you register and you
01:09:44
◼
►
get have these series of 30 second one time password. So I switched everything I could
01:09:48
◼
►
to that. You know, and I feel better for it.
01:09:53
◼
►
Yeah, that or like what Apple where there's the out of band. Like I use a product called
01:09:59
◼
►
duo security. It's a commercial tool, Cisco bottom. I've got some friends that work over
01:10:04
◼
►
But, uh, like for some of my work related stuff, like one of my VPNs and it's cool.
01:10:08
◼
►
Cause it sends a push notification that pops up on my Apple watch.
01:10:11
◼
►
And I just click approved, like right there on the watch when I'm logged in, which is slick.
01:10:16
◼
►
And I don't have to kind of type in the code, but it's the same level because it's locked to
01:10:20
◼
►
the device I have on me that I've already authenticated with. Right. And like your
01:10:23
◼
►
Apple watch is only going to show that if you've already unlocked the Apple watch through your
01:10:28
◼
►
Apple watches passcode or through the connection to your phone. So in other words, if your Apple
01:10:33
◼
►
watches just sitting there on your desk off your wrist. It isn't going to show the alert.
01:10:37
◼
►
You did inspire me. I pulled up my one password because I use like you unique password. I don't
01:10:42
◼
►
know. Do you use one password or I don't pass. I don't use any of those things to be honest.
01:10:46
◼
►
I really use that built in just teaching stuff. Yeah. Which we, which we have to get to
01:10:50
◼
►
might be, might be a mistake. Yeah. And I use one password. I just looked it up. I've got
01:10:57
◼
►
1,358 unique passwords and going back to 2008.
01:11:02
◼
►
And, uh, although apparently I had no password on my secure assist blog for,
01:11:07
◼
►
at some point, I don't know, cause that's a blank,
01:11:09
◼
►
but it's actually kind of fun to look back my mobile, me password,
01:11:13
◼
►
my Yahoo from 2008, three hack Yahoo. That's good stuff.
01:11:18
◼
►
No, I do have a,
01:11:20
◼
►
I do keep passwords outside the key chain in, uh, you know, Jimbo,
01:11:24
◼
►
an app from barebone software that is only for the Mac.
01:11:27
◼
►
- Is that still supported?
01:11:29
◼
►
- Yeah, it's still actively developed.
01:11:30
◼
►
It's-- - Wow, okay.
01:11:32
◼
►
- It hasn't really had a major update in a while,
01:11:34
◼
►
so I don't know, but still works perfectly on Mojave.
01:11:39
◼
►
So I have 637 passwords there.
01:11:42
◼
►
I don't know how many are in my key chain.
01:11:45
◼
►
- And that does have encryption, if I remember.
01:11:46
◼
►
- Oh yeah, yeah, yeah, no, it's, and I, you know,
01:11:49
◼
►
part of it is the satisfaction,
01:11:50
◼
►
I know the people who wrote it, so I know,
01:11:53
◼
►
I know I've, I trust it. It is, uh, you know,
01:11:57
◼
►
and I don't keep that password in my key chain. So that is, you know,
01:12:01
◼
►
that is one of the,
01:12:02
◼
►
that's like one of the few passwords where I have to keep it somewhere outside
01:12:06
◼
►
the kitchen.
01:12:07
◼
►
Yeah. My iCloud and my one password password or the two and my system login or
01:12:12
◼
►
the, I guess the three passwords I don't have stored anywhere.
01:12:15
◼
►
My friend, uh, Brent Simmons, uh, collaborator on,
01:12:19
◼
►
on our app Vesper and just a long time friend. And he's been on the show many,
01:12:23
◼
►
many times. But we were on Slack together and a couple of weeks ago, he just wrote,
01:12:29
◼
►
"I can't believe this. I've had the same login password on my Mac for 10 years and I suddenly
01:12:34
◼
►
can't remember it." That has happened to me because it is a muscle memory thing. I type
01:12:43
◼
►
it without really thinking about it. I can think about it right now and I could write
01:12:49
◼
►
it down. But it has happened to me where it's a different sort of memory from just actually
01:12:56
◼
►
typing it on the keyboard with muscle memory than actually thinking about it. And if you
01:13:01
◼
►
actually think about it, sometimes you can like temporarily forget it. And it is, it is terrifying.
01:13:06
◼
►
So I've had to read mine over the phone to my wife. Like I'm traveling and my,
01:13:11
◼
►
she needs something off the computer and I don't have, I'm not having an affair and I don't have
01:13:16
◼
►
have dick pics on my computer. So it's okay. I don't care if she gets in there and I can't
01:13:20
◼
►
remember it. So I will pull out a keyboard and just type it in. Yeah. Dictate it as I'm typing
01:13:28
◼
►
it. Type it into a temporary like text edit document so you can see it and be like,
01:13:32
◼
►
Oh yeah, that's actually looks like a goofy password. There you go. No, I'm like that.
01:13:36
◼
►
And it's, you know, and then he like, he was at work at the Omni group and then he like went home
01:13:40
◼
►
and he went just like the commute and he went home and then he could just type it and it was like,
01:13:44
◼
►
okay, I got it, you know, but he needed that, like, sort of contextual break. Yeah.
01:13:50
◼
►
Yeah. Well, that's my biggest fear, like getting hit by a bus, and then my wife can't get into
01:13:55
◼
►
stuff. I know because she does. And I have some of those written down and I've got a copy I sent to my
01:13:59
◼
►
my lawyer and we know but she like, my wife's technical. And for some reason, she just doesn't
01:14:04
◼
►
want to use one password. Now that is that is something that I want to set up. I want to set up
01:14:09
◼
►
like, I don't know if it would be a safe deposit box. I don't know if it would be something I give
01:14:13
◼
►
to my lawyer, but some kind of binder of written stuff that exists in a place where there is a,
01:14:18
◼
►
if John is incapacitated, here's everything you need. I don't have that set up. There's an awful
01:14:25
◼
►
lot that is in my brain. And- Well, you got to maintain it because once you set it up,
01:14:30
◼
►
the passwords change. So I'm like, all right, well, set up so she can get into my computer
01:14:35
◼
►
in one password. And I just told her, if things get really bad, you call one of my partners from
01:14:39
◼
►
work and like Chris Pepper, because he's a friend and he's super technical and can kind of walk
01:14:44
◼
►
through and figure it out. And maybe they can open up to the hacker network and somebody can get into
01:14:48
◼
►
my stuff if the paperwork is no longer current. Chris Pepper is one of the handful of people who
01:14:54
◼
►
should be an employee of Daring Fireball, but I've never paid him a nickel, but he has reported
01:14:59
◼
►
more type of typographic errors on Daring Fireball than I could ever count.
01:15:06
◼
►
We put him on retainer at secure roses like eight years ago. And so he for his typo reporting
01:15:14
◼
►
ability, I just said, just go in and fix it. He's just a bond. He's an absolute emails. He's an
01:15:18
◼
►
absolute savant. And I agree with I would say, his hit rate number one, he catches all the mistakes
01:15:24
◼
►
and mistakes, of course, get fixed. And when he has a quibble, like a suggestion for this would
01:15:29
◼
►
be better, I would say his batting average is somewhere between 90 and 95%. And I would say,
01:15:34
◼
►
no more than one out of 10, but probably closer to one out of 20 times. Will I actually reject
01:15:40
◼
►
his suggestion and stick with what I have? Yeah. We haven't like edit some of our more complicated
01:15:46
◼
►
research papers that we do. And occasionally he'll have a miss on those, which changes the context.
01:15:51
◼
►
And he doesn't just edit me. It's my, you know, I've got two partners these days at Securosis and
01:15:56
◼
►
he had, it's all of us. And now with the new company, the startup, um, I just give him the
01:16:00
◼
►
the email of our marketing guy, and so he'll send him all the corrections. So he's pretty good.
01:16:06
◼
►
Before we move on, let's just talk about the various ways that people can send private messages
01:16:11
◼
►
to each other. And I'm curious what you would recommend. I almost exclusively use iMessage,
01:16:17
◼
►
simply because almost everybody I communicate with is on iMessage. And I can't even remember
01:16:28
◼
►
the last time I sent SMS other than when I do get some kind of like confirmation
01:16:34
◼
►
from some service or something, you know, it's all automated. Everything I get by SMS is automated.
01:16:41
◼
►
That is a weird aspect of being in the United States. I totally get it. Every time I write
01:16:48
◼
►
about iMessage, I have readers all around the world and I totally get it that in Asia and Europe,
01:16:54
◼
►
iMessage is not really a big player and WhatsApp is huge and there's other ones, other services
01:17:00
◼
►
in other countries. I totally get it. But here for me personally, iMessage is,
01:17:04
◼
►
everybody I want to talk to is on iMessage. And then second to that, and this is what you and I
01:17:09
◼
►
have used a lot, is Twitter DM. Not because I think Twitter DM is secure, but because everything
01:17:15
◼
►
that goes over Twitter DM, it's almost more like a more convenient form of email. I don't trust it
01:17:21
◼
►
any more than I trust email, which is to say in terms of security, not at all. But, you know,
01:17:29
◼
►
and I have my Twitter DMs open. I did this a couple months ago and I thought it would be
01:17:34
◼
►
a disaster and I'd have to quick hurry, shut the door and close them. And it's actually worked out
01:17:39
◼
►
amazingly well. I don't really get, I get very little spam, maybe like two or three a week.
01:17:45
◼
►
And, you know, the ones that I get that I wouldn't have been able to get before,
01:17:50
◼
►
In other words, DMs from people who I don't follow are very, you know, they're nice. They're
01:17:57
◼
►
just from readers and readers at the website and listeners of the show. And I kind of the
01:18:01
◼
►
thing I kind of extra like about it is the nature of a Twitter DM user interface wise
01:18:08
◼
►
promotes brevity. I mean, no offense to those of you who send very long emails, but email,
01:18:16
◼
►
You know, it's very easy to write a long email and it's very, it's not so easy to write a very
01:18:20
◼
►
long DM. And so getting feedback from listeners and readers in the context of a DM is actually
01:18:26
◼
►
more convenient than an email for me. Yeah, I use iMessage for anything and everything I can. The
01:18:33
◼
►
problem is, is I have a wider range of people, I think, and quite a few in particular, like family
01:18:39
◼
►
members and some friends that are not on iMessage. And so then I've got to use SMS for them, but I
01:18:45
◼
►
I don't put anything in there that would be not sensitive, but at work I do.
01:18:48
◼
►
So there's times I need to exchange passwords and, you know, not a lot,
01:18:53
◼
►
but I'll just go to iMessage.
01:18:54
◼
►
It's I trust the encryption on that.
01:18:56
◼
►
Uh, Twitter DM.
01:18:58
◼
►
Is same thing.
01:18:59
◼
►
Number two, I've got multiple ones here with various people, but again, I, I
01:19:04
◼
►
assume it's insecure, especially cause I, um, uh, had some of my stuff hacked.
01:19:09
◼
►
So it wasn't me.
01:19:10
◼
►
It was, there was one of those big exposure things and this is a while ago
01:19:13
◼
►
now. And somebody got into Dan Kaminsky's Twitter's and Dan and I had exchanged some messages about
01:19:20
◼
►
something. And one of the things was not horrible. It was mildly embarrassing. It was something like
01:19:25
◼
►
because it was about somebody else, but it wasn't insulting in any way. But it was enough. I had to
01:19:31
◼
►
go talk to this person who I think was wicked drunk at the time. So he didn't care when I went
01:19:36
◼
►
to him finally had the conversation, you know, and that re reinforced, you can't trust it.
01:19:41
◼
►
I hate to say it. I have to use Facebook stuff. I really try and minimize and compartmentalize it.
01:19:47
◼
►
Like I only use Facebook from my phone. I have it isolated out from like everything else and
01:19:52
◼
►
Facebook messenger. So I have to use that sometimes for like, you know, family members and friends
01:19:57
◼
►
that like I don't talk to more than once every five years. And, uh, but the sensitive stuff,
01:20:03
◼
►
iMessage first and Signal or Wicker would be the two that I would drop back to after that.
01:20:08
◼
►
Trenton Larkin See, Signal I know, and that is,
01:20:11
◼
►
you know, very well known, but I'm not familiar with Wicker at all. How do you spell it?
01:20:18
◼
►
Wicker's, yeah, W-I, I think it's W-I-C-K-R or W, let me pull it up. Yeah, it's one of those
01:20:23
◼
►
different spellings. So it's W-I-C-K-R. And Wicker, me is the name of the app. I guess they changed
01:20:30
◼
►
to name of the app. A friend of mine started the company. So someone I do stuff with at Def Con and
01:20:34
◼
►
everything, which is how I got involved there. So a bunch of the Def Con crew, because I work
01:20:39
◼
►
at Def Con, will use Wicker. But it's one of those two. But I, like, I mean, I use either one of them,
01:20:46
◼
►
like once a year at this point. The thing I don't like about Signal is that it, your ID is your phone
01:20:53
◼
►
number. And I don't like giving out my phone number. Like I've seen a lot of people in the
01:20:58
◼
►
in the press who publicize, you know,
01:21:00
◼
►
you can reach me on Signal privately
01:21:02
◼
►
and here's my phone number.
01:21:04
◼
►
And I don't have, I have secondary phone,
01:21:09
◼
►
I'm certainly not gonna use my main phone number for it,
01:21:12
◼
►
but I have like a, you know,
01:21:15
◼
►
if I spend 50 bucks a month for a second SIM
01:21:17
◼
►
that I keep in like an Android phone just for testing
01:21:19
◼
►
and to have another SIM, I could use that, I guess,
01:21:22
◼
►
and I don't really care if it's public,
01:21:23
◼
►
but I don't know, it still feels like an invitation
01:21:26
◼
►
to have my phone ringing off the hook.
01:21:28
◼
►
- Yeah, well, and for me, it's people that I know
01:21:31
◼
►
in the industry and who wanna, they don't trust iMessage
01:21:34
◼
►
or they use Android, which I don't understand
01:21:38
◼
►
any security pros that use Android at this point.
01:21:40
◼
►
You just shouldn't be doing that, but some people do
01:21:44
◼
►
'cause you know, freedom or something, I don't know.
01:21:47
◼
►
And so that's when I have to use Drop Fact to signal.
01:21:51
◼
►
- Do you feel differently about Android?
01:21:52
◼
►
Do you feel differently about Google's,
01:21:53
◼
►
like the Pixel phones?
01:21:55
◼
►
- I would use a Pixel phone if I had to.
01:21:57
◼
►
They're not there yet.
01:21:58
◼
►
And that's the problem.
01:22:00
◼
►
Like, Apple is such a lead because of how tightly
01:22:04
◼
►
integrated the hardware is, and their secure code processors
01:22:07
◼
►
are better than the Google Pixel, what they have
01:22:10
◼
►
available in those devices.
01:22:12
◼
►
But yeah, I would be, if I had to use,
01:22:16
◼
►
I tell everybody, if you have to use Android,
01:22:17
◼
►
use a Pixel, I wouldn't really, even the Samsungs,
01:22:19
◼
►
I'm not a Wii, that would be like the third choice,
01:22:21
◼
►
would be a Samsung.
01:22:26
◼
►
- All right, let's keep going.
01:22:29
◼
►
So a couple episodes ago, Joanna Stern was on the show,
01:22:33
◼
►
I think it was two episodes ago,
01:22:34
◼
►
and she even mentioned at the end of the show,
01:22:36
◼
►
I said, "What are you working on?"
01:22:36
◼
►
She said she was working on a piece about webcams
01:22:38
◼
►
and webcam security, and then that column dropped last week.
01:22:43
◼
►
And I was intrigued by, so she hired a security researcher
01:22:48
◼
►
to help attack her knowingly to see how he could take control
01:22:55
◼
►
webcam on her. She had an HP notebook running Windows 10 and a MacBook running presumably
01:23:03
◼
►
Mojave. I don't know if she said, but a recent version of Mac OS X. I was intrigued by the
01:23:11
◼
►
vectors that were used to try to take control of it, but I disagreed with her advice at the end.
01:23:16
◼
►
Therefore, I wrote a response to it. Probably everybody listening has read it. I assume,
01:23:23
◼
►
I don't know. I don't assume that everybody in life has read the stuff I read during Fireball,
01:23:28
◼
►
but I sort of assume the people who listen to this podcast have. So I won't rehash the whole
01:23:33
◼
►
thing, but I thought it was just very odd. I thought that her column sort of proved that
01:23:39
◼
►
webcams are pretty safe if you're running, you know, recent versions of either Windows or Mac OS
01:23:46
◼
►
10 and pay attention to the default security warnings for things that you've downloaded and
01:23:52
◼
►
and things asking for permissions.
01:23:55
◼
►
And yet her advice, she was like,
01:23:57
◼
►
I'm still putting a cover on my webcam thing.
01:24:01
◼
►
- Yeah, I've got friends and stuff who do that.
01:24:03
◼
►
And maybe it's 'cause they haven't just dug into,
01:24:07
◼
►
you know, particularly on Apple, what they do.
01:24:09
◼
►
So first of all, if you're on a recent Mac,
01:24:14
◼
►
and I don't know how many years back,
01:24:15
◼
►
probably about two or three years back,
01:24:17
◼
►
I don't really worry at all.
01:24:18
◼
►
The webcam is tied to the light.
01:24:22
◼
►
and that's done in hardware. Now, I had asked Apple, I think a year or two ago,
01:24:28
◼
►
and I didn't get a response, which was, is it hardware locked or is it a firmware lock?
01:24:35
◼
►
The difference being, if it's hardware locked, there's like some circuit that triggers it.
01:24:39
◼
►
If the camera is on, the light is on. And firmware being there's very close to that,
01:24:45
◼
►
but there's like a little software decision that's made, but technically you could blow through it
01:24:49
◼
►
by compromising the firmware.
01:24:51
◼
►
Either way, that's not something that's happening
01:24:53
◼
►
under normal circumstances.
01:24:54
◼
►
- No, and there was, and it's one of those things
01:24:56
◼
►
where once something bad happens,
01:24:58
◼
►
it lands in people's memory and then they never shake it.
01:25:01
◼
►
And there was an interesting story
01:25:03
◼
►
that the FBI had some sort of exploit,
01:25:06
◼
►
like in the mid-2000s, that could take control
01:25:09
◼
►
of a MacBook's webcam remotely.
01:25:12
◼
►
But that's a long time ago on very different hardware.
01:25:16
◼
►
Yeah. And Windows systems, I don't think most of them have that level of protection,
01:25:21
◼
►
but the operating system is a lot stronger as well these days than it used to be. So,
01:25:27
◼
►
I mean, I actually have an app that'll tell me if my microphone or my camera is on. The
01:25:31
◼
►
main reason I have that, it's one of the ones from Patrick Wardle who does a lot of security
01:25:35
◼
►
research. It's more for the microphone part than the camera part that I've got that attached.
01:25:40
◼
►
And also see what, I'm just curious, you know, what processes are turning on my mic throughout
01:25:44
◼
►
the course of the day. How's Chrome going to screw up my computer today is how I otherwise
01:25:49
◼
►
refer to it. But, um, yeah, Patrick Wardle has a utility. Do you know the name of it
01:25:53
◼
►
off the top of your oversight oversight? And I know that the guys at the, who do little
01:25:58
◼
►
snitch have a utility too. Oh really? I miss that one. I have little snitch on like everything
01:26:05
◼
►
I own. It's not part of little snitch, but they, uh, they have a utility that does the
01:26:09
◼
►
the same thing and gives you a notice when somebody,
01:26:12
◼
►
when anything is accessing the microphone.
01:26:15
◼
►
And that was part of what I wrote was that
01:26:18
◼
►
I don't get this paranoia about the camera.
01:26:20
◼
►
I kind of do, I don't wanna be obtuse.
01:26:24
◼
►
Like I kind of get it that nobody wants
01:26:27
◼
►
to be surreptitiously photographed.
01:26:29
◼
►
And there are people out there doing,
01:26:33
◼
►
with bad intentions, and there are,
01:26:36
◼
►
most people don't really know how a computer works,
01:26:38
◼
►
And so they have no idea whether they can trust that by going to
01:26:42
◼
►
xyz.com, whether xyz.com can access the, you know, can
01:26:48
◼
►
somebody clever who's making this website somehow turn on my
01:26:51
◼
►
webcam? And can they do it without having the light come
01:26:54
◼
►
on? I don't know. So I'm just going to cover it up. I mean, I
01:26:58
◼
►
mean, it is a real issue in a couple of cases. One, depending
01:27:02
◼
►
on like on Windows, I'd be much less confident. And it's Windows
01:27:06
◼
►
is way more secure than it used to be,
01:27:08
◼
►
but there's not the hardware software ties
01:27:10
◼
►
the way we get with Macs.
01:27:12
◼
►
I mean, that's one of Apple's--
01:27:13
◼
►
I've got a piece I've been meaning to write for like a year,
01:27:15
◼
►
which is Apple's strategic security advantages,
01:27:18
◼
►
and owning the hardware and owning the software
01:27:20
◼
►
is just a massive advantage over everything else.
01:27:22
◼
►
That's why Google Pixel is the most secure
01:27:25
◼
►
of the Android devices, because Google has that control
01:27:28
◼
►
of the entire ecosystem there.
01:27:30
◼
►
So I do worry a little bit more,
01:27:32
◼
►
even though Windows 10 is actually much better
01:27:35
◼
►
than it's ever been and is really,
01:27:38
◼
►
infection rates are a lot lower,
01:27:39
◼
►
it's still potentially more likely.
01:27:41
◼
►
On the Mac side, I don't worry about it at all
01:27:44
◼
►
unless I'm talking to somebody
01:27:45
◼
►
who's on a really old computer.
01:27:47
◼
►
And again, my definition of really old and yours
01:27:49
◼
►
might be different than other people out there.
01:27:52
◼
►
- The objective development group
01:27:55
◼
►
is the people who make little snitch.
01:27:57
◼
►
They also make the excellent,
01:27:58
◼
►
one of my favorite utilities of all time, Launch Bar.
01:28:03
◼
►
but their app for letting you know when somebody
01:28:07
◼
►
has the microphone or the camera is called MicroSnitch.
01:28:10
◼
►
Anyway, I'll put a link in the show notes.
01:28:12
◼
►
- I'm just having to use Launch Bar.
01:28:14
◼
►
It seems like everybody else is going to Alfred and stuff,
01:28:16
◼
►
and I love Launch Bar.
01:28:17
◼
►
- I love Launch Bar, and I've looked at Alfred,
01:28:20
◼
►
and it looks like a great alternative,
01:28:23
◼
►
but I've never seen any,
01:28:27
◼
►
I don't see anything that would make me want to switch,
01:28:29
◼
►
mainly if only because I've got these crazy muscle memory
01:28:33
◼
►
for LaunchBar. But anyway, but they're two great utilities. I don't know how anybody,
01:28:38
◼
►
a power user goes without them. But yeah, I don't know. I just can't believe that people
01:28:44
◼
►
are so paranoid about the camera. I get it. I get it. You don't want pictures of you,
01:28:48
◼
►
but boy, I sure wouldn't want to be surreptitiously recorded either. And there is no indicator
01:28:54
◼
►
light for the microphone. And quite frankly, to me, it all comes down to the question of
01:29:00
◼
►
of do you trust the software running on your device?
01:29:04
◼
►
- And I get it that there's people,
01:29:08
◼
►
and in the old days computers are so complicated now.
01:29:10
◼
►
They're so much more complicated.
01:29:11
◼
►
Like part of what makes on a really modern MacBook,
01:29:15
◼
►
the camera is for the webcam
01:29:20
◼
►
goes through the T2 security system.
01:29:24
◼
►
And that's an iOS computer.
01:29:27
◼
►
It's a computer in the computer.
01:29:29
◼
►
And to use a variation of a word used by Jeff Bezos,
01:29:36
◼
►
it's a complexifier.
01:29:38
◼
►
It's really cool that there is an iOS computer running
01:29:42
◼
►
inside your Intel-based Mac computer doing
01:29:46
◼
►
a bunch of security-related stuff,
01:29:48
◼
►
including completely controlling access to the camera
01:29:53
◼
►
so that nothing on the Mac can actually touch the camera.
01:29:56
◼
►
It has to make a request through an API,
01:29:59
◼
►
and the API has to go through the T2, which is running iOS
01:30:02
◼
►
and has a secure enclave and all this other stuff.
01:30:05
◼
►
That's all cool, but it's complex for me to understand,
01:30:10
◼
►
and I have a degree in computer science
01:30:13
◼
►
and spend my days thinking about these things.
01:30:15
◼
►
For a typical person, it's a black box, I get it.
01:30:19
◼
►
- Yeah, I'm not actually sure the T2 covers the camera.
01:30:23
◼
►
I'd actually pulled up that document before that.
01:30:24
◼
►
- I know, it doesn't say so in that document,
01:30:26
◼
►
but I was told by somebody that it does go through there.
01:30:28
◼
►
I don't know.
01:30:29
◼
►
But since it's still just an API call
01:30:33
◼
►
that anything can make,
01:30:35
◼
►
I think that's why Apple doesn't really call it out.
01:30:37
◼
►
It does go through the T2 though.
01:30:39
◼
►
And I think-
01:30:40
◼
►
- 'Cause Face ID uses all the Secure Enclave stuff for sure.
01:30:44
◼
►
- And I think that it's partly just to make it harder
01:30:47
◼
►
for anything on the Mac to access the firmware
01:30:50
◼
►
behind the camera.
01:30:52
◼
►
You can't even get to it because it's through there.
01:30:54
◼
►
So I don't think they brag about it in the white paper,
01:30:57
◼
►
but I know I've been told, I don't know for a fact,
01:31:00
◼
►
but I've been told for a fact by somebody who would know
01:31:02
◼
►
that if you have a Mac with a T2,
01:31:05
◼
►
the webcam access goes through there.
01:31:08
◼
►
- I had thought that, but I was right.
01:31:11
◼
►
I flat out pulled up the paper
01:31:12
◼
►
because I knew we were gonna talk about this.
01:31:14
◼
►
Like, oh, it's not in there.
01:31:15
◼
►
- So the cool thing that it does that the T2 ones do
01:31:18
◼
►
is that the microphone, and it is a physical disconnect,
01:31:21
◼
►
when you close the lid on a modern MacBook that has a T2,
01:31:26
◼
►
the microphone is physically disconnected.
01:31:28
◼
►
So that there is, you know, there is,
01:31:30
◼
►
the microphone actually doesn't even have
01:31:32
◼
►
an electronic connection, you know,
01:31:33
◼
►
it doesn't even get power when the lid is closed.
01:31:35
◼
►
It's actually physically disconnected.
01:31:38
◼
►
So you only have, you know,
01:31:39
◼
►
your microphone can theoretically only be used
01:31:41
◼
►
when the lid is open.
01:31:42
◼
►
- You know, and that's also really good
01:31:45
◼
►
because I don't know how many like conference calls
01:31:47
◼
►
you're on, people are always closing their laptop lids
01:31:49
◼
►
and not realizing they're not running
01:31:50
◼
►
the microphone on their headset and it stops that annoyance.
01:31:54
◼
►
Well, you know what? All right. So let me go through. Since I wrote my rejoinder to Joanna's
01:32:00
◼
►
piece on webcams, let me, you know, I heard from readers, you know, all very polite. And it's one
01:32:05
◼
►
of those things where, you know, we can agree to disagree. And it actually makes me very happy
01:32:10
◼
►
in this era of, of everything turns into a shouting argument and nobody listens to the
01:32:16
◼
►
the other side, where it was all reasonable and it made me very happy about the level
01:32:21
◼
►
of discourse between me and my readers. But one of the reasons that readers said, "Well,
01:32:30
◼
►
I get what you're saying. I don't disagree with anything you're saying, but I can't
01:32:35
◼
►
live without a webcam cover," like one of these little stick-on things with a switch,
01:32:40
◼
►
'cause I'm on conference calls all the time.
01:32:43
◼
►
And I want, you know,
01:32:44
◼
►
I just want to know that I'm not being photographed
01:32:49
◼
►
when I don't think I'm being photographed.
01:32:50
◼
►
Totally reasonable.
01:32:51
◼
►
So you know, that to me isn't voodoo.
01:32:54
◼
►
It's not a, I have no idea what an attacker could do,
01:32:57
◼
►
and so who knows what's running on my computer.
01:32:59
◼
►
It is, I know exactly what I'm running.
01:33:00
◼
►
I'm running whatever, you know,
01:33:03
◼
►
video conferencing software,
01:33:05
◼
►
and I want to physically control
01:33:08
◼
►
when I know that I'm being photographed or not.
01:33:10
◼
►
And that's totally reasonable.
01:33:12
◼
►
- Yeah, and that goes back to the software trust.
01:33:14
◼
►
Like everything I've got running,
01:33:15
◼
►
I've gotta open it and turn it on,
01:33:17
◼
►
but I know there's things that people use in enterprises
01:33:19
◼
►
or whatever where it could pop it open
01:33:21
◼
►
or they accidentally have it on when they join a call.
01:33:23
◼
►
Yeah, I'm cool with that.
01:33:26
◼
►
To be honest, there's something to be said for placebos.
01:33:30
◼
►
It makes you feel better.
01:33:32
◼
►
My wife gives me crap all the time
01:33:34
◼
►
because I'm a scientific skeptic.
01:33:36
◼
►
I'm a paramedic. I got a bio like in college, like somebody comes out with the latest,
01:33:41
◼
►
I'm doing this diet or that diet or this or that or chiropractic or whatever. It's like,
01:33:46
◼
►
yeah, none of that shit works, but I'm not going to tell you because it's not going to improve
01:33:49
◼
►
anything and it's causing no harm to you to do that. Joanna's article though, I didn't like
01:33:54
◼
►
because, and she is an exceptional journalist. Like I feel like that I don't like to criticize
01:34:00
◼
►
that, you know, somebody who's that good at what she does, but all those articles,
01:34:06
◼
►
and this is a bigger issue, it's like really easy to fall into this trap when you're working
01:34:11
◼
►
with a security researcher hacker type, and they may not even realize they're doing it,
01:34:16
◼
►
where you want them to show you something, they come up with a way to do it, but you got to do
01:34:21
◼
►
like the four or five manual steps to allow it. And she detailed all of those out. What I've
01:34:26
◼
►
learned over the years, though, is people lose that context when they read those articles.
01:34:29
◼
►
- Right, right.
01:34:30
◼
►
- So they headline the first graph and the last graph,
01:34:32
◼
►
and then they forget everything in the middle.
01:34:34
◼
►
- Right, that was, I don't think it was the very ending,
01:34:37
◼
►
but it was like the ending of part one of my thing,
01:34:39
◼
►
which is basically people are gonna take away from this,
01:34:41
◼
►
the Wall Street Journal says,
01:34:42
◼
►
you should cover up your webcam.
01:34:45
◼
►
- And that to me is the wrong conclusion, right?
01:34:47
◼
►
And the steps she had to jump through,
01:34:49
◼
►
but and to Microsoft's credit,
01:34:53
◼
►
to me it was equally convoluted on the Windows machine
01:34:57
◼
►
with the modern Windows 10 and Windows Defender running,
01:35:01
◼
►
you had to ignore some pretty,
01:35:04
◼
►
some pretty really big warnings, yeah,
01:35:07
◼
►
really big warnings you had to click through.
01:35:10
◼
►
And the guys under which her Mac's camera was compromised
01:35:15
◼
►
was somebody applying for a job that she was looking for
01:35:20
◼
►
to help with video production and send her a resume
01:35:24
◼
►
that required LibreOffice.
01:35:25
◼
►
Number one, who sends the resume in that format?
01:35:27
◼
►
I mean, who doesn't--
01:35:29
◼
►
Somebody who doesn't want the job?
01:35:31
◼
►
Right, and once opened, required decreasing the level
01:35:36
◼
►
of macro security in LibreOffice to a point
01:35:39
◼
►
where LibreOffice says, are you sure this is really insecure?
01:35:42
◼
►
I mean, whose resume has--
01:35:45
◼
►
A has macros, period, but B has macros
01:35:47
◼
►
that require you to decrease the default security settings.
01:35:50
◼
►
And then even then, it still said,
01:35:53
◼
►
are you going to allow this app access to your camera? Cancel or okay?
01:35:57
◼
►
Yeah, it's so we have this thing on and the light went on,
01:36:03
◼
►
there was still, I haven't seen any proof. And again,
01:36:06
◼
►
maybe there is a way and maybe it's a deep dark secret on the dark web and it's
01:36:10
◼
►
tightly held secret that there's some kind of way to get the Mac book or modern
01:36:14
◼
►
Mac book camera on without having the light come on. But if there is,
01:36:17
◼
►
I haven't seen it.
01:36:19
◼
►
You know, it's a category of something
01:36:21
◼
►
that we call stunt hacking.
01:36:23
◼
►
And stunt hacking is where you do
01:36:25
◼
►
something big and exploitative to get attention frequently.
01:36:29
◼
►
Now, this was not directly that case.
01:36:32
◼
►
And I want to be really clear about that,
01:36:34
◼
►
because the researcher wasn't like trying to trick her
01:36:37
◼
►
or make it out to be bigger than it was.
01:36:39
◼
►
This was a legitimate collaboration
01:36:40
◼
►
between the journalist and the researcher.
01:36:42
◼
►
And he's like, yeah, here's the technique.
01:36:44
◼
►
So it's not really stunt hacking in the same way.
01:36:48
◼
►
And sometimes stunt hacking can be good.
01:36:50
◼
►
So when Charlie Miller and Chris Vlasic hacked cars,
01:36:53
◼
►
because that was legit, it was sensationalistic,
01:36:57
◼
►
but it also woke people up to an issue
01:36:58
◼
►
that was legitimately being ignored.
01:37:02
◼
►
The problem is when it goes bad.
01:37:04
◼
►
And that's why I'm very sensitive
01:37:06
◼
►
to these kinds of articles,
01:37:07
◼
►
because the average reader's not gonna know the difference.
01:37:10
◼
►
Years ago, when TJX got hacked,
01:37:12
◼
►
I actually was working with 60 Minutes.
01:37:15
◼
►
I got introduced to a 60 Minutes producer
01:37:17
◼
►
to do a whole piece on that kind of hacking.
01:37:20
◼
►
And if you remember, that was like,
01:37:21
◼
►
they were using WEP for their network.
01:37:23
◼
►
And so the bad guys' wardrobe used the old,
01:37:27
◼
►
for those who don't remember,
01:37:28
◼
►
WEP was the not very secure wifi standard encryption,
01:37:33
◼
►
broke into their network
01:37:35
◼
►
and they were able to sniff the credit cards
01:37:36
◼
►
and got many millions.
01:37:38
◼
►
And I remember it was like, it was enticing
01:37:40
◼
►
'cause I'm driving around with this producer
01:37:42
◼
►
and he flew into Phoenix where I'm living these days
01:37:45
◼
►
or those days.
01:37:46
◼
►
And I'm driving, I think it was outside of a home Depot and I'm like, yep.
01:37:50
◼
►
I'm doing war driving.
01:37:51
◼
►
Like, yep, that's insecure.
01:37:52
◼
►
That's insecure.
01:37:53
◼
►
And he really, he's like talking about where the cameras are going to be placed
01:37:56
◼
►
and where Leslie stall is going to sit and all this other stuff.
01:37:59
◼
►
And he wanted me to like, Guarantee that that was not insecure so that she
01:38:03
◼
►
could walk into the office with the tape and show them, you know, have
01:38:06
◼
►
that, have that gotcha moment.
01:38:07
◼
►
And I looked at the guy, I'm like, I'm not breaking the law to be on TV.
01:38:11
◼
►
Like, is that secure behind the scenes?
01:38:15
◼
►
I'm not going to fake it and say that it was definitively insecure because yeah, could
01:38:21
◼
►
I break that wireless network easy?
01:38:23
◼
►
Like at that point it was so easy, anybody could do it.
01:38:26
◼
►
But behind the scenes, maybe they have those connections encrypted and I couldn't do anything
01:38:31
◼
►
Like I don't know.
01:38:32
◼
►
I'm not going to put my reputation on the line.
01:38:34
◼
►
And they ended up doing that story like nine months later with other people and which I
01:38:39
◼
►
was, I don't need that kind of fame.
01:38:41
◼
►
So I was totally fine with it.
01:38:42
◼
►
was really interesting because there's different desires,
01:38:47
◼
►
the journalists, you need the page views and everything.
01:38:49
◼
►
Some researchers do want to build up their reputations,
01:38:52
◼
►
but it's a fine line when you start writing these pieces
01:38:55
◼
►
without the right context.
01:38:56
◼
►
- Is it sort of like, you know, like, hey,
01:38:58
◼
►
like in the physical world, like, hey,
01:38:59
◼
►
the front door is open, I can just open this front door.
01:39:02
◼
►
And I don't know if I can go in there
01:39:05
◼
►
and steal money from their cash register.
01:39:07
◼
►
I don't know if the cash register is locked or not,
01:39:09
◼
►
but I'm not going into the store.
01:39:11
◼
►
I'm not going into their store while it's closed in the middle of the night just to see.
01:39:15
◼
►
You know, it's even more like I walk and I look and I see the door cracked open or the
01:39:21
◼
►
window cracked open like without even touching it. Yeah. Yeah. It's that.
01:39:25
◼
►
But I'm not going to pop it open and go in.
01:39:27
◼
►
No, because fucking Robocop might be in there. Like I'm not. Yeah, exactly. Now,
01:39:33
◼
►
admittedly, I think they probably had horrible security behind that. But um,
01:39:37
◼
►
and fully compromiseable, but I can't guarantee it. I'm sure it's not going on in national
01:39:44
◼
►
television and saying I could.
01:39:45
◼
►
I will say this. I heard from somebody who knows, I mentioned a few episodes ago that
01:39:50
◼
►
I'd mentioned that famously everybody seems to think Mark Zuckerberg uses a webcam, a piece of
01:39:56
◼
►
tape over his webcam because there was a picture of him once a couple years ago sitting in front
01:40:00
◼
►
of a MacBook with a piece of tape over the webcam and a piece of tape over the microphone. And I
01:40:05
◼
►
I mentioned, I don't know which episode it was,
01:40:07
◼
►
I mentioned a few episodes ago that,
01:40:09
◼
►
I don't know if that was actually,
01:40:12
◼
►
nobody's ever confirmed that that's his device.
01:40:13
◼
►
Somebody who works at Facebook wrote to me and said,
01:40:15
◼
►
"I can confirm that Mark Zuckerberg
01:40:17
◼
►
does use a webcam cover."
01:40:20
◼
►
So there it is, Mark Zuckerberg does.
01:40:22
◼
►
But he said it's probably not even his choice.
01:40:24
◼
►
He's got like a team of like eight security people
01:40:26
◼
►
and he just does whatever they say.
01:40:28
◼
►
But I will say this, a piece of tape over your microphone
01:40:31
◼
►
does not block the microphone.
01:40:34
◼
►
And I encourage anybody who thinks that it does to try it.
01:40:37
◼
►
Put a piece of tape over your microphone.
01:40:39
◼
►
- Has somebody done that?
01:40:40
◼
►
- I've seen people do it.
01:40:42
◼
►
So the thing that I've seen that you can buy on Amazon
01:40:45
◼
►
is you can buy, I don't know what they call it,
01:40:48
◼
►
but you can buy like a dummy microphone
01:40:51
◼
►
that plugs into the microphone jack
01:40:53
◼
►
and it doesn't actually do anything.
01:40:57
◼
►
But when you have a microphone plugged
01:40:58
◼
►
in the microphone jack, the MacBook defaults
01:41:00
◼
►
to using that external microphone as the microphone.
01:41:03
◼
►
And so it's just like a dummy plug.
01:41:07
◼
►
- That would actually work to block your microphone.
01:41:09
◼
►
So if you actually do, for whatever reason,
01:41:11
◼
►
want to block your Mac's microphone by default,
01:41:15
◼
►
I would suggest buying one of those.
01:41:17
◼
►
They cost exactly as much as you think.
01:41:18
◼
►
They're like $4 at Amazon.
01:41:21
◼
►
But buy one of those.
01:41:22
◼
►
Don't think you can cover your microphone
01:41:24
◼
►
with a piece of tape.
01:41:25
◼
►
It doesn't work.
01:41:26
◼
►
- And don't be a target,
01:41:27
◼
►
because if the attacker actually really wants
01:41:30
◼
►
to get your microphone and they know,
01:41:33
◼
►
I mean, they can just look for multiple sound sources
01:41:37
◼
►
- It's just, I'm assuming that's probably not built
01:41:38
◼
►
into average malware, so that makes sense.
01:41:42
◼
►
Basically though, like I said before,
01:41:43
◼
►
it really does come down to,
01:41:44
◼
►
do you trust the software running on your device?
01:41:47
◼
►
And if you don't, I mean, it's like game over.
01:41:50
◼
►
And you know, like, well, I don't know why anybody
01:41:51
◼
►
would trust their keyboard.
01:41:52
◼
►
If you don't, if you think that your computer
01:41:54
◼
►
is running software that could take control of your webcam
01:41:58
◼
►
and could take control of it in a way
01:41:59
◼
►
that wouldn't even show the indicator light,
01:42:02
◼
►
I don't know why you trust your keyboard.
01:42:03
◼
►
Why do you ever, I mean, I don't know how you could live
01:42:06
◼
►
if you don't trust the software on your device.
01:42:09
◼
►
- Yeah, I mean, keyboards have firmware on them
01:42:12
◼
►
and there's actually been stories,
01:42:13
◼
►
there's been versions of those that have been cracked.
01:42:16
◼
►
But I mean, these are like rare.
01:42:18
◼
►
I mean, this is the whole like thing,
01:42:19
◼
►
like what's the real risk to the average person?
01:42:22
◼
►
Like, I mean, you're kind of a target.
01:42:24
◼
►
I'm kind of a target in different ways because of the--
01:42:26
◼
►
- I'm more of a target than an average person.
01:42:28
◼
►
- Yeah, but not a big target.
01:42:30
◼
►
And so our risk assessments are different
01:42:34
◼
►
than like Jeff Bezos or Zuckerberg.
01:42:38
◼
►
And the nature of the kinds of attacks
01:42:40
◼
►
that you're gonna be subject to
01:42:42
◼
►
are gonna be completely different.
01:42:43
◼
►
The average person just getting malware,
01:42:45
◼
►
if that malware can't like pop that microphone right away
01:42:47
◼
►
or get that webcam right away,
01:42:49
◼
►
they're moving on to the next one.
01:42:51
◼
►
All right, let me take a break here
01:42:52
◼
►
and thank our next sponsor, it's our good friends at Away.
01:42:55
◼
►
Look, Away makes some of the best luggage
01:42:58
◼
►
you're ever gonna see.
01:42:59
◼
►
And look, they considered all types of travelers
01:43:02
◼
►
when they made their carry-on,
01:43:03
◼
►
and that's why they make it in two sizes.
01:43:06
◼
►
They have the carry-on, the regular carry-on,
01:43:08
◼
►
and then the bigger carry-on.
01:43:10
◼
►
And they come with an optional ejectable battery.
01:43:14
◼
►
I love this battery, I love it.
01:43:16
◼
►
You just have this big, super long-powered battery
01:43:21
◼
►
right on top of your carry-on suitcase with two USB ports,
01:43:26
◼
►
one of them which does the PD, so you get the extra power,
01:43:28
◼
►
to get like 10 or 12 watts out of it.
01:43:30
◼
►
You just plug a cable in right in your suitcase,
01:43:34
◼
►
plug it in your phone, wire at the airport,
01:43:36
◼
►
waiting for your seat.
01:43:37
◼
►
So every single seat for me,
01:43:39
◼
►
I can go anywhere in the airport
01:43:40
◼
►
and wherever I decide to sit, I have a charger near me
01:43:43
◼
►
'cause I've got my carry-on with me
01:43:45
◼
►
'cause I have an Away suitcase.
01:43:46
◼
►
I love it, I love this feature of it.
01:43:49
◼
►
It is true that in recent years,
01:43:51
◼
►
airlines have come out with these regulations
01:43:53
◼
►
on lithium ion batteries.
01:43:55
◼
►
That's why the Away battery pops right out.
01:43:58
◼
►
You just push it in, pop it right out.
01:44:00
◼
►
So if you need to gate check your suitcase
01:44:03
◼
►
and it has to go under and you can't put a battery in,
01:44:05
◼
►
it's like, click it out, keep it with you at your seat,
01:44:09
◼
►
and then you can pop it back in
01:44:10
◼
►
when you get your suitcase back.
01:44:12
◼
►
No problem, no tools required, nothing like that.
01:44:16
◼
►
Couldn't be easier.
01:44:17
◼
►
It's really, I recommend a suitcase for that reason,
01:44:20
◼
►
in and of itself.
01:44:21
◼
►
You don't have to carry a separate battery.
01:44:23
◼
►
They've got two types of material.
01:44:27
◼
►
They've got their lightweight, durable German polycarbonate,
01:44:30
◼
►
and now they have suitcases made with a great aluminum alloy.
01:44:35
◼
►
You can charge your phone.
01:44:36
◼
►
Their battery can power your phone,
01:44:38
◼
►
like an iPhone XS or XS Max, like five times,
01:44:42
◼
►
and it's completely TSA compliant.
01:44:44
◼
►
They have a great interior compression system
01:44:48
◼
►
that lets you pack more.
01:44:49
◼
►
They've got this great, it seems so simple,
01:44:51
◼
►
but my old carry-on didn't have anything like it.
01:44:54
◼
►
It's this great little system, and these two straps,
01:44:56
◼
►
You can put button-down shirts in there,
01:44:58
◼
►
put this strap over it,
01:44:59
◼
►
and it keeps the shirts from getting wrinkled up.
01:45:02
◼
►
It's a great system.
01:45:05
◼
►
My shirts used to come out of the suitcase all wrinkled up.
01:45:07
◼
►
Now they don't, I love it.
01:45:09
◼
►
They've got four 360-degree spinner wheels.
01:45:12
◼
►
The wheels are fantastic.
01:45:13
◼
►
I've had my away suitcase for years now.
01:45:15
◼
►
I don't know how many years they've been sponsoring this show
01:45:16
◼
►
but I've had the same suitcase
01:45:18
◼
►
ever since they started sponsoring the show.
01:45:19
◼
►
It still looks brand new.
01:45:21
◼
►
It looks brand new.
01:45:22
◼
►
The wheels still spin like new.
01:45:25
◼
►
We have terminals at Philadelphia
01:45:27
◼
►
where they go downhill or uphill.
01:45:29
◼
►
The wheels spin so good that it actually,
01:45:32
◼
►
I actually have to make sure I hold onto the suitcase right
01:45:34
◼
►
because it would just go flying away with me
01:45:36
◼
►
as it goes downhill.
01:45:37
◼
►
They spin so great.
01:45:38
◼
►
Even comes with a TSA approved combination lock.
01:45:41
◼
►
They comes with a removable, washable laundry bag.
01:45:47
◼
►
Can't say enough about how great the Away suitcases are.
01:45:50
◼
►
They have other ones that aren't carry-on size,
01:45:52
◼
►
but the carry-on is the one that I live and die by
01:45:54
◼
►
because I like to travel just with the carry-on.
01:45:56
◼
►
I don't like to check baggage.
01:45:57
◼
►
And they have a special offer for listeners of the show.
01:46:01
◼
►
20 bucks off a suitcase by visiting awaytravel.com.
01:46:06
◼
►
That's awaytravel.com/talkshow.
01:46:09
◼
►
Awaytravel.com/talkshow.
01:46:12
◼
►
And just remember this promo code, talkshow20.
01:46:16
◼
►
Talk show, 'cause this is the talk show, 20,
01:46:19
◼
►
'cause you'll save 20 bucks off a suitcase.
01:46:21
◼
►
go to awaytravel.com/talkshow. Use that URL down there. It came from the show. And remember this
01:46:27
◼
►
code, talk show 20, and you'll save 20 bucks. Great product. I would recommend it even if
01:46:34
◼
►
they weren't a sponsor. You know, I've listened to the show. I've not heard that ad. I'm looking
01:46:40
◼
►
at the website now because I travel a lot. I will say this. I wouldn't put gold bullion in
01:46:47
◼
►
a suitcase if it's got a TSA approved lock. I can't. I mean, it's the best you can do. It's the
01:46:53
◼
►
best you're allowed to do is a TSA approved lock. Let's just face it, a TSA approved lock is not
01:46:58
◼
►
really much of a lock. I can't can't have a can't have rich mogul on the show and let it let it go
01:47:04
◼
►
that. Well, did you ever remember that story where somebody took a picture of the TSA guy holding up
01:47:09
◼
►
like all the keys on the key ring? And then everybody's skin? Okay, let's go 3d print all
01:47:13
◼
►
all the TSA locks. Oh my God. That was can't make this shit up. That was a good one.
01:47:19
◼
►
Yeah. I can't say that I used the lock. Well, but it's, you know, pretty basic airport security is
01:47:27
◼
►
to, you know, I don't think you can go more than 50 feet without hearing a recorded voice telling
01:47:31
◼
►
you not to leave your bag unattended. I'm like really particular about how I organize the inside
01:47:38
◼
►
of my bags. Yeah. And I travel just cause it's gotta be the same. You spend enough time on the
01:47:43
◼
►
road. You don't want to think about it when you show up someplace at 2 AM,
01:47:46
◼
►
which is why, sorry, I'll stop browsing their website. Now,
01:47:49
◼
►
go back to the interview.
01:47:51
◼
►
Let me think about what else here, let me look at my list.
01:47:55
◼
►
And what else do we have to talk about? We're still, still have a fair amount.
01:47:58
◼
►
Let me say this while we're talking about, I mean trusting that the,
01:48:03
◼
►
the software on your device, there's been a story.
01:48:08
◼
►
It started with Facebook and then it expanded to Google, but,
01:48:12
◼
►
But Facebook for a while had this VPN
01:48:17
◼
►
that they were quote unquote giving away free.
01:48:19
◼
►
It was called Ovono, Avano, I get it wrong,
01:48:24
◼
►
doesn't matter, they had to get rid of it.
01:48:26
◼
►
But basically they were using this VPN ostensibly
01:48:29
◼
►
to provide you with security with a VPN,
01:48:32
◼
►
but they were using it to snoop on all of the VPN traffic
01:48:35
◼
►
to see what people were doing on their phones
01:48:37
◼
►
and using it to figure out,
01:48:38
◼
►
hey, everybody's using this WhatsApp.
01:48:40
◼
►
Even though they want $20 billion, it's worth it because look at what we can show people
01:48:46
◼
►
Well, Apple put the kibosh on that.
01:48:48
◼
►
And then immediately afterwards, Facebook started distributing an app and paying people,
01:48:55
◼
►
including kids down to 13 years old, like 20 bucks a month or something to install this,
01:49:01
◼
►
which was effectively giving Facebook all of their traffic.
01:49:06
◼
►
And they were just, you think, "Well, that can't go through the app store."
01:49:10
◼
►
Well, it wasn't going through the App Store.
01:49:11
◼
►
They were distributing it as an enterprise beta,
01:49:13
◼
►
which is not what the enterprise beta system was meant for.
01:49:16
◼
►
So TechCrunch uncovered that.
01:49:20
◼
►
Turned out Google was doing something similar
01:49:22
◼
►
and Google got on top of it.
01:49:23
◼
►
And rather than taking Facebook's root of,
01:49:25
◼
►
we didn't think we were doing anything wrong,
01:49:27
◼
►
Google was like, "Ooh, we're sorry."
01:49:31
◼
►
But there was an embarrassing few days for Facebook
01:49:35
◼
►
and I think for Google too,
01:49:36
◼
►
where Apple had revoked their enterprise certificates,
01:49:39
◼
►
which meant that all of their beta software
01:49:42
◼
►
inside the company was inoperative.
01:49:45
◼
►
So even like beta versions of Facebook
01:49:47
◼
►
and I guess Instagram and all sorts of custom apps.
01:49:50
◼
►
- It wasn't just beta,
01:49:51
◼
►
it was their internal app certificate.
01:49:53
◼
►
So any enterprise apps, beta or not.
01:49:55
◼
►
- Yeah, and like Facebook has an app,
01:49:57
◼
►
I forget what they call it,
01:49:58
◼
►
but it's effectively like a separate version,
01:50:00
◼
►
a shadow version of Facebook just for employees
01:50:03
◼
►
where that's how they communicate with each other.
01:50:05
◼
►
All of those things stopped working
01:50:07
◼
►
'cause they revoked these certificates.
01:50:09
◼
►
But it turns out, and I really had no idea about this, but as this story was unfolding
01:50:16
◼
►
and I was writing about it, people would write to me and say, "This is a bigger story than
01:50:19
◼
►
you think," and they'd link to various things.
01:50:24
◼
►
All sorts of companies are, in our present tense, misusing enterprise certificates in
01:50:30
◼
►
the same way to effectively allow what we've always thought wasn't really allowed on the
01:50:35
◼
►
iPhone, which is sideloading, which is loading native apps, not through the App Store. I mean,
01:50:42
◼
►
there's gambling apps, there's porno apps, there's just all sorts of apps that aren't going through
01:50:46
◼
►
the App Store that you can get, you know, just by installing a beta certificate.
01:50:53
◼
►
Well, I mean, I knew sideloading was going on. I mean, I work with enterprises. So I see those,
01:50:58
◼
►
you know, all the time as part of their, you know, legitimate internal apps and stuff that they get
01:51:02
◼
►
loaded. And I knew, I've seen this abused actually in malware. So the one kind of
01:51:08
◼
►
tricky malware thing that does work on iOS, if you can pull it off, is if you get
01:51:12
◼
►
someone to deploy one of those device profiles, then they can have full access
01:51:17
◼
►
and sniff all the content on your device at that point, potentially, depending on
01:51:21
◼
►
how all that stuff's configured. I just had no idea it was being used for, you
01:51:25
◼
►
know, all these other side store things. And so it makes me wonder, like, we
01:51:30
◼
►
use Circle here at home to monitor the kids.
01:51:34
◼
►
And it's an app.
01:51:35
◼
►
It's like the Circle by Disney thing.
01:51:37
◼
►
So I got the little puck.
01:51:38
◼
►
And it actually does some things that
01:51:39
◼
►
are normally bad for security, but are usable to monitor
01:51:42
◼
►
kids' activity.
01:51:43
◼
►
So we can limit the amount of YouTube time and stuff
01:51:45
◼
►
that they've got with the younger kids.
01:51:47
◼
►
And they have a version that will work on devices.
01:51:50
◼
►
And I know you have to use a device certificate,
01:51:52
◼
►
and basically what these apps do.
01:51:54
◼
►
So it kind of makes me curious if there are lines
01:51:57
◼
►
and if that's considered legitimate or not.
01:51:59
◼
►
they don't hide that that's what it does. So I'm kind of hoping that is real.
01:52:02
◼
►
Have you ever seen this site builds be, you know, like building builds.io and they have the store. I
01:52:10
◼
►
mean, go check this out. I mean, it's, it's unbelievable. I mean, and it's chock full of I
01:52:16
◼
►
mean, it seems like the main thing that builds.io has our game emulators. Like Game Boy emulator and
01:52:26
◼
►
a PSP emulator and an S super, because all these emulators are not allowed in the App
01:52:32
◼
►
Store for copyright reasons. So you can't put like a Nintendo Game Boy emulator, you
01:52:37
◼
►
know, in the App Store because Apple isn't going to let it fly for copyright reasons,
01:52:42
◼
►
because the only way to play the games is with, you know, using game ROMs that you don't
01:52:47
◼
►
have legal right to. But this builds.io is just chock a block full of emulators. And
01:52:55
◼
►
you said, Bitcoin miners and stuff to itorrent clients, all sorts of stuff. There's some
01:53:02
◼
►
other, I forget, I just linked to it yesterday and they already got shut down after I linked
01:53:06
◼
►
to it. But it's like a whack-a-mole type thing where they, you know, every couple of weeks,
01:53:09
◼
►
they just come up with this. Somebody told me they just come up with a new name and just
01:53:14
◼
►
go to a new site and have a new certificate. And it's just from some random company in
01:53:18
◼
►
China. But effectively, it's just a way to to stream pirated
01:53:22
◼
►
movie and TV content. I mean, this I had no idea that I mean,
01:53:26
◼
►
there's a this is a giant
01:53:28
◼
►
No, I had I'm looking at this now and all the Yeah, I had no
01:53:34
◼
►
idea this. Oh, yeah. All these. Yep. It's like box movie box.
01:53:39
◼
►
Yeah, I mean, it's just it's rampant. It's absolutely like
01:53:42
◼
►
Facebook is far from the exception. And if they ever
01:53:44
◼
►
looked at I mean, it almost makes me I don't I'm not I can't
01:53:47
◼
►
say I'm sympathetic to Facebook, but it almost makes me think that maybe they had an excuse
01:53:51
◼
►
here, like, "Hey, everybody's doing this." There is a rampant market inside loaded iPhone,
01:53:57
◼
►
iOS native apps that go through the developer certificate system. My question, and I wrote
01:54:06
◼
►
yesterday, I really don't know the answer. Is this something that Apple was blind to,
01:54:10
◼
►
or are they purposefully turning a blind eye to it and they kind of know it's going on
01:54:13
◼
►
and for whatever reason, they accept it?
01:54:16
◼
►
I mean, there's-- I don't know.
01:54:19
◼
►
It's tough to police that.
01:54:20
◼
►
But Apple does have a little bit of money to hire cops.
01:54:26
◼
►
Yeah, I don't know.
01:54:27
◼
►
Certainly, I know enterprises use this
01:54:30
◼
►
in all sorts of different ways.
01:54:32
◼
►
Typically legitimate, but I've seen ones that are kind of
01:54:35
◼
►
on the line in terms of employee monitoring sorts of things.
01:54:38
◼
►
But at that point, that's usually like work devices.
01:54:40
◼
►
They own it.
01:54:40
◼
►
It's not the same thing as a consumer.
01:54:42
◼
►
And I think that's what those certificates were meant for.
01:54:45
◼
►
These ones, looking at that build IO,
01:54:47
◼
►
I had no idea it was that big.
01:54:49
◼
►
I mean, I knew it was happening.
01:54:50
◼
►
I didn't know it was that big.
01:54:51
◼
►
- It does this, it seems like it would be a big job
01:54:55
◼
►
to police this, but it doesn't seem like
01:54:57
◼
►
it would be too big a job for a company of Apple's resources
01:55:02
◼
►
to police with a reasonably sized force.
01:55:05
◼
►
And I've been saying for a while,
01:55:07
◼
►
I suggested a couple of weeks ago,
01:55:08
◼
►
like the effectively that I think Apple should have
01:55:11
◼
►
effectively like a bunco squad for the App Store,
01:55:15
◼
►
which would be not entirely, you know,
01:55:18
◼
►
in the way that like,
01:55:20
◼
►
hopefully everything that goes through the App Store
01:55:25
◼
►
gets reviewed by the App Store reviewers
01:55:26
◼
►
and they would catch things like
01:55:28
◼
►
blatant copyright violation,
01:55:30
◼
►
like a game that uses Mario as a character
01:55:33
◼
►
that is not from Nintendo.
01:55:34
◼
►
Well, you would hope that Apple's app reviewers
01:55:37
◼
►
would catch stuff like that.
01:55:40
◼
►
But there are all sorts of other ripoffs.
01:55:42
◼
►
There are a ton, there's just a cottage industry
01:55:46
◼
►
in ripoffs where people find popular apps in category X,
01:55:51
◼
►
let's just say weather apps,
01:55:53
◼
►
and they find a popular weather app,
01:55:54
◼
►
and then they make an app that is like a visual clone
01:55:57
◼
►
of a popular app and give it a name that is super similar,
01:56:01
◼
►
and then hope that when you search for that app,
01:56:04
◼
►
their ripoff version shows up high in the results
01:56:09
◼
►
they make money. And there's all sorts of goofy look at the top grossing charts, you
01:56:13
◼
►
can it's not that hard to find a bunch of apps that make you think like, why is this
01:56:17
◼
►
there? You know, like, why are there antivirus utilities that are doing well in the iOS App
01:56:23
◼
►
Store? Like what in the world? You know, given the sandbox restrictions of, of App Store
01:56:28
◼
►
apps, what in the world could you know, whatever you want to say about what antivirus software
01:56:32
◼
►
is useful for on the desktop? And how many of them are actually useful and how many of
01:56:38
◼
►
of them aren't on iOS, it's almost ridiculous. Why are they
01:56:43
◼
►
doing so well? And and I, so I sort of feel like outside the
01:56:47
◼
►
App Store review system, Apple should have a team that is
01:56:50
◼
►
specifically looking for fraud in the way that a police bunco
01:56:54
◼
►
squad is looking for things like three card money games and
01:56:57
◼
►
pickpockets and stuff like that.
01:56:59
◼
►
I mean, I think whether or not it's going on now, it's
01:57:02
◼
►
inevitable, because I mean, let's think back to the whole
01:57:05
◼
►
reason iOS is so secure. I mean, you know, remember Apple pre iOS security was kind of
01:57:10
◼
►
an obscurity thing and Apple realized they can't grow the market unless people can trust their
01:57:16
◼
►
devices. And if the app store gets too overloaded, which is I think it is approaching that because I
01:57:23
◼
►
run into the same issues. I mean, I install apps all the time and I'm not infrequently,
01:57:27
◼
►
I end up installing the wrong one because it's plus plus weather, whatever, as opposed to,
01:57:32
◼
►
you know, whether I mean, you know, how Apple works, once the reputation damage hits a particular
01:57:37
◼
►
point, then they they're usually on it. I think it's just flown under the radar. And it seems like
01:57:42
◼
►
this last year, last 12 months, it's been kind of becoming part of the social consciousness a little
01:57:49
◼
►
bit more. I would hope so, you know, and then you get to the other aspect of this, which another
01:57:56
◼
►
recent scandal in the App Store were these screen recording frameworks and I
01:58:01
◼
►
Don't know if screen recording is quite the right word
01:58:04
◼
►
It's not like they were taking movies
01:58:05
◼
►
but you know more or less these frameworks that you can install third-party frameworks and you put them in your app and then it'll and
01:58:11
◼
►
you know give the you know, let's say you have an app and
01:58:14
◼
►
Then you install this framework and then it'll give you feedback on all of your users on what buttons they pressed and what path they
01:58:22
◼
►
Took through like the first run and all of this stuff
01:58:26
◼
►
which, in theory, is useful, but you would think would be odd to be restricted to your
01:58:33
◼
►
beta testing. And if you were going to use it in production, it should have some kind
01:58:38
◼
►
of warning and opt-in, opt-out type thing before it starts going. And the story was
01:58:45
◼
►
scandalous enough—there's another one that was broken by TechCrunch, to their credit—that
01:58:49
◼
►
you know, last weekend was going through and systematically looking for apps that have
01:58:56
◼
►
these frameworks in them and then sending developers notices that you know, that you've,
01:59:00
◼
►
you know, you need to submit, you have 24 hours of submitted updated app that complies
01:59:03
◼
►
with the terms of the App Store. And again, this it gets to a point I wanted to make with
01:59:08
◼
►
you is that this isn't necessarily a security violation, but it's definitely a privacy violation.
01:59:14
◼
►
Because you have a reason, you have reason to expect that the buttons you click as you
01:59:18
◼
►
you go through an app that you've just downloaded
01:59:20
◼
►
aren't being sent back to the developer.
01:59:22
◼
►
- Yeah, both of these issues together
01:59:27
◼
►
are really interesting because they're abuse of the system.
01:59:32
◼
►
So you talk about trusting the software.
01:59:34
◼
►
I actually come from a different perspective.
01:59:36
◼
►
I don't trust all the software I run on my systems,
01:59:38
◼
►
which is why I love iOS because it compartmentalize it.
01:59:42
◼
►
So I don't trust Facebook worth a shit.
01:59:45
◼
►
And I have to use it for various family related
01:59:48
◼
►
And I'm in the 501st Legion.
01:59:50
◼
►
That's all Facebook-- or not all Facebook, but a lot of it.
01:59:53
◼
►
The communications is there, so I need to be on that.
01:59:56
◼
►
But I only run it on iOS and with all the restrictions
01:59:59
◼
►
around it, because I trust that Apple has given me
02:00:01
◼
►
a secure compartmentalized platform.
02:00:04
◼
►
So even if I want to use something untrusted,
02:00:06
◼
►
it's in its own container.
02:00:08
◼
►
And the side loading of the apps is one way
02:00:10
◼
►
that can just blow past that.
02:00:12
◼
►
And the screen recording frameworks,
02:00:14
◼
►
it's not as bad because it's still within that app.
02:00:17
◼
►
But if you're not communicating that to the people using it,
02:00:20
◼
►
and in particular, like some apps are fine,
02:00:23
◼
►
but if it's an app that has like private info
02:00:27
◼
►
that you think, I don't know, I don't like that,
02:00:29
◼
►
I just don't like slimy vendor stuff.
02:00:33
◼
►
If they disclosed it to the users, totally different.
02:00:36
◼
►
I mean, then you know, everybody does click streaming now
02:00:38
◼
►
on their apps, like they're keeping track of everything
02:00:40
◼
►
you click on a web app as it is,
02:00:42
◼
►
but it's different than like full on screen recording
02:00:44
◼
►
in my book, but maybe not.
02:00:47
◼
►
- I don't mean to be holier than now.
02:00:49
◼
►
I don't, but I had an app in the app store recently,
02:00:54
◼
►
Vesper, the text editor, a little note-taking app.
02:00:57
◼
►
It never in a million years would have occurred to us
02:01:00
◼
►
And I realize our app is relatively simple
02:01:04
◼
►
and we didn't have big venture capital things to satisfy
02:01:09
◼
►
and we ended up not being successful,
02:01:12
◼
►
so maybe we should have, I don't know.
02:01:14
◼
►
I the reason why we failed though wasn't because we didn't track what users were tapping and what they were typing and I
02:01:21
◼
►
It I don't know it. I wouldn't want that information. You know what I mean? Like I feel like that's
02:01:27
◼
►
Something that the industry I hope is sort of coming to grips with is that that for a long time now
02:01:34
◼
►
It has been if you can collect it collect it because of course we want this information we can do something with it
02:01:40
◼
►
Whereas I feel like people
02:01:43
◼
►
Companies are starting to come to grips with the idea of we don't want it. We don't want this information on our hands
02:01:50
◼
►
We don't want it to be there. We don't want to have to protect it
02:01:54
◼
►
Like if we don't have it, we can't be hacked and have it stolen. I
02:01:57
◼
►
Wish more people thought like that. Actually, I don't because it would dramatically affect my income
02:02:02
◼
►
right and but everything you collect as a company about your your users is
02:02:09
◼
►
is a liability in a sense.
02:02:11
◼
►
And maybe you'll come to the conclusion
02:02:13
◼
►
that the benefits of that data are outweigh the liability,
02:02:17
◼
►
but you can't overlook that it is a liability
02:02:20
◼
►
because now you've got this thing to protect.
02:02:23
◼
►
Whereas if you don't collect it at all,
02:02:25
◼
►
you've got nothing to lose.
02:02:26
◼
►
- Well, like we have trackers in our product.
02:02:28
◼
►
It's in beta right now, but they are for errors.
02:02:32
◼
►
They are, we track which of the features
02:02:35
◼
►
are using or not using. Not kind of a screen capture thing. If it was a consumer based app,
02:02:44
◼
►
there's just a level of stuff that I wouldn't want. To be honest, I think a lot of this data
02:02:50
◼
►
correlation and big science and AI stuff like we hear, I think most of it is crap.
02:02:54
◼
►
I know Amazon sends everything to Facebook because I hurt my knee. I bought this thing,
02:03:01
◼
►
fitness device to do rehab on it. And I'm in Facebook and I see ads for the thing I just
02:03:06
◼
►
bought on Amazon. You know, I paid for it. I don't need two of them. And it lasts more than
02:03:12
◼
►
like the two months since I've had it. And you know, it's not like one of them is some
02:03:18
◼
►
rinky-dink company. These are two of the big five, right? You bought it on Amazon and the ads are
02:03:23
◼
►
showing up on Facebook. I have not seen a single ad that would actually drive me to click on it
02:03:28
◼
►
buy something more often than not, it's stuff I've already bought.
02:03:31
◼
►
Yeah. I recently I don't know, I think I mentioned this a few times recently, but for years,
02:03:38
◼
►
up until just like, maybe like two months ago, I didn't have any ads on Instagram. And I don't know
02:03:44
◼
►
why I don't know if it. I don't know if it's because I literally signed up on the first day
02:03:49
◼
►
that Instagram was public. I wasn't on the beta. But I knew people on the beta. And I thought that
02:03:55
◼
►
That looks like a neat thing.
02:03:56
◼
►
I'm gonna keep my eye open.
02:03:57
◼
►
And then the day that it launched,
02:03:59
◼
►
I signed up 'cause I wanted @Gruber.
02:04:02
◼
►
I don't know if it's because I signed up early.
02:04:05
◼
►
I don't know if it's because I'm slightly internet famous
02:04:08
◼
►
and I got white listed for a while.
02:04:10
◼
►
I don't know, but I went years without getting ads.
02:04:12
◼
►
Now I get Instagram ads and I tapped one one time
02:04:16
◼
►
and then every single ad I saw after that
02:04:19
◼
►
was for the exact same thing.
02:04:22
◼
►
And that's not tracking across.
02:04:24
◼
►
It's just within the app, but it is--
02:04:27
◼
►
- Was it Cars for Kids?
02:04:29
◼
►
It was Cars for Kids.
02:04:30
◼
►
- But it's crazy.
02:04:31
◼
►
It's absolutely crazy.
02:04:33
◼
►
But the stuff that happens cross company like that
02:04:37
◼
►
is really maddening.
02:04:39
◼
►
And I don't know how they don't think it's creepy.
02:04:42
◼
►
Like in the real world, if you went into Nordstrom
02:04:47
◼
►
and you were looking at black boots and you thought,
02:04:52
◼
►
- Well, you know what?
02:04:53
◼
►
Maybe I want my wife to give her opinion on this.
02:04:55
◼
►
I like these, but I'm not gonna buy them right now.
02:04:57
◼
►
Okay, thanks.
02:04:58
◼
►
Thanks for letting me try them on.
02:04:59
◼
►
And then you leave and then you go down the street.
02:05:02
◼
►
If somebody like in the next door down there is like,
02:05:04
◼
►
hey, hey, Rich, you want some black boots?
02:05:06
◼
►
Wouldn't you be freaked out?
02:05:07
◼
►
You'd be freaked out.
02:05:09
◼
►
That would be so weird.
02:05:10
◼
►
- Just yesterday, I had a conversation with somebody
02:05:15
◼
►
and they work at one of the major brands
02:05:17
◼
►
on the security team there.
02:05:18
◼
►
And this is work related stuff.
02:05:21
◼
►
and it came up their big data analytics,
02:05:22
◼
►
like the stuff we're talking about right now.
02:05:24
◼
►
And I can't give you any context,
02:05:26
◼
►
like this is like literally household name kind of a thing.
02:05:29
◼
►
And the guy was laughing, he's like,
02:05:31
◼
►
"Yeah, we've not seen a single bump in sales
02:05:35
◼
►
because of this, what I'm guessing
02:05:37
◼
►
is a multimillion dollar program."
02:05:38
◼
►
But it just creeps the shit out of me.
02:05:42
◼
►
- It goes back to the sympathy I have for people
02:05:45
◼
►
who do run webcam blockers, right?
02:05:48
◼
►
because you know something's creepy going on
02:05:51
◼
►
in your computing life where you buy a knee brace at Amazon
02:05:56
◼
►
and all of a sudden Facebook is showing you
02:05:58
◼
►
therapeutic knee braces.
02:06:01
◼
►
You know something creepy's going on.
02:06:03
◼
►
So where do you, you know,
02:06:04
◼
►
you don't understand what's going on.
02:06:06
◼
►
I don't even understand exactly how that connection is made.
02:06:09
◼
►
Why not just play it safe and cover up your webcam?
02:06:11
◼
►
I get it, you know?
02:06:12
◼
►
So I stand behind my rebuttal to Joanna's piece,
02:06:16
◼
►
but I still, I have deep sympathy for the people
02:06:18
◼
►
are just like, see what just what they see that is going on in front of their eyes. It was stuff
02:06:24
◼
►
like that. I can see why they're like, screw this. I'm covering up. Well, and related the whole thing
02:06:29
◼
►
like was Facebook listening into conversations and using that to drop ads in which we don't think
02:06:34
◼
►
they were but right. But it's not implausible. Like I fully believe if Facebook thought they
02:06:38
◼
►
could get away with it, they would do that. Right. And you know, like I've pointed out several times,
02:06:42
◼
►
the microphone doesn't have an indicator light. So I kind of hope, I almost think someone moved
02:06:49
◼
►
to Europe, like the GDPR stuff is finally starting to hit over there and actually have an effect,
02:06:53
◼
►
the privacy regulations they have. I don't think we're going to have anything here,
02:06:57
◼
►
but I do have a feeling, I don't know, maybe the optimist in me that, you know, there's kind of
02:07:02
◼
►
the potential for generational shift. I'm okay giving away some of my privacy. Like I go to
02:07:07
◼
►
Disney World, I get the magic band, they track me and my kids and everything we do. And in that
02:07:11
◼
►
context, that isolated context for whatever reason, I've made the informed decision. They're
02:07:16
◼
►
not hiding what they're doing there. Within my computing, I use all Apple everything because
02:07:22
◼
►
when they went all in on privacy, which was only what, like within the last five years that they
02:07:28
◼
►
started really, really putting that into place, knowing that I had a nice safe place to go where
02:07:32
◼
►
my privacy was respected, I'll pay extra money for that. And that's my choice. But the average
02:07:37
◼
►
person doesn't, you know, it's just a markets thing and they don't understand necessarily.
02:07:43
◼
►
Although I think the survey data is starting to show people kind of understand more and
02:07:46
◼
►
they're not okay with it.
02:07:47
◼
►
Yeah, I would say Apple's. It's a big, slow ship, but they started steering in the direction
02:07:55
◼
►
of privacy more, I would say about five years is when they went public with it as something
02:07:59
◼
►
that they touted. But like one of my favorite stories I got from from somebody at Apple
02:08:04
◼
►
was on the creation of iMessage was when they had the idea for iMessage and that they had
02:08:11
◼
►
the idea that we could do this end-to-end encrypted thing or we could just do our own
02:08:18
◼
►
messaging service and sort of usurp SMS when you're communicating iPhone to iPhone. They
02:08:28
◼
►
clearly had it in mind, Apple device to Apple device.
02:08:32
◼
►
One of the dictums that came down from the very top, I guess, yeah, because when iMessage
02:08:37
◼
►
came out, Steve Jobs was still there. But from Steve Jobs down, it came from whatever
02:08:43
◼
►
we do, engineer this such that we don't have the messages. We don't want them. We don't
02:08:50
◼
►
have them in any form that's readable. It was never an afterthought. It was just a notion
02:08:59
◼
►
with no code or diagram, but let's, you know, one of the top level bullet points of iMessage
02:09:04
◼
►
from the beginning was let's engineer this from the ground up where we never have the
02:09:08
◼
►
plain text of these messages ever. And for the reason of we don't want them, we simply
02:09:15
◼
►
don't want it. There's nothing good can come of us having it. So let's engineer it that
02:09:20
◼
►
John "Slick" Baum: And if you look back at when they just are hiring history for people
02:09:25
◼
►
on some of the security and privacy team,
02:09:27
◼
►
you can kind of start seeing when people
02:09:29
◼
►
who were really well kind of respected
02:09:32
◼
►
and cared about that stuff started going in as well.
02:09:35
◼
►
When was iMessage?
02:09:37
◼
►
When did that first get, was it with iPhone one?
02:09:39
◼
►
No, it was later. - No, no, it was later.
02:09:41
◼
►
I'm gonna say like 2009 was when it was announced,
02:09:44
◼
►
famously with, oh, maybe I'm conflating it with FaceTime.
02:09:48
◼
►
I don't know.
02:09:49
◼
►
I'm gonna guess 2009, doesn't matter.
02:09:53
◼
►
- Yeah, I mean, it takes a long time,
02:09:54
◼
►
but it's so embedded into their culture now.
02:09:57
◼
►
- If it had happened in the '80s,
02:09:58
◼
►
I could tell you exactly what year it was.
02:10:02
◼
►
But yeah, but that's been, you know,
02:10:05
◼
►
but again, that sort of thing that I've been talking about
02:10:07
◼
►
where they've been conscious for a while
02:10:11
◼
►
of let's not collect anything we don't wanna collect
02:10:15
◼
►
because bad things can happen.
02:10:17
◼
►
- You know, having covered data security,
02:10:20
◼
►
like that was, when I used to work at Gartner,
02:10:22
◼
►
like over a decade ago, that was the part of it
02:10:24
◼
►
that I covered and one of the top recommendations is
02:10:27
◼
►
like don't have data that's gonna create a liability.
02:10:30
◼
►
- All right, here we go, I got the Wikipedia history.
02:10:32
◼
►
iMessage was announced by Scott Forstall
02:10:34
◼
►
at the WWDC 2011 keynote, June 6th, 2011.
02:10:38
◼
►
- Yeah. - So there we go, 2011.
02:10:40
◼
►
Yeah, exactly, what were you, I'm sorry, I interrupted you.
02:10:46
◼
►
- No, no, I was saying like way back
02:10:47
◼
►
when we were advising companies, don't keep data
02:10:50
◼
►
if you don't want that liability,
02:10:51
◼
►
but in particular it's on the marketing side
02:10:54
◼
►
or anybody in the advertising base
02:10:56
◼
►
has just slurped up that stuff forever.
02:10:59
◼
►
And sometimes there's interesting things
02:11:01
◼
►
that kind of bite you in the butt.
02:11:02
◼
►
There was, God, was it Dropbox, I think,
02:11:06
◼
►
had to do a disclosure related to GDPR.
02:11:09
◼
►
Like they weren't tracking the data,
02:11:12
◼
►
but what happened was there was one part of their system
02:11:17
◼
►
that was saving a log file down
02:11:18
◼
►
that people didn't even know was being saved down
02:11:20
◼
►
to that system. And I could be totally wrong if it was Dropbox or Twitter. I mean, it was just one
02:11:27
◼
►
of the big kind of consumer names and they fully disclosed it, closed it down. And a bunch of
02:11:31
◼
►
people started screaming their heads off and it was like, no, this was a legitimate technical
02:11:35
◼
►
error. Like somebody didn't turn off this one log thing. And it was just sitting there in a secure
02:11:39
◼
►
area to begin with. All right, let me take a break here and thank our third and final sponsor. It's
02:11:45
◼
►
our good friends at Squarespace. Look, if you need a new website or if you have an old
02:11:50
◼
►
website that needs to be redone, you should do it at Squarespace. Squarespace has everything
02:11:57
◼
►
you need to host, build, design, update, keep updated a website. Everything from domain
02:12:04
◼
►
name registration to picking from a slew of professionally designed templates, all of
02:12:09
◼
►
which work responsibly on everything from phones to big screens on the desktop.
02:12:18
◼
►
Any feature you could want on a website, a store with a catalog and SKUs and do commerce
02:12:24
◼
►
right there, you could do it in Squarespace. A portfolio, like if you're a designer and
02:12:29
◼
►
you want to have a portfolio of your work, a personal portfolio, you can do it right
02:12:33
◼
►
there in Squarespace. Blog, podcast, host it right on Squarespace. Update it. How do
02:12:40
◼
►
you update your blog? Go into Squarespace. That's how you post. Everything is there.
02:12:44
◼
►
It is a terrific service. You can do it all visually. You drag and drop. There's no question
02:12:51
◼
►
why you're making changes to your Squarespace website, what it's going to look like, because
02:12:55
◼
►
what you do is right there in the browser window. And when you drag something from the
02:12:59
◼
►
the left side to the right side. You see it right there. It's what you see as you're designing
02:13:03
◼
►
is exactly what people see when they're visiting your site. It's a great service. They've been
02:13:10
◼
►
sponsoring this show for a long time and they keep sponsoring it because people who listen
02:13:15
◼
►
to the show keep signing up to make new websites at Squarespace. They have great customer support.
02:13:23
◼
►
They have great analytics. I always mention this because I just love it. Their analytics
02:13:28
◼
►
It's such a great interface, such great data presentation.
02:13:32
◼
►
So once you have a Squarespace website,
02:13:34
◼
►
you wanna see where are people coming from,
02:13:36
◼
►
how many people are visiting, where are they coming from,
02:13:38
◼
►
you can check it all out in their built-in analytics.
02:13:42
◼
►
Really, really nice.
02:13:43
◼
►
You can start building your website today
02:13:46
◼
►
at squarespace.com.
02:13:47
◼
►
They have a free trial, so you don't have to pay anything.
02:13:51
◼
►
See if it works for you before you pay,
02:13:53
◼
►
but then when you do pay, remember this code, talk show.
02:13:55
◼
►
Just T-A-L-K-S-H-O-W.
02:13:58
◼
►
Use that at checkout and you get 10% off.
02:14:00
◼
►
And you can pay for up to a year in advance.
02:14:03
◼
►
You can save 10% on the whole year.
02:14:05
◼
►
It's a really great service.
02:14:07
◼
►
They keep sponsoring the show.
02:14:09
◼
►
I can't thank them enough.
02:14:10
◼
►
Go to squarespace.com/talkshow and remember that code.
02:14:13
◼
►
Same thing as the URL slug, talk show,
02:14:16
◼
►
and you'll get 10% off your first purchase.
02:14:18
◼
►
That's squarespace.com/talkshow.
02:14:22
◼
►
All right, we're getting towards the end right here, Rich.
02:14:27
◼
►
There's this key chain stealer story,
02:14:29
◼
►
which I can't let go by without talking about.
02:14:33
◼
►
What the hell happened with this?
02:14:35
◼
►
This seems like a nightmare.
02:14:38
◼
►
- Yeah, again, so this is full on,
02:14:42
◼
►
like an overlap of, I don't know,
02:14:44
◼
►
I'm not gonna say stunt hacking in this one.
02:14:47
◼
►
So a younger researcher released a video,
02:14:50
◼
►
had a way to get into the key chain
02:14:52
◼
►
and steal all the passwords on an entire system,
02:14:54
◼
►
so not just the current user.
02:14:56
◼
►
and built a tool to go ahead and do that.
02:14:59
◼
►
The functionality of the tool,
02:15:00
◼
►
like Patrick Wardle we were talking about earlier,
02:15:02
◼
►
the kid sent him the code,
02:15:04
◼
►
he validated that the whole thing worked,
02:15:05
◼
►
put a video out there and said he wasn't going
02:15:07
◼
►
to release any more information
02:15:08
◼
►
because Apple doesn't have a bug bounty program.
02:15:10
◼
►
And this is like- - For the Mac.
02:15:11
◼
►
They don't have a bug bounty program for Mac.
02:15:13
◼
►
- Yeah, for the Mac.
02:15:14
◼
►
And even for iOS, it's limited.
02:15:15
◼
►
- Right, you have to- - And so this is like, yeah.
02:15:18
◼
►
- It's like invite only.
02:15:19
◼
►
- Invite only, and so this is like a full-on firestorm
02:15:23
◼
►
of like disclosure and was it real, was it not real?
02:15:28
◼
►
So I sent you a link, Dan Gooden did an article
02:15:31
◼
►
over at Ars Technica, which is really good that,
02:15:33
◼
►
well, yeah, it was just, you had to have some pretty deep
02:15:35
◼
►
access onto the system to make this work anyway.
02:15:38
◼
►
So you would, it is the kind of thing you would put
02:15:40
◼
►
into malware if you were writing malware,
02:15:42
◼
►
but you gotta get that malware to run in the first place
02:15:44
◼
►
to pull those out.
02:15:45
◼
►
And that doesn't work under certain conditions.
02:15:47
◼
►
It only had access to certain, not the iCloud key chain,
02:15:50
◼
►
but the local key chain.
02:15:52
◼
►
kind of a thing.
02:15:53
◼
►
So the issues for me that I thought were fascinating.
02:15:58
◼
►
One is, I get it.
02:15:59
◼
►
It's your vulnerability.
02:16:01
◼
►
It's not really a vulnerability.
02:16:02
◼
►
It's your exploit.
02:16:04
◼
►
And you discovered it.
02:16:05
◼
►
And as a security researcher, it is your right
02:16:08
◼
►
to do with it what you want.
02:16:10
◼
►
But what's the public good here?
02:16:13
◼
►
Releasing a video and saying this is possible
02:16:15
◼
►
and then saying you're not going to disclose stuff
02:16:18
◼
►
unless you get paid.
02:16:19
◼
►
You know what?
02:16:21
◼
►
go sell it to Saudi Arabia or something,
02:16:24
◼
►
or the US government, preferably, or whatever.
02:16:26
◼
►
If you want to make money off of it,
02:16:28
◼
►
if Apple doesn't have that.
02:16:29
◼
►
Or release the information responsibly to Apple,
02:16:33
◼
►
like notify Apple, let them clean up.
02:16:35
◼
►
Apple used to be kind of dicks about that,
02:16:37
◼
►
now they're pretty good,
02:16:38
◼
►
or much better than they used to be.
02:16:40
◼
►
I've heard much more positive things about,
02:16:43
◼
►
really, really positive things in many cases
02:16:45
◼
►
about researcher relationships.
02:16:47
◼
►
But releasing the video and creating the hype
02:16:50
◼
►
and doing it that way, I don't know,
02:16:52
◼
►
it's just like not, whatever.
02:16:54
◼
►
- In his defense, I mean it's not really a defense,
02:16:57
◼
►
but you know, he is 18, so maybe it's the folly of youth.
02:17:02
◼
►
- Yeah, it's true, and some of that's a culture
02:17:06
◼
►
on the security side, particularly the research exploit side
02:17:11
◼
►
where you have to build up your reputation
02:17:13
◼
►
by kind of releasing stuff in public,
02:17:15
◼
►
and then you get the better, to be blunt,
02:17:17
◼
►
the better job opportunities and stuff in some cases.
02:17:19
◼
►
It's a weird, it's like actors, kind of a meritocracy.
02:17:23
◼
►
It's a meritocracy, but it's not at the same time.
02:17:25
◼
►
- It puts Apple in a weird spot though,
02:17:27
◼
►
because it's like, you know, it certainly,
02:17:29
◼
►
I'm sure it caught the attention of Apple's security team
02:17:31
◼
►
and the keychain team, and they would like to have some,
02:17:35
◼
►
they'd like to see the code and fix it.
02:17:38
◼
►
I don't know that they have enough to go on to fix it.
02:17:41
◼
►
So here's this thing that is,
02:17:44
◼
►
and Patrick Wardle has a good reputation,
02:17:47
◼
►
and was a former NSA consultant or whatever he did.
02:17:52
◼
►
He used to work with the NSA
02:17:53
◼
►
and certainly is highly reputable and vouched for it.
02:17:57
◼
►
Had access to the code and said yes, does what he says.
02:18:00
◼
►
So it's out there, but Apple has no way to fix it
02:18:05
◼
►
other than independently finding the bug that he was using.
02:18:10
◼
►
And from what he showed, it seems very difficult.
02:18:15
◼
►
proof of concept video doesn't really show you the how at all. It just shows you that it works.
02:18:19
◼
►
Yeah. And I mean, Apple can likely, you know, they've got smart people, there's enough details
02:18:26
◼
►
in the blog post. And even if you, if you look at the updated posts, he's good. You know,
02:18:30
◼
►
he even says, quote, this is not a security bug in OS 10. Everything works as designed.
02:18:34
◼
►
This is post exploitation technique. And so, and I don't understand what that means. What does that
02:18:41
◼
►
- Yeah, so for compromising a system like this,
02:18:44
◼
►
there's a couple of things you need to do.
02:18:46
◼
►
First thing is you need to find that open door.
02:18:48
◼
►
So that's the vulnerability.
02:18:49
◼
►
So somebody has a crappy lock on their door,
02:18:51
◼
►
the window lock doesn't work.
02:18:53
◼
►
And that's the thing that you can use
02:18:55
◼
►
to get into the system.
02:18:57
◼
►
Sometimes it's tricking the user to install something
02:19:00
◼
►
with privileges.
02:19:01
◼
►
I mean, that's the way all that targeted phishing stuff
02:19:04
◼
►
Sometimes it's what we call drive by.
02:19:07
◼
►
If you get like a drive by browser exploit,
02:19:09
◼
►
means you hit a website and it exploits you.
02:19:12
◼
►
It's harder today, we used to have a lot of those.
02:19:14
◼
►
Or there's some other kind of virus
02:19:17
◼
►
or something that could be transmitted through email,
02:19:21
◼
►
There's a variety of techniques.
02:19:22
◼
►
So you got to get your toes, your fingers,
02:19:25
◼
►
into that system with that initial exploit.
02:19:27
◼
►
Once you get that exploit, or you
02:19:29
◼
►
have to have the vulnerability, then
02:19:31
◼
►
you have to be able to exploit that.
02:19:33
◼
►
And so if you think about system integrity protection and ASLR
02:19:37
◼
►
and a variety of--
02:19:39
◼
►
and kernel ASLR, all those things
02:19:41
◼
►
that you see announced at WWDC or in those security papers,
02:19:45
◼
►
that's all to reduce if that vulnerability is exploitable.
02:19:50
◼
►
So on iOS, all the compartmentalization
02:19:52
◼
►
is, even if there's a malicious app,
02:19:54
◼
►
it doesn't have access to much.
02:19:55
◼
►
It's kind of stuck in its own little sandbox.
02:19:57
◼
►
So it shouldn't be able to affect other apps.
02:19:59
◼
►
So you've got the vulnerability.
02:20:01
◼
►
Then you have to be able to exploit the vulnerability.
02:20:03
◼
►
And then you have post-exploitation.
02:20:06
◼
►
And that depends on what you're doing. So one of the things you likely want to do is escalate your
02:20:10
◼
►
privileges to get up to like administrator or root level. And then there's all sorts of other stuff
02:20:15
◼
►
you do. That's where you install the webcam sniffer and microphone sniffer and keyboard
02:20:21
◼
►
sniffers or Bitcoin miners. That's all post exploitation. Have the vulnerability exploited,
02:20:27
◼
►
get your footprint, then do the bad stuff. And this is a part of the bad stuff. So this is post
02:20:32
◼
►
exploitation, which means you already have to have compromised that system for this thing to work.
02:20:37
◼
►
And it does some things that are bad, but it's not an actual vulnerability on the system. And
02:20:43
◼
►
Apple does have things to limit post exploitation in various areas of the system. T2 chip, you can't
02:20:53
◼
►
exploit and nail the bootrom. So, for example, one of the things, think about jailbreaks and
02:21:01
◼
►
and tethered versus untethered jailbreaks. So a lot of your listeners probably know the tethered
02:21:06
◼
►
jailbreak, like you got to redo it every time you reboot your phone, because the boot of the system,
02:21:11
◼
►
the secure boot process kicks out whatever you did. It's like running in memory, the moment you
02:21:16
◼
►
power down and power back up, it reboots, it's gone. The T2 chip, for example, on the newer,
02:21:21
◼
►
I mean, admittedly, not a lot of systems have it, brings those same protections to the Mac.
02:21:25
◼
►
And there was earlier stuff to help reduce the persistence of what you could do.
02:21:31
◼
►
So that's kind of that whole chain of stuff. Yeah. Well, and it won't be that long before
02:21:35
◼
►
I, everything like that has to start slow, like the T2, you know, and it won't, you know,
02:21:41
◼
►
five, six years from now isn't that long. And then all of a sudden the vast majority of max
02:21:46
◼
►
inactive use will have a T2, you know, it's just how it works. You got to get started.
02:21:51
◼
►
It's one of the cool things about Apple is because again, they control both that hardware
02:21:55
◼
►
and software, there are other cryptographic copro or sorry, um, security coprocessors on the market.
02:22:02
◼
►
Like you go to AMD and you read through their stuff, or I'm sorry, arm and the arm specs have
02:22:07
◼
►
that in there. AMD has chips. And in every case it's this secure, you know, it's a system on chip
02:22:12
◼
►
thing, uh, that's designed specifically and they all have similar functions. Apple's more mature,
02:22:18
◼
►
but because they have both the hardware and the software, they can do cool things like embed
02:22:23
◼
►
certificates for software updates. So you can like completely wipe your Mac and start over
02:22:27
◼
►
using a secure, you know, kind of chain because Apple's actual company certificates embedded.
02:22:32
◼
►
Trenton Larkin Yeah. Last thing I wanted to talk about,
02:22:36
◼
►
and it's not really, it's not really, I mean, hopefully it fits for good. But there's that
02:22:40
◼
►
group FaceTime bug a couple weeks ago. And that was the other thing that people brought up in
02:22:46
◼
►
response to my, hey, I don't think you need, I really don't think you need a webcam cover.
02:22:51
◼
►
And they were like, well, what about the group FaceTime bug that just shows bugs can, you know,
02:22:55
◼
►
if they can, if they're a bug like that can slip by, I don't trust anything.
02:22:58
◼
►
And I have to say sort of bad timing on Apple's part, like it really was a bad bug.
02:23:07
◼
►
But I can see how it happened. You know, it was just like a weird, it's like a weird bug in the
02:23:14
◼
►
flow of answering a phone call, you know, a group FaceTime call, where it was like accepting the
02:23:20
◼
►
the group FaceTime call before you actually accepted it and left it connected.
02:23:25
◼
►
Yeah, I mean…
02:23:28
◼
►
But you can see how that spooks people, right?
02:23:30
◼
►
Yeah. I mean, it's a full legit spook. Like when I read it, I'm like, "Oh crap,
02:23:37
◼
►
I think it was an airport because I travel a lot. Do I need to turn my FaceTime off? All right. Well,
02:23:43
◼
►
somebody calls me on FaceTime other than my wife, then I'll turn FaceTime off."
02:23:46
◼
►
And within an hour it was blocked anyway.
02:23:48
◼
►
I mean, these bugs, even if you're really good at security, they're going to happen sometimes.
02:23:53
◼
►
Apple does miss stuff like everybody. They're much better than they were 10 years ago,
02:24:00
◼
►
15 years ago. And some of these bugs, people get all crazy about some of the lock screen bugs that
02:24:08
◼
►
come up and the lock screen bypasses. Yeah, it's exactly the same sort of thing.
02:24:13
◼
►
Yeah. You can fuzz against some of these situations and others, it's actually hard to
02:24:18
◼
►
build a test harness to find all of them. So I don't know the details about how this one slipped
02:24:22
◼
►
through the cracks. But this to me, this was a good security story. Like it was bad that it
02:24:27
◼
►
happened. The reporting and the disclosure stuff was messed up. Kids get a bug bounty,
02:24:33
◼
►
by the way. So that's cool. And we know Apple is going to fix that. I mean, they publicly
02:24:41
◼
►
apologized for that. And they blocked at the servers like within an hour. Like I had people
02:24:45
◼
►
still on Twitter, like three hours later, turn off FaceTime, turn off No. And then I'm like,
02:24:50
◼
►
Why would you turn off FaceTime? And they're like, Well, just in case. Okay. I mean,
02:24:55
◼
►
have fun with that, guys. Yeah, I don't know. I wonder how many people have turned off FaceTime
02:25:02
◼
►
and left it off. And in the light of this, it was never more than a group FaceTime bug. And they did,
02:25:10
◼
►
like you said, I mean, ideally, the family's report would have been somehow escalated and
02:25:17
◼
►
been seen by the right people. And they would have taken action sooner. But it does seem like they
02:25:23
◼
►
were able to disable it at the server side as quickly as you could expect them to once it did
02:25:29
◼
►
escalate. Yeah, and then an individual FaceTime still worked, which was great. Right. So I mean,
02:25:36
◼
►
there was the escalation, of course, with all of these things. And this ties into the previous
02:25:40
◼
►
story the whole bug bounty thing comes up and I will say like I never thought Apple had to do a
02:25:44
◼
►
bug bounty. I think they can be valuable they cannot be valuable depends on how you do it.
02:25:48
◼
►
I do think though now that Apple's had it out there like if you're going to do it really do it
02:25:54
◼
►
and then they've got a little bit of a half-ass. I mean Microsoft does a similar thing where there's
02:25:58
◼
►
very definitive bounties for very high value Windows exploits but ignoring the Mac or whatever
02:26:07
◼
►
You know, that makes it a little tougher story for them
02:26:11
◼
►
to kind of justify that.
02:26:13
◼
►
- All right, last but not least,
02:26:14
◼
►
I wanna talk about Amazon buying Eero.
02:26:16
◼
►
Because Eero, I have to say, is a long time sponsor
02:26:20
◼
►
of this podcast in particular, but Daring Fireball,
02:26:24
◼
►
they've sponsored weeks at Daring Fireball as well.
02:26:27
◼
►
I have Eero, wifi network equipment,
02:26:31
◼
►
that's what gives me wifi at home.
02:26:35
◼
►
I'm still using them, I'm going,
02:26:36
◼
►
I plan to continue using them.
02:26:39
◼
►
But I have to say, I'm a little disappointed by the news.
02:26:43
◼
►
And I want to talk about it in an episode
02:26:46
◼
►
when they're not sponsoring the show.
02:26:48
◼
►
Yeah, I mean, I'm on Ubiquiti myself because of not privacy
02:26:52
◼
►
concerns, just because of the nature, because I could.
02:26:55
◼
►
Yeah, well, Ubiquiti, from what I understand, is higher end.
02:27:00
◼
►
It takes a little bit more expertise to set up,
02:27:03
◼
►
maybe a lot more expertise.
02:27:05
◼
►
It's good stuff.
02:27:06
◼
►
I've heard nothing but good things about Ubiquiti's Wi-Fi stuff. I like these sort of just plug
02:27:12
◼
►
it in and forget it aspect of the Eero thing because I'm lazy. And I hear too, you know,
02:27:19
◼
►
at least till now I do trust them. And it really is the ease of setup makes it really
02:27:25
◼
►
easy to recommend to friends and family. I mean, again, the Eero is not sponsoring this
02:27:30
◼
►
episode of the show. But I will say their stuff is really, really easy to set up in
02:27:34
◼
►
and their app has a really, really good interface
02:27:38
◼
►
for configuring stuff,
02:27:40
◼
►
anything that a normal person would wanna configure
02:27:44
◼
►
on their WiFi.
02:27:45
◼
►
And I guess the thing that I wrote was that,
02:27:50
◼
►
look, I liked Eero as an independent company,
02:27:54
◼
►
just having a WiFi company
02:27:55
◼
►
that was just doing their own thing.
02:27:57
◼
►
I kinda knew in the back of my head
02:27:58
◼
►
that the end result for them
02:28:01
◼
►
was probably gonna be an acquisition,
02:28:03
◼
►
and I was sort of hoping it was gonna be Apple
02:28:05
◼
►
for privacy reasons, and that maybe that would be
02:28:09
◼
►
Apple's re-entry in the Wi-Fi market.
02:28:11
◼
►
I could think of worse places than Amazon.
02:28:16
◼
►
Could've been Facebook.
02:28:17
◼
►
But the truth is, owning a Wi-Fi base station
02:28:23
◼
►
gives you everything you would want.
02:28:24
◼
►
In terms of that stuff that Facebook was doing
02:28:26
◼
►
with their VPN, in terms of like,
02:28:28
◼
►
we'd like to know what people are doing
02:28:29
◼
►
with their network traffic.
02:28:30
◼
►
Well, if you control the base station,
02:28:32
◼
►
you could do that.
02:28:34
◼
►
- Yeah, I had really mixed feelings.
02:28:37
◼
►
So it was a Euro or Ubiquiti
02:28:39
◼
►
when I was deciding on my network.
02:28:41
◼
►
And the main reason I went with Ubiquiti
02:28:43
◼
►
is 'cause I've got the background
02:28:44
◼
►
and I like to tinker with that stuff.
02:28:47
◼
►
I had a different one before that was an Euro competitor
02:28:50
◼
►
that's kind of tanked, Luma,
02:28:52
◼
►
'cause I had known some people over at the company
02:28:55
◼
►
when they started that.
02:28:56
◼
►
So here's, put the security hat on,
02:29:02
◼
►
here's a risk assessment.
02:29:04
◼
►
One is they can't do that without at least notifying you.
02:29:07
◼
►
Track all that info.
02:29:08
◼
►
So you're covered on there or FTC violations
02:29:11
◼
►
or what's left of the federal government could go after them.
02:29:15
◼
►
And then they would lock themselves out
02:29:16
◼
►
of the European market and stuff where you just can't do that.
02:29:20
◼
►
Two, your cable company or whoever
02:29:23
◼
►
you get your internet from tracks absolutely everything
02:29:25
◼
►
anyway, as does your phone company.
02:29:28
◼
►
Like, that traffic is just there.
02:29:31
◼
►
I've looked at VPNing and I do use a couple of VPNs
02:29:36
◼
►
for at times, mostly when I use Facebook,
02:29:41
◼
►
'cause it's just to fuck with Facebook.
02:29:42
◼
►
But the, even like I've got gig ethernet,
02:29:47
◼
►
there's not a gig ethernet VPN supported anywhere
02:29:50
◼
►
I can get my hands on.
02:29:51
◼
►
So there's like a sacrifice you've got to do there.
02:29:53
◼
►
So that data is getting out anyway
02:29:55
◼
►
and Amazon or whoever could buy it.
02:29:56
◼
►
So I don't mean to diminish the risk,
02:29:58
◼
►
but the creep factor is really there.
02:30:00
◼
►
And just the fact that that was everybody's first response,
02:30:04
◼
►
like you, when Eero got acquired, it's like,
02:30:07
◼
►
"Damn it, now Amazon's gonna read all my websites."
02:30:09
◼
►
- And I have an Echo in the kitchen, you know,
02:30:14
◼
►
right next to our HomePods.
02:30:16
◼
►
- Same here.
02:30:18
◼
►
- I have one, so it's not like I'm anti-put, you know,
02:30:22
◼
►
and I've got another one in my living room,
02:30:24
◼
►
which I really probably could disconnect
02:30:26
◼
►
'cause I don't use it anymore.
02:30:29
◼
►
but more or less just for controlling lights
02:30:31
◼
►
and shades verbally.
02:30:33
◼
►
So I'm not, it's not like I'm opposed
02:30:35
◼
►
to putting Amazon internet connected devices in my home.
02:30:39
◼
►
I don't know, but, and like I wrote yesterday,
02:30:44
◼
►
there was a, I forget who had the story,
02:30:46
◼
►
but somebody asked Amazon for comment,
02:30:48
◼
►
do you plan to change the terms of service for Eero?
02:30:51
◼
►
And they were like, no, we don't at this time.
02:30:55
◼
►
And I get it, like I even wrote, I get it, I get it.
02:30:58
◼
►
They're not going to say no.
02:30:59
◼
►
They're not gonna, the ink isn't even dry on this.
02:31:02
◼
►
It hasn't even been approved.
02:31:04
◼
►
It's not a finalized acquisition.
02:31:07
◼
►
Although I can't imagine why it wouldn't go through,
02:31:10
◼
►
especially in the Trump administration.
02:31:12
◼
►
And that's not anti-Trumpism,
02:31:14
◼
►
that's just general Republicanism as being more amenable
02:31:17
◼
►
to less likely to pursue,
02:31:25
◼
►
to look askance at an acquisition for anti-competitive reasons. I mean, I would bet big money that
02:31:33
◼
►
this will, you know, that Amazon will successfully acquire Eero.
02:31:37
◼
►
Well, it's like, I mean, it just somehow it's like, I don't know. And again, I don't think
02:31:46
◼
►
that if you're an existing Euro customer, you have as much to worry about as what does
02:31:49
◼
►
it mean going forward when there are new devices that come with new terms of service? Like I don't
02:31:55
◼
►
really, you know, Amazon's not stupid. And I don't think they're evil. I really don't, you know, so
02:32:01
◼
►
they're not going to do something dumb and, you know, turn your existing heroes into spy spying
02:32:07
◼
►
devices without telling you. Well, like, I mean, the vast majority of my professional works on
02:32:13
◼
►
Amazon, with Amazon Web Services stuff and doing assessments, building things and everything else.
02:32:18
◼
►
I mean, Amazon, you know, it's another big company. I think what this comes down to in a lot of these
02:32:24
◼
►
privacy related issues, because I mean, that's a big chunk of what we've talked about through
02:32:30
◼
►
the last couple hours is like, it's just privacy after privacy after privacy. And
02:32:35
◼
►
it's a really personal thing. You know, it's like, all of our lines are very personal and not
02:32:43
◼
►
always logical. Like I don't do the webcam stickers, but I won't use an Android phone.
02:32:47
◼
►
I minimize Google services and Facebook, even though I have to use Google for a ton of work
02:32:53
◼
►
stuff. And then I'm like, well, the Google, the G Suite stuff, they have privacy because you're
02:32:59
◼
►
paying for it. It's enterprise versus like, there's just all these lines there. And it's
02:33:02
◼
►
becoming, you know, it's hard to navigate it. But I think when like with the Euro, you know,
02:33:07
◼
►
it's something you've come to trust and expect to behave a certain way. And when that it's that
02:33:12
◼
►
fear that that's going to change. It's just totally legitimate. Well, and the other the flip
02:33:17
◼
►
side of it, the elephant in the room is the fact that Apple has exited this market, the
02:33:21
◼
►
consumer Wi-Fi base station market. You've said at least two or three times during the
02:33:26
◼
►
show that you've gone all in on Apple stuff because you trust them. As a company that
02:33:35
◼
►
more and more promotes itself as a protector of personal privacy, it just feels like an
02:33:39
◼
►
an abdication for Apple to exit this market.
02:33:44
◼
►
I can only presume that there are good business reasons
02:33:50
◼
►
for it, that they weren't making significant money
02:33:52
◼
►
with the airport base stations.
02:33:55
◼
►
If they were, why in the world would they get out of it?
02:33:59
◼
►
But as a company that wants to promote themselves as,
02:34:02
◼
►
"Hey, get into our ecosystem and we're gonna protect
02:34:06
◼
►
your privacy 'cause we have no interest
02:34:08
◼
►
or financial reason otherwise, boy,
02:34:13
◼
►
if you don't have a base station, you can trust--
02:34:15
◼
►
it's all from moot anyway, right?
02:34:17
◼
►
Your devices, your Mac, and your iPhone, and your iPad
02:34:21
◼
►
can all be as private as Apple can possibly make them.
02:34:25
◼
►
And if your Wi-Fi network is transmitting everything
02:34:30
◼
►
you do to Amazon, it's all for moot.
02:34:34
◼
►
That is the single most perplexing thing
02:34:37
◼
►
Apple's product line to me, even over various, we can get all the debates about the Mac books
02:34:42
◼
►
or the different iPad sizes and our iPod touches still around. I mean, things like that. The most
02:34:48
◼
►
perplexing to me is if we look at Apple's longer term strategy of TV and HomePod and watches and
02:34:56
◼
►
phones and iPads and everything else within the home. And as they expand out with HomeKit,
02:35:04
◼
►
Like, how do they not have the thing that ties all of those together?
02:35:11
◼
►
I mean, it's like the linchpin of it all. And one of the biggest obstacles of getting
02:35:17
◼
►
somebody moved up on technology with like friends and family who aren't good at this stuff is
02:35:21
◼
►
routers. Like I can't just, I can't point them to something and say it just works. I can't kind
02:35:27
◼
►
it out because of Euro or like the Netgear Orbeez and stuff are much better. But imagine if it had
02:35:33
◼
►
all the password sharing features that we now have between iOS devices. Like, I mean,
02:35:38
◼
►
there's just so much I don't, I don't know why it's not there. Yeah, there's a lot that
02:35:42
◼
►
I could do with a modern, you know, you know, because the airport that they sort of stepped
02:35:48
◼
►
away from was years old at the time anyway, you know, that it had been years since it
02:35:52
◼
►
had been significantly updated. And like you said, there's all sorts of stuff they've done
02:35:56
◼
►
recently, like with the password sharing stuff. You know, that that if man if it was built
02:36:02
◼
►
into the base station level would be even more convenient. There's all sorts of little
02:36:06
◼
►
things like that. I don't think they did an airport update in the whole era of, what was
02:36:13
◼
►
the name of it? It was like a catch all phrase for this.
02:36:17
◼
►
No, but what's the integrations between like iPhone and Mac?
02:36:21
◼
►
Oh, continuity.
02:36:23
◼
►
Continuity, right. In the era of continuity. I mean, surely there's ways to do things that
02:36:29
◼
►
that could make continuity even more fluid
02:36:32
◼
►
and less of a delay,
02:36:34
◼
►
like when you go to the sharing sheet on iOS
02:36:37
◼
►
and you wanna open up the current webpage on your Mac,
02:36:40
◼
►
just if the base station is looking for stuff like that,
02:36:43
◼
►
I mean, surely they could make it more convenient.
02:36:46
◼
►
I mean, then the password sharing thing
02:36:47
◼
►
for the Wi-Fi is so cool.
02:36:49
◼
►
Every time it doesn't come up very often,
02:36:51
◼
►
but when it does, I'm like, oh yeah, that's pretty cool.
02:36:54
◼
►
- Yeah, I've always closed that window
02:36:55
◼
►
because I'm tapping other stuff 'cause it's so cool.
02:36:57
◼
►
I forget it's there.
02:36:59
◼
►
Well, and I mean, like some of the more, you know, just getting devices hooked onto the network.
02:37:05
◼
►
Or man, I just want my iTunes, Wi-Fi backups to work. Like those don't work on any of my,
02:37:14
◼
►
I can't get it to work on any iOS devices anymore. And it's probably some networking thing that a
02:37:18
◼
►
tool like that could, could just deal with that for me. Yeah, it does. You know, whether, you
02:37:23
◼
►
know, whether they were making a lot of money on it or not, it would, it just seems baffling to me
02:37:27
◼
►
that they got out of that market. And it's like, I don't know what they you know, what
02:37:30
◼
►
is Apple's recommendation for Okay, so what should I use for my Wi Fi? What did they what
02:37:35
◼
►
can Apple say that they stand behind and say, here's what we think you should use instead?
02:37:39
◼
►
Yeah, I mean, again, it's just astounding. It is the glue for their entire ecosystem
02:37:45
◼
►
in the home. And they have nothing for it. Yeah. It seems untenable, to be honest, like,
02:37:51
◼
►
I don't know, you've got all the little birdies. I don't have little birdies anymore. Well,
02:37:54
◼
►
I don't have any little birdies that have said anything about airport or anything along those
02:37:59
◼
►
lines. So, you know, if there's something cooking, I have no idea. I hope there is, but
02:38:03
◼
►
it sure looks like they're just leaving it aside and, and boy, it really could use Apple.
02:38:09
◼
►
Apple. Are you listening? Cause I really actually, I mean, I'm,
02:38:15
◼
►
I'm all in on this other stuff now is expensive to set it up, but for everybody,
02:38:19
◼
►
every place else I am, I'd love to have it. Yeah. All right. Rich mogul. Thank you for being on the
02:38:23
◼
►
show. Everybody, your long standing website and consultancy has been Cirque Securosis,
02:38:30
◼
►
sec, you are os is.com. Probably the easiest way to get there, though, it's just Google rich,
02:38:38
◼
►
mogul, mo, g, u, ll. And you got a new now what's your new you mentioned is at the outset of the
02:38:43
◼
►
show, you got a new startup you're you're you're working at? Tell me
02:38:46
◼
►
Jared Ranere: it's it's pretty cool. It's called disrupt ops. So it's cloud operations and security
02:38:51
◼
►
automation. And basically I was talking before about Amazon Web Services. We built a platform that
02:38:57
◼
►
can kind of go in and not just find problems with your environment, but automatically remediate
02:39:02
◼
►
those, integrate with your existing workflows, do things like, "Hey, we found this public S3 bucket.
02:39:07
◼
►
We'll send a notification to the right person." They don't close it in three days. We shut it down.
02:39:12
◼
►
So it's like all this really cool, like software defined workflows and automation to help secure
02:39:20
◼
►
cloud environments and then also like save money. Like, you know, we, hey, you got all this stuff
02:39:25
◼
►
you're not using. Why don't you let us shut it down? Stuff like that.
02:39:27
◼
►
Yeah. I just saw somebody, somebody recently who had like a photo sharing thing, had a, uh,
02:39:32
◼
►
uh, S3 bucket that was public and should not have been public and had guessable names of people who
02:39:39
◼
►
were fishing all sorts of pictures out of there that they shouldn't have. That's a common one.
02:39:43
◼
►
Yeah. And the weird thing is, is it's like, and that is in a lot of ways,
02:39:48
◼
►
my CEO and I fight on that one all the time. I'm like, it's so easy. We shouldn't talk about it.
02:39:51
◼
►
And he goes, everybody still screws it up. Yeah, I don't know why. I don't know why. But
02:39:55
◼
►
public S3 buckets are a huge one. Yeah. So that's great. So it's disruptops.com.
02:40:01
◼
►
Yep. Is the name of the website. And your title is VP of product?
02:40:05
◼
►
Yeah, actually. Oh, aren't you important? No, not really.
02:40:12
◼
►
It's fun though.
02:40:13
◼
►
- People can also, and perhaps where people listening
02:40:17
◼
►
to the show are most familiar with your name
02:40:19
◼
►
is you're a long-standing contributor to Tidbits
02:40:21
◼
►
where you write on security related issues, tidbits.com.
02:40:26
◼
►
And on Twitter, rmogul, and that's with two Ls.
02:40:32
◼
►
You know, my wife, I told, my wife said,
02:40:35
◼
►
"Who's on the show?"
02:40:36
◼
►
And I said, "Somebody knew Rich Mogul."
02:40:38
◼
►
And she said, she said, that's his real name. And I have to admit, I'd never occurred to me.
02:40:45
◼
►
All the years I've known you and I've known you at least 10 years. It never really occurred to me
02:40:49
◼
►
that your name is almost like a comic book character. So I didn't know. I, it never
02:40:54
◼
►
clicked for me until I was like a junior in college. Cause I'm at a bar. I shit you not.
02:41:00
◼
►
I'm at a bar and I'm like hitting on a girl and I go, hi, I'm rich. Like I can tell you the name
02:41:06
◼
►
of the bar. Speaking of memory things, it was the catacombs in Boulder, Colorado,
02:41:11
◼
►
and I'm at the bar and I'm like, Hey, I'm rich. And she looked at me and she goes, that's nice.
02:41:14
◼
►
And I turned away and I'm like, Oh, damn it. Why did I not figure that out earlier?
02:41:22
◼
►
Jeff book, Jeff Bezos, a rich mogul, you the rich mogul.
02:41:29
◼
►
Yeah, yeah. I don't know if I'm and I look a little like Louis C. K. So I'm in bad company
02:41:34
◼
►
of these days. Yeah. Keep that on the down low. All right. Thank you, Rich. I appreciate it.
02:41:39
◼
►
Thanks a lot, John. It was fun.