The Talk Show

162: ‘Special Bullying Venue’ With Glenn Fleishman


00:00:00   Every Mac user is the smugest bastard in the world.

00:00:03   A virus is never going to affect us, we're immune, whatever.

00:00:06   And the reality of course is some people feel that way.

00:00:08   Most people who've looked at the situation at all know that, um, we're not immune.

00:00:14   We're just really lucky.

00:00:15   It's just the target's been small.

00:00:16   Apple does a good job upgrading, um, in pushing people to upgrade.

00:00:19   They support systems back many years.

00:00:21   The target is so small that it's usually not worthwhile for

00:00:25   malware makers to target us.

00:00:27   So that's been our immunity is because we're insignificant as a target, not necessarily in absolute numbers.

00:00:33   So this new thing is this group at Cisco, Talos, um, or Talos, they, uh, are one of many security firms.

00:00:40   It's constantly researching for weaknesses, usually to help clients.

00:00:43   They are trying to make sure their clients are protected.

00:00:46   They find something, this fellow who, um, had some communication with Tyler Bowen, uh, found five previously undiscovered,

00:00:55   fairly severe flaws in the way that image format files get parsed.

00:01:01   And the trouble was like three or four of them were reasonably severe, but you had to kind of open a file and whatever.

00:01:06   But there's one related to TIFF. John, you've been using TIFF for your whole life, right?

00:01:10   Since you were born.

00:01:12   That's right.

00:01:12   Ancient format. So there's a part of the something in the parser,

00:01:16   if you use a tiled TIFF file and it's formatted in this very specific malicious fashion,

00:01:21   the iOS and ostensibly OS 10.

00:01:25   And probably I think the other two OSes

00:01:27   are also patched tvOS and watchOS.

00:01:30   If it's rendered, it will actually think in that case,

00:01:33   it's a memory, it's a buffer overflow,

00:01:35   usual typical old thing.

00:01:36   And it can allow the potential for malicious code to execute

00:01:40   and take control of the machine and do whatever.

00:01:42   - Just by displaying the image.

00:01:44   - Yeah, and here's the reason it's considered

00:01:46   particularly insidious.

00:01:47   You could load it with a webpage in Safari,

00:01:49   You could have iMessages open and somebody could text,

00:01:53   you know, send you a text message with a TIFF in it.

00:01:55   There's a few other vectors.

00:01:57   Merely by rendering the preview,

00:01:59   it would have to parse the file enough

00:02:01   that this condition could be exploited.

00:02:03   That sounds pretty horrific, right?

00:02:05   And that's why it was considered,

00:02:07   it's considered very severe.

00:02:10   And some discussion about how severe in Apple's patch,

00:02:14   did they use responsible disclosure?

00:02:16   It's not a zero day.

00:02:17   there aren't exploits in the wild.

00:02:19   And if you update to all the current OSs,

00:02:23   if you just update to the latest micro release,

00:02:25   you are protected against that particular TIFF exploit

00:02:28   as well as the other four image format related exploits.

00:02:32   - I have so many follow up points already.

00:02:35   - By the way, BMP format, BMP is actually also affected,

00:02:38   which is kind of hilarious.

00:02:39   Like TIFF and BMP, nobody's looking at them

00:02:41   and I think they're just old implementations.

00:02:43   - Let me see if I can keep them on my head.

00:02:46   One of them is, yes, I used to use, as you know,

00:02:49   and I know you have the ink-stained hands as well,

00:02:52   I came from a world of print design,

00:02:54   and in the world of print design in the '90s,

00:02:57   TIFF was the de facto format, at least for line art,

00:03:02   for any kind of bitmap.

00:03:06   Everything went through Photoshop into TIFF

00:03:09   before it went into production.

00:03:11   But it's sort of a notorious file format,

00:03:14   'Cause I seem to recall vaguely that over the years,

00:03:16   there have been an awful lot of security issues like this

00:03:21   that have to do with TIFF parsing.

00:03:24   - That is my recollection as well.

00:03:25   I didn't go back and look it up,

00:03:26   but I think it's that old code thing.

00:03:28   Like people wrote TIFF parsers in 1980 something,

00:03:32   no, sorry, early 1990s, or late 1980s maybe,

00:03:34   probably late 1980s.

00:03:35   And the parsers, the basic code is probably mostly unchanged

00:03:39   in a lot of ways for 30 years.

00:03:41   - And it's probably a very difficult format.

00:03:43   It's probably a poor format spec, that's my guess.

00:03:46   That therefore it makes it hard to write the,

00:03:48   easy to write, not easy to write,

00:03:51   hard to write a parser and easy to make mistakes

00:03:56   with memory management in the handling of the parser.

00:03:59   So anyway, all right, here's another point.

00:04:00   Because it's such a weird old format

00:04:02   and it was never really part of the web in any way,

00:04:06   I'm very surprised that iOS and watchOS

00:04:10   even have to parse some code.

00:04:13   I know this is part of the, um, like the quick

00:04:15   look thing though, like everything in, um, all

00:04:18   the different OSs support quick look for, you

00:04:21   know, every major file type and include some

00:04:23   weird image formats and other things that you,

00:04:26   you know, might never be previewing, but you

00:04:28   can preview a raw images in some cases, PNGs,

00:04:31   are they DNG files, one of the digital formats,

00:04:34   um, all these things, PNG, by the way, when

00:04:37   that specification portable network graphics,

00:04:40   that's kind of TIFFs replacement because it

00:04:41   has all the different kinds of things need lossless

00:04:44   and lossy and different alpha whatever.

00:04:46   In the specifications, I met one of the people

00:04:49   who wrote the spec years ago, 1990s.

00:04:51   It says ping is pronounced ping in the spec.

00:04:54   So there's no question.

00:04:55   (laughing)

00:04:57   - How else could you pronounce it?

00:05:00   P-N-G, I guess.

00:05:02   - I don't know.

00:05:02   - The only other way you, yeah,

00:05:03   the only other way I could think you could pronounce it

00:05:05   would be P-N, or just saying the initials, P-N-G.

00:05:09   - I pronounce it pang, I don't know.

00:05:10   so she knows someone would find a way.

00:05:13   - Ping is a-- - Pinge, ping it.

00:05:15   - Ping is an odd historical success

00:05:18   because a lot of times when people get together in the open

00:05:22   and say, you know, there's even like an XKCD comic about it

00:05:27   where it's like, we have way too many specs

00:05:28   and they all stink.

00:05:29   You know what the solution is?

00:05:31   A new spec.

00:05:32   You just add to the pile, usually,

00:05:35   even if it's a noble goal.

00:05:36   Whereas Ping had the noble goal of saying,

00:05:39   We've got a couple of bitmap image files,

00:05:43   formats that are out there being used

00:05:45   and they are all terrible for one reason or another.

00:05:49   On the web we had GIF files

00:05:51   and don't even tell me that you're a GIF person.

00:05:53   - Oh my God. - Are you a GIF person?

00:05:55   - I'm choosey programmers choose Jeff.

00:05:57   That's the--

00:05:58   - It's a graphic image change, interchange format.

00:06:01   Anyway, it's a GIF file.

00:06:07   Just, I mean, horrendous format.

00:06:10   I mean, you can only have 256 colors at a time.

00:06:12   I mean, it's goofy.

00:06:14   - It was great.

00:06:16   It was great for CompuServe.

00:06:18   - It's juicy.

00:06:19   - It was great for CompuServe

00:06:21   because I was a CompuServe user.

00:06:23   Were you, I can't remember.

00:06:24   Were you a CompuServe user?

00:06:24   - No, I was never on CompuServe.

00:06:26   - Okay, so I can't, for a long time,

00:06:28   I can remember my five comma four digit address.

00:06:31   It begins with seven.

00:06:32   I've totally lost.

00:06:33   That's one of the problems with aging.

00:06:34   I cannot remember my 1979 CompuServe login.

00:06:37   It makes me sad, but on copy serve, it was great.

00:06:39   We dial up.

00:06:40   We had like, I don't know, 1200 bod modems or 1200 BPS modems, something like that.

00:06:44   It was a great compact format when you had, and it was rendered, you know, rendered

00:06:49   in a way that made sense, like line at a time or interlacing was, you know, all

00:06:53   these things, just most computers, most computers only had 16 color displays.

00:06:57   So yeah, it was perfect for the day and it's astonishing that it still is great,

00:07:02   but yeah, ping and you remember the most boring thing in the world, the

00:07:06   The LZW algorithm was patented and Unisys tried to enforce it and Ping is an outcome

00:07:12   of efforts to patent.

00:07:14   The patent expired in, I had to look it up 2003 because I'm already gone off.

00:07:18   Um, but that was part of the issue.

00:07:19   Like TIFF used LZW, which meant how to be licensed.

00:07:22   Uh, GIF was being attacked or potentially Unisys wanted to license it.

00:07:27   Uh, I believe LZW may have been one of the early attempts to, uh, one of the issues

00:07:31   like business, not business method patents, but it was a software-ish patent

00:07:34   algorithm patents.

00:07:35   There were some issues there anyway.

00:07:37   Yeah. And it came out of that and historically we're getting sidetracked

00:07:39   here, but it's all good stuff.

00:07:40   The, the, it was, uh, it was a big story locally because

00:07:45   Unisys is a Philadelphia area company, at least they were, I don't know if they

00:07:48   still are. Um, so it was a big story locally. Um, but the gist of it is that

00:07:53   they own the patents behind or patents that were part of fundamental part of

00:07:58   the GIF GIF format.

00:08:01   Uh,

00:08:02   son of a bitch.

00:08:05   uh...

00:08:06   and they never enforced it and it you know people there were

00:08:11   gift

00:08:12   sping displayed and parsed and created

00:08:16   you know and every image editing tool and i was all over the place and then

00:08:20   netscape added you know put support in to render them in

00:08:26   mosaic or whatever that you know whatever the first one of their browsers

00:08:29   that images which was a big deal

00:08:32   It was actually a big deal when they added image tags

00:08:35   to the web browser.

00:08:37   - Oh my God.

00:08:37   - And they waited until like, all of a sudden,

00:08:42   like when the internet exploded

00:08:43   and everybody was buying, you know,

00:08:47   like Adam Eng's internet book and everybody is like,

00:08:49   I'm gonna get on the internet

00:08:50   and Bill Gates is writing a memo that we're gonna,

00:08:52   you know, turn the whole, you know,

00:08:54   Microsoft around at the internet, internet, internet.

00:08:57   And then all of a sudden, somebody at Unisys was like,

00:08:59   hey, we own a patent for this.

00:09:00   (laughing)

00:09:02   And the open community responded and said,

00:09:05   "We've gotta create a new format."

00:09:07   And they did it, they did it quickly.

00:09:09   They got support into all the tools quickly.

00:09:12   And Ping took over the world as it was supposed to,

00:09:17   very quickly, and that almost never happens.

00:09:20   - It's where Ping is such a great format in a lot of ways.

00:09:22   I mean, I think it's funny that it didn't actually

00:09:24   ultimately replace everything except JPEG,

00:09:28   but it's just, it's great.

00:09:29   as all the different kinds, you know, two modes, you can do 24 bit with alpha transparency.

00:09:35   And it's just, it's not quite as compact, I think as GIF for the same thing.

00:09:41   But yeah, patents, patents are issued, almost came up, you remember with the, the committee

00:09:48   that runs the, it was H.264, whatever the underlying patents are there, there's a patent

00:09:53   pool.

00:09:54   But I think there was at one point, there was a question about whether if you used,

00:09:58   If you displayed H.264 video on your site without going through a third party package

00:10:03   like Adobe flash, for instance, which wraps it, that you might actually owe separate licensing

00:10:08   fees and using flash.

00:10:10   One of the reasons that flash was successful, uh, is because Adobe handled the licensing

00:10:16   for video patents, or at least they said they did.

00:10:18   I think that fact was overlooked because the, you know, flash is so terrible, but it meant

00:10:22   there was no, if you were a, you know, CBS or something, if you put it in flash, your

00:10:26   your lawyers must have assured you it's cool.

00:10:28   We don't have any fees.

00:10:30   If you'd use HTML5 as if it existed to show,

00:10:34   you know, to directly stream that kind of format,

00:10:37   I think there might've been a patent issue.

00:10:38   And that's been resolved since the patent holders changed it

00:10:41   so they wouldn't, you know, poop in the pool.

00:10:42   - I seem to remember that.

00:10:43   You know, it's always a good sign

00:10:45   when the lawyers are making engineering decisions

00:10:48   (laughing)

00:10:49   for your web technologies.

00:10:51   - It's great, I like that.

00:10:52   - Now what could go wrong?

00:10:55   All right, back to the security issue.

00:10:56   So all of Apple's operating systems

00:11:01   were vulnerable to this threat.

00:11:04   So it's- - Yeah, all the current ones

00:11:06   and also two previous version of OS X were tested.

00:11:09   It's possible that older versions are also vulnerable

00:11:12   and they just didn't test them.

00:11:14   I don't have the clarity on that

00:11:15   because the particular library

00:11:17   may go back a bazillion years,

00:11:19   could be back to 10.6 or 10.0 for all we know.

00:11:25   But there are no actual exploits in the wild, right?

00:11:29   Or at least none that are known.

00:11:31   - None that are known.

00:11:32   It's possible, so I talked to the folks

00:11:34   at the Cisco division directly,

00:11:37   'cause I could not find, I saw this coverage.

00:11:39   So you know how this works.

00:11:40   Something happens related to Apple,

00:11:42   and the news story is Apple computers burst into flames.

00:11:45   That's like the next day.

00:11:46   It's like, well, it was one computer in a lab.

00:11:48   It was under controlled circumstance.

00:11:49   No, no, Apple computers burst into flames.

00:11:51   So Apple releases these updates,

00:11:54   then the engineer at Cisco, or I sorry, Talos

00:11:56   is the division.

00:11:57   He writes up a, uh, the guy is in charge of

00:11:59   this team who's credited with the discovery

00:12:01   by Apple.

00:12:02   So it meant he used responsible disclosure

00:12:04   reported months ago, Apple patch at all,

00:12:06   everything's cool.

00:12:06   He writes up a very detailed blog post that

00:12:08   explains the severity and then has some details

00:12:11   about exactly what gets dumped.

00:12:12   He puts some, you know, core dump information

00:12:15   in or traces and things.

00:12:16   And, um, people like, you know, the register

00:12:19   and Forbes and whatever, right?

00:12:20   These it's stage fright for Apple, you know,

00:12:23   which is Android still is wrestling with stage fright,

00:12:26   which was an MMS deliverable, among other factors,

00:12:29   a way to send a maliciously formatted message

00:12:33   for Android 2.2 and every subsequent system,

00:12:37   and the problem with upgrading older Android systems

00:12:39   and on and on, right?

00:12:40   So stage fright remains, it's possible

00:12:42   there's a couple hundred million Android devices

00:12:44   that remain susceptible to stage fright, it's a big deal.

00:12:47   And there were viruses that were discovered in the wild

00:12:50   within I think weeks of the first stage fright release and then months later some more for

00:12:55   devices that couldn't be patched. This touches on one of my very most precious pet themes in what I

00:13:06   write about like in media criticism which is false equivalence. Yeah, yeah. And it's a huge

00:13:11   issue in politics and it's definitely an issue in the tech world too and it's this idea that to be

00:13:17   fair or cynic more cynically to sensationalize the story because I think everybody knows

00:13:26   that putting Apple into headlines gets more clicks.

00:13:34   So Android had stage fright or has stage fright, you know, and has this issue that is actually

00:13:39   being exploited in the world.

00:13:41   And there are lots, not most, you know, I'm not, nobody is saying that every Android phone

00:13:47   out there is hacked or even most of them are, but that there are many people with Android

00:13:51   devices that have malware, perhaps even unbeknownst to the user because of this exploit.

00:13:58   Then Apple has this, you know, Cisco discovers this security problem in Apple's operating

00:14:04   systems and it's presented as though it's, you know, like you just said, they're actually

00:14:08   calling it Apple's version of stage fright, even though there aren't any known exploits.

00:14:12   Right.

00:14:12   The right, the right, it's great narrative.

00:14:13   And there are similarities except sandboxing and code signing and so forth.

00:14:20   There's a lot of spaces that are different in terms of how Android deals with incoming

00:14:25   everything or, you know, malicious software that's trying to execute on a system.

00:14:28   There's other protections, even if you manage to deliver a payload, you may just crash a process.

00:14:33   Right.

00:14:33   So the, you know, I talked to the engineer, I went back and forth through email.

00:14:37   I had to go through the PR folks and, um, you know, it's like, is there a proof of concept?

00:14:40   Because with Stagefright, the researchers who did that

00:14:43   delivered an effective proof of concept

00:14:45   showing not just that they could crash a process

00:14:47   or overwrite a buffer,

00:14:48   but that they could actually commit acts and gain root.

00:14:50   If I recall, some Android systems you could get root

00:14:53   and some behalf you could take control of the microphone

00:14:57   and things like that.

00:14:58   So there were ways to get access to system resources,

00:15:01   even if you couldn't potentially gain root.

00:15:03   And the Talos researcher said,

00:15:05   in effect, we have a provable exploit

00:15:08   that allows us to do nasty stuff with Safari and OS 10.

00:15:13   And the reporting had been about MMS,

00:15:16   but they have no exploit for that.

00:15:18   They think there are some major hurdles in the way

00:15:20   that might be able to be overcome.

00:15:22   They weren't focusing on that

00:15:23   because they knew it was gonna be patched.

00:15:24   They focused on something

00:15:25   where they had a path already to do it.

00:15:27   They suspect that the Safari plus OS 10 pathway

00:15:30   would also let them exploit iOS and Safari, mobile Safari.

00:15:35   They haven't tested it, but everything seems the same.

00:15:37   They think there may be a few more bars in the way there too, that can be overcome.

00:15:40   This is all with the unpatched version.

00:15:42   So, you know, the truth is, this is that it's going back to the

00:15:45   beginning is it's a severe bug.

00:15:47   You know, a few years ago, Apple had some wifi bugs where you could drive by and do

00:15:52   terrible things to someone's wifi network and gain access and, you know, do all kinds

00:15:56   of stuff to an airport base station or Mac, or sorry, Macintosh's, if you could

00:15:59   just gain X, you know, just have physical proximity.

00:16:02   And it's terror, that's especially terrifying because you don't

00:16:04   actually have to be in their house.

00:16:05   in their house, you just have to be within range of their wifi, which could

00:16:08   be, you know, in a car in front of their house.

00:16:10   Yeah.

00:16:12   Just put a high gain antenna out the window and you can do it.

00:16:14   And those, they were patched and people have the same argument then.

00:16:17   And this happened since I think in a lot of times is how severe

00:16:21   is it if the threat is over?

00:16:22   It's like, well, this is very severe.

00:16:23   We don't know the it's.

00:16:25   It's severe as an exploit.

00:16:26   You know, these things really are doing something really terribly

00:16:29   wrong they shouldn't do and shouldn't be in the code.

00:16:31   That's true.

00:16:32   And conceivably it's a pathway.

00:16:34   the next steps are in the lab or, you know, malware could be developed that

00:16:38   would then take advantage of it.

00:16:40   And we could find out how severe in practice it is.

00:16:42   In theory, it's very severe, but because of responsible disclosure and Apple

00:16:47   being able to patch it in a timely manner, there's no evidence anything is in the

00:16:50   wild that said, this is the kind of thing when the, um, was it, uh, I'm blanking on

00:16:55   the group in Italy that had all its files, hacking, hacking team hacking

00:17:00   team or something like that.

00:17:01   Yeah.

00:17:01   Those guys.

00:17:02   And it's true of some other outfits that have had leaked information.

00:17:05   These, you know, there's all these companies, it's individual researchers and companies that exist to find zero days and sell them to governments, right?

00:17:11   And governments are also working on similar things.

00:17:13   It's possible things like this are already completely known, maybe even to multiple parties, and they're being deployed against some Iranian official or American official or Chinese official or a company for, you know, for industrial espionage.

00:17:28   and it's being specifically deployed in a very quiet way

00:17:31   in which it allows them to exfiltrate information

00:17:34   or tap communications, but the idea, it's not,

00:17:39   if that's the case, that hasn't been found yet,

00:17:41   maybe it will now the bug is known,

00:17:43   virus signatures will get updated, people will find it,

00:17:45   but so it's not in the wild.

00:17:46   - Yeah, there's, the most obvious source of malware,

00:17:51   the one that we see and hear about most often

00:17:56   is the one that's sort of stuff that's sort of out in the open,

00:17:59   where it's just almost like-- you almost

00:18:03   want to say more like common criminals who either find

00:18:07   exploits like this that you can go through a web page

00:18:10   or certainly email, click link, click this link in an email.

00:18:14   And if you're using a certain type of computer that

00:18:16   has a known exploit, just by clicking the link,

00:18:23   you've got malware on your computer.

00:18:25   And what does the malware do?

00:18:26   A lot of times it sets up like a botnet

00:18:28   or something like that.

00:18:29   And it's like a shotgun approach to the crime

00:18:33   where they're trying to just get hundreds and thousands

00:18:35   of random people, they don't even care who you are,

00:18:38   to run this and steal your Bitcoins

00:18:41   or whatever they wanna do.

00:18:42   That's the stuff that we see.

00:18:47   The part that, the paranoid part of my mind is the,

00:18:51   well, what about the Chinese government?

00:18:53   The Chinese government would be, you know,

00:18:55   are they employing people to find exploits like this?

00:18:59   Of course they are, right?

00:19:00   I mean, who, does anybody believe they're not?

00:19:03   Does anybody think the NSA doesn't have

00:19:05   really, really smart people doing this exact same thing?

00:19:10   You know, and then there's the companies,

00:19:14   not just governments, but companies like the hacking team

00:19:16   that you mentioned that sell their services,

00:19:19   you know, to governments and stuff like that.

00:19:20   But find these zero days and then hold,

00:19:25   instead of letting them loose,

00:19:26   they're like a precious commodity.

00:19:28   I mean, I think last year somebody was actually

00:19:30   like in the public was saying,

00:19:32   we're gonna pay a million dollars

00:19:35   if you can find an exploit for iOS 9 or something like that.

00:19:38   - Yeah, yeah.

00:19:39   - I mean, a legitimate, I mean, sort of legitimate.

00:19:42   I mean, it's sort of a scummy underside of the world,

00:19:44   but a serious offer, a legitimate offer for a million dollars

00:19:48   if you could deliver them an exploit that would let them do

00:19:52   it.

00:19:52   Well, it was just jailbreaking often falls that right.

00:19:55   As there's companies that make a lot of money off, um, third

00:19:57   party app stores for jailbroken, uh, uh, iPhones, I think.

00:20:02   And so there's money to be made if you get the, the, uh, the

00:20:05   exploit first, uh, the jailbreak, uh, um, pathways first, some of

00:20:09   them sell jailbreaking is funny.

00:20:11   Jailbreaking went from a very innocent enterprise in the early

00:20:13   days to something that is now all enmeshed in viruses and criminal

00:20:16   enterprise and so forth.

00:20:18   There's still, I'm sure, legitimate people out there doing jailbreaking,

00:20:20   but everything I read about it makes it sound like you don't know when you

00:20:24   download the jailbreak tools what they're going to do.

00:20:26   I was talking to a friend of the show, Craig Hockenberry at WWDC,

00:20:31   and we were laughing,

00:20:34   thinking like reminiscing to 10 years ago when the iPhone was new about how we

00:20:39   all jail broke our phones.

00:20:40   We all jail broke our phones at before the first C4 conference,

00:20:47   because somebody had created the Lights Out game.

00:20:54   And it was a really nice game.

00:20:56   Like, it was just like a-- there was no Xcode for iOS yet.

00:21:01   I mean, these were some really smart people.

00:21:03   And there were no public tools at all to create iOS software.

00:21:10   And somehow--

00:21:10   Oh my god.

00:21:11   And Craig, eventually, during the jailbreak era,

00:21:15   I got Twitterrific working.

00:21:18   So like, I don't know, of course,

00:21:19   so of course I jail broke my phone

00:21:20   'cause then I had an iPhone with Twitterrific on it.

00:21:22   And this is before you could have apps.

00:21:24   I mean, of course I jail broke.

00:21:26   But it was, it just seemed, it was a lot easier too.

00:21:31   - Innocent days back when it was just us kids

00:21:34   playing around with the phone.

00:21:36   Sidebar counselor, EFF has just filed a lawsuit

00:21:41   about section 1201 of the DMCA,

00:21:44   which I'm sure you're aware of.

00:21:45   I saw the announcement.

00:21:46   Yeah.

00:21:46   Uh, it's the, so, you know, DMCA digital money and copyright act, which I think

00:21:50   is a largely unconstitutional piece of legislation that has never been fully

00:21:54   tested if the Supreme court ever got his hands on it, I got to say there's so many

00:21:58   things in there that, that, uh, give the privileged commercial speech at the, in

00:22:03   the face of free speech and whenever I've seen anything like it, that's been decided

00:22:07   by high courts, you know, either appeals or up the Supreme court, it's usually,

00:22:11   even if I don't agree with the decision entirely,

00:22:13   it usually opens up the way for more speech,

00:22:16   meaning more, encode is speech.

00:22:17   It's sort of now been encoded at a certain level

00:22:20   of the court systems, even if it's not totally understood.

00:22:23   So one provision of the DMCA is this reverse engineering

00:22:26   thing related to digital rights management.

00:22:28   And if you put DRM on something as a manufacturer,

00:22:31   it's illegal and you can be sent to jail for years,

00:22:34   for, I'm sorry, four years, not for four years,

00:22:36   I forget, see if it's like five years, I think.

00:22:39   reverse engineer it, even for yourself, like in the privacy of your own home, FBI

00:22:42   breaks in, you've been breaking DRM and not distributing it, you go to jail.

00:22:46   And, uh, there's a provision called section 1201, which is the most hilarious thing

00:22:51   in the world every three years, more or less, uh, the librarian of Congress

00:22:56   holds hearings, not them personally.

00:22:58   And the last, you know, we just got a new one.

00:23:00   Isn't this awesome?

00:23:00   The Senate actually approved the confirmation, uh, new librarian of

00:23:04   Congress understands technology has run a library system.

00:23:07   She's the voice of the future.

00:23:09   This is going to be great.

00:23:10   The guy who's been in charge for decades has been kind of a know nothing,

00:23:13   Luddite, just terrible in terms of the technology side, great in terms of books.

00:23:17   Anyway, so the section 1201 hearings, it's a circus.

00:23:21   You have the process described in the law is terrible.

00:23:24   So the librarian of Congress created a process that basically people who object

00:23:30   to limitations and want to get them removed temporarily only for a three

00:23:36   only for a three-year period have to essentially file something like a legal

00:23:39   brief, although it can be in more plain language explaining why there's a

00:23:42   legitimate public interest to be served in providing exemption. And then the

00:23:47   librarian, so they have this circus, they have hearings and people parade through

00:23:50   and they testify and there last time I think there were 47 different subgroup

00:23:55   items being presented and it's just and it's you know farming, it's companies

00:24:00   like John Deere who have DRM on their tractors, automakers, video game makers,

00:24:05   printer companies, as well as the software and, you know, like iPhone

00:24:10   locking and the rest of it.

00:24:11   And, and then there's people who, you know, file these objections and there's

00:24:14   back and forth, and then the Library of Congress issues a set of rules about

00:24:18   what's going to be exempted, if anything, in the next three year period.

00:24:21   It's a ridiculous process.

00:24:22   So the EFF is suing, basically on the unconstitutionality of the, of this

00:24:28   provision, and if it were struck down or even minimized, it would dramatically

00:24:33   enhance the ability of people to do self-repair, which is, you know, Kyle

00:24:37   Wiens at iFixit, he's been a huge proponent and deeply active in this process.

00:24:41   And, um, you can read a lot of stuff about right to repair that

00:24:44   relates a lot to DRM these days.

00:24:46   Uh, I did a specific provision in the D digital D M C a not DCM.

00:24:54   D D yeah.

00:24:55   Digital cause it was a millennium copyright was sunny.

00:24:58   I can't remember if Sonny Bono was involved in that one, but maybe

00:25:01   - Maybe. (laughs)

00:25:02   - We might have. - The Sonny Bono law also,

00:25:04   but that's a different one.

00:25:05   - It's specifically, it more or less outlaws backwards

00:25:08   reverse engineering how DRM works.

00:25:11   And reverse engineering is, you know,

00:25:13   here, you've got this thing, you own it.

00:25:16   Are you allowed to try to figure out how it works?

00:25:18   And that's, you know, I think that's been considered

00:25:21   part of, you know, I guess free speech,

00:25:25   but certainly seems like something that, you know,

00:25:28   I don't know, the engineer in me objects to,

00:25:31   We've got this magic thing called copyright

00:25:35   for our movies and music,

00:25:37   and it gets a special exemption for this

00:25:40   that nothing else has.

00:25:41   - Yeah, and I just learned something,

00:25:44   and that's terrible because it keeps innovation.

00:25:46   How did everything interesting happen that's going on?

00:25:49   Some people point to giant corporate research labs.

00:25:52   So many interesting things we're doing in technology

00:25:54   came from people tinkering at little stuff

00:25:57   that they took apart, right?

00:25:58   I saw Kate McKinnon, who I love,

00:25:59   We're going to talk about Ghostbusters later, right?

00:26:01   We'll talk about Ghostbusters.

00:26:02   I have not seen the movie yet though.

00:26:03   We'll talk about the movie.

00:26:04   We'll talk, we're gonna talk about Twitter or whatever.

00:26:05   Okay.

00:26:06   Yeah.

00:26:06   Yeah.

00:26:06   I am one of the, my, one of the few people who still watches SNL.

00:26:09   I watch it and Lynn and I, my wife, we fast forward, we tape it.

00:26:12   We watch it like a few days later.

00:26:14   We fast forward through the bad stuff.

00:26:15   There are a bunch of really great performers.

00:26:18   A lot of people our age have given up an SNL longer.

00:26:21   Like it's, you know, we're big SNL fans here.

00:26:24   Oh, good.

00:26:24   Yeah.

00:26:24   And it's hit or miss as you know, sometimes you're watching an entire

00:26:27   episode, you're like, where I don't even know what happened.

00:26:29   And then another time you're crying for--

00:26:31   - It's always been like that.

00:26:33   It's always been like that.

00:26:34   - You're totally right.

00:26:35   I have this, I have the same memory of,

00:26:36   no, wasn't it always, no it wasn't.

00:26:38   So Kate McKinnon, you love her then 'cause you watch it.

00:26:40   She's an incredible mimic and she's great in "Ghostbusters."

00:26:43   She is such a, I've seen her interviewed.

00:26:45   I just think she is so great.

00:26:47   And we're at the beginning of her breakout part

00:26:49   of her career, like Kristen Wiig already had, right?

00:26:52   This is it.

00:26:52   So Kate McKinnon, I see this interview.

00:26:54   She's on the red carpet for "Ghostbusters."

00:26:55   And this tiny little girl,

00:26:57   It's supposed to be like eight or nine is doing interviews.

00:27:00   And she asked Kate McKinnon question and Kate

00:27:02   McKinnon looks so touched and in love and then

00:27:04   looks at her very seriously and gives you this

00:27:06   answer.

00:27:07   And the girl said, what was it like to work

00:27:09   around all this cool technology in the movie?

00:27:11   And Kate McKinnon said, when I was a kid, I

00:27:13   used to love to take things apart, radios,

00:27:15   things that look at the circuit boards and

00:27:16   whatever, and this whole movie, I walk in

00:27:17   everything is circuit boards.

00:27:19   And it's just, it was a dream.

00:27:21   It was like my childhood again, something like

00:27:23   that.

00:27:23   And I'm thinking that is the kind of thing

00:27:26   that kids today, they don't know,

00:27:28   they're not encouraged to take things apart.

00:27:30   They could actually be violating the law

00:27:32   if they were to know, on the software side,

00:27:34   to circumvent things.

00:27:35   Literally, these kids could be violating federal law

00:27:38   for doing the stuff that you and I did,

00:27:40   and all, you know, millions and millions of other children.

00:27:41   - It is certainly the case

00:27:43   that the overwhelming majority of all people

00:27:45   have almost no curiosity about how things work.

00:27:47   - Which is fine, which is fine.

00:27:48   - Right, but for the minority of people

00:27:51   who are curious about how things work,

00:27:54   those also tend to be the sort of people

00:27:56   who create new things, you know?

00:27:58   And I mean, I think you could find that

00:28:00   with creative people and, like you said, entertainers.

00:28:03   Even entertainers, it's just like a mindset of,

00:28:06   I would like to take that apart.

00:28:08   (laughs)

00:28:10   It is a very-- - I saw her at StuffMiner

00:28:11   saying about refinishing tables too.

00:28:12   She's got a crafty aspect to her, she does.

00:28:15   - So my take on it is that I think it's,

00:28:20   I object philosophically to the idea

00:28:22   that you should outlaw being able to take things apart

00:28:24   and figure out how they work,

00:28:26   But on the flip side, I also think that the people

00:28:28   who make things have every right to make them

00:28:30   as difficult to take apart or as, you know,

00:28:33   like in the case with Apple and cell phone encryption,

00:28:38   that if Apple can figure out a way to mathematically

00:28:40   make the contents of a phone effectively unbreakable,

00:28:45   encryption wise, they have the right to do that.

00:28:48   And the NSA has a right to try to find the holes

00:28:51   in their logic.

00:28:52   - Oh, yeah, I don't think manufacturers should be obliged

00:28:55   to make it easy.

00:28:56   And some do that.

00:28:57   I was talking to a company that I can't reveal

00:28:58   that said, because of various regulations,

00:29:01   they're not allowed to promote the fact that

00:29:03   their product is modifiable because it would

00:29:05   actually put them in violation and put them in

00:29:07   a new regulatory framework.

00:29:08   However, they can make their product

00:29:11   viable, a modifiable, including the firmware,

00:29:13   and they just can't say anything about it.

00:29:15   So they are actually doing everything they can.

00:29:17   A great example is that I can talk about is

00:29:20   Chumby, which is Bunny Huang, who's one of the

00:29:22   plaintiffs in the EFF suit that's coming out

00:29:24   Uh, suit that's coming up, uh, bunny, uh, lives in Singapore.

00:29:28   I think still lives in Singapore.

00:29:29   It goes to Shenzhen all the time.

00:29:31   He's a hands-on designer.

00:29:33   He's been designing an open laptop.

00:29:34   This is too fascinating.

00:29:35   It's not, there's a little ideology in it, but it's also an

00:29:38   incredible technical exercise.

00:29:39   He was one of the people behind Chumby, which was, uh, it was originally

00:29:42   this kind of soft alarm clock that you could make apps for like many years ago.

00:29:47   Yeah.

00:29:47   And it went through a lot of revisions.

00:29:48   It's still out there at company.

00:29:50   So they left everything in such a state that when the product didn't succeed,

00:29:54   People could keep it alive and then a new company

00:29:57   came in to support it.

00:29:58   And then that company is now making new chumby

00:29:59   stuff and running the servers.

00:30:01   Um, because enough was open and available.

00:30:04   I don't think the whole thing was open source

00:30:06   and forgetting all the details, but they had

00:30:07   left everything open enough.

00:30:09   And then I think when it was shutting down,

00:30:10   they opened it even further.

00:30:12   And I'm like, that was a wonderful thing.

00:30:14   So as a company, you could choose to do that,

00:30:15   but you can also choose to be, you know, rat

00:30:18   bastards or pursue security, let's say

00:30:20   it, whichever you want and, uh, not make it easy.

00:30:23   That's totally, I mean, I don't, I don't think being, there's a difference there.

00:30:26   Like should Apple allow third-party apps that are not sold from its store?

00:30:30   I should say non app source apps.

00:30:31   That's a whole interesting debate.

00:30:32   And there's an argument to be made that they should be required, even if they have

00:30:36   to put hurdles in and switches, and you have to agree, you're going to

00:30:39   avoid your warranty or whatever.

00:30:41   That's separate from should Apple let its firmware be hackable.

00:30:45   That's a very different situation.

00:30:46   They, they shouldn't pursue people who have broken into it.

00:30:51   That's where the DRM issue lies rather than they should make it easy, which is a sort of philosophical,

00:30:56   ideological situation.

00:30:58   Um, all right.

00:31:00   And that's where I disagree with the, what's his name?

00:31:03   Wines from, uh, the, uh, Mac fix it.

00:31:06   What's the point?

00:31:07   Yeah.

00:31:07   I fix it.

00:31:08   I fix it.

00:31:08   Like the pentalobe screw thing was a big, that was an inflection point, right?

00:31:11   Is Apple switched to pentalobe screws.

00:31:14   It was hard to get such a screwdriver then iFixit made it, which is great, but Kyle maintained, I think that, uh,

00:31:20   - Yeah, from your right.

00:31:21   Yeah, he said Apple is doing this to make it hard to repair.

00:31:25   Some people said maybe it's because it's easier

00:31:27   to do machine creation, but that was machine assembly.

00:31:31   That's, who knows?

00:31:32   But it definitely made it harder to repair.

00:31:34   And then, you know, Tana Lobe screwdrivers got made

00:31:36   and now it's possible.

00:31:37   But I agree that Apple probably does some things

00:31:40   to make it harder for third,

00:31:41   for consumers and third parties to make changes.

00:31:44   And other things, I think it's, they don't care.

00:31:47   They just engineer it

00:31:48   because they know they're gonna repair it

00:31:49   so they don't give a damn if it's hard,

00:31:50   'cause they'll take care of it.

00:31:51   - Right, I think that there's,

00:31:54   it's funny, 'cause that's a perfect example,

00:31:56   and I try to be like this as much as I can.

00:31:59   So I disagree with Kyle on his take on this,

00:32:03   but I'm intrigued by his argument,

00:32:05   and I don't, you know, I don't think that,

00:32:07   I don't say this guy's an idiot.

00:32:09   I disagree with him.

00:32:11   I do believe that he's wrong,

00:32:13   but I always do enjoy reading his pieces, arguing about it.

00:32:16   He's pushing and he pushes the, the, he pushes the envelope in a way that's good for everybody.

00:32:20   Even if you disagree with them, there's nothing wrong with the idea that Apple could roll back to

00:32:26   Phillips head screws.

00:32:27   Like that doesn't make things worse for anyone.

00:32:28   Even if you disagree with why they switched to Pentelome.

00:32:31   Um, I want to point that since we're in a sidebar, we're sort of side barring a sidebar, there's a great piece

00:32:35   about warranties and motherboard last month.

00:32:38   I don't know if you saw this house Sony, Microsoft and other gadget makers violate federal warranty law.

00:32:42   I am going to send you a URL.

00:32:45   This is a great piece. I never knew this.

00:32:48   Most, all the things that say,

00:32:51   break the seal and you violate warranty

00:32:53   are actually either illegal or unenforceable.

00:32:56   I had no idea.

00:32:57   - I did not know that either.

00:32:58   - 1970 law called the Magnuson Moss Warranty Act.

00:33:01   Federal law says you can open your electronics

00:33:03   without voiding the warranty,

00:33:04   regardless of what the language of that warranty says.

00:33:06   People should read this 'cause I had my mind blown.

00:33:09   - Every single hard drive I've ever purchased

00:33:10   has a sticker like that somewhere on it.

00:33:14   But I always thought that it was kind of reasonable

00:33:16   for like a hard drive because,

00:33:18   especially in the spinning disk era,

00:33:20   if you get to the point where you open up

00:33:22   and expose the disk and it picks up dust

00:33:25   and then doesn't work right,

00:33:27   well, why shouldn't your warranty be blocked?

00:33:28   - Oh yeah, it's not that, but the thing is,

00:33:31   the warranty isn't de facto violated by breaking the seal,

00:33:35   and that's the thing.

00:33:36   Or even doing a repair, the article notes,

00:33:39   and they got some great liability lawyers,

00:33:42   that lemon law lawyers talk about is the manufacturer

00:33:45   has to prove that your, whatever you did caused the failure.

00:33:48   - Ah, gotcha, that's fair.

00:33:49   - So they can do that.

00:33:50   They can say there's dust on the drive,

00:33:52   there wasn't dust originally,

00:33:53   it worked when you got it, so screw you.

00:33:55   - I see that, going back to Kyle Wines,

00:33:58   I think his argument would be bolstered

00:34:01   if he stopped attributing malice to Apple

00:34:04   and simply stated why he thinks

00:34:07   these would be better devices with standard screws.

00:34:10   And I know one of his other bugaboos

00:34:12   is the use of glue.

00:34:13   - Oh yeah.

00:34:14   - Just make the argument that this would be a better device

00:34:19   for everybody if they stopped using glue

00:34:23   and stuff like that.

00:34:25   - Oh yeah. - Instead of saying

00:34:26   that they're doing this to make it hard to repair.

00:34:29   Apple does not care about the repair shops.

00:34:31   They don't.

00:34:32   They give no shits.

00:34:33   - It's tiny, yeah.

00:34:35   It's also, this is a case,

00:34:38   this is not a sidebar, I swear to God.

00:34:39   parentheses Nestor as you know.

00:34:40   Um, but Walt Hickey wrote this great piece that I

00:34:43   think he's at 538, I think that's right about,

00:34:46   um, he's, I love his writing and he wrote this

00:34:48   thing about how IMDb, uh, movies, uh, movie scores

00:34:52   are, um, are sunk by male trolls because he did

00:34:57   the analysis and you can see that men more highly,

00:35:01   uh, downvote or give poor ratings to movies that

00:35:04   women like better than, and women do not do the

00:35:07   same thing to male films.

00:35:08   films that men like better.

00:35:10   So, you can look at films that more men have seen,

00:35:12   there's gender split, you know, and so forth.

00:35:13   And you can do that analysis and figure it out.

00:35:15   And the article is really interesting, but the one

00:35:17   thing I disagree with him on is he was attributing malice.

00:35:20   He said effectively that men are trolling as

00:35:23   opposed to internalizing their own toxic masculinity

00:35:26   and voting stuff down because they thought no one

00:35:28   should see it, sort of the Ghostbusters thing.

00:35:29   Again, like, well, if you haven't seen the movie and

00:35:31   you're voting to three, you're kind of a troll, but

00:35:34   some people may have, may have seen it.

00:35:36   You don't know how many haven't seen it.

00:35:38   haven't seen it, how, have, and whatever.

00:35:40   I feel like the same thing a bit with people

00:35:41   who attribute things to Apple.

00:35:43   If you know, if you could pull it out and say,

00:35:47   I talked to, or an Apple engineer goes public and says,

00:35:50   I worked at Apple for 20 years,

00:35:52   and the things we did to screw repair firms,

00:35:55   then you could say, malice.

00:35:56   - My job was to make it difficult or impossible

00:35:59   to replace your screen at a third-party facility,

00:36:02   because Apple wants the $100 replacement fee.

00:36:06   I was the extra glue engineer at Apple,

00:36:08   continuing glue engineer.

00:36:09   - Before, this is good.

00:36:12   I think I can unwind.

00:36:13   I wanna go back to the security bug.

00:36:17   So all the way back to this responsible,

00:36:24   what is it called, responsible?

00:36:26   - Is it like, I don't know, I may know if this is right,

00:36:27   it's like responsible disclosure policy

00:36:29   is what I'd label it.

00:36:30   - So in some aspect though, it's not responsible

00:36:33   because the only things that are updated

00:36:36   are the very latest update to the OS.

00:36:38   I'm talking to you right now through an iMac

00:36:42   that has not been updated,

00:36:43   because I didn't feel like restarting my Mac

00:36:45   before we started the show.

00:36:46   - Right, right.

00:36:47   - So the Mac I'm talking to you on right now is not updated.

00:36:51   My iPhone is.

00:36:53   I thoroughly doubt that my son's or wife's iPhones are.

00:36:58   I'm pretty sure my iPad is not because I haven't,

00:37:01   oh no, it might because I'm running iOS 10 beta,

00:37:04   so I don't know.

00:37:05   But I have devices, I'm technically adept

00:37:08   and I'm in tune to the news.

00:37:10   And I have devices that aren't updated yet.

00:37:13   And let alone anybody who hasn't updated to El Capitan yet.

00:37:18   If you're running, you know, what was the one before?

00:37:26   - Yosemite, I don't have to keep looking it up.

00:37:29   - You know, you're still vulnerable.

00:37:34   And I don't know if Apple, you know,

00:37:35   nobody knows 'cause Apple won't say it,

00:37:36   but whether Apple's gonna do a security update

00:37:38   for those things either.

00:37:39   So in some sense, I often wonder

00:37:41   about disclosing these things, you know,

00:37:43   whether it's good or bad.

00:37:46   - Well, yeah, there's a couple of different aspects to it.

00:37:49   If there's a zero day, you know,

00:37:52   which just for the benefit of listeners who don't know,

00:37:53   zero day is an exploit that is known as something is patched.

00:37:57   So the patch comes out and it's known

00:37:59   that it's in malware that's being used, right?

00:38:03   So it has to be patched immediately

00:38:05   because you have to protect people who are in active danger.

00:38:07   So this is not a zero day.

00:38:09   There's no known attacks in the wild.

00:38:11   I haven't seen anything in the last couple of days

00:38:12   that suggested anyone had managed to exploit this

00:38:15   in any broad way.

00:38:15   And we'd know that 'cause Mac users and iOS users,

00:38:18   if it had been sent out as general malware,

00:38:20   would have been reporting,

00:38:21   I've been hijacked or whatever,

00:38:23   because malware distributors are not subtle, typically.

00:38:26   There'd be ransomware.

00:38:27   I mean, the big thing right now is ransomware,

00:38:28   as you know, I've been writing about that recently.

00:38:30   I got a couple articles on it.

00:38:31   And anyway, so this isn't a zero day,

00:38:35   but this is the tricky part.

00:38:37   Like if you don't think that it's in the wild,

00:38:39   like you're working for a security company,

00:38:40   there's no reports of this coming out,

00:38:42   that you've been analyzing,

00:38:43   you've found it proactively in advance or prospectively.

00:38:47   And the patch has been made and distributed

00:38:50   so anyone can get it has it.

00:38:52   That removes the financial incentive

00:38:53   for any criminal organization to try to find an exploit

00:38:57   because the window is closing so fast.

00:38:59   Like, you know, 25% of it's closed in the first few hours

00:39:03   and Apple pushes updates so heavily

00:39:06   and makes it hard to ignore them,

00:39:07   especially in iOS and other platforms

00:39:10   that the odds of having a target are very small.

00:39:12   So you spend weeks or months developing the payload

00:39:15   of the exploit technology to deliver the correct payload.

00:39:18   And by then 93% of people have updated

00:39:21   and you have, you know, a couple hundred thousand people

00:39:22   you have to reach and even spam and phishing

00:39:25   doesn't make it worthwhile to send out the messages

00:39:27   to reach a fraction of them.

00:39:29   So the economics of it are bad when it's not a zero day.

00:39:31   When it's a zero day, you disclose because it's so dangerous

00:39:34   and you want everyone to patch right away.

00:39:36   And the economics there change instantly too,

00:39:38   but there's exploits out there that will be tried

00:39:40   to be put out as fast as they can before people patch.

00:39:44   - Let me take a break here and thank our first sponsor.

00:39:46   I love, this is a brand new sponsor and it is a great app.

00:39:51   It's called Boom.

00:39:52   It's a Mac app from Global Delight.

00:39:54   Have you ever wished that the audio playing

00:39:56   your Mac could be richer, crisper, and just better. If you like listening to music, movies, videos

00:40:02   through your Mac or any other audio, then you may have searched for ways to boost the volume on your

00:40:08   Mac but haven't found anything yet. Well, that's what Boom does. It's an amazing audio enhancer for

00:40:12   the Mac. It is simple. It's gorgeous. It has great looking UI. These guys at Global Delight have always

00:40:19   always done really, really high-end, you know, classic indie Mac developer attention to detail,

00:40:26   and all the icons look great and everything like that. So it's just a volume booster. That's it.

00:40:33   Just it works on a system-wide level, so you don't have to install it like inside apps. It's not a

00:40:39   plug-in or anything like that. And anything you play on your Mac suddenly sounds amplified.

00:40:45   It works with headphones, it works with speakers,

00:40:48   and the best part, it's a MacWorld Best of Show winner.

00:40:52   This is an app that has been renowned.

00:40:54   It really, it sounds almost like snake oil,

00:40:57   but it really does work.

00:40:58   It's won the MacWorld Best of Show

00:41:00   back when MacWorld was a show.

00:41:02   And here's the best part.

00:41:04   Anyway, I keep talking in circles here,

00:41:06   but the best part is right now, for a very limited period,

00:41:09   I don't quite know how limited the period is,

00:41:11   so if you hear it, you better go get it now,

00:41:12   It is 33% off in their store.

00:41:15   It's just $9.99.

00:41:17   It's usually $15.

00:41:19   Where do you go?

00:41:20   Here's their URL.

00:41:21   It's a Bitly URL.

00:41:22   So go to bit.ly/boom2mac.

00:41:28   That's the digit 2.

00:41:30   Bitly/boom2mac.

00:41:34   And you will get more info.

00:41:36   They have a seven-day free trial.

00:41:37   Can you believe that?

00:41:38   Seven-day free trial.

00:41:39   What a-- it's like in the world of the App Store,

00:41:42   it's like you forget about free trials.

00:41:43   Well, guess what?

00:41:44   That's why the Mac is awesome.

00:41:46   Seven day free trial.

00:41:47   Try it, listen to it, see that you like it,

00:41:51   and then you can get it for 33% off.

00:41:53   That's boom two, that's the reason the two is in the URL.

00:41:57   This is version two of the app.

00:41:58   Boom two, 33% off.

00:42:00   Bitly/boom2mac.

00:42:02   It's a great app.

00:42:03   We were talking before about the sort of,

00:42:12   I don't know what you'd call it,

00:42:13   but the idea that Mac users are insufferably smug

00:42:17   on security issues.

00:42:18   - So true.

00:42:20   - And that's another one that's sort of like

00:42:21   a pet issue of mine, is the incessant,

00:42:26   inevitable need to boil everything down to a binary.

00:42:32   It's either this or that.

00:42:35   Either the Mac is completely invulnerable

00:42:38   to malware and security exploits,

00:42:42   or the Mac is every bit as vulnerable and exploitable

00:42:47   as every other system that's out there,

00:42:50   when the truth is in between.

00:42:53   It is nuanced.

00:42:54   You know, and it was a lot, you still see it,

00:42:57   and part of it is that iOS is so spectacularly popular

00:43:02   and such a lucrative target.

00:43:05   But in the old days, pre-iPhone,

00:43:08   it was always, almost always boiled down to an argument

00:43:13   that the Mac, you know, Windows has all this malware

00:43:16   problems, all these issues and all this, you know,

00:43:20   and I don't even think it was snake oil.

00:43:22   I think it was a reasonable thing that, you know,

00:43:24   informed users would agree with,

00:43:26   was that it was considered a standard practice

00:43:31   to install antivirus on your Windows PC.

00:43:35   That if you didn't, you were a fool

00:43:37   and you were probably gonna get exploited.

00:43:38   - Oh yeah, remember when you'd install

00:43:40   a virtual Windows machine on your Mac in Parallels

00:43:42   and then you'd launch it and it would get infected

00:43:44   before you could install the antivirus software?

00:43:46   That happened to me.

00:43:47   - It's no exaggeration.

00:43:50   And then Mac users, informed Mac users like myself

00:43:55   would say, "Well, I don't run any antivirus on my Mac."

00:44:00   And in fact, I don't recommend, my family members do,

00:44:04   and I don't think you should either.

00:44:05   And I don't think you need to.

00:44:08   And then they would say, well, you're an idiot,

00:44:09   because the only reason the Mac doesn't get exploited

00:44:12   is that it's every bit as vulnerable as Windows,

00:44:14   but it's too small for the malware people to care about.

00:44:18   And it's like, you can't disprove that.

00:44:20   That's one of those, it's like,

00:44:21   I don't know what the rhetorical description of that is,

00:44:24   but it's sort of like a straw man argument.

00:44:26   You can't knock it down.

00:44:27   You can't disprove a hypothetical like that, right?

00:44:32   There's no way to prove it otherwise,

00:44:33   except if the Mac got as popular as Windows.

00:44:37   And that sort of happened with iOS, right?

00:44:42   iOS has hundreds of millions,

00:44:45   or I guess a billion active devices.

00:44:47   There might be, now that I think about it,

00:44:50   there might be more iOS devices in use than Windows devices.

00:44:53   I don't know, that might be ridiculous.

00:44:55   - I think there's more Windows,

00:44:55   because Windows things are never,

00:44:57   there's like people running Windows 95 stuff someplace.

00:45:00   - It's in the ballpark though.

00:45:01   probably running air traffic control or something.

00:45:04   - It is in the ballpark.

00:45:05   - Yeah, oh yeah.

00:45:07   - And it didn't happen.

00:45:09   I mean, now there is malware for Mac.

00:45:11   There is malware that attacks iOS,

00:45:13   but it's never been as rampant a problem

00:45:15   as it has been on other platforms.

00:45:18   - Yeah, it's, I've always said,

00:45:21   shouldn't say always, but I've said increasingly,

00:45:23   let's say this time is going by,

00:45:24   I said increasingly like Apple was a weird target

00:45:26   and we had, we were the first viruses

00:45:28   come to the computers, the first widespread worms and things

00:45:31   was through Macs, right?

00:45:32   We had the one that you stuck a disc in,

00:45:34   it would write it to the floppy disk, I think.

00:45:35   - Yes, I have in my notes here,

00:45:38   do you remember what, do you remember?

00:45:40   There was a time. - It's not gonna be.

00:45:41   - There was a time when I did run antivirus on my Mac

00:45:45   and I did recommend everybody do it.

00:45:47   It was called Disinfectant.

00:45:48   - Yeah, oh, I love that.

00:45:49   Yeah, those guys, yes. - Awesome.

00:45:51   Oh no, it was one guy.

00:45:52   It was like a guy up in Seattle, right?

00:45:54   - Oh, you're so right, 'cause did he charge for it?

00:45:57   - No, it was free. - It was free.

00:45:58   - What was his name? - It was great.

00:45:59   It was as good as the commercial solutions for a long time.

00:46:01   - John Norstead.

00:46:04   - Oh, wow, that's good memory.

00:46:05   I think that's right.

00:46:06   A very nice guy, if I recall, too.

00:46:08   - Really nice guy.

00:46:09   I mean, like, just amazing that it was just,

00:46:14   and it was this free utility you did,

00:46:17   like an init, you would, Norstead, John Norstead.

00:46:21   - I miss inits, sort of.

00:46:23   - Here's his homepage, John Norstead's homepage.

00:46:26   There he is.

00:46:27   - Yay.

00:46:27   - I will put this in the show notes.

00:46:29   John Norstead's homepage.

00:46:30   It was amazing.

00:46:31   It was an init, you ran it,

00:46:34   it seemingly didn't slow your Mac down at all.

00:46:38   It had no adverse effects,

00:46:40   and it was updated on a regular basis

00:46:43   with all the new viruses that were spread around,

00:46:47   and it would identify them and block them.

00:46:49   But yeah, that was like an insidious one.

00:46:50   I remember Drexel was absolutely hit by it.

00:46:53   It was so insidious.

00:46:57   it was a virus that would spread just by inserting a floppy disk into an already infected machine.

00:47:01   Great piece of engineering for the day.

00:47:03   Yeah.

00:47:04   But yeah, so here's my thing.

00:47:05   So Apple didn't necessarily, they're not a,

00:47:08   Apple was never necessarily ahead on any innovation like address space layout randomization, ALSR,

00:47:13   which is a great technique used so that you can't predictably as a malware developer

00:47:18   know where part of the system is going to be located in memory necessarily.

00:47:22   And the more ASLR you do across everything,

00:47:25   target a memory location and have something happen.

00:47:27   And so that, you know, that Windows, I want to say

00:47:30   Microsoft introduced that years before and they

00:47:32   needed to before OS X.

00:47:34   It was much more necessary.

00:47:35   But Apple wasn't necessarily an innovator in that.

00:47:37   What they were an innovator in was accidental things

00:47:40   that worked, right?

00:47:41   So, Apple going to free updates for its operating

00:47:45   system, you know, first super cheap and then

00:47:48   completely free and all the free incremental ones.

00:47:50   And the way in which you could do an incremental

00:47:52   You could do an incremental update in OS X without typically destroying your system.

00:47:56   I know there were some bad releases and I had problems over the years,

00:47:59   but they created an environment in which people were expected to run updates and typically stay up to date.

00:48:04   You might have been, you know, using 10.6.8, not 10.7 for a while, but then you'd switch over.

00:48:09   And when you looked at over the years, as you looked at the adoption curve,

00:48:13   it's ridiculous compared to any other operating system.

00:48:16   These people move up so fast, and software developers, of course, have the issue of compatibility and all that.

00:48:21   So there are, uh, the, the long tail of older versions by number of people, uh,

00:48:27   running, uh, a number of computers rather of older versions of OS X is very small

00:48:31   compared to most other things.

00:48:33   You look at Android, too, Android, and there's this incredible distribution

00:48:36   because Android devices, when they were sold originally, the first

00:48:40   generations, as opposed to iOS, uh, it was very difficult or impossible

00:48:45   to get beyond the version, even the subversion, like 2.2 that you had

00:48:49   that you had installed.

00:48:50   That gives you a great target,

00:48:52   'cause you know there's going to be 100 million devices

00:48:54   out there running Android 2.2

00:48:56   for as long as the devices work.

00:48:58   So you have that target to attack forever.

00:49:01   Android, as opposed to OS X or iOS,

00:49:04   didn't have a good pathway directly to users.

00:49:06   I mean, Windows has this, of course,

00:49:08   as a direct pathway,

00:49:09   and eventually it helped to provide security updates.

00:49:11   So Android users are sort of abandoned,

00:49:14   and Google's been working on this for years

00:49:15   to create an effective way to have a rapid turnaround

00:49:18   for certain kinds of security issues.

00:49:20   Switching to apps is one way.

00:49:22   One of my colleagues over at Greenbot

00:49:24   in the IDG family there, Flo I was telling the other day

00:49:27   that like Google having a messaging app,

00:49:30   Hangouts and messenger, I forgot they call it messenger.

00:49:33   They can update the app.

00:49:34   If there's a security problem in the app,

00:49:35   the app can take care of it,

00:49:37   but they can't update the entire system

00:49:38   'cause someone's running an outdated or unpatchable phone.

00:49:41   So Apple, I think the upgrade cycle

00:49:44   can always reduces the target of potential infections,

00:49:47   whether it's with iOS, which is a monolithic,

00:49:49   you know, ecosystem as opposed to Android,

00:49:52   or it's with OS 10 where they push stuff out.

00:49:55   And I think it seems like a pretty fast way

00:49:58   and get people up to the next version.

00:49:59   So there's never a lot of people

00:50:01   that you can easily target.

00:50:02   And with Windows, you have people running,

00:50:04   hundreds of millions of people

00:50:05   running pre Windows 10 versions.

00:50:07   You have hundreds of millions of Android users,

00:50:09   maybe several hundred million

00:50:11   running pre Marshmallow six point X versions, right?

00:50:14   So those targets are so lovely that as a,

00:50:17   If you're a malware developer and you're a criminal and you're trying to hit the biggest target, why would you do it for iOS or OS 10?

00:50:24   If it's going to be hard or there aren't enough people, it's, it's again,

00:50:28   if you're doing that shotgun approach of I want to do the least, I want to do the least amount of work and get my scummy little piece of malware on as many devices as possible.

00:50:38   I'm going to send it a billion phishing messages. And I know that I'm going to catch of that, like a hundred million Android 2.2 users.

00:50:44   users, it's their owners.

00:50:45   It's easy, relatively easy.

00:50:46   Whereas if you want it, if you're in the business of targeting specific

00:50:51   individuals at the behest of a government agency, you

00:50:54   totally different.

00:50:54   Yeah.

00:50:55   I mean, look at the threat, you know, the hacking team that revealed how many

00:50:57   different kinds of attacks that were against, uh, max and iOS.

00:51:00   So, and it's a different category.

00:51:02   It's something to be concerned about, but it's also things like Apple, uh,

00:51:05   didn't have a native mail program for the longest time.

00:51:07   Remember, and then they come out with mail and mail's kind of crappy, but it

00:51:10   Apple came out with own mail program for OS 10.

00:51:12   Um, there was already diversity, so you

00:51:14   couldn't as a malware author, target a

00:51:16   specific popular mail programs problems.

00:51:20   The mail programs weren't deeply integrated,

00:51:22   like they were in windows.

00:51:23   So you couldn't cause like JavaScript to run

00:51:25   an attached message without someone clicking

00:51:27   on it in Eudora or whatever.

00:51:29   Right.

00:51:29   And then when mail came out, Apple already

00:51:31   knew this was a problem.

00:51:32   So they engineered something that was less

00:51:34   embedded in more separate.

00:51:36   So you could still have stuff that would go

00:51:37   wrong, but it was, it was a problem.

00:51:40   a more limited set of activities that happened

00:51:42   with Outlook and integration.

00:51:44   I mean, Microsoft's basically spent the last 20 years

00:51:46   pulling out the hooks it built deeply into its systems

00:51:49   that allowed things to happen

00:51:50   when they should have been sandboxed.

00:51:52   - We're so old that I specifically, you say that

00:51:55   and it sounds so old.

00:51:58   - That's right, you remember when they did.

00:51:59   - Max shipped without an email app.

00:52:02   And I had to double check.

00:52:04   I knew that Apple didn't make one, but I was like, wait,

00:52:06   Did they maybe ship like Netscape's weirdo email

00:52:10   or something?

00:52:11   I was like, no, they didn't ship any of them.

00:52:12   I didn't even ship the one.

00:52:13   I was a Eudora user.

00:52:16   - What came after that?

00:52:17   I mean, it was Outlook.

00:52:18   I think I used Outlook for a while in the Office Suite.

00:52:21   I don't know if that was part of, was that part of the deal?

00:52:23   Did Jobs and Gates agree as part of the Microsoft investment

00:52:28   that Apple wasn't going to release its own mail software?

00:52:31   - No, no.

00:52:32   - Okay, because Outlook existed.

00:52:34   So we put Office or they didn't install Eudora

00:52:35   And there were a bunch of other mail programs.

00:52:37   There's still a remarkable number of email programs

00:52:40   available and new ones being developed all the time.

00:52:42   - I don't wanna resort to Google here.

00:52:45   I want either me or you to remember it.

00:52:47   What was the one Jed Spencer's team created?

00:52:51   And they went on to Microsoft

00:52:53   and created the good version of Outlook.

00:52:56   - And it wasn't Note, what was it called?

00:52:58   I don't think I ever used that.

00:53:01   - Guy Kawasaki was a big--

00:53:03   - Mail mate?

00:53:04   - No, no.

00:53:05   It was beloved by some people.

00:53:09   - I don't, I don't know.

00:53:10   - Was it a Claris product at one point even?

00:53:12   - Claris had an email product, didn't it?

00:53:14   I don't, you know, I'll tell you, I use Mailsmith today.

00:53:18   I actually wrote a--

00:53:19   - You still do?

00:53:20   - I wrote a pro, I wrote an article for Macworld

00:53:22   recently called Old Software That We All Still Use,

00:53:24   and I got so many lovely comments from people

00:53:26   chipping in with these.

00:53:27   I used Quicken 2007.

00:53:29   I used Mailsmith, which is updated for compatibility.

00:53:32   I use CSS Edit from Mac Rabbit,

00:53:34   which doesn't develop anymore

00:53:35   for doing CSS tweaking on live sites.

00:53:38   I have this whole set of old software

00:53:40   that's still either getting tiny compatibility updates

00:53:44   or manages to work under the current environment,

00:53:46   and I'll cry when it stops.

00:53:48   And people chimed in with all this software that they use

00:53:50   that's sometimes like 10 plus years old

00:53:53   that they've just been, you know, a level later.

00:53:55   Level later, they upgraded it very nice

00:53:58   to the people involved and to be compatible with El Capitan.

00:54:02   but Levelator I don't think was really changed

00:54:04   for several years.

00:54:05   It's a vital piece of podcast,

00:54:07   audio normalization, equalization software.

00:54:10   - All right, we hold the thought.

00:54:12   I did cheat, I Googled

00:54:14   and I went to Judge Benser's LinkedIn page.

00:54:18   It was Fog City Software.

00:54:20   - Fog, yes.

00:54:21   - And the product was called E-mailer.

00:54:24   And then it was purchased by Apple in March, 1996

00:54:27   and became Claris E-mailer.

00:54:29   So Apple owned it 'cause Apple owns,

00:54:31   or I guess Claris doesn't exist anymore.

00:54:33   And I guess it's FileMaker.

00:54:34   - It was like spun out and Summit was spun in.

00:54:37   - Apple owned an email client,

00:54:38   but didn't pre-install it on Mac.

00:54:40   - Oh my God.

00:54:41   - Because, and the whole point was that until the,

00:54:46   again, I've long said that to me,

00:54:50   there's really only two eras at Apple.

00:54:53   There was the original era up

00:54:56   and then starting with the next reunification,

00:54:59   that's like modern Apple.

00:55:00   Modern Apple started with the next reunification

00:55:04   and jobs coming back.

00:55:06   And there's so many things that were different,

00:55:08   but in the original Apple, it was explicit.

00:55:12   It wasn't even implicit.

00:55:14   I think Apple was explicit about it at times

00:55:17   that they didn't want to compete with third party developers.

00:55:22   And so the Mac, it would have,

00:55:26   it had like, what was it called back then?

00:55:28   Simple Text, Teach Text, which was the--

00:55:31   - Ah, I can't remember now.

00:55:33   I think it's, I remember Simple Text,

00:55:34   but I think that was later, I think you're right.

00:55:36   - Teach Text was like the built-in read me reader,

00:55:39   but super minimal.

00:55:41   I mean, way less, way more minimal,

00:55:44   minimally featured even than TextEdit today.

00:55:46   - I'm sorry, just think about TextEdit,

00:55:48   'cause TextEdit actually has its roots, right, in Next.

00:55:51   - Yes, oh, deeply, yeah. - I mean, that is basically,

00:55:53   and TextEdit is a great piece of software no one uses,

00:55:55   and I wrote a Macworld piece about how great

00:55:58   There's some features in it that are invaluable

00:56:00   that you can basically cannot easily get

00:56:01   in any other piece of software.

00:56:02   - People do use it.

00:56:04   There are, it's sort of like a secret cult of people

00:56:06   who love, and rightly so,

00:56:07   text edit, including people at Apple.

00:56:11   I remember I had a meeting at Apple over a decade ago

00:56:14   when I was at Joyent.

00:56:15   It had nothing to do with during Fireball.

00:56:16   It was, you know, Apple wanted to meet with us

00:56:19   and talk about Joyent's technology and blah, blah, blah.

00:56:22   And I noticed that it was the first time, actually,

00:56:26   I met Michael Lop in person.

00:56:28   - Oh yeah, yeah. - At Ranson Repose.

00:56:29   He was there, he was an Apple software manager at the time.

00:56:33   And so we didn't know each other, we weren't friends yet,

00:56:36   but we were online, had a little bit of back and forth

00:56:39   simply as he knew Darren Fireball, I knew Ranson Repose.

00:56:43   He comes into the meeting and of course,

00:56:45   he's a very, he's such a minimalist.

00:56:48   He opens up his, I don't know if it was,

00:56:50   it might even have been a PowerBook at the time,

00:56:51   but whatever, MacBook, PowerBook.

00:56:53   Of course there's nothing on screen,

00:56:55   It's just a beautiful desktop picture.

00:56:57   He launches TextEdit and he's got one window on screen

00:57:00   and it's TextEdit and that's the app he used

00:57:02   to take notes for the meeting.

00:57:03   And I was like kind of blown away,

00:57:05   but also not surprised at all.

00:57:07   And he was like, oh, of course.

00:57:08   'Cause he said, you know why I use this app?

00:57:10   I use this app because it is super simple

00:57:12   and it has never once crashed on me.

00:57:14   I've never lost a single letter of anything I've ever typed

00:57:18   in TextEdit.

00:57:19   But anyway, Apple--

00:57:21   - I think this is still the case in Japan, I don't know.

00:57:23   But in Japan, it used to be the less on your business card,

00:57:25   the more important you were.

00:57:26   And then, at least in the '80s,

00:57:27   I remember seeing a cartoon about this

00:57:28   and reading about this, and you know,

00:57:30   when you hand someone a business card to Japan,

00:57:31   you hold it with two hands, and you hand it to them.

00:57:33   It's just, you know, it's got a little bit

00:57:34   of a ceremony about it, used to.

00:57:35   I have no idea what people do today.

00:57:37   And there was a comic strip at the time

00:57:38   by an alt cartoonist who was about these two people

00:57:41   competing for a job in Japan and striving to be whatever,

00:57:44   and one of them has a dream, he says,

00:57:45   "I dreamt I met God, and he handed me his business card,

00:57:48   "and it was completely blank."

00:57:49   That's what I'm thinking about

00:57:50   when you're talking about Michael Law on the street.

00:57:53   - So Apple had an email product,

00:57:54   and they didn't even pre-install it, I don't think.

00:57:56   Maybe they did at some point.

00:57:58   - Yeah, so nothing bundled.

00:58:00   They kept their hands off.

00:58:01   And Microsoft integration, bundling,

00:58:03   tying monopoly-ish tendencies,

00:58:06   always keeping this market locked in for themselves,

00:58:09   whether it's business software, productivity issues,

00:58:12   email, browser, that was their security downfall.

00:58:15   And I think they spent a lot of time backing away from that.

00:58:18   And okay, oh, so I just did this story.

00:58:20   I don't know if this is a sidebar.

00:58:21   We're still talking viruses.

00:58:22   I just did a piece that should be out

00:58:24   by the time this airs, for MIT Technology Review

00:58:26   about some new research, it's been public for months,

00:58:29   a couple research teams at different universities

00:58:32   came up with strategies for fighting ransomware on Windows.

00:58:34   And it was very interesting, and I talked to a bunch,

00:58:36   I talked to McAfee and other folks, and the researchers,

00:58:40   and the thing that's hilarious,

00:58:41   they say, "Well, how do you stop ransomware?

00:58:42   "What do you do?"

00:58:43   And they're like, "Well, you keep your software up to date,

00:58:45   "you install, you run the latest patches,

00:58:47   "you don't run Java, you don't run Flash."

00:58:49   I'm like, "Wait, what about virus software?"

00:58:50   They're like, "That's sort of the last stage.

00:58:52   Like ransomware and most malware now just targets this incredibly low hanging fruit

00:58:57   of which there still remains so much.

00:58:59   Like, so one of the people I spoke to said something like 50% of machines you can

00:59:03   just get into because they're just not protected in any way.

00:59:07   Forget antivirus software.

00:59:09   Um, what the ransomware, what's interesting is the developers of these, uh, these

00:59:13   academics, rather the two different groups took different approaches, but the fact

00:59:16   is ransomware, it works on user space files.

00:59:19   So it's actually insidious.

00:59:21   You don't have to gain deep permissions.

00:59:22   Once the payload is dropped and it runs, you know, a lot of ransomware is like scripts,

00:59:26   like PHP or JavaScript, and you double click a Trojan horse that's delivered via email,

00:59:32   and it just starts encrypting files because it doesn't need extra permission.

00:59:35   It can, it's, they're your files, so they're only doing documents, but there's a lot of

00:59:39   tell tales, there's entropy and all kinds of other stuff that can monitor.

00:59:42   So the approaches are really cool, but I was like, oh, that's one reason ransomware has

00:59:46   a money thing at the end, which is very straightforward.

00:59:50   And there's like 6 million unique variants of ransomware out there now,

00:59:54   because the modifications of a bunch of like base families of ransomware,

00:59:58   so many different people are doing it because the money is so easy.

01:00:01   But fundamentally, it has that great advantage.

01:00:04   It's not trying to get into your kernel and do something.

01:00:06   It's not trying to hijack your networking.

01:00:08   It's just trying to take your Word documents and make them unreadable.

01:00:11   Right. It's not really fighting the system.

01:00:13   It's actually going with the flow of the system.

01:00:15   Oh, you're something the user double-click,

01:00:17   you have access to all of the user's files.

01:00:19   Yeah, exactly. It's in your doc, you know, my documents, boom, you're done.

01:00:21   Well, and it's exactly the sort of thing, it's exactly the sort of reason that Apple is, you know,

01:00:26   sandboxed iOS from the start and is so, for all the technical problems it causes for honest apps,

01:00:35   why they're so bent towards sandboxing on Mac as well.

01:00:41   Yeah, yeah. And I hear that. There's a technique I learned that I didn't know about. There's a

01:00:45   micro virtualization is coming. It's a step beyond sandboxing.

01:00:48   Every app runs essentially in its own tiny virtual machine,

01:00:52   which sounds crazy, but bromium,

01:00:54   a B R O M I E M is one of the companies in the space. I think, uh, was it, uh,

01:00:59   F secure may has a product. Um,

01:01:01   it's the new thing because a lot of business users are basically only running a

01:01:05   handful of apps.

01:01:06   So running them in the virtual environments are transparent to you as a user,

01:01:09   but essentially it's like super sandboxing. Right. It's wild.

01:01:15   Did you see the story I linked to it?

01:01:17   I guess I'll put it in the show notes.

01:01:19   I linked to it earlier this week, or maybe last week,

01:01:21   where there was a variant of ransomware that doesn't actually--

01:01:29   after you pay them, it doesn't give you your file.

01:01:32   So for anybody who doesn't know, the way ransomware works,

01:01:35   your machine gets hit by ransomware.

01:01:37   The ransomware malware starts running,

01:01:39   and it starts encrypting your files.

01:01:41   And then all of a sudden, you notice it

01:01:43   when you go to open one of your documents and it gives you a dialogue and it says you've been,

01:01:49   you know, you've been hit by ransomware, all of your files are encrypted and they really are

01:01:54   encrypted. And so like if you like go try to open it and you know a text editor or something,

01:01:59   it's just going to be, you know, garbled binary stuff because it's encrypted. Right. And the key

01:02:04   is sent via command and control system. That's the one. It does need a little network access. So

01:02:07   the key is not stored on the device. So you can't just extract the keys. Yeah.

01:02:11   So there's some clever use of encryption there.

01:02:15   And then if you give them money by following steps X, Y, and Z,

01:02:19   who knows, maybe they want Bitcoin.

01:02:21   You have to go buy Bitcoin and then give them the Bitcoin,

01:02:23   or they just want your credit card number or whatever.

01:02:27   Somehow you've got to get the money.

01:02:28   And then you give them money, and then they really do decrypt.

01:02:31   They give you the key and you get your files back.

01:02:34   And there have been cases, high profile cases.

01:02:36   I remember there was a hospital, I think in Los Angeles,

01:02:39   a hospital that got hit by a ransomware.

01:02:40   they spent 17 000 i think yeah unencrypted they had to but you know that you know and and there's

01:02:47   sort of a uh you know it's it's like an old almost like a political thing that you know like the

01:02:52   united states government doesn't negotiate with terrorists and if that's your poll if you just

01:02:56   state that as your policy up front we don't negotiate with with terrorists it hopefully acts

01:03:01   as a deterrent to terrorists who would you know uh take people take us you know take people hostage

01:03:08   because now it's not really true that we don't negotiate with them, but it's the policy. And you

01:03:12   can see the logic of that. And you can see the logic of, "Well, you should never pay these people

01:03:16   for this." But at a certain point, it might have been worth it for if it was critical enough

01:03:20   information, it might be worth it for you to pay whatever the price that the price they're asking,

01:03:25   however distasteful it is to actually give in and give these literal criminals money, it might be

01:03:31   more valuable to you than the actual data that's been encrypted is more valuable.

01:03:37   Now, the funny thing is that there's a group,

01:03:42   there's a group that you give them the money

01:03:46   and I'm laughing, but it's terrible

01:03:48   because it's even worse, obviously.

01:03:50   But the thing that makes me laugh about it

01:03:54   is that all of the honest ransomware,

01:03:58   and it's so funny to say that, the honest crooks,

01:04:02   have gotta be furious about this

01:04:04   because if word spreads that even if you pay,

01:04:07   don't get your files back, it's going to make people less likely to pay.

01:04:09   I have so many things to say about this.

01:04:11   Can I say a few things?

01:04:12   I have so many, I spent several days working on this recently.

01:04:15   So, okay.

01:04:16   So you know how ship ransoming works like piracy, Somali pirates, right?

01:04:20   Money, Planet Money did a great piece about this that talked to one of the

01:04:23   people who does negotiation with pirates to pay their ransom, right?

01:04:28   If you start killing the hostages, piracy stops working.

01:04:32   So it's actually typically not dangerous to be taken captive by pirates, Somalian or whomever,

01:04:40   because it is entirely in the interest of the economic system to negotiate a reasonable fee,

01:04:46   make sure all the hostages are unharmed. They will even sometimes release people for medical

01:04:49   or compassionate reasons. Like it's handled like a business. Then you have the situation where,

01:04:54   I can't remember how long ago this was now, a couple years, where some pirates started killing

01:04:57   people. And then it was like, okay, and all the navies in the world went, screw this. And they

01:05:01   They start steaming Navy ships in and sort of clean up the problem, which has been a

01:05:05   commercial problem and is now, you know, a human rights one, right?

01:05:09   And I'm thinking the idiots who think it's funny to delete the files or they're too incompetent,

01:05:15   whatever it is, they have so many, I mean, there are so many angry organized criminals

01:05:19   in the world.

01:05:20   They're, these guys could get killed, honestly, if they're tracked down.

01:05:23   That's for sure.

01:05:24   So that's not funny, but it's also like, it actually does exactly that.

01:05:27   almost like a disruptive technique that would,

01:05:29   that destroys the value of ransomware.

01:05:31   So in researching this story, I came across

01:05:35   just a few days ago, F-Secure released this

01:05:37   hilarious white paper.

01:05:38   They tested the customer service of major

01:05:41   ransomware packages.

01:05:42   They're like, it has a customer service burden.

01:05:46   I talked to this guy named Sean Sullivan at

01:05:48   F-Secure labs about some background stuff

01:05:50   about ransomware.

01:05:51   And he said, the reason it's gotten so popular

01:05:53   is it's Tor, you know, the routing network that

01:05:56   That's how the ransomware people post websites,

01:05:59   basically, and Bitcoin is all Bitcoin.

01:06:01   There's no credit card anymore.

01:06:02   That's the big change.

01:06:03   The ransomware dates back literally decades,

01:06:05   but this is the Bitcoin just makes it, it

01:06:08   facilitates it so much.

01:06:09   The average ransomware demand now has gone up

01:06:13   from a few hundred dollars worth of Bitcoin to

01:06:14   like 600 something dollars, but F-Secure

01:06:17   found you could negotiate with some of them.

01:06:18   They'll, you'll, they'll run the fee down.

01:06:20   They created like a naive user who then she,

01:06:23   they hired somebody who was not technically

01:06:24   They hired somebody who was not technical to do the communication so they wouldn't give anything

01:06:28   away about, you know, what they ate about more sophisticated details. And they tested all these,

01:06:33   you can extend the deadlines, customer service people, the ransomware companies are very

01:06:36   sensitive. In one case, they're like, this was as good as like real customer service. It's like

01:06:40   what you'd get from a software company. They talked you through it. They'll often teach you

01:06:44   how to buy Bitcoin so that you can pay. They treat it like a real customer service burden,

01:06:48   like it's a business and we're here to help you get your files back. So

01:06:51   - Yeah, well, if you think about it,

01:06:53   when do you typically get the best customer service?

01:06:57   Typically, you get better customer service

01:07:00   before you've given them your money.

01:07:02   - Yeah, yeah. - Right?

01:07:03   You get, you know, it's easier and you wait less time

01:07:08   to talk to a salesperson before you've bought something

01:07:10   than when you come back with a problem.

01:07:13   - Yeah, it's just, the whole thing is hilarious.

01:07:15   I mean, it's hilarious and awful.

01:07:16   So Mac users, you know,

01:07:17   there's been a couple ransomware attempt,

01:07:18   software attempts against Mac users.

01:07:20   And again, because the user space file issue,

01:07:24   it's possible we will see,

01:07:25   phishing style ransomware

01:07:28   or things that will be minimally capable

01:07:30   because OS X will execute certain kinds of things.

01:07:34   The question is your network access

01:07:36   and some privilege it may need would be harder.

01:07:38   There might need to be an exploit pathway

01:07:39   for a little bit of it,

01:07:40   but it is so much less of a burden

01:07:42   to get to have some effect.

01:07:45   So we'll see.

01:07:46   So hopefully it won't affect most of us,

01:07:48   but it is, and again,

01:07:50   everything they're saying, it's like,

01:07:51   update your software, use patches, make backups,

01:07:53   having good backups.

01:07:55   You know, I use Backblaze and CrashPlan and local clones.

01:07:59   And I have a deep archive.

01:08:01   So if all my files were encrypted today,

01:08:04   I have 100% of those on,

01:08:06   most of them on Dropbox also,

01:08:08   and in two other, like at least one other place,

01:08:10   I should say, where I have a deep archive

01:08:12   and I could go back to a pre-encrypted release.

01:08:14   - Let me take a break here.

01:08:16   Thank our next sponsor.

01:08:18   This is a long time friend of the show,

01:08:20   long time sponsor, Fracture.

01:08:23   Fracture is a photo decor company

01:08:25   that is out to rescue your favorite images

01:08:27   from the digital ether.

01:08:29   They print your photos directly onto glass

01:08:32   and add a laser cut rigid backing

01:08:35   so they are ready to display right out of the box.

01:08:38   It's not like a piece of paper glued to a piece of glass.

01:08:41   They print the photo right on the glass.

01:08:44   They've been a sponsor for years.

01:08:45   I still haven't seen anybody else that does this.

01:08:47   Maybe it's somebody else.

01:08:48   I think the Fracture's got all this

01:08:49   proprietary stuff down there.

01:08:50   It is an amazing display.

01:08:53   I've always said it is very much like

01:08:55   the way that the retina displays.

01:08:58   Once Apple started fusing the screens to the glass

01:09:01   where it looks like the pixels are on the glass

01:09:02   instead of behind a layer of glass,

01:09:04   that's what Fracture photos look like

01:09:06   except they're completely analog.

01:09:08   It is a great thing to do with your digital photos.

01:09:11   I have thousands, I shoot thousands of photos every year.

01:09:14   The ones that I like best I get get them printed on these fractures hang them up around the house

01:09:18   That's what people used to do because your photos were printed you'd go you'd shoot photos

01:09:23   You'd get them back from the photo lab

01:09:25   You'd take the ones you like out, you know

01:09:27   The real keepers you'd instead of just putting them back in the envelope you'd put them in a frame hang them up

01:09:32   We don't do that anymore

01:09:34   Because there's like this extra step where you got to get them from your digital archive to being on

01:09:38   This is the way to do it. If you're gonna print your photos take the keepers and send them to fracture

01:09:42   They're so good. They have so many sizes, amazing sizes. They have a 60-day happiness guarantee

01:09:50   so that you're sure to love your order. Each fracture is handmade in Gainesville, Florida,

01:09:55   from U.S.-sourced materials in their carbon-neutral factory. All happens in the U.S.

01:10:01   So for more information and 10% off your first order, visit fractureme.com/podcast.

01:10:10   There's even a special note in here that says, "Note, the URL really ends in 'podcast,' not the name of your podcast."

01:10:18   Because I'll tell you, if they did put that note in here, I would have said,

01:10:21   "FractureMe.com/TheTalkShow," but that's not it. Remember this. It's "FractureMe.com/Podcast."

01:10:27   And then what they want you to do when you place your order, they're going to say, "Where did you

01:10:31   hear about this?" Just mention that you heard it from the talk show, and they'll know that you came from here.

01:10:35   It's literally, it is a one question survey.

01:10:40   Where did you hear about Fracture?

01:10:42   So it is the easiest survey you will ever take.

01:10:44   Just remember to tell them.

01:10:45   That helps support the show and it's fractureme.com/podcast.

01:10:50   - Can I talk about-- - You can do whatever you want.

01:10:56   - Well, a brief another, this is like, this isn't a sidebar.

01:10:59   You were writing recently about Amazon

01:11:01   and its inventory of the Birkenstock story.

01:11:03   - That's on my list of topics.

01:11:05   Oh, good. Well, I have something to say about that,

01:11:06   but we can go take your direction.

01:11:08   - I think we were done with security,

01:11:10   so we can totally go on to Amazon fraud.

01:11:12   I've linked just to a couple of recent stories about-

01:11:15   - Oh my God, it's so terrible.

01:11:17   - So there's a couple of aspects to it.

01:11:19   One is the main one that I've been reading,

01:11:23   'cause it seems like it's a little bit new,

01:11:25   is big brands being ripped off

01:11:30   by mostly Chinese counterfeiters,

01:11:33   and then they sell these things through Amazon.

01:11:36   Well, the first one I heard,

01:11:38   this is maybe two years ago,

01:11:39   is that Mophie products, Mophie the battery maker,

01:11:43   they make a whole bunch of battery,

01:11:46   external battery packs and battery cases for phones.

01:11:49   Really interesting company.

01:11:52   I have a couple of their things

01:11:53   and I really, I have to say I like them a lot.

01:11:57   I have a battery, I like them instead of a case,

01:12:00   I like to just have the battery pack

01:12:02   and they make one that has built-in lightning and USB cables,

01:12:06   so you don't need to take cables.

01:12:08   You don't need to have an extra cable.

01:12:09   I really like that product a lot.

01:12:11   It's my favorite external battery thing ever.

01:12:13   But I heard years ago, just like at least two years ago,

01:12:16   don't buy Mophie stuff on Amazon

01:12:18   because there's so much Chinese knockoff crap

01:12:21   that looks like a Mophie product,

01:12:23   but it's really substandard electronics.

01:12:25   The batteries are no good.

01:12:26   They're just crummy products.

01:12:29   If you Google like Mophie Amazon,

01:12:31   you'll find lots of hits about it.

01:12:32   So now just recently this week,

01:12:34   Birkenstock, the sandals, what else do they make?

01:12:37   Just sandals and flip-flops, I guess.

01:12:39   - Yeah, a bunch of stuff.

01:12:39   I was called, by the way,

01:12:40   I was called socks and sandals in college

01:12:42   'cause of my Birkenstock habits, just so you know.

01:12:45   I went to school in the East, grew up in the West,

01:12:47   that's how it goes. (laughs)

01:12:50   - Is pulling out of Amazon on January,

01:12:52   I guess they can't do it immediately

01:12:53   'cause they might have contracts or inventory or something,

01:12:56   but they're so overrun by counterfeiters on Amazon

01:12:59   that they're pulling out.

01:13:00   - Oh yeah, and you love them.

01:13:01   They're like, Amazon basically told them,

01:13:03   if you wanna be, sell every single thing you sell,

01:13:06   then we will fight counterfeiting.

01:13:07   I mean, I'm paraphrasing.

01:13:09   And if you wanna just sell the way you're doing,

01:13:10   then screw you.

01:13:11   I mean, they didn't say it that way, but that's the effect.

01:13:13   That's, you know, that's what Birkenstock claims at least.

01:13:15   - I think I could be wrong on this.

01:13:18   I wrote that, it's sort of like when you search for stuff

01:13:22   on Amazon, it tells you who you're buying it from.

01:13:25   And you know, with a lot of the smaller things,

01:13:28   like when you just go there and buy copy paper,

01:13:33   printer paper, you know, it's like full,

01:13:35   you know, it's just, you're not really buying it

01:13:37   from Amazon, you're buying it from some vendor

01:13:39   that sells through Amazon.

01:13:41   But I was under the impression that even if you wanted

01:13:44   to buy like a Mophie battery pack,

01:13:46   if you buy it directly from Amazon and you're not buying it

01:13:49   from, you know, Joe's battery shop,

01:13:52   that you're, you know, you could be, you know,

01:13:55   you could feel safe that you're getting

01:13:57   the actual Mophie product, but you kind of have to be--

01:14:00   - Oh my God.

01:14:01   - You have to be like a close reader.

01:14:03   How many people who shop at Amazon actually look at

01:14:05   who it's, who's this fulfillment is?

01:14:09   I don't think most people, most reasonable,

01:14:11   most regular people even know that that's how Amazon works.

01:14:14   I think they just think you're buying it from Amazon

01:14:17   and they don't even look, they just look at prices

01:14:19   and they might be curious about the fact

01:14:21   that the same product is available

01:14:23   at three different prices from Amazon

01:14:25   because it's from three different, you know,

01:14:29   whatever you wanna call them, fulfillment partners.

01:14:32   - Yeah, or maybe through Marketplace

01:14:36   or maybe through their, they've,

01:14:36   Fulfillment by Amazon also,

01:14:38   you can send your stuff to Amazon

01:14:40   and they will sell it to their own customers.

01:14:43   - Right.

01:14:44   - So, you know, I think you mentioned this

01:14:45   in one of the pieces that you linked to, the commingling.

01:14:48   So there's like two kinds of,

01:14:49   there's actually like four kinds of fraud.

01:14:51   There are two kinds of major fraud.

01:14:52   One is, oh God, there's so many kinds of fraud.

01:14:55   So co-mingling is company X ships a product to Amazon and they say, this is exactly the same.

01:15:02   Here's the skew number.

01:15:03   This is product Y that you already sell like a Cuisinart electric kettle, which I'll explain why I bring that up.

01:15:08   Right.

01:15:08   Amazon does co-mingling where they take this inventory.

01:15:12   I don't even know how many inspected or whatever, and they put it on their shelves as, and they don't care.

01:15:17   Uh, they treat it as a fungible thing.

01:15:20   This thing that came from company X that claims its product Y we're putting on the shelves in our warehouse.

01:15:25   house next to this thing that came from the manufacturer that claims its product Y right

01:15:29   I have heard stories from multiple places and you can read them publicly to in a lot of these articles about

01:15:34   Companies like Birkenstock and Birkenstock has a different problem which I get to but that you buy a product you go to Amazon you order product

01:15:41   Y and it comes and you're like this is not product Y you complain to Amazon and they're like, oh send it back

01:15:46   We'll ship you a new one or you complain the company. I bought product Y from Amazon

01:15:50   They like we didn't make that does it have this and this? No, that's a counterfeit

01:15:53   fit that was put into Amazon stream and we cannot prevent them from

01:15:56   selling them basically, that's the commingling problem.

01:15:58   Birkenstock has the undercutting problem where people are listing things

01:16:02   basically as the same kind of thing.

01:16:04   They're undercutting Birkenstocks prices and they may or may not

01:16:07   be shipping a Birkenstock.

01:16:08   They're probably shipping that thing as it could be there buying

01:16:11   them from some other source, or it could be they are manufacturing

01:16:13   something that is completely counterfeit and selling it.

01:16:16   And, um, when they're both huge problems, because in the one case

01:16:20   you have this thing where people are buying, I mean, actually

01:16:21   Birkenstock has the same problem, whether it's a commingled problem or counterfeit

01:16:24   and cheaper, someone buys it says Birkenstock screwed me.

01:16:28   They're ready.

01:16:28   The company said we didn't make that.

01:16:29   What'd you buy from Amazon?

01:16:30   That's not ours.

01:16:31   You bought the $80 one, not the a hundred dollar one.

01:16:34   We don't sell it for $80.

01:16:35   You bought the sucky one.

01:16:36   Right.

01:16:37   So this just happened to me last night.

01:16:39   Really?

01:16:40   Yeah.

01:16:40   So what was the product?

01:16:42   So, uh, uh, electric kettle love electric kettle.

01:16:45   I've had two, uh, Brown model ones or Braun, I guess we say in America.

01:16:48   Uh, my wife got one for a birthday, like 15 years ago.

01:16:51   It worked fine for several years and then it just died.

01:16:54   So we bought the identical thing.

01:16:56   It's like a $20 one.

01:16:57   It's worked great.

01:16:58   It just dies of the day.

01:16:59   Same thing.

01:16:59   Some kind of, you know, it's got all these contacts breaks.

01:17:02   So I'm like, okay, I will find out what Wirecutter recommends.

01:17:05   My old friends at Wirecutter and Sweet Home was a contractor there for a bit.

01:17:08   I love the people there.

01:17:09   I love the process.

01:17:10   I'm like, all right, well, what one do they recommend?

01:17:11   I'm like, oh God, it's $80.

01:17:13   Am I going to spend $80 on an electric?

01:17:14   I'm like, ah, I don't know.

01:17:15   I'm not made of money.

01:17:16   Uh, Jason Snell has a Breville T robot.

01:17:20   which is $250, which I've been eyeing for a year now.

01:17:25   I'm not, I can't get it.

01:17:27   Can't get it. - Robot that makes tea?

01:17:29   - It has a basket that lowers robotically and raises.

01:17:32   So it steeps it for the right amount of time.

01:17:35   - I just wanted to clarify that it was tea, the beverage,

01:17:38   and not like some kind of, you know, like a t-shirt,

01:17:41   like the letter, something I wasn't familiar with.

01:17:44   I just wanted to clarify.

01:17:45   - It's a tea brewing robot pot.

01:17:47   It's awesome.

01:17:48   And people love the Breville.

01:17:49   but I'm not ready to spend, I'd love to be able,

01:17:51   if I ever got some great contract or something,

01:17:53   I might say, this is my treat to myself.

01:17:55   I will get a robot that makes tea for me.

01:17:58   You still have to put stuff in the basket,

01:17:59   but it moves it, lose it up and down, doesn't oversteep.

01:18:01   So anyway, 80 bucks, I'm like, that seems, I don't know.

01:18:03   So I start doing research and I go to Amazon,

01:18:05   'cause it's usually my first stop.

01:18:07   And there are like 700 electric pedals now.

01:18:10   And many of them have one review,

01:18:11   which makes no sense, right?

01:18:13   Like, what's something with one review and a brand name

01:18:16   I've never heard of in my 48 years on this planet.

01:18:19   And then a bunch of others, I find one, I'm like, this is like 300 reviews.

01:18:22   I've never heard of this thing.

01:18:23   And 97% of them are five stars.

01:18:26   You know how the app store works, same thing on Amazon.

01:18:29   So I'm reading the reviews and it's like, Marty says, this is a great kettle.

01:18:32   This kettle does everything I want it to.

01:18:33   It's great.

01:18:34   Jill says, this kettle is great.

01:18:36   It does everything I want it to.

01:18:37   This kettle is great.

01:18:38   It goes on and on.

01:18:39   And there's a few real five star reviews and then a distribution of other ones.

01:18:43   I'm like, I'm not going to buy that because that's some piece of crap

01:18:46   manufacturer, God knows where that they're, you know, for a finite amount of time,

01:18:50   they're going to push through Amazon and make some hundreds of thousands of

01:18:52   dollars. So I'm like, where are the 20 or $30 good kettles? I can't find Browns anymore.

01:18:57   Everyone's gone upscale kettles now costs 50, 60, $80 just to do basic stuff.

01:19:01   So I'm like, all right, I'm going to bite the bullet. I,

01:19:03   I use the kettle two, three times a day. My wife uses it two times a day.

01:19:07   This is a high use thing. I can, I'm going to, I'm going to spend $80. I'm,

01:19:10   I hate it, but I'm going to spend $80. I go to Amazon site and I'm like, all right,

01:19:14   Well, it's 80 bucks.

01:19:15   Okay.

01:19:15   I'm looking through reviews, but this looks pretty good.

01:19:17   And then I look at the seller, it's a Cuisinart, but it doesn't say sold by

01:19:21   Cuisinart says sold by, uh, everything lucky, not making this up, everything

01:19:29   lucky to selling the Cuisinart.

01:19:31   And I thinking this, this isn't right.

01:19:33   So I search an Amazon site for the model number.

01:19:35   It's like CPK 12, 112 or something.

01:19:38   And I'm like, all right, so where is this?

01:19:41   And then I find another listing for a hundred dollars with, okay.

01:19:44   So the $80 unit, everyday lucky, which I find

01:19:47   out later, so he points out to me, it says

01:19:48   ever day lucky, it's not even spelled right.

01:19:51   Whatever the make, you know, they're mostly

01:19:52   said an everyday lucky is mostly selling iPhone

01:19:55   cases and this Cuisinart alleged Cuisinart.

01:19:57   So the $80 everyday lucky listing has 2,500

01:20:03   clearly legitimate reviews.

01:20:04   They have managed somehow to

01:20:06   hijack the main listing.

01:20:08   There's a hundred dollar version.

01:20:10   And if you look at all the other people selling

01:20:12   Walmart, Best Buy, they're all selling this model for a hundred dollars.

01:20:15   So I'm like, that's the list price clearly.

01:20:17   And the hundred dollar one on Amazon says, buy sold by, you know, buy Cuisinart.

01:20:21   It's being fulfilled by a third party, but it's clearly the legitimate product,

01:20:24   but it has like seven reviews.

01:20:25   So I asked my friends at Sweet Home, like what's going on here?

01:20:28   They're like, Oh, what happened is I got a response this morning

01:20:31   from Tony over there.

01:20:31   Who's great.

01:20:32   You know, they have all these deals, people and everything.

01:20:33   It says, what happened is, um, the, uh, Amazon ran out of a

01:20:39   stock that it sells directly.

01:20:40   And so they pulled a listing for someone selling it new, not, you know, used or new refurbished, whatever, and they drop that in so they don't drop the listing off.

01:20:50   And so in this case, Everday Lucky was the backup provider in their listing of third-party sellers for this particular model.

01:20:57   It's, it's, I'm looking at it as we speak, it's Everthing Lucky.

01:21:01   Oh, Everthing Lucky, sorry.

01:21:03   Everthing Lucky.

01:21:04   I know, isn't that, and so that, that is my story.

01:21:06   So, I'm like,

01:21:07   It's still there!

01:21:08   Yeah, so well, no, so I wake up this morning and Tony has responded at

01:21:11   Sweet Home via Twitter because great God, he's like, oh, here's, what's going on.

01:21:13   Check it's back.

01:21:14   And I go, I'm like, oh, there's the Amazon listing at $67.

01:21:17   Now.

01:21:17   Now I feel justified.

01:21:19   I've waited, right?

01:21:19   Do you think that this is the real product though?

01:21:21   Well, well, so here's, so, uh, it may be, but there's no way to know.

01:21:26   Like I'm buying it from a third party seller who has clearly done some magic.

01:21:30   So this morning I placed an order, I paid $67.

01:21:33   Everyone can criticize me for my profligate spending.

01:21:36   And, uh, if you like, please feel free anyway.

01:21:39   So I buy it and I write back to Tony.

01:21:42   I'm like, Hey, the thing you're right.

01:21:43   Swapped back.

01:21:43   And he's like, I just went back and it's there.

01:21:45   Apparently I bought the one model Amazon had in a warehouse that it fulfilled

01:21:49   itself and it's back to everything lucky.

01:21:50   So anyway, that's the problem they have.

01:21:52   So I don't know.

01:21:53   So everything lucky could be selling me a legitimate object and still $20 less than

01:21:58   the retail price at target, Walmart, et cetera.

01:22:00   However, um, they're getting the advantage of 2,500 positive reviews

01:22:05   And Amazon doesn't vet that that product is actually a new from Cuisinart item.

01:22:10   Yeah. So part of the, part of the problem, the way that, I mean,

01:22:15   this is so insidious, but part of this problem now is if, if it starts to,

01:22:19   if awareness starts to spread that you can't trust stuff through third party

01:22:24   resellers on Amazon,

01:22:25   it hurts all of the honest ones that the system was set up for in the first

01:22:29   place.

01:22:29   Exactly. It's terrible all around. There's also like even yet another variant on

01:22:34   is that, um, so those are like the almost legitimate cases, right?

01:22:38   Like this could absolutely be a fell off the truck or, you know, bought like, you

01:22:43   know, one of the things in China is that, and I've heard this, I hope, I don't mean

01:22:46   to tell lies about China.

01:22:47   So let's say this isn't, I don't know if this is true, but I've read it in

01:22:50   other number of accounts.

01:22:51   I've talked to people that admit stuff made in China.

01:22:53   It's a complaint made by a number of companies about things with China is

01:22:57   come.

01:22:58   Some factories will gear up products for a given maker.

01:23:01   And during the day they're being supervised or being made for Maker, you know, whatever company is, you know Reebok or whatever. It could be shoes, it could be

01:23:08   Cuisinart kettles. At night they fire up the lines, they make stuff and they sell that themselves. And they're essentially identical.

01:23:15   Sometimes there's labeling changes. So they're the identical unit with it not labeled so they avoid some intellectual property issues.

01:23:20   So the Birkenstock thing, when you're go to try to buy a Birkenstock thing and their list price is $100 for whatever and

01:23:27   and there's people selling it for $80, you might be getting the legitimate thing.

01:23:30   It might be made as a factory nighttime job or who knows what, or they're just

01:23:34   doing deeper discounts because however they're acquiring it, they're not

01:23:37   honoring the list price.

01:23:38   The same thing that a Cuisinart kettle bought from everything lucky could have

01:23:43   been absolutely the same as anything purchased directly from an Amazon

01:23:46   warehouse or from a third party, you know, uh, authorized Cuisinart thing.

01:23:51   But you don't know the, the other problem though, like the other thing is that,

01:23:54   Uh, what you were saying before is it's the, um, the counterfeit stuff.

01:23:58   That's just knockoffs that are crap and they appear to be listed the same.

01:24:02   So it's, I see this all the time for, you know, all these different products.

01:24:05   I'm looking for, for reviews and things.

01:24:07   You find stuff that is clearly the pictures are something off.

01:24:11   You buy it, something's not right.

01:24:13   And you're not, you know, sometimes it's listed under a slightly different name.

01:24:16   Sometimes it's listed as exactly the same thing and it's trying

01:24:18   to take advantage of the reputation.

01:24:19   But all of these problems persist because Amazon wants to sell more.

01:24:24   it doesn't want to do tighter inventory control

01:24:26   'cause it costs a lot of money.

01:24:27   It shaves the margins off.

01:24:28   - So I just sent you a link.

01:24:29   This is the 1.8 quart cordless electric tea cuddle.

01:24:33   I think this is the product you're talking about.

01:24:36   - Yes, that's right.

01:24:37   - I kind of hate the interface to tell you.

01:24:39   - I know, it's the best, cheapest thing.

01:24:42   Like if you want something better,

01:24:44   you have to spend more.

01:24:45   I know it looks horrible.

01:24:46   The queries are like, "CPK 17 sold by Everything Lucky."

01:24:51   - Now, if you look, I think it's the same product.

01:24:53   there's a thing that says size seven cup.

01:24:56   And the two sizes that are offered is seven cup,

01:25:00   which is $100.

01:25:02   And then the other one, instead of measuring in cups,

01:25:04   it just gives you the dimensions of the box,

01:25:06   9.7 inches by six inches by eight.

01:25:10   And so it- - Oh, they may have managed

01:25:12   to sneak, oh, interesting.

01:25:14   - They've made it look like there's two options

01:25:16   of the same product, but one is measuring by the cup

01:25:20   capacity of how much water you can put in,

01:25:22   and the other one is measured in inches.

01:25:24   - Yeah, and the one-point-eight-quart one,

01:25:25   the seven-cup one, is sold by Card Machine Outlet, Inc.

01:25:29   - Oh, the one, you know, mine says,

01:25:31   oh, it says, mine says it's by Cuisinart.

01:25:34   - Yeah, you scroll down under in stock,

01:25:36   it says, "Sips and Sold By." - Oh, I see.

01:25:39   - I know, isn't it, but so--

01:25:40   - No, mine is sold by Kitchen Capers.

01:25:42   - Oh my God, what happens when I reload?

01:25:43   That's hilarious.

01:25:44   - Mine is from Kitchen Capers.

01:25:46   - Every time I reload,

01:25:47   I'm getting a different answer, I think.

01:25:48   - Oh.

01:25:49   So this one is $100 and Prime is available,

01:25:53   but that $80 one, to me, is suspicious.

01:25:56   The $80 one, and I guess Amazon defaults to it

01:25:58   'cause it's cheapest.

01:26:00   - Yeah, that's exactly it.

01:26:01   - The fact that that's the one by everything lucky,

01:26:03   the fact that it's cheaper makes me think

01:26:05   that it might be fake.

01:26:06   I would actually, if I were gonna buy this right now,

01:26:08   I would actually spend the extra $20

01:26:10   to get the one that says it's by Cuisinart.

01:26:13   - Yeah, except here's the funny thing.

01:26:14   So when I logged in this morning,

01:26:15   Amazon apparently had gotten one on their shelves

01:26:17   that they fulfilled directly,

01:26:18   so it's supposed to be sold by Amazon, 67 bucks.

01:26:21   So I got the deal, I guess I got the one that was $67.

01:26:25   I'm very happy about that.

01:26:25   I'm gonna be able to boil my tea

01:26:26   at all kinds of temperatures.

01:26:27   It's gonna be great. - Now there's another,

01:26:28   there's another type of fraud going on on Amazon,

01:26:31   and this is just comical. - Oh my God, no, come on.

01:26:33   - I linked to somebody on Twitter yesterday

01:26:38   who bought, wanted to get a floor mat,

01:26:41   and they got like a, almost like a mouse pad type mat

01:26:46   with printed on the mouse pad like it is it is floor mat size but printed on the

01:26:51   piece of foam is a screen print of the sort of texture of the full,

01:26:57   I was crying. I was explaining that to my wife this morning cause I was saying,

01:27:02   okay, I bought a tea kettle. Here's my story. And she's laughing at it.

01:27:05   I told her about the other one, the cup,

01:27:07   the changing temperature pattern cup one. Right. So the other one,

01:27:12   I guess I'll put a link to my Daring Fireball link piece

01:27:15   in the show notes.

01:27:16   - That was so funny. - And you guys can look

01:27:17   it up there.

01:27:18   But somebody bought, there was a listing on Amazon,

01:27:20   and it showed a picture, two pictures of the same mug.

01:27:24   One was like when it's empty, and it just looks black,

01:27:28   and if you fill it with a hot beverage,

01:27:30   the color will, it's printed with some kind of

01:27:34   temperature-sensitive ink, and it changes and gives you

01:27:37   like a snowy Christmas scene.

01:27:39   And so somebody bought this on Amazon,

01:27:42   And what they got was a mug where somebody had printed that photo of two mugs onto the mug.

01:27:49   So it was a mug, a mug with a, with a photo of two mugs on it.

01:27:57   It's kind of, I actually think it would be a great, now that, now that has become a meme,

01:28:00   that'd be a great gift. I would love to buy a mug with a picture of two mugs on it. I think that sounds great.

01:28:05   Uh, and I guess that the idea with that type of scam is that it is so, I mean, I'm guessing

01:28:13   that this was not a very expensive mug that if it's only like four bucks or six bucks

01:28:18   that people wouldn't even bother to send it back because it's like, what's the, you know,

01:28:23   what's the point?

01:28:24   You know what I mean?

01:28:25   Like you feel like you're ripped off, but at a certain point, it's more of a hassle

01:28:28   to send it back than it is the money's worth.

01:28:30   Can I tell you my perfect Walmart experience, which is a Walmart pay just came out.

01:28:35   nationwide, they rolled it out.

01:28:36   And this is, so I've been laughing about currency,

01:28:39   which was the, you know, the big retailer system

01:28:41   that was supposed to use 2D codes

01:28:43   and checking accounts and crap.

01:28:45   Been laughing about that for years.

01:28:46   Susie Oakes and I on the Macworld podcast,

01:28:48   every time a currency story comes out,

01:28:49   we make sure to highlight it so we can laugh at it.

01:28:52   Because it sort of, if it had come out before Apple Pay

01:28:55   and then Android Pay,

01:28:56   maybe it would have gotten a little bit of traction,

01:28:58   I don't know, but it didn't, right?

01:28:59   It was like ridiculous.

01:29:01   It's finally been a basically shut down.

01:29:02   they're focusing on the MCX consortium,

01:29:05   that's a bunch of these big retailers,

01:29:06   is now focusing on backend stuff, which is great.

01:29:08   So Walmart had been working on its own system,

01:29:11   it's a member of the MCX consortium,

01:29:13   but it had been working on its own variant.

01:29:14   And I saw an announcement about it,

01:29:15   I'm like, you know, this doesn't look as awful.

01:29:18   They don't accept checking account linkages,

01:29:20   so there's less risk of your stuff being hijacked, right?

01:29:23   Where your checking account can be drained

01:29:25   and it's always a pain to get anything fixed.

01:29:26   I just talked to someone the other day,

01:29:28   they had $160,000 taken out of their checking account

01:29:31   after they sold a house.

01:29:33   Took them six months to get it back,

01:29:35   even though they had not authorized anything,

01:29:36   they weren't even scammed, the bank was scammed.

01:29:38   So checking accounts are a pain in the ass.

01:29:41   Credit cards, debit cards, we have protections.

01:29:43   Even gift cards, state-

01:29:45   - That's terrifying.

01:29:46   - Yeah.

01:29:46   - That's positively terrifying.

01:29:48   - There are no protections on accounts.

01:29:49   Let's go all, go back to Bitcoin and gold.

01:29:52   So state attorney generals have a lot of control

01:29:55   over gift cards.

01:29:56   There's this, those are state regulated.

01:29:58   And so there's control even there.

01:30:00   So Walmart pay will let you use,

01:30:01   a Walmart gift card, a debit card, credit card,

01:30:03   and prepaid card, something like that.

01:30:06   And I'm like, well, this is kind of cool.

01:30:07   You still have to use a barcode.

01:30:09   You're scanning a 2D code, but it all looked sensible.

01:30:13   So I pitched to Mac, well, let me go right about this.

01:30:15   Is it? Sure.

01:30:16   So I find a Walmart, it's about 20 minutes away.

01:30:17   I've been to, they opened it actually a store

01:30:20   they shut down, which is rare for them.

01:30:22   And I go there and first thing in the morning,

01:30:23   the place is totally empty.

01:30:25   No one's even trailing me around to make sure I don't steal.

01:30:27   There's no greeter, nobody checked my receipt when I left.

01:30:29   The place is empty.

01:30:30   We need to buy a cheap clock with a face for our dining room

01:30:34   so that my younger son does not spend an hour and a half

01:30:37   eating at the table, the same piece of toast.

01:30:39   So he can get through things in the day.

01:30:41   So some children dawdle, some eat fast.

01:30:44   Anyway, so I find a clock, it costs $6.50.

01:30:46   I'm like, this is great, it's battery operated,

01:30:48   it looks cool, I like the face, whatever.

01:30:49   Do the checkout process.

01:30:50   I actually quite like it.

01:30:51   I'd already set up Walmart Pay.

01:30:53   The app is actually very well designed.

01:30:56   Point of sale system displays a code.

01:30:58   You just open your app, you tap it and you're done.

01:31:00   and it's electronic, I'm like, this is great.

01:31:02   I get home, I unpack it, the clock doesn't work.

01:31:04   (laughing)

01:31:06   So that's the Walmart story.

01:31:09   - I have noticed that it seems like,

01:31:11   I don't know if this was predicted or not,

01:31:15   I actually wanted to talk about Apple Pay.

01:31:18   - Oh yeah.

01:31:18   - It seems to me, and Apple is an interesting,

01:31:22   Apple's usually late to most things,

01:31:24   but every once in a while, they're early on things.

01:31:27   Like for example, Wi-Fi.

01:31:29   Apple, you know, like when Apple introduced the iBook

01:31:33   that had Wi-Fi, they actually had to explain what Wi-Fi was.

01:31:36   And that was the event where Phil Schiller did like a stunt,

01:31:40   like he like jumped off, climbed up a ladder

01:31:43   and jumped 10 feet onto a padded mat

01:31:47   while holding the iBook to prove that it was, you know,

01:31:50   getting the internet over the air.

01:31:51   Like they actually, it was almost like, you know,

01:31:55   the idea that you're getting internet over the air

01:31:56   was such a novelty that they actually felt like

01:31:59   to prove it. And it seems to me like Apple Pay is another one of those things where Apple Pay came

01:32:03   out at the right time, like, because it's just maybe it's a local thing here in Philly, but maybe

01:32:09   it's something else, but a whole bunch of chains around here have suddenly started getting the chip

01:32:13   and pin registers. And they, they all seem to work with Apple Pay, even though they don't have Apple

01:32:19   Pay logos yet. Like, so we have a supermarket chain here called Acme. They don't have, there's

01:32:24   There's no Apple Pay logo, but it just says tap or pay or something.

01:32:29   I forget what they all say,

01:32:31   but there's like a little logo that suggests that maybe there's some kind of NFC

01:32:35   thing. And so I've, I tried Apple Pay and it just works.

01:32:38   And Starbucks are local, at least the,

01:32:42   the one I go to now has chip and pin, but a lot of these ones too,

01:32:46   they get the chip and pin. Um,

01:32:48   and then they have a piece of tape there and it says chip chip doesn't work yet.

01:32:54   - Yep.

01:32:55   - But at Starbucks, Apple Pay worked and I paid.

01:32:58   - Oh.

01:32:59   - I paid with Apple Pay at Starbucks.

01:33:01   And-- - I think that's new, right?

01:33:03   - It is totally new.

01:33:04   I mean like--

01:33:05   - You may have an early rollout.

01:33:06   - It's like maybe like within the last 10 days,

01:33:08   at least here.

01:33:09   And that the none of, you know,

01:33:12   and it's funny because it like Whole Foods, where I go,

01:33:15   they've had Apple Pay for a while.

01:33:16   They were like a debut partner

01:33:18   and like listed on the slide on the stage.

01:33:22   And I've been using it at Whole Foods for years,

01:33:26   ever since it came out.

01:33:27   So it's not a novelty there, but at Starbucks,

01:33:29   I've gotten like two or three of the clerks

01:33:31   have been like, whoa, what did you just do there?

01:33:33   That was amazing.

01:33:34   But then I went today and it didn't work.

01:33:38   It worked in so far as when I got my iPhone

01:33:41   near the terminal, my credit card came up on the screen

01:33:45   and it read my fingerprint and went, "ching,"

01:33:50   and said done, and then the little hand terminal

01:33:54   said processing, but then the processing never went through

01:33:58   and I forget what it said.

01:33:59   It just like, processing was up way too long,

01:34:01   'cause Apple Pay is very fast usually.

01:34:04   It was processing way too long and then it said like,

01:34:07   payment could not be completed.

01:34:09   And so I had to pay with an actual credit card

01:34:12   like a 20th century person.

01:34:14   - I think my recollection is that NFC is typically like,

01:34:18   It's a protocol.

01:34:18   The payment is a over FCS protocol.

01:34:21   So even if it isn't supposed to take Apple pay, if NFC is enabled at all, it will try to do it, but the back end part may not work.

01:34:29   Cause I think that was the deal.

01:34:31   Was that, um, Oh, I forget CVS or something.

01:34:33   One of the MCX partners.

01:34:34   Yes.

01:34:35   Yeah.

01:34:35   At launch, it was like people were paying with Apple pay and they're like, oh, it's not supposed to work.

01:34:38   It's like, they literally, they literally did stay nationwide.

01:34:43   CVS disconnected their entire, the ability to pay with any NFC at all, just so that it

01:34:50   wouldn't use Apple pay, even though they were actually getting the money from Apple pay.

01:34:54   It wasn't like they were getting ripped off. It was, they didn't want to put, well, they

01:34:57   all had a deal and it was, I just, I just got emailed from a company. I don't want to

01:35:01   mention its name yet because I haven't tested it, but I'm so full of secrets. I'm so full.

01:35:05   No, it's not that they're secret. I don't want to promote them until I see what they

01:35:08   actually do. Uh, they're in beta testing it like public beta. You have to sign up. Our

01:35:12   So that's how credible they are.

01:35:13   Okay.

01:35:13   They're great.

01:35:14   Um, and it's a very funny video, of course.

01:35:16   Uh, so, and very informative.

01:35:18   So this, uh, do you remember several years ago, there were some credit cards, uh, that

01:35:23   would let you create an individual card number for every transaction?

01:35:27   Yes.

01:35:27   Yes.

01:35:27   I love that.

01:35:28   I use that and you could set things like this can charge no more than a hundred

01:35:32   dollars a month.

01:35:33   This is a one-time use and should only work until such like all these things.

01:35:35   And the interface was terrible.

01:35:37   It was premobile.

01:35:37   It was awesome.

01:35:38   It was awesome.

01:35:39   And the interface was terrible.

01:35:40   It's pre-mobile.

01:35:40   It was awful.

01:35:41   And online payment was terrible anyway.

01:35:43   So this is an outfit that is doing the same thing with an app.

01:35:46   You sign up, you get a credit card through their partner, which I forget which bank it

01:35:50   is, a major bank and a 18% credit, you know, interest, APR.

01:35:55   So it's, you know, it's the kind of card you better pay off because it's not reasonable

01:35:58   otherwise and 1% cash back.

01:36:00   So it's, they got all these parameters on it, but the fact is like for online

01:36:03   transactions, like you're, they have a physical card you can use.

01:36:06   You can use that's got an EMF or MV chip in it.

01:36:10   Um, so you could use that, but when you're paying online, uh, you know,

01:36:14   any kind of transaction you run the app, it generates a unique number with

01:36:17   whatever parameters you want, like one time or whatever, and you use that

01:36:20   one time number and so it gets stolen.

01:36:22   You know, who stole it.

01:36:23   So I'm going to try.

01:36:24   They gave me an invitation.

01:36:25   I signed up.

01:36:26   I'm going to test my credit rating, but, um, I had three card

01:36:29   numbers stolen this year so far.

01:36:30   It's amazing.

01:36:32   It's been so long for me.

01:36:33   And numbers.

01:36:33   I had a visa and then an Amex and then another visa, three different

01:36:36   car issuers and in every case, I got to tell you the fraud people have got, I

01:36:41   mean, they used to be good.

01:36:42   I dealt with this in the past a bit, but every call was with someone who is so

01:36:45   crackerjack and like, they're obviously paying people well, they're training them

01:36:48   well.

01:36:49   These people were amusing and fun.

01:36:50   Nobody's like fun to talk to while we're going through all the crap you have to go

01:36:54   through.

01:36:54   And they took care of it in every case they caught it in one case, like a 38 cent

01:36:59   transaction to a charity went through and, but everything else didn't.

01:37:02   So the pilot,

01:37:03   yeah, yeah.

01:37:05   That's what the, that's what they do.

01:37:06   I forget it wasn't me. It was my wife. I think it was Amy gas station charge or an online charge

01:37:12   Uh, like in a bodega somebody bought like coke they see if it's been canceled. Yeah, so they got it

01:37:18   they they bought a coke and

01:37:20   Then they went to buy like, you know, I don't know TV. Yeah

01:37:24   So in each of these three cases their fraud

01:37:26   pattern got it and they're like in two cases nothing in one case 38 cents, which was refunded and

01:37:31   Poor charity that got 38 cents. They didn't know they were gonna get and then it's taken back

01:37:35   But so I'm very interested and you know you think so I am so excited for Apple pay and Safari not because I am

01:37:41   So excited about Apple pay

01:37:43   But I'm like Apple pay and safaris can be the beginning of a transition because Android pay in

01:37:48   whatever browsers is coming like they're gonna be all these mobile pay options will now be available through desktop and

01:37:54   mobile web transactions

01:37:57   The minute any of my cards is enabled. I'm like never gonna buy anything from a site that doesn't do Apple pay in Safari again

01:38:04   I'm dreading my Amex getting ripped off again.

01:38:08   It was a couple years ago and somebody, some nitwit tried to buy jet skis in Arizona or New Mexico.

01:38:16   Either Arizona or New Mexico.

01:38:17   That's very funny.

01:38:18   But you know, the guy from Amex called me and said, you know,

01:38:23   "You're not trying to buy jet skis in Arizona, are you?"

01:38:26   And I was like, "No, definitely not."

01:38:27   And he goes, "All right, well somebody is sorry."

01:38:29   You know, and like you said, totally on the ball.

01:38:32   Just don't worry about it, but your card is now canceled.

01:38:35   I'm FedExing you a replacement.

01:38:37   It should be there tomorrow.

01:38:39   Really sorry, you know, you know, but you know, and you know, and take a look,

01:38:44   take a look at your next, you know, statement.

01:38:47   And I guess he read some of my recent transactions that were me.

01:38:51   And I was like, yeah, those are all good.

01:38:54   But I'm dreading my the one I've had now for a couple of years getting ripped off

01:38:58   because it's my new couple of my new card came my last three digits are double

01:39:03   o seven oh my god yeah I love this card custom card numbers why don't they sell

01:39:07   them I think if you're a very high like you're a whale of a card user or an

01:39:12   investment I think you can get a number you want get some lucky numbers on there

01:39:15   double I can you believe it I got double so oh my god that's so it's so good and

01:39:19   I always read it when I have to read it over the phone I always say it that way

01:39:21   like you know whatever the whatever whatever whatever whatever whatever and

01:39:25   and then double it.

01:39:26   (laughs)

01:39:28   - Yeah, I think, I mean, the pain of the thing,

01:39:32   the fact that we're still using unsecured numbers

01:39:34   to do this is sort of hilarious.

01:39:35   Like I've wondered, why can't the credit card companies

01:39:37   be set up to do two factor authentication?

01:39:40   Like I'd be delighted if I went to Amazon,

01:39:42   I punched in my cart, or not Amazon,

01:39:43   'cause they have their own whatever.

01:39:45   But I go to a random site X,

01:39:46   and before the transaction goes through,

01:39:48   it texts me a code and I have to enter it.

01:39:50   Like I know the backend systems are ancient and weird

01:39:53   and whatever, but you'd think after this many years,

01:39:55   they could just tack that on and you'd enable it

01:39:58   in your card and if you went to a site

01:39:59   that couldn't do it, they'd say,

01:40:01   you have to try this transaction again

01:40:02   after clicking a link that's being sent to you,

01:40:04   email or some kind of bypass,

01:40:05   but apparently the frictionless nature of e-commerce

01:40:09   has to be emphasized over the amount of fraud.

01:40:11   Fraud at some point becomes so high

01:40:14   that they have to then invent new ways

01:40:15   to prevent against it, but there's some balance there.

01:40:18   - So what do we think?

01:40:21   circle back to Amazon.

01:40:24   I think Amazon's gotta clean this mess up.

01:40:26   I think Amazon needs to,

01:40:29   I know that they're operating at a massive scale

01:40:33   and there's sort of an app store-like problem there

01:40:37   where if you've got all of these hundreds of thousands

01:40:39   of products from all these partners

01:40:40   that maybe they can't ever achieve perfection.

01:40:44   But at this point, it seems like nobody's even

01:40:46   watching the door, you know what I mean?

01:40:50   - Yeah, well, they could say there's like,

01:40:51   they could say there's a thousand major brands or 10,000 major brands,

01:40:54   or they could even partner with major brands who want to do this and say,

01:40:57   if you're company,

01:40:58   if you're some no name supplier who ships us a Cuisinart electric kettle,

01:41:02   we're not going to list it as if we're not going to commingle the inventory

01:41:05   because you're nobody, right? Or you have to prove your relationship,

01:41:07   or you have to do some Providence, or you have to prove yourself over time.

01:41:10   They give you a, you know, they give you some kind of, um, uh,

01:41:13   long payments to sort of do something to vet you before you're allowed to ship

01:41:17   product that would be commingled. They don't, as far as I can tell,

01:41:20   They do a little bit of something.

01:41:22   I know there's issues with how they hold payment

01:41:23   and so forth, but I don't think they have

01:41:25   any real processes in place because it hasn't hit them yet.

01:41:29   But if you have companies like Birkenstock

01:41:31   saying we're more willing to back out

01:41:35   of these relationships, like Birkenstock,

01:41:38   I don't know how many tens of millions of products

01:41:40   they sell online, dollars of products they sell online,

01:41:42   but it's gotta be something.

01:41:43   So them saying basically, they're gonna tell,

01:41:45   I mean, this was a leaked memo, so we don't know,

01:41:47   this is internal stuff,

01:41:48   it wasn't a Birkenstock announcement.

01:41:50   So if you have companies saying,

01:41:51   if you buy a product with our name on it from Amazon,

01:41:56   it is not authorized and it is likely counterfeit,

01:42:00   if it's being sold new, that is, I mean, that's--

01:42:03   - That is serious damage to Amazon's brand, right?

01:42:05   Because-- - Who wants that?

01:42:06   - Retail is largely about trust, in my opinion.

01:42:10   I mean, I guess for some people,

01:42:11   and maybe this is the way Amazon sees it,

01:42:13   maybe, I guess for some people,

01:42:14   retail is largely about price,

01:42:16   and it's all just cheap, cheap price.

01:42:18   And Walmart is sort of built on that.

01:42:20   But I think Walmart has, for people who like Walmart,

01:42:25   there's a certain trust, right?

01:42:29   I think part of it is that people trust

01:42:31   that the prices are gonna be low.

01:42:32   There's no, you know, we don't have,

01:42:34   don't even bother going around town and pricing--

01:42:37   - Exactly. - Pricing your dog food

01:42:39   at the supermarket too.

01:42:41   Just get it at Walmart.

01:42:42   You know it's gonna be as cheap or cheaper.

01:42:44   And I think people know that when you buy

01:42:48   whatever brand dog food at Walmart,

01:42:50   it really is whatever brand dog food.

01:42:53   You know, it's the, you know,

01:42:54   the Cuisinart thing you buy at Walmart is a Cuisinart.

01:42:57   I think it's serious, serious, serious damage

01:43:01   to Amazon's trust that it's turning its reputation.

01:43:04   It's starting to be like eBay,

01:43:06   where it's like, who the hell knows what you're gonna get?

01:43:08   - It's true.

01:43:09   I remember, by the way, Jeff Bezos said this.

01:43:11   I worked for Amazon briefly for like six months in '96, '97.

01:43:15   Was hired by Jeff who I knew

01:43:16   when the company was starting out.

01:43:17   So it was a little bit of a, you know, it wasn't nepotism per se.

01:43:19   I did a great job.

01:43:20   I did a great job.

01:43:21   Did a bunch of stuff.

01:43:22   There's still like, I understood a bunch of programs.

01:43:24   I feel very happy with my time there.

01:43:25   Like what I got done.

01:43:26   But, uh, Jeff said at some meeting and I, I, 20 years ago, I remember it very distinctly.

01:43:30   He said, um, we're eventually going to become authoritative for price.

01:43:33   And what I mean by authoritative is not that we always had the best price that people don't think they need to go anywhere else.

01:43:38   They just assume we do.

01:43:39   Right.

01:43:39   And I was like, and that was, I was like, I thought it was like, huh, I wonder how that'll work out.

01:43:43   It's like, that's what's happening.

01:43:45   So, you know, I run this book price comparison

01:43:46   site called isbn.nu.

01:43:48   It's my ongoing programming experiment and

01:43:50   running a large, like a high traffic site.

01:43:52   It's millions of queries a day.

01:43:53   People just punch an ISBN or you search on a book

01:43:56   and it gives you price results from a dozen,

01:43:58   15 bookstores.

01:43:59   And what's fascinating to me is over time, the

01:44:01   revenue has gone down because people don't

01:44:02   price compare that much.

01:44:04   And most of the revenue used to be Amazon was,

01:44:06   I don't know, like 25% of the revenue.

01:44:08   Now it's like 75% because people come and

01:44:11   they search like, ah, let's get it at Amazon.

01:44:12   I mean, that's kind of the result.

01:44:14   And there was a point at which like most, I

01:44:17   think the majority of my money comes from people

01:44:19   doing textbook searches because then you have

01:44:21   more variety and I mean, there's textbooks that

01:44:24   sell, there's people about textbooks that cost

01:44:26   $1,200 and I get four to 8% of that.

01:44:28   I'm like, hooray, but holy crap.

01:44:30   So, uh, so I get a lot of sales in, uh, you

01:44:33   know, in July, August and January, basically

01:44:35   it's kind of a funny pattern I have.

01:44:37   Um, but it's been interesting to watch it

01:44:39   because it's been this gradual change as, as

01:44:41   Amazon has just assumed to have the best price

01:44:43   as other stores like why would you buy I bought a bunch of DVDs and blu-rays from Barnes and Noble

01:44:48   a few weeks ago because they had a ridiculous sale they were doing 40 to 50 percent off everything

01:44:53   already then they had a 30 percent off coupon you could add with it and they had free shipping so I

01:44:58   bought stuff some things for 70 to 80 percent off list price or yeah off list price but I never buy

01:45:04   from bn.com um let me take one final break here thank our third and final sponsor of the show it

01:45:10   is our old friends at Casper, a company that makes obsessively engineered mattresses at

01:45:15   shockingly fair prices. I have recently been on vacation. We stayed in two different hotels,

01:45:21   split a little trip, and one hotel had, in my opinion, a terrible mattress. I woke up

01:45:27   every day and I was a little miserable, and then we immediately spent the second half

01:45:31   of the trip in a different hotel, which had a terrific mattress. I thought of Casper and

01:45:37   And I thought, you know, this,

01:45:38   'cause a lot of times if you don't think about it,

01:45:40   you don't get to compare one day after another

01:45:42   what a difference a great mattress can make.

01:45:44   I am one of those people who says,

01:45:46   "Hey, you really do sleep a third of your life.

01:45:48   "It's worth getting a great mattress and a good bed."

01:45:52   You spend so much time in bed,

01:45:53   you probably spend more time in bed

01:45:55   than you do anywhere else.

01:45:57   Why not get a great one?

01:45:58   Well, Casper has created terrific mattress.

01:46:01   They have an engineering team in-house

01:46:03   that spent thousands of hours

01:46:04   developing their Casper mattress.

01:46:06   It combines springy latex and supportive memory foam

01:46:09   for a sleep surface with just the right sink

01:46:12   and just the right bounce.

01:46:13   I love the fact that Casper just has one type of mattress.

01:46:18   All you do is pick your size.

01:46:19   You just pick a size and that's the mattress.

01:46:21   Because how the hell would you pick?

01:46:22   Like if they had like seven different types of mattresses,

01:46:25   how would you pick?

01:46:26   I don't know how to pick.

01:46:28   I just, I trust that somebody who becomes a mattress

01:46:31   engineer is gonna do the job for me, right?

01:46:34   It's sort of like Apple in that regard,

01:46:35   where they're going to just do that's what design is design is making choices

01:46:39   well Casper has figured out what they think is the best way to make a mattress

01:46:43   now maybe you disagree maybe you get it you don't like it well guess what they

01:46:47   have a hundred night home trial so if you buy the thing take it up the steps

01:46:52   and that they vacuum seal these things into the most ridiculously little you

01:46:56   cannot believe that there is like a queen or king size mattress in one of

01:46:59   these boxes it you take it up in a room follow the directions it sucks all the

01:47:04   out of the room to fill it up and all of a sudden you've got this little box now

01:47:07   you've got a queen or king-size mattress whatever you need if you don't like it

01:47:11   you have a hundred nights and if you don't you just go to the website and say

01:47:15   I'd take my mattress back they give you all your money back and they take care

01:47:19   of the hassle of getting the mattress out of your house that's how confident

01:47:24   they are and how few people actually take them up on it so if you have any

01:47:27   reluctance to buy a mattress online because you haven't actually sat there

01:47:31   in a gross showroom where other people have laid on the same mattress and poked at it

01:47:36   before. You don't have to worry about it. You can't lose. I have heard from readers,

01:47:41   a lot of readers. It's the craziest thing in the world to me that I have become like

01:47:45   a spokesperson for a mattress company. Among the many things I never thought I would ever

01:47:49   do in life is sell mattresses. But it's so funny to me, but readers write to me and say,

01:47:55   "You know, I just moved. I had to get a new mattress and I got one of the Casper things

01:47:59   I you know and expected that they would send it back or whatever. They're like, this is great

01:48:02   This is like staying in like a best hotel. It is a great mattress

01:48:06   So get yours today go to Casper comm slash the talk show Casper

01:48:11   Casper ER slash the talk show use the code the talk show with the the and you save 50 bucks on your mattress and the

01:48:19   Prices are great. It's 750 for full 850 for Queen 950 for a king a king-size mattress for 950 you save 50

01:48:26   It's only 900 bucks.

01:48:27   Most stores, that's like two grand.

01:48:29   Really is.

01:48:30   That's how expensive mattresses are.

01:48:32   So go to casper.com/thetalkshow

01:48:34   the next time you need a mattress.

01:48:36   And maybe consider the fact

01:48:38   that maybe you do need a mattress.

01:48:40   - I want someone to do that experiment

01:48:41   like with Amentos and Pepsi's

01:48:42   where they open a bunch of Casper mattresses in a room

01:48:45   and see if all the air is sucked out of the windows pop in.

01:48:47   Like now I wanna see that.

01:48:48   - Or if people can't breathe.

01:48:51   - Can't breathe, Casper.

01:48:54   Last thing on my agenda for this show was this issue where, uh,

01:48:58   this guy, Milo, uh, I don't know how to pronounce his name.

01:49:02   Yiannopoulos. I believe it is.

01:49:05   The Annapolis, um, sort of a, uh, uh,

01:49:08   conservative agitator slash political columnist. Uh,

01:49:12   I'm not quite sure how to describe him for people who aren't familiar with them.

01:49:15   He's the very six, very, very successful troll.

01:49:21   very successful, charming in a way that only people with an English accent can ever get away

01:49:26   with while being—he's a younger Boris Johnson for people into politics. Younger, nastier Boris Johnson.

01:49:34   Brian And this relates to what you were talking about before, where there is some subset of—he's

01:49:40   a participant in this new subculture called the alt-right, which is, you know, I don't want to

01:49:48   want to get too much into politics of it, but there is a subset of this movement and

01:49:54   of the internet at large that is, of all the things in the world to be upset about, very

01:50:02   downright angry that the Ghostbusters reboot features an all-women cast of Ghostbusters.

01:50:10   the people actually busting ghosts are for women. When in the original movie, there were four men,

01:50:19   and they are very upset about this. And I don't really know why. It seems very strange to me.

01:50:25   To me, this is a sign that these people have some very...

01:50:31   I think if you're upset that Ghostbusters is now all women, it is a very good sign that you have

01:50:37   some very significant issues with women. Well, this guy, I think I'm being fair here. Somehow,

01:50:47   a week ago, Leslie Jones, also speaking of very talented current SNL cast members.

01:50:53   I have to interrupt you. I love, more than Kate McKinnon, I love Leslie Jones so much, so much,

01:51:00   because she is the kind of comedian you do not see on television. Not just that she's black,

01:51:04   But she's like a statuesque, she's like, I don't know, she's six something.

01:51:08   She's six feet tall.

01:51:09   But she is not a traditional, you know, figure.

01:51:12   She is, she is this large, beautiful, totally outspoken woman with spiky hair who has this

01:51:19   like incredible voice and she has this like intensity that like John Cleese at his best,

01:51:25   when he was like angry John Cleese in the early Imani pythons and could go red face.

01:51:28   Like she has this incredible energy and I love her and I love that she is on SNL.

01:51:34   now. I love that she's in this film. Yeah, I think she's really, I like, she's really great on SNL too.

01:51:40   I love, again, like you said, she's definitely not the sort of comedian that you typically see.

01:51:44   I can't think of anybody else to even compare her to. No, there's no, I mean, she's, yeah,

01:51:48   she's kind of her own. I mean, I don't go to clubs, but she seems to be like her own,

01:51:52   like, unforg, un, like, um, she does not change herself for anybody else. And she got her place

01:51:58   in this show and is doing her thing. Absolutely. Uh, she's a good writer too. She's not just a

01:52:03   a good performer because I, you know, she writes like her own bits when she, my favorites are when

01:52:07   she's on the weekend update desk. And that's really just, you know, those are her bits.

01:52:13   Well, anyway, she's in the Ghostbusters, she's busting ghosts and was on Twitter engaging,

01:52:19   you know, with the fans and somehow just started getting a just a steaming, ceaseless barrage of

01:52:30   at replies and mentions, racist, misogynist, I don't even know, I mean there's got to be even

01:52:37   more things, more offenses, but really some just sick, sick stuff from these gamergate troll types.

01:52:45   And she, you know, engaged with them. She started retreating some of them, and this Milo guy jumped

01:52:54   in on it, and including, and to me the one I think probably put him over the top, was the one where

01:53:00   where he posted a fake screenshot that made it look as though Leslie Jones herself called

01:53:06   somebody a kike.

01:53:09   And she didn't.

01:53:10   She I think there's one where she, the manufactured screenshot, which had more than 140 characters

01:53:16   by the way, but it was part of the tell, but who's counting was accusing him was like calling

01:53:21   you know, he's he's openly gay.

01:53:22   He makes a big point of being conservative and very far right and gay and he's just at

01:53:26   the Republican National Convention hosting a party.

01:53:29   very into that, right? So he, the screen capture had something that was essentially making fun of

01:53:34   him for being gay. I forget the exact detail. Uh, so that was part of it too. Like he was saying,

01:53:38   look, I'm being attacked by her. Right. When in fact she's not. And these fake screenshots

01:53:44   were not, uh, there's no way that you could, they're not parody. It's not like when our pal

01:53:50   Darth, you know, posts a picture that makes Trump's hand seem like a Barbie doll size.

01:53:55   You know what I mean? Like, you know, like that's parody. And nobody is, you know,

01:54:01   it's nobody's fool. This is a deliberate attempt to turn his followers, to get them

01:54:07   to actually believe that she was committing, you know, these acts of hateful tweets as well,

01:54:13   which would encourage them to, "Well, hey, if she's going to do that, let's take the gloves off,

01:54:17   you know, now fair is fair. Now we can, you know, go racist and misogynist on her."

01:54:24   And so what happened is this guy, and this guy's been in trouble with Twitter before,

01:54:28   where people have reported him for this sort of abuse and harassment, leading a caterer of

01:54:34   harassers. Before, at one point he was verified. You know, I think it was part of the whole thing.

01:54:40   He is a journalist. He works, you know, he's worked for legitimate publications before. I think he's

01:54:44   the tech correspondent for Breitbart now. If you don't know Breitbart, it's a, well, I think it'll

01:54:52   say everything you need to know is it's a conservative leaning website that was very

01:54:59   early on the pro-Trump. Well, yeah, it's the current state. It's also, if you haven't looked

01:55:04   at Breitbart in a while, I'm not sure you should, but it was founded by Andrew Breitbart, who's like,

01:55:08   was our age and then died. And I know people who liked him, actually like, like friend to friend

01:55:12   to friend liked him very much personally. His site was, oh, he tried to, he'd helped co-start

01:55:17   Huffington Post and then left and started this far right thing. And, but it wasn't, didn't involve

01:55:22   of like white supremacy and whatever.

01:55:24   It was pretty far right.

01:55:26   And he's the one who found Elliot Spitzer's,

01:55:28   who's picked, Anthony Weiner's Weiner,

01:55:30   didn't he have the shot of that, I think, revealed?

01:55:32   - Maybe, I don't know.

01:55:33   - Anyway, so Breitbart now, if you remember

01:55:36   what the site was like when Andrew Breitbart

01:55:38   were live and ran it, this is like something

01:55:40   even so far beyond that, it makes me look back fondly

01:55:43   at the time that Andrew Breitbart ran the site.

01:55:46   - Yeah, when Breitbart ran it,

01:55:48   I was not a regular reader of it,

01:55:49   but I was familiar with it.

01:55:52   It was like something I didn't agree with, but not something that I found offensive,

01:55:57   whereas now, to me, it is borderline offensive.

01:56:00   To me, it comes about as close as you possibly can in today's...

01:56:05   Even if you're anti-quote-unquote "anti-politically correct,"

01:56:10   there are still modicums of discourse that we all agree to.

01:56:16   It comes about as close as you can in the guidelines of modern discourse

01:56:20   to being like white supremacist in my opinion.

01:56:23   Yeah, it crosses into, it almost, I mean, this is, because this is Milo's stick without

01:56:29   even getting into the political aspect of it, because there are left, left-side extremists

01:56:34   who do the same thing. And we saw a lot of times the left does not get the same criticism

01:56:37   as the right. The right seems to get a lot more coverage when they say extreme things

01:56:41   and they verge into, you know, nativism and white supremacy and things, and you know,

01:56:46   anti-Semitism and so forth. The left has, unfortunately, the far left has an anti-Semitism

01:56:50   and other kinds of extremism and the whole, uh,

01:56:54   Bernie period revealed, unfortunately, that a

01:56:56   subset, not all, not all Bernie Bros, not all

01:56:59   Bernie supporters, but a subset of, uh, Sanders,

01:57:02   the people who alleged to be, may have been good

01:57:05   trolls, may have been deep supporters also

01:57:07   engaged in some pretty severe activity.

01:57:09   And it's they, but they always try, especially,

01:57:12   I mean, Breitbart does a great job of this.

01:57:13   Milo, especially they're trying to come up to the

01:57:15   line of hate speech without crossing to it,

01:57:18   where they get into something that's, that's

01:57:19   actionable, where they could get sued and lose,

01:57:22   not just get sued, but sued and clearly lose,

01:57:24   or in which they would cross some line in which

01:57:27   some aspect of decency would involve the maybe

01:57:30   a criminal statute, like hate speech is not

01:57:33   protected in a line.

01:57:34   You know, all speech is not absolutely protected

01:57:35   in the United States.

01:57:36   There's been a lot of trials about it, but

01:57:38   certain kinds of hate speech used for incitement,

01:57:40   especially if you're publishing a website in

01:57:42   multiple countries.

01:57:43   Uh, I don't know if I can't remember if Milo

01:57:45   lives in the U S UK or he's back and forth, but

01:57:47   like, you know, he's a, he's a, he's a, he's a

01:57:49   Like he could wind up being, you know, prosecuted

01:57:52   if he crossed certain lines outside the US than here.

01:57:56   Anyway, so like they, but they are knowingly skating up

01:57:59   to the precipice and skating back,

01:58:01   where they will get as far as they can

01:58:02   with the skates hanging over the cliff,

01:58:04   you know, before they skate back.

01:58:07   - Yeah, and I don't think it's any coincidence.

01:58:11   I mean, part of it is that Leslie Jones

01:58:13   was engaging on Twitter,

01:58:15   but I don't think it's any coincidence

01:58:17   that of the four Ghostbusters, they went after the one who's not just a woman, but is also

01:58:23   black. This guy's been in trouble with Twitter before. At one point he was verified. That's

01:58:29   why I mentioned he's a journalist. They removed his verified badge, which is very odd thing.

01:58:34   I thought that was very unusual. Like I could see why Twitter wants to deal with this guy,

01:58:38   but removing his verified badge was a very strange move to me because I didn't like it.

01:58:42   I didn't like it either because it, it, it plays into this whole notion that having the

01:58:46   verified badge is this mark of prestige, which to me is like nonsense. I don't know. I mean,

01:58:52   I got verified. I didn't ask for it. It was like, somehow what happened is when remember

01:58:57   when Matt Honan got hacked? Oh, yeah, yeah. So Matt Honan, who's now at BuzzFeed got hacked

01:59:05   like two years ago, and he wrote a great story about it. And it was somehow that was like,

01:59:09   you know, it was sort of social engineering where his attacker called Apple and said,

01:59:15   you know, claim to be him and somehow got through a couple of the questions and got

01:59:19   his, you know, Mac.com, his iCloud account, uh, uh, reset. And then once they had his

01:59:27   email account, that was the email account used by his Twitter account. And so his at

01:59:31   MAT, you know, it seemed like the target of it was that the, because he has this very

01:59:36   short, uh, Twitter handle at MAT, uh, isn't that his Twitter handle? I think it used to

01:59:41   be.

01:59:42   Yes, that's right. It's super short. They really wanted that. And I have a friend who

01:59:45   who has a very short handle that apparently it's part of her name,

01:59:49   but it's four letters and it relates to some programming thing.

01:59:51   So she is being regularly harassed by script kiddies and how to get someone

01:59:55   permanently suspended because of their, uh, doxing attempts.

01:59:58   Yeah. Uh, or yeah,

02:00:00   it's a weird thing where like certain handles are so in demand.

02:00:04   Jesse char is at Jesse on Instagram. Every, every photo she posts,

02:00:09   there's some girl named Jesse, you know, not the same person, but every time it's,

02:00:12   can I have your Twitter handle?

02:00:14   There's, you know, Dave Rutledge works at MEH, our friends at MEH, which is I've been

02:00:19   running for MEH, by the way, which is a fun place to write for.

02:00:21   He's at, he's @_.

02:00:22   Yeah, and his wife is @__, and their child is @___.

02:00:26   Well, his wife's @__ got ripped off, got hijacked somehow, and he eventually, he kind of was

02:00:32   trying to, they were trying to go through channels, they eventually kind of bumped up

02:00:35   like, "Anybody help us?"

02:00:36   And somebody at Twitter was like, "Got your back," and took care of it, but she almost

02:00:39   lost @__.

02:00:40   Ugh.

02:00:41   Well, anyway.

02:00:42   It was terrible.

02:00:43   which is weird because to me, it's almost like to me that isn't like we're punishing you. It's like,

02:00:48   to me that would be like, we're no longer certain that this account is who you, is you.

02:00:52   But they knew it was him. There's no question that it was him. So removing the verified badge

02:00:56   is very strange. It's like,

02:00:58   Well, you know, but the reason they do it though, I think the reason they did it and I, and I have

02:01:02   the same reaction you do too. It's like Twitter shouldn't be anointing people. They originally

02:01:06   started the verified program to help celebrities in some news outlets prevent to have a legitimate

02:01:12   account that showed it wasn't being impersonated, that they'd vetted it, right?

02:01:15   And they used to have requirements that you had to have two-factor authentication or some

02:01:19   other protection on your email, or they would ask, and so they vetted entire newsrooms where

02:01:23   the newsroom had shown them or discussed their techniques to prevent email from being hijacked

02:01:28   so that they wouldn't be overtaken.

02:01:30   Especially if they're like AP hijacks and a bunch of news outlets got hijacked a few

02:01:33   years ago.

02:01:34   So, but the thing that, the reason I think they did it is if you're verified, I'm not

02:01:39   You know, there's, they have the tools that you

02:01:41   see only verified accounts.

02:01:42   So if you're looking at replies, so I think this

02:01:44   was to get him to not show up in the replies to

02:01:48   other people are verified, like celebrities and

02:01:50   whoever, whoever they, yeah, I know.

02:01:52   Cause I, you know, I tweeted, so, um, Chrissy

02:01:54   Teigen, uh, the supermodel, I think she is also

02:01:58   awesome.

02:01:59   Like I don't even know her super modeling career.

02:02:00   Like I've seen pictures of her.

02:02:01   She's I like most supermodels, one of the most

02:02:04   beautiful women on the planet.

02:02:05   Great.

02:02:05   Right.

02:02:05   But I don't, it's fine.

02:02:07   It's beautiful.

02:02:07   Um, sports illustrated model, right?

02:02:10   Cover model.

02:02:10   So, but that I didn't follow her because of that.

02:02:12   At some point I noticed people retweeting these

02:02:15   incredibly funny, really direct stuff she's saying

02:02:17   about, you know, not just like feminism or politics,

02:02:20   whatever, but the way that, which she has to defend

02:02:22   yourself, she's married to John Legend about people

02:02:25   telling her she doesn't deserve him and he's been

02:02:26   duped and whatever.

02:02:27   She is awesome.

02:02:28   She is so forthright and so great.

02:02:30   She has a cookbook out.

02:02:31   So I, she tweets something last night.

02:02:34   I wrote something back to her.

02:02:35   I can't read just some little thing in my

02:02:36   just some little thing in passing. I don't think she's going to read it. She favorited it." And I'm

02:02:40   like, "Good God, just, this thing has 4,000 favorites on it. Why would she do that?" I'm

02:02:44   like, "Oh, she's using this filtering. I have a verified tag." So, there is a little bit of a

02:02:49   superpower that is associated with it, is you have more visibility to people who are verified.

02:02:53   So, I'm, you know, some random journalist who works in a daylit basement. And I can,

02:02:58   but because of that, it's like, that I think is the power of that blue checkmark.

02:03:04   Yeah, well what happened for people like me and you is after Honan got hacked and I think he got hacked

02:03:09   I think I might be misremembering the details. I think he got hacked those. I think it was very clear

02:03:13   It was about his his MAT Twitter. I think you're totally right, but it was early but Twitter

02:03:19   I think as a precaution thought well

02:03:21   Maybe it's because he's a tech

02:03:23   reporter and that all of a sudden in a very short period of time like I think me and you probably got verified right around the

02:03:28   Same time a lot of people at Macworld did

02:03:31   I had to wait a bit. I had to ask them for actually a few years because I did. Well, the thing was,

02:03:35   I felt there was this developing thing where a lot of journalists were getting marked and I'm like, look, I'm a freelancer.

02:03:40   Not having a blue check mark makes me look like I'm not legitimate on Twitter. And I'm like, I don't care or not.

02:03:46   But if you're going to have a system, I want to be in it so that then, and likewise, like I can direct message people who don't have DMs open and they can DM me even if I have DMs off because we're verified.

02:03:57   verified like that's a funny thing. There's a couple little things that are useful.

02:04:01   It's a very strange club.

02:04:03   Yeah, it's a weird club and it's like it's got supermodels,

02:04:06   the President of the United States and you and me. It's great. I love being in this club.

02:04:08   I've told this before and it is incredibly embarrassing, but it's,

02:04:13   uh, my son told his friends at school that I'm verified and all the girls in the class thought it was the coolest thing that they did.

02:04:20   Oh my God. That's awesome.

02:04:24   They were like, no way. They didn't believe him. They didn't believe him.

02:04:26   So like they, they opened up a Chromebook and loaded my Twitter page and they were like,

02:04:32   that's so cool.

02:04:33   Oh my gosh, that is awesome.

02:04:35   That is terrible.

02:04:36   I think it's just dreadful.

02:04:38   But anyway, Milo Yiannopoulos, they took away his badge.

02:04:43   They had suspended him temporarily for similar incidents in the past and he'd always come

02:04:48   back.

02:04:49   And after this one, right before he was supposed to go into some kind of event in Cleveland

02:04:54   for the Republican National Convention, he received an email that said, "This is it.

02:05:00   Your account is permanently suspended."

02:05:03   One of the weird things about that—I think they did the right thing.

02:05:06   I think this guy was abusing Twitter, and I don't think Twitter should tolerate it.

02:05:11   But it is weird.

02:05:14   There's a weird down-the-memory-hole aspect to it, where once his account is suspended,

02:05:18   all of his tweets are gone.

02:05:19   - Yeah, I know, that's, you know,

02:05:21   so when they do temporary suspensions,

02:05:23   they will often make people agree to delete specific tweets

02:05:27   through an automated process

02:05:28   before they're allowed to the count back.

02:05:30   So only those tweets are deleted?

02:05:32   Like, however odious something is,

02:05:33   I'm like, well, this is deleting history.

02:05:35   Now no one knows that he said

02:05:37   these hundred thousand terrible things.

02:05:39   - Yeah, and that was part of it.

02:05:40   And there were, you know,

02:05:41   I wrote briefly about this on "Daring Fireball,"

02:05:43   I support this.

02:05:44   And my take is that his supporters say,

02:05:47   and I've seen this argument with other people,

02:05:49   this is not the only time, but when something like this happens and after instigating this

02:05:54   sort of abuse, they say, well now that they've suspended his account, they're obviously,

02:06:01   Twitter is a company that censors conservatives and suppresses free speech.

02:06:07   As though the right to harass people on Twitter, and I don't think there can be any argument

02:06:13   that what was done to Leslie Jones was outright harassment.

02:06:16   She seemed genuinely emotionally distressed at what she was seeing and dealing with.

02:06:24   The argument that that should be protected free speech is just nonsense, and it just

02:06:28   shows that these people are, in my opinion, not just racist and misogynist, but that they're

02:06:36   like sociopaths, that they're so emotionally stunted, that

02:06:44   it's very hard. You really can't rationalize with these

02:06:49   people. And there were other readers. I got some feedback from people

02:06:53   who read Daring Fireball. Very thoughtful. And some people I

02:06:57   know, just random readers, who said, "I'm very

02:07:01   uncomfortable with this. I'm not racist or misogynist.

02:07:04   These weren't people who were Milo Yiannopoulos fans, but they were like, "I'm just very

02:07:08   uncomfortable with Twitter saying that somebody, you know, this is allowed and this isn't.

02:07:16   There should be a free-for-all."

02:07:17   But you can't have a free-for-all.

02:07:18   You just—well, you could.

02:07:20   You could run a service that's a free-for-all, but it's not going to be a pleasant place.

02:07:24   Alan Corey Yeah, among other things.

02:07:25   Twitter has rules of engagement, has terms of service, and he was violating them.

02:07:31   like the reason they got criticism about this is

02:07:34   famous persons attacked, famous prominent person

02:07:36   who's already under attack has part of like a

02:07:38   cultural war against, you know, people who may

02:07:41   align with Trump and people who may align with

02:07:43   other progressive movements.

02:07:44   Uh, so, and I actually saw Leslie Jones

02:07:47   interviewed by her friend Seth Meyers and his

02:07:49   show, and it was lovely.

02:07:51   And she was talking about this whole situation.

02:07:53   Oh really?

02:07:53   I didn't see that.

02:07:54   Yeah, it was like four or five minute clip.

02:07:56   And one of the things they showed was a bunch of

02:07:57   people who'd sent into Seth Meyers, like these

02:07:59   these little videos of just how, you know,

02:08:02   you're awesome, Leslie, you're so inspirational.

02:08:03   It's like this like little girl and an adult

02:08:05   couple and just like all kinds of people.

02:08:07   It was great.

02:08:07   And she was practically crying cause it was just,

02:08:09   you know, so you'll be so nice about it.

02:08:11   But, um, she said, um, he said, you know, should

02:08:14   it rise, Seth asked a very good question.

02:08:16   He was like, should this rise the level?

02:08:18   Like you and I are kind of well known, like we

02:08:20   have a lot of, you know, whatever, a lot of

02:08:21   followers and she's like, this should be, and

02:08:23   she said, you know, this is for, it should be

02:08:25   for everybody.

02:08:25   No one should go.

02:08:26   Like he said, she said, if I never spoke up

02:08:28   spoke up about this. No one would ever have known it happened to me. But I made a fuss,

02:08:33   right? And if she'd never talked back to people, it still would have affected her,

02:08:38   because it's asymmetric. And I think that's, that's the issue. The question I would say

02:08:42   too, when you look at Milo, like, he's a provocateur. That's his stock and trade.

02:08:46   He talks about being a provocateur. He wants to get a reaction, and he's very good at

02:08:50   it. Now, it's unfortunate that what he wants, that he's not, you know, in my mind, funny

02:08:54   or interesting or decent or whatever. I think he's off the charts in terms of, you know,

02:08:59   being practically a sociopath in the way that he acts. He acts consistently in his self-interest

02:09:03   without regard for any standards of morality or decency. And that's, you know, it doesn't matter,

02:09:07   I don't care if he's left or right. I'm not even talking about any political view he has,

02:09:11   it's his behavior. And it's not political correctness when you're specifically trying

02:09:16   to say things that you know will cause, like, political correctness is when you are

02:09:21   told not say something that is a reasonable statement that is not designed specifically

02:09:26   to harm someone and is actually part of social discourse that needs to occur to improve the

02:09:31   social good. That is political correctness I've encountered. There's a lot of things as somebody

02:09:35   who's, you know, left the center, maybe a liberal, lifelong democratic voter. There are things that

02:09:40   I do not feel comfortable discussing in public because I know I can't discuss in a way without

02:09:45   having a blowback that would be pretty severe, even though I have no bad intent and I want to

02:09:50   through an issue as opposed to make statements,

02:09:52   right? So there is, there is a chilling effect

02:09:55   in certain areas that I would call that because

02:09:56   you can't even bring up, this happens like easy

02:09:59   case, which I can talk about. It's like, look

02:10:00   at Israel, supportive Israel. I'm a Jew. My

02:10:03   whole family, I'm married to someone who's not

02:10:05   Jewish, but my whole family back to whatever

02:10:08   is Jewish. I'm very uncomfortable with Israel.

02:10:10   I think they're engaged. I don't, I'm not going

02:10:12   to get into the political stuff, but like I'm

02:10:13   very uncomfortable with Israel. Let's leave it

02:10:14   there, right? It is very difficult to have any

02:10:17   sane and sensible conversation about

02:10:18   about Palestinians, Arabs, and Israel with any combination of people, other Jews, non-Jews,

02:10:24   Muslims, whatever, no two people can get together and talk without somebody chiding you for some

02:10:30   opinion about it no matter what your stance is. And there is, that's sort of a problem with race

02:10:35   in America too. It's very difficult to have a discussion because no two people can agree how

02:10:39   to talk about it without essentially accusing each other of engaging in something. That is a form of

02:10:44   Going out of your way to specifically knowingly inflict emotional harm or inspire threats

02:10:50   against somebody, even when you do it with a way that's plausibly deniable, that is not convincing

02:10:55   at all, there's no question what that is. So, if Milo were super left-wing and thought it was funny

02:11:01   to go after anti-Semitic things against Jewish actress or something, and I'm saying that not

02:11:08   that the left is all anti-Semitic, but it's the closest, you're not going to have left-wing people

02:11:11   go after black people typically, but you know,

02:11:13   that happens too.

02:11:13   You have issues with intersectional

02:11:14   feminism, blah, blah, blah.

02:11:16   I won't go into that.

02:11:16   But anyway, so it doesn't, I don't think it's a

02:11:18   political thing.

02:11:18   He's not raising a conservative point

02:11:20   against Leslie.

02:11:22   He's not raising an issue even about the movie.

02:11:24   Like he has a cultural, the whole gamer gate

02:11:26   movement, alt-right merged together with it.

02:11:29   And even the Ghostbusters thing has to do with

02:11:31   people who feel like that people who were in a,

02:11:34   what they thought was a majority situation,

02:11:35   never felt the benefits of privilege and now reject

02:11:38   entirely the notion that they have a

02:11:40   that they have any privilege.

02:11:42   But so there's a structural cultural argument there

02:11:45   that he, that the basis of which is at Ghostbusters

02:11:48   is a terrible thing because of,

02:11:49   but that is not a conservative viewpoint.

02:11:51   - But he's parlaying off the, his followers,

02:11:56   actually, I think is the point you're trying to make,

02:11:58   actually well-founded concern at political correctness,

02:12:01   the dial right now is set too far,

02:12:04   and that things that we should be able to have discussion

02:12:07   about we feel like you can't.

02:12:09   It's funny, but whatever political correctness is raised, it's always about issues where I'm like,

02:12:12   that's not political correctness. That's someone wanting to say something offensive

02:12:15   and not liking the consequences as opposed to a valid, when I say valid, everyone will have

02:12:20   valid differently. This is the thing I want to get at with speech too is I wrote this long

02:12:23   screen a few days ago. I wrote like 40 tweets in a tweet storm. I thought I was going to write two.

02:12:27   Pete: You? Really?

02:12:28   I know, it's crazy, right? And I got a remarkable amount of response,

02:12:32   but I thought people wouldn't listen. I just had to say it because it's after Leslie-Jen's thing.

02:12:35   And it's like, I think there's a clear difference. I don't study the First Amendment,

02:12:38   so I can't tell you where this sits in law. There's probably people who have this more to find.

02:12:42   There's offensive speech, and then there's like abusive, hate-threatening speech, right? And

02:12:47   hate speech is a difficult thing under the First Amendment because the First Amendment is so broad.

02:12:51   But I think Twitter should not encourage offensive speech, but I think it should allow it. And when I

02:12:56   say offensive, it's things you don't want to hear are offensive. And sometimes those things are very

02:13:00   offensive. If somebody wants to, on their own account, not adding me, say, and not talking about

02:13:06   Let's say the best case.

02:13:07   They say, there are Nazis who want to say,

02:13:09   Jews are terrible.

02:13:10   I think they should all go to a gas oven.

02:13:12   I wish this would happen.

02:13:13   I hate the fact that the world is run by Jews.

02:13:15   If somebody wants to say that, well,

02:13:17   that's very general.

02:13:18   It's awful.

02:13:18   It's offensive.

02:13:19   I don't want to hear it.

02:13:20   I don't have to follow them.

02:13:21   If rather they're using their account to

02:13:24   exchange information, to create, you know, to

02:13:26   organize around notions of hate, to create,

02:13:28   um, not even to create policies, but to, uh, to

02:13:32   threaten other people, to gain strength that

02:13:34   allows them to then practice hate speech

02:13:36   against other people.

02:13:37   Um, or if they're using their Twitter accounts,

02:13:39   even as like coded ways to organize or promote

02:13:42   things that they link to on websites or everyone

02:13:45   knows on a website.

02:13:46   And there's some question with Milo about like what

02:13:48   he's posting on Breitbart versus what he's doing

02:13:49   on Twitter.

02:13:50   And he may even more careful on Twitter to not

02:13:52   be as, uh, you know, provocative in some ways

02:13:55   as he was on Breitbart even like that's a

02:13:57   different thing.

02:13:58   So like, I don't, so even if nothing Milo said

02:14:01   specifically individually, if it weren't sent to

02:14:05   to Leslie was offensive, like vile,

02:14:10   but not actually abusive,

02:14:12   that maybe there's a case made that that person,

02:14:14   and there's some gray area there

02:14:16   about saying things you don't wanna hear.

02:14:18   'Cause John, there's tons of things that people say

02:14:19   that you don't wanna hear

02:14:20   that are valid political opinions

02:14:22   that you wouldn't say that person should be banned

02:14:24   unless you actually wanted to decrease free speech.

02:14:27   - Right. - It's a very

02:14:28   different thing.

02:14:29   - And you know, it's a funny thing.

02:14:30   It just isn't, a subset of these people

02:14:34   seem to really think that free speech is,

02:14:37   that it's absolute, that they should be able

02:14:42   to do whatever they want.

02:14:43   Say whatever they want to whatever they want

02:14:45   and if they wanna @ reply Glenn

02:14:47   that you should be put in an oven,

02:14:49   they should be allowed to.

02:14:50   And you should figure out how to filter it

02:14:53   if you don't wanna see it.

02:14:53   - Right. - That it's your problem.

02:14:55   And I say that's hogwash.

02:14:56   That's absolute nonsense.

02:14:59   And I know that these analogies

02:15:02   from between the online world and the real world

02:15:04   always break down.

02:15:05   There's always, 'cause you know,

02:15:07   online is so totally different.

02:15:09   But like my analogy was that nobody would ever

02:15:12   tolerate a restaurant, no sane restaurant

02:15:15   would ever tolerate allowing a person to come in

02:15:18   and harass fellow patrons and just go up and

02:15:21   make disparaging remarks about the color of their skin

02:15:27   or that they seem to be same sex couple

02:15:32   or whatever, for anything.

02:15:35   Why would you allow that?

02:15:36   It would be toxic.

02:15:38   And that's exactly what these storms of people

02:15:42   harassing people on Twitter are all about.

02:15:44   - No, that's why Leslie Jones, he said,

02:15:46   if I'm in a restaurant, she said exactly the same thing.

02:15:48   - Maybe she read my-- - I'm in a restaurant,

02:15:51   and a couple people are shot in front of me.

02:15:54   That's the restaurant's problem.

02:15:55   That's not my problem.

02:15:56   I wanna eat in the restaurant.

02:15:57   I like the restaurant, I wanna come back.

02:15:59   But if people are gonna be shot there,

02:16:00   I'm not going to go there.

02:16:02   I think there's also this—free speech is not the right to be heard. It's the right to talk.

02:16:06   Ted

02:16:09   that it is actually hard to behave antisocially

02:16:14   in the real world, especially after you get out

02:16:19   of the older you get.

02:16:21   It's, you know, kindergarten,

02:16:23   there's a lot of antisocial behavior.

02:16:25   And as you work your way through high school,

02:16:27   it decreases, but there's, you know,

02:16:29   a lot of people have memories

02:16:30   of a lot of antisocial tormentors in, you know,

02:16:33   middle school and high school.

02:16:35   And then all of a sudden it just magically goes away

02:16:38   when you go to college.

02:16:39   and in the real world, in most venues,

02:16:41   it is extremely difficult to behave antisocially.

02:16:44   There's all sorts of social pressure

02:16:48   and there's physical pressure.

02:16:50   Like if you behave the way some of these people

02:16:51   behave on Twitter in the real world,

02:16:53   there's a very good chance that you'd get punched.

02:16:55   I mean it.

02:16:56   - Oh yeah, yeah. - You would actually

02:16:57   get punched.

02:16:58   So for example, here's an example.

02:17:01   You and I both know an awful lot of,

02:17:04   we've seen it in the last year or two.

02:17:06   We've seen women who get, you know,

02:17:09   stand up and speak out on something that upsets these people

02:17:14   and then they get literally told,

02:17:17   literally threatened with rape or death in a tweet.

02:17:20   At, at, insert, you know, the username.

02:17:23   You know, I know where you live, you're gonna get raped.

02:17:27   If you did that to somebody in real life,

02:17:30   there's a very good chance,

02:17:31   well, you should be like arrested,

02:17:34   But if somebody did that to my wife,

02:17:37   I would be prepared to punch the guy.

02:17:40   Because that's a dangerous situation.

02:17:42   People don't do that.

02:17:43   I mean, it just doesn't happen.

02:17:45   So for these people who, for whatever reason,

02:17:48   in their minds, want to behave antisocially,

02:17:51   the online world, and we've known this,

02:17:54   like you said, you've been on CompuServe since 1968.

02:17:57   I mean, the online world has seen this

02:17:58   from the beginning days, that there's antisocial behavior

02:18:01   that you just don't encounter in the real world

02:18:03   because there are no repercussions.

02:18:07   You can get away with it.

02:18:08   You can do it.

02:18:09   So people have gotten into these hordes

02:18:13   that harass people, and they seem to enjoy doing it.

02:18:17   And what they're being told now,

02:18:19   like with this Milo getting kicked off Twitter,

02:18:21   is no, you can't do it.

02:18:22   Well, if they can't do it, then all of a sudden

02:18:25   they don't have an outlet for this antisocial behavior.

02:18:29   And part of it, too, is that you can go off into,

02:18:33   what's the 4chan or whatever, and just hang out,

02:18:35   hang out with your fellow, you know, sickos,

02:18:39   and do this.

02:18:41   The beauty of Twitter from these weirdos' perspective

02:18:45   is that they can at reply these women and minorities

02:18:49   and get their jollies by knowing that the people

02:18:54   who check their replies are gonna see it.

02:18:56   And then when they engage, it's like they just go bananas

02:19:00   because it's like they know, then it's confirmation.

02:19:02   Like once they knew that Leslie Jones was actually seeing this stuff and reacting to it,

02:19:06   it just made it worse because this is exactly what they want to do.

02:19:09   They want to behave in an extremely, almost sociopathic way,

02:19:14   and they can, and there's no other outlet for it in the world.

02:19:17   And so now that Twitter is standing, you know, so they're standing behind free speech,

02:19:20   but what they really want is they want their special bullying venue

02:19:25   to remain available.

02:19:26   Special bullying venue is probably not what Twitter wants as its motto.

02:19:31   Definitely should, but that's the way these guys see Twitter.

02:19:34   That is exactly the way they see Twitter.

02:19:36   There's a high degree of asymmetricality in Twitter that could be changed.

02:19:39   And, uh, Randy Harper knows I've been working on tools. I use her, uh, good, uh,

02:19:44   what's called a good game block list. I forget. GG auto blocker, um,

02:19:48   with a block together as the tool. And so you subscribe block together.

02:19:51   There's a web app with a Twitter API integration.

02:19:54   You log into it and then you can add shared lists of other people you and other

02:19:58   people maintain, and you can set a few throttles and things.

02:20:00   So, block together exists, it's a third party thing,

02:20:02   Twitter allows it, and there are like 12,000 people

02:20:05   on Randy's list that are auto-generated

02:20:07   from people who follow a few major accounts,

02:20:09   which will change, 'cause one of those few major accounts

02:20:11   was Milo's, which was @Nero.

02:20:14   And oh, Twitter, by the way, when it block,

02:20:16   permanently suspends an account,

02:20:17   that account is dead forever.

02:20:19   @Nero will never be used again.

02:20:20   - Right. - You know,

02:20:21   'til the sun burns out, potentially.

02:20:23   So, anyway, so, Randy Harper wrote this thing,

02:20:26   just someone asked me that the other day,

02:20:28   in February, it's a Medium piece,

02:20:29   You can search for Randy Harper, medium Twitter ideas and, uh, finding the show notes.

02:20:33   If you put it in, I don't know whatever, but she had like 23 ideas for Twitter.

02:20:36   She's a developer.

02:20:37   She's been working at three API.

02:20:39   She's met with them.

02:20:40   She's knows how everything works on the, you know, the outside of the black box.

02:20:44   She knows it as well as anybody trying to develop anti-abusive tools, the stuff she mentioned, some of it's trivial.

02:20:50   Some of it's harder.

02:20:51   She's not sure the severity of work required for all of it, but those things are still out there.

02:20:57   And so the issue with Twitter isn't so much that people can gang together and abuse, it's that you as a party receiving it, it is bad at asymmetrical warfare, where someone can get 100, someone can either create 1000 accounts and tweet at you, or get 100,000 people like the the the twitch twitchy, there's a conservative site, and they will often say basically, that person is bad. And then all the people who follow twitchy or go to the website will go off, I get I say I'll say respond to something Sally Cohen says, you know, I have a little back and forth her about something.

02:21:27   or whatever, I think I'm being funny.

02:21:28   And I will suddenly get all these people who hate

02:21:30   follow Sally Cohen.

02:21:31   I've never been targeted by Twitchy yet for as

02:21:33   far as I know.

02:21:33   Uh, and, uh, I get like, you know, a hundred or

02:21:36   200 tweets from people who don't spell and have

02:21:39   flags in their bios and just like, you're

02:21:40   before, blah, blah, blah, blah, blah, blah.

02:21:42   You know, it's like, it's not even intelligible.

02:21:44   It's just like random anger.

02:21:46   There's nothing I can do to stop it except

02:21:48   blocking what her time, even the auto block

02:21:50   lists don't help me.

02:21:51   So if there were tools, like if you started

02:21:53   getting a lot of tweets, you could like, if there

02:21:55   were dials and tools, you could just, you know,

02:21:57   There are things that could happen that would,

02:21:58   that would either temporarily or as a permanent

02:22:01   block, you could prevent.

02:22:02   The course tool that Twitter is going towards

02:22:04   is that thing I was talking about.

02:22:05   They have an app, I forgot what it's called.

02:22:07   They could at least a separate app that's

02:22:09   basically, it's good if you have a verified

02:22:11   account, works fine otherwise, but it's a

02:22:13   little tailored towards being a verified person

02:22:15   with a large number of followers and mostly

02:22:19   posting rather than doing a ton of interaction

02:22:21   with everybody.

02:22:21   Um, I've forgotten the name of it, but, uh, so

02:22:25   So Twitter just the other day said they're going to open a verification to everybody with a

02:22:29   opaque process that I think involves sending a picture ID to them.

02:22:33   So you're going to give them your OD.

02:22:34   So conceivably, this is one step on the road to more people being verified.

02:22:39   And what if there are the hundred thousand most prolific, engaged Twitter people are all verified.

02:22:44   Well, that can be removed as well.

02:22:46   So maybe Twitter is moving towards a more asymmetrical model where we're only following, we give up following random people.

02:22:51   If you can't be bothered to verify.

02:22:54   I've thought for a long time, Twitter should

02:22:55   offer a basic level of verification where you

02:22:58   had to use a phone number to confirm, even

02:23:00   though phone numbers can be disposable and

02:23:01   whatever, it's a high enough bar that if you're

02:23:04   re taking, getting a text from a phone number,

02:23:06   you're, you know, it's a hassle if that account

02:23:09   gets banned or blocked, it's a hassle to get

02:23:11   their phone number and you can only get so

02:23:13   many, you can't get a 10,000, a hundred

02:23:15   thousand phone numbers, but they could have

02:23:16   some basic level that doesn't involve sending

02:23:18   a photo ID.

02:23:19   And I could say, look, I don't want to deal

02:23:20   with people who are completely anonymous.

02:23:22   I only want to see people my time.

02:23:23   And I think that's true in general, people don't follow Brianna

02:23:25   Wu and then spew hate or they're not following.

02:23:28   Typically some people do, some people follow you because they think it

02:23:31   gives them a little extra whatever.

02:23:32   So I only want to see things in my timeline will follow me because I can

02:23:35   block them or I could change them.

02:23:37   I mean, there's a checkbox and from people or for people who don't follow me.

02:23:41   Cause I don't want to see things in my timeline, I don't want to see

02:23:44   things in my timeline because I don't want to see things in my timeline.

02:23:46   I don't want to see things in my timeline because I don't want to see

02:23:48   things in my timeline because I don't want to see things in my timeline.

02:23:50   Or for people who don't follow me, who, if they're not retweets from somebody else, that's cool.

02:23:56   I'll see people who I follow as retweets, but people who have gone through a very basic level of verification

02:24:01   that doesn't cost money. Like I don't want this to be a first world, developing world thing or whatever.

02:24:06   But there's only things they could do. They could put lots of throttles in, they could be monitoring behavior.

02:24:10   Jon, you know there's a setting called filter low quality tweets in your account, right?

02:24:15   No, I did not know that.

02:24:16   It's only available to verified people so far. It's been available for like two years.

02:24:19   And if you check that, they use their machine learning, I think it's using machine learning,

02:24:23   to filter out tweets that look like they aren't very good.

02:24:26   (laughs)

02:24:27   I don't know. I don't know. I don't know what it is. I think part of it, obviously, is that I am a white man.

02:24:35   I can do things like write about Milo Yiannopoulos in a very critical way, and at least knock on wood, to date.

02:24:44   I don't get, I get very, I got responses in Twitter, but it wasn't any, I wouldn't call it abuse.

02:24:49   I mean, and there were people who disagreed and there were obviously, you know, people who staunchly disagreed.

02:24:53   And a little bit of it was ugly, but I don't get any kind of mob, it never happens.

02:24:58   Whereas if I think if I was, you know, Joan Gruber, I think there's a very good chance that I, I really do.

02:25:07   I thought about A/B testing, like creating a fake account that was under the name of a woman with a woman's face,

02:25:14   with someone's permission. Some people have tried this on like, OKCupid, they've gotten a friend's

02:25:17   permission to use their image and just been like, you know, but then it becomes, I mean,

02:25:21   it becomes a stunt. I don't want to be a stunt. Left women live through this, people of color,

02:25:24   people with, who are not, you know, cis male, straight, white. It's like, they already have

02:25:30   enough problems that I don't need to pretend to do like racism tourism or something like that.

02:25:35   gender tourism. I should try it though because it would be interesting. I actually just by

02:25:39   coincidence last night I crafted what I think I don't know how else you'd measure it I should go

02:25:45   to fab star I think I wrote my two most popular tweets of all time last night no no really yeah

02:25:52   and it the beating my old one which was my original all-time best tweet was like from 2009

02:26:00   and I said, let me see if I can get it right.

02:26:02   I said-- - I'm looking it up here.

02:26:03   - I don't drink, I don't gamble.

02:26:06   - Oh yeah, I found it.

02:26:10   My one vice, I'm reading your own tweet to you.

02:26:12   - Yeah, read it.

02:26:12   - Okay, I don't drink, sorry, I don't gamble,

02:26:15   I don't drink, my one vice is buying a new iPhone

02:26:18   every summer, well that and lying about drinking and gambling.

02:26:21   I remember that one, that was great.

02:26:22   That was 2590 days ago.

02:26:24   - Yes, that was previously my all-time favorite tweet.

02:26:28   But last night after--

02:26:29   - Oh my God, I'm looking at your numbers.

02:26:30   This is hilarious.

02:26:31   What are you using to look at them?

02:26:32   I'm using-- I like Fafstar.

02:26:34   Yeah, how do I do that?

02:26:35   I'm just vain enough.

02:26:36   I'm looking at-- oh, Nick Kristof and my friend Susan Orlean

02:26:40   both retweeted it.

02:26:41   And Chris Hayes from MSNBC retweeted it.

02:26:44   Wait, he did?

02:26:44   Yeah.

02:26:45   Oh, I'm looking at one of them.

02:26:46   Oh, yes, Nick Kristof and Susan--

02:26:47   Nick Kristof, the columnist for The New York Times.

02:26:50   Susan Orlean, staff writer for The New Yorker.

02:26:52   Oh, yeah, Baratunde.

02:26:54   Yeah, it's unbelievable.

02:26:56   Somehow it got into the cycle of top political reporters,

02:26:59   and they all retweeted it.

02:27:00   Well, you're followed by—you have an interesting group of followers that overlaps, and so people who—this is what happened, I've had some breakouts.

02:27:06   I have like 6,000 retweets, which is like 50 times more than anything I've ever said before, because I made a comment about Brexit.

02:27:14   Because the morning after, I'm like, I'm listening to everything, and I'm like, this isn't going to happen.

02:27:19   Like, there is a plan here.

02:27:21   And I wrote something about basically like Brexit, you know, this could be such political suicide that Brexit's not going to happen.

02:27:29   and it got retweeted like 6,000 times.

02:27:31   Oh my God, did I get interesting responses?

02:27:33   Those were actually interesting.

02:27:34   I got hundreds and hundreds of replies from people.

02:27:37   Some of them thought I was an idiot

02:27:38   and some people thought I, you know,

02:27:39   I'm like, I don't vote in the UK.

02:27:41   I don't have an opinion on this, but I love the UK.

02:27:43   I want it to succeed.

02:27:44   But I also got very informed things by people

02:27:46   who were torn about it or had voted for Remain anyway.

02:27:50   But yeah, yours was-

02:27:51   - This blew away.

02:27:52   So I have two tweets, one after another,

02:27:54   and I connected them with Tweetbots, whatever they call it.

02:27:57   Let's play what if.

02:27:59   What if Barack Obama had five children

02:28:02   with three different women?

02:28:04   Immediately followed by what if Hillary Clinton

02:28:06   had five children with three different fathers?

02:28:08   - I thought those were pretty brilliant.

02:28:11   You know what's funny?

02:28:11   And this is not to not credit you with originality and wit,

02:28:14   'cause you have a lot of it, I'm gonna butter you up.

02:28:17   But when I saw that, I was like,

02:28:18   oh my God, has no one tweeted that before?

02:28:21   'Cause the minute you say it, it's absolutely obvious,

02:28:24   'cause they're all, you know, they're on stage,

02:28:25   the five kids are on stage, and you're like,

02:28:27   Holy crap, but like nobody had said it as succinctly and with the perfect timing as you had and so I did my

02:28:34   At reply, you know, so the one tweet the one mentioned the first one with Barack Obama has

02:28:40   3100 retweets and Hillary Clinton one has 3,800 and 34 retweets

02:28:45   So I even though I have an unusual Twitter account in a large number of followers. I had two tweets that were obviously a little

02:28:56   provocative, presented to an awful lot more people

02:29:00   than who usually read my tweets.

02:29:01   And so my current @ reply stream is,

02:29:04   I can't get to the bot, I can't keep up with it right now.

02:29:07   - It must be out of control.

02:29:07   I'm looking at, yeah, when I look at the people

02:29:09   who've retweeted it, there's a bunch of people in there

02:29:11   that I know like Mike Montero and John Siracusa

02:29:13   and people are definitely the tech side of things.

02:29:16   And then, you know, so it's a lot of the,

02:29:18   lot of the, and then like, I mean,

02:29:20   looking at the top retweets,

02:29:21   and then it's a lot of just really interesting people

02:29:23   who obviously found this through another thing,

02:29:24   some of them have 50,000, 60,000.

02:29:26   Yeah.

02:29:27   Excuse me.

02:29:28   There's an incredible, I mean, this is what's

02:29:30   interesting about Twitter is like, it makes

02:29:31   everyone have the ability to be a pundit or to be

02:29:34   a standup comedian and reach an audience they

02:29:36   don't otherwise.

02:29:37   And it's like, sometimes you feel like you're

02:29:39   shouting into the void.

02:29:39   Like there's times, Max Temkin asked me, you

02:29:41   know, from Cards Against Humanity, he asked

02:29:43   me, I don't know, a while ago, he's like, you

02:29:45   know, everybody who follows you agrees with

02:29:46   what you're saying.

02:29:46   Do you feel compelled to say it?

02:29:48   And he wasn't, he was mildly castigating me

02:29:50   because I had gone on about something.

02:29:51   And, and I was like, ah, you know, I did go on

02:29:53   And I'm like, you know, I don't think everyone

02:29:56   who follows me actually, like, I don't know how

02:29:57   many people who follow me read every tweet.

02:29:59   It's a percentage.

02:30:00   I look at Twitter.

02:30:01   Nobody.

02:30:01   I'm sorry.

02:30:02   That's, I didn't mean to be such a traitor.

02:30:03   I meant more like you can use Twitter analytics

02:30:05   and anyone can log in and you actually get to

02:30:07   see the impressions, like what percentage of

02:30:09   your audience sees it.

02:30:11   And it's fascinating.

02:30:11   It's good.

02:30:12   It's good and bad for the ego too.

02:30:13   Yeah.

02:30:13   Um, we got to wrap up.

02:30:15   Okay.

02:30:15   There's there's things that I say where I'm glad

02:30:19   at times to be able to say, I just need to say

02:30:21   this because I want other people to know that

02:30:23   to know that other people feel it?

02:30:24   - At the bottom line, Twitter has got to get a handle

02:30:30   on this, it cannot just be something that they address

02:30:33   when it's somebody of Leslie Jones' stature.

02:30:36   This needs to be something that everybody feels

02:30:39   like they're not going to be abused.

02:30:40   Not, and I agree, there is a fine line

02:30:42   and it is worrisome and Twitter, it's difficult

02:30:44   and I trust that it can be done, which is,

02:30:47   there should not try to prevent offensive speech

02:30:52   or ideas that some people find offensive

02:30:54   for being expressed on Twitter,

02:30:56   but they should absolutely make it seem

02:30:58   as though people are not going to be attacked.

02:31:01   And they're the different human being,

02:31:03   I don't know if you could do it all algorithmically,

02:31:04   I really don't.

02:31:05   I think you can use some algorithms to help,

02:31:08   but I think when somebody reports abuse,

02:31:10   I think somebody at Twitter who has any bit of empathy

02:31:12   can easily discern this is just an offensive idea

02:31:16   versus this is a personal attack on this other user.

02:31:21   - Some of it's nuanced, but when a thousand accounts

02:31:24   with fewer than a hundred followers all tweet

02:31:26   within a few minutes of each other at one account,

02:31:29   I think machines can learn what that means.

02:31:31   - And I would even be in favor of,

02:31:32   in terms of the nuance of, if there's any doubt,

02:31:35   don't suspend the account.

02:31:39   But it's so many cases where it's easily

02:31:41   should be recognized.

02:31:42   And the other thing that Twitter cannot,

02:31:44   you can say, well, there's a scaling problem there.

02:31:45   Twitter has an enormous headcount.

02:31:47   I don't know how sustainable that is.

02:31:49   I think it's, we could do a whole segment

02:31:52   on the show about it, is why in the world

02:31:53   does Twitter have so many employees?

02:31:54   Because it doesn't seem like there's much new stuff going on.

02:31:58   They certainly have the resources to hire a staff

02:32:00   that can look into this abuse.

02:32:02   And I think it absolutely needs to be done,

02:32:04   and it ties into the Amazon thing,

02:32:05   where it really is hurting Twitter's reputation,

02:32:08   where Twitter is more and more getting a reputation

02:32:10   as a place where if you participate,

02:32:11   you, especially as a woman,

02:32:13   you have a very good chance of being abused.

02:32:18   Twitter spends literally hundreds of millions of dollars a year, literally hundreds of millions of dollars a year on R&D.

02:32:23   It's astonishing. I don't know what exactly they count as R&D, but it's a big part of their expense.

02:32:28   So they're working on it. But yep.

02:32:31   Bottom line. Anyway, Glenn, thanks for being here. You've got a new book, new books to promote?

02:32:36   I have, I have a thing to be, yeah, I got deep into the Slack well.

02:32:40   Do you use Slack? I know that you work for Waddellun.

02:32:43   I am a regular participant on one Slack.

02:32:46   That's yeah, and it's slack you can get I have six slacks. I'm part of and some of them thankfully

02:32:51   They're not very loud, but I like them all but it can get out of control

02:32:53   So yeah, they're friends at uh, the take control books adam and tanya

02:32:56   angst long time mac folks, um

02:32:58   We were talking about like we started using it and it's like, uh, it's it's not quite like pokemon go

02:33:03   That's not that addictive but boy it went from zero to you know millions really quickly

02:33:08   And we were complaining amongst ourselves. Like how do we do this? What's this thing?

02:33:12   we're like, oh, if we can't figure this out and we've used a thousand software packages in our

02:33:16   life, perhaps it would be useful to do a book. So I did a book that's Take Control of Slack Basics,

02:33:22   that is for people who are users who want to be able to master it because the online documentation,

02:33:27   it's good. Our friend Matt Howey, he's, you know, involved in documentation, some aspect,

02:33:31   the Metafilters founder, great guy. And now Slack's head of documentation or something.

02:33:36   Is it head of documentation? Yeah. I don't know what he is. They're working on their side,

02:33:39   but like they have a different— He writes for Slack.

02:33:41   Yeah, they do a different thing.

02:33:43   Like, online documentation is not what a take control book is about.

02:33:47   They're experientially different things, as you know.

02:33:49   You've worked on manuals, you've worked on other stuff.

02:33:51   And so, I wrote a book that's divided into subjects using channels and messages

02:33:55   and even how to do emoji and so forth.

02:33:57   And the idea is you wake up one morning,

02:33:59   so many people I know this happened to,

02:34:01   you're 30 years old, 40, 50, and everyone says,

02:34:04   "You get a message from the office,

02:34:06   we're using Slack starting today."

02:34:07   You're like, "Oh, God, not another piece of software."

02:34:09   So the book is partly for people who had sort of Slack thrown at them and other,

02:34:13   you know, also for people who want to master it without having to go through

02:34:17   and sort of discover everything.

02:34:18   There's a lot of stuff in there.

02:34:20   And then I wrote a complimentary book.

02:34:21   That's how to set up a Slack team, which is pretty straightforward in some ways.

02:34:25   There's a lot of detail.

02:34:26   Um, and so if you don't have an IT organization behind you, but you want to

02:34:29   set it up, you want to run it well, you want to make sure people, uh, stay civil.

02:34:33   And how do you keep things correct and keep people from doing, you know, stupid

02:34:37   but also just set it up so it's as secure

02:34:40   and sensible as you want.

02:34:41   So I have this great URL.

02:34:43   You can go to takecontrol.com, of course,

02:34:45   takecontrolbooks, rather, .com, of course.

02:34:47   But I have slackhelp.me.

02:34:49   I got this URL and I'm very happy about it.

02:34:51   - Slackhelp.me.

02:34:52   I will absolutely put it in the show notes.

02:34:54   I love, just by the way, I know it's probably not,

02:34:56   maybe you did, I don't know, I know you know Adam,

02:34:59   but I love the new cover design for the Take Control books.

02:35:02   - Oh, it's so great.

02:35:03   Took a lot of years to figure out what we didn't,

02:35:05   like to this illustrative approach.

02:35:07   And then the minute I saw the new designs

02:35:09   when they were in progress, I'm like, oh, this is it.

02:35:11   It's like a discovering, like you're cutting away

02:35:14   at a block of marble and there's the sculpture inside.

02:35:16   - I think it's probably no surprise

02:35:18   that you can guess that you know my taste in graphic design.

02:35:21   But well, one of the things that I really love

02:35:23   about the new cover design is that putting

02:35:26   where the author title is.

02:35:28   The author title is in relatively small print,

02:35:30   but the topic is a big print, which is right,

02:35:33   because that's what take control books are about.

02:35:35   They're very topical.

02:35:37   They're just like, this is about Slack admin.

02:35:39   So that's nice and big, real nice font,

02:35:42   all caps, which I, of course, like.

02:35:44   Then the author name is down below, real small,

02:35:46   but because it's surrounded by white space,

02:35:48   it is so prominent.

02:35:50   And I love prominence in an author's name.

02:35:54   Like, it's, 'cause that's, to me,

02:35:55   is the whole point of the take control series,

02:35:57   is that they're not just churned out.

02:35:58   These are like the best writers in our racket

02:36:01   who do these things.

02:36:02   - It's fun too.

02:36:03   eBooks after, I mean, I kind of misprint, but, uh, you know, faster turnaround.

02:36:07   Yeah.

02:36:07   Um, yeah.

02:36:08   So Slack Slack's fun.

02:36:09   It's it's fun.

02:36:10   I know people that Jeff Carlson knows long-time Mac writer.

02:36:13   He and his wife have a two person slot because it's free.

02:36:15   You can use it free, you know, with some limits, but not, you know, 10,000 people.

02:36:19   We have a Slack group.

02:36:20   We set up, um, if people go to slack help that me, there's a link.

02:36:23   Uh, it's a public Slack team that anyone can join who just wants to understand what

02:36:27   Slack is like without having to set one up.

02:36:29   So it's four, it's called Slack bits.

02:36:31   and we're having discussions about Mac stuff, but also Slack help.

02:36:34   And it's great, we got several hundred people joined who just, they wanted a place to go with no overhead.

02:36:38   And, um, because it takes five minutes to set up a Slack team, but you have to set one up and you're dealing with it.

02:36:43   So, um, I like, there's a lot more, so my very last point, a lot more public Slack teams starting,

02:36:49   like this one for Mac admins, there's a bunch of them, I keep hearing about private ones that have hundreds or thousands of people.

02:36:54   If Twitter isn't careful, some of the conversation is going to get drained off to private Slack groups where everything is controlled.

02:37:00   That's a very interesting tie-in. And on the Slack that I'm on, that point has been made before,

02:37:04   too. There's an awful lot of people. I've heard this from other people, but on the board I'm on,

02:37:08   you probably know everybody who's there. It's a bunch of mutual friends. But it's a relatively

02:37:12   small group, maybe like 15 of us. And a whole bunch of them have admitted that they personally

02:37:19   use Twitter a lot less because of this Slack. I'd rather just communicate with this handful of good

02:37:26   friends, people who I would like to go and have dinner and a drink with, then interact with the

02:37:31   world at large on Twitter, just because of, you know, a couple of knuckleheads who make it unpleasant.

02:37:35   It's safe, you know, you know, everybody's part of the group. And even if it's a public group,

02:37:40   there's still an administrator who can kick anybody off. And Incomparable Podcast Network,

02:37:44   we have an incomparable Slack with, I don't know, 50 or 60 people who are involved in the many,

02:37:47   many, many podcasts. Jason Snell has now got up at the site. And that's where a lot of my

02:37:53   conversation that I don't trust to do publicly anymore because there's too

02:37:57   many jerks on Twitter who will respond to it I have those conversations with

02:38:00   many the same people who I would have on Twitter I have not privately so sad but

02:38:04   all right my thanks to you Glenn you've been so generous with your time what a

02:38:07   great conversation this is a good episode I really liked it remember that

02:38:11   URL if you do slack you got any interest in it go check out the books they're

02:38:14   really worth it and slack definitely has a lot of power user stuff that you're

02:38:17   gonna be like I didn't know I could do that at slack help dot me and you'll find

02:38:22   links to both books. My thanks to our sponsors. We've got Casper where you can

02:38:26   go to buy a mattress and we've got Fracture where you can print out your

02:38:29   photos and Global Delights Boom which will make the audio on your Mac sound a

02:38:36   lot better. So my thanks to them.