Hello Internet

Hello Internet Episode One Hundred


01:00:00   jokes and the fact that they joke about

01:00:01   a thing means that it's not serious and

01:00:03   so that they're contributing in this way

01:00:04   like I think that's a path down towards

01:00:07   crazy town and the only result then is

01:00:09   like Oh everybody has to just say things

01:00:11   that are perfectly acceptable online

01:00:14   like the other thing is I also just feel

01:00:17   bad for this guy because his whole life

01:00:22   has been on hold while this trial has

01:00:25   been waiting to happen

01:00:26   and it's been like a year and a half or

01:00:29   something before it has actually gone to

01:00:30   trial and this is a thing where I have

01:00:32   these bad feelings about the Internet

01:00:34   where it's like these little storms

01:00:37   happen and it's like a person becomes an

01:00:39   unperson even before anything official

01:00:43   has occurred so even though this guy

01:00:45   hasn't gone to trial it's like Oh his

01:00:48   whole life has been destroyed for the

01:00:50   past year and a half while this is

01:00:52   occurring because nobody wants to hire

01:00:55   the guy who is waiting for his trial

01:00:58   about inciting hate on the Internet to

01:01:01   go through and it's like Oh even if he

01:01:03   wasn't ever punished like he has still

01:01:06   been totally punished in the meantime

01:01:10   it's like a terrifying thing in society

01:01:13   that that can happen it's like you get

01:01:16   swept up in something and so much of

01:01:19   your life is destroyed and even if in

01:01:21   the end it's like oh we're gonna

01:01:23   overturn that case and we're not gonna

01:01:24   put you in prison and don't mind about

01:01:26   it it's like well the internet never

01:01:29   forgets and now no one will ever want to

01:01:32   associate with you because you are this

01:01:34   person even though like many people

01:01:37   don't think you did anything wrong

01:01:38   like it just it's terrible for UK law

01:01:40   and for people who make anything on the

01:01:43   Internet

01:01:43   in the UK I'm not condoning it but I do

01:01:45   wonder in the back of my mind if it

01:01:47   could be the making of him because he's

01:01:48   a pretty funny guy he could become like

01:01:49   a really funny successful youtuber now

01:01:52   again I'm not saying he wanted the

01:01:54   attention but now that has happened I

01:01:55   hope it becomes the making of him for

01:01:57   his sake even if like everything turns

01:01:59   up roses for him in the future somehow

01:02:01   it doesn't change the fact that it's

01:02:04   like okay well who are we gonna bring up

01:02:05   next on this obscenity law just start

01:02:08   picking youtubers in the UK very early

01:02:12   you said like so many people who are

01:02:13   leaving comments on reddit hmm I don't

01:02:17   know it's like such an incredible

01:02:18   overreaction do you have this fear

01:02:20   because you're worried that it'll affect

01:02:21   you one day because quite often I

01:02:23   thought your fears boil down to that

01:02:25   somehow lucky you skate one day someone

01:02:27   will take something you've done in the

01:02:28   wrong context and you'll end up in hot

01:02:30   water or you really just being quite you

01:02:33   know Noble here and just worried about

01:02:35   the world in general I don't know how to

01:02:37   say this very well in the context of

01:02:39   this conversation because we've just

01:02:40   been talking about Nazis right but the

01:02:44   ability to express thoughts is extremely

01:02:48   important and the power to suppress what

01:02:55   thoughts people are allowed to express

01:02:57   is terrifyingly powerful it worries me

01:03:03   on a society level right like when you

01:03:07   ask Oh am I personally concerned not

01:03:09   really I produce the most g-rated

01:03:11   content in the world and the podcast has

01:03:15   a million hours of context around it and

01:03:17   I don't think aside from our opening up

01:03:20   the wound of the necessary lies of

01:03:21   society like I don't think we say

01:03:23   anything that's terrible on the show so

01:03:25   I'm not personally concerned but I I

01:03:28   really do feel that there's something

01:03:30   like dangerously corrosive here I always

01:03:35   try to stay away from specific political

01:03:37   examples because I don't think that that

01:03:39   is very instructive but I will mention

01:03:42   one that came up like in a conversation

01:03:44   with someone so this is like in an

01:03:46   American context but the conversation

01:03:48   was around the idea of this person

01:03:52   basically wanted to make it illegal to

01:03:55   deny the fact that climate change was

01:03:57   occurring all right they're like you

01:04:00   shouldn't be able to express this idea

01:04:02   online it should be against the law hmm

01:04:04   the reasoning was it's so potentially

01:04:07   harmful spreading misinformation around

01:04:10   this may be the end of the species right

01:04:13   maybe the end of all life on Earth

01:04:15   they're like this idea that there's no

01:04:18   climate change is so terrible we should

01:04:21   put it into law that people are not

01:04:23   allowed to express it

01:04:25   it's like Oh

01:04:27   okay that's absolutely terrifying even

01:04:29   if you agree with that that opens the

01:04:32   door to the idea that like you're

01:04:34   seeding power to a government to tell

01:04:38   you what you're allowed to think hey

01:04:40   news flash your team isn't in charge

01:04:44   right now so how would you feel if a law

01:04:46   was passed that said oh you're not

01:04:49   allowed to express the idea that the

01:04:51   climate is changing then suddenly is

01:04:53   like oh I don't like that idea at all

01:04:55   like yeah of course this is why like you

01:04:57   have to defend the ability to express

01:04:59   ideas and thoughts in the broadest

01:05:03   possible form and so I feel like any

01:05:06   incursion into this territory is just

01:05:10   like a big danger yeah so that's why

01:05:13   like I feel these stories very strongly

01:05:15   and it's like yes I will agree with you

01:05:17   like there are tiny areas that we can

01:05:20   carve out direct incitement to violence

01:05:22   like attacks on a particular person but

01:05:25   even then I feel like we need to define

01:05:27   that very tightly what do we mean by

01:05:30   that it's like there's a cleanliness

01:05:31   monster that is going through the world

01:05:35   making everything have to be clean and

01:05:38   safe we can't even make jokes about the

01:05:41   Nazis because the Nazis are not clean

01:05:44   and safe right so let's just put some

01:05:46   antiseptic over this and just make it go

01:05:48   away all right and I think it's bad like

01:05:50   I think it's really bad for society and

01:05:53   there's something about it that just

01:05:54   seems like it's getting worse and the

01:05:56   producers is a perfect example psycho

01:05:58   fifteen years after World War two we can

01:06:00   make an entire movie that makes fun of

01:06:02   the Nazis but 60 years after World War

01:06:05   two jokes about the Nazis are forbidden

01:06:07   they're so dangerous we can't possibly

01:06:09   even speak them aloud I don't think

01:06:11   that's good for society yeah here's a

01:06:13   topic rady where I feel like I need

01:06:15   pushback where am I wrong here or do you

01:06:17   agree I do fundamentally agree so I

01:06:19   don't want to be the good person who now

01:06:21   gets up here and talks against free

01:06:23   speech the only thing that I don't like

01:06:25   about the way you argue this point it

01:06:27   doesn't mean I disagree with the point

01:06:28   it's just I don't like your stylistic

01:06:30   argument and that is I always feel like

01:06:32   the people who are like the scumbags of

01:06:35   the world who would putting all the scum

01:06:37   out there that I hate and the dirty

01:06:39   stuff I acknowledge there

01:06:41   right to do it right so I'm on your side

01:06:43   but I sometimes feel you kind of clothed

01:06:46   them in righteous garments when you talk

01:06:48   about defending like their right to

01:06:51   express thoughts and ideas like this is

01:06:54   just people taking a dump everywhere and

01:06:56   I feel like you're kind of using this

01:06:58   argument of let's have a free debate and

01:07:01   you make a lot of these people sound

01:07:04   better than they are I don't deny their

01:07:07   right to exist and do these things but I

01:07:10   wish you didn't make them sound so noble

01:07:12   yeah I mean the reason I talk about

01:07:14   thoughts and ideas is because I feel

01:07:16   like I'm trying to keep it

01:07:18   generic yeah but I think by doing that

01:07:20   you're bringing a lot of scumbags into

01:07:22   the fold of nice people yeah but that's

01:07:24   the problem of letting people express

01:07:28   all of their thoughts and ideas here it

01:07:29   is guess what some people have terrible

01:07:31   thoughts and ideas I don't think these

01:07:33   people were taking a dump all over the

01:07:35   place are expressing thoughts or ideas

01:07:36   they dislike taking a dump but they are

01:07:39   thoughts and ideas they are concepts

01:07:42   expressed with language I don't think

01:07:44   that there is a way to draw a circle

01:07:47   where we can say we all agree these

01:07:50   patterns of words are good and these

01:07:52   patterns of words are terrible no no I

01:07:56   think we can do that I just don't think

01:07:58   we can ban them I think it's fair to sit

01:08:00   around and start weighing things and

01:08:02   saying that's a good idea that's an

01:08:04   interesting thought and idea that's a

01:08:06   smart thing you said you there who just

01:08:08   wrote you're gay that's not a good

01:08:10   thought or idea that's not constructive

01:08:12   you're an idiot but you're allowed to do

01:08:14   it like that's a perfect example right

01:08:15   like the canonical youtube comment

01:08:16   you're you are gay yeah perhaps there

01:08:19   has been no YouTube comment more

01:08:21   frequently said than that one right

01:08:23   right and it's like yeah I'll sit here

01:08:25   and agree with you like 100 percent like

01:08:26   it adds nothing it's no good it's not

01:08:29   Plato

01:08:29   but I think it's too easy in a

01:08:32   conversation to pick a particular thing

01:08:35   that we can say like oh we all agree

01:08:37   that this is no good

01:08:38   but if you're trying to craft laws I

01:08:41   think it's really hard to craft a law

01:08:45   that can define what do we mean by this

01:08:49   and that then as an additional layer can

01:08:52   be in

01:08:54   forced in a reasonable way and count

01:08:57   Acula is a good example of this we're

01:08:59   okay if this is the law of the land in

01:09:02   the United Kingdom how is it that like

01:09:04   just this one guy got swept up by this

01:09:09   law and meanwhile like there are actual

01:09:12   Nazis like expressing actual Nazi

01:09:15   thoughts and that's cool like okay well

01:09:19   then you have some kind of bizarre

01:09:22   selective enforcement of this I'd love

01:09:25   to know the story behind this there must

01:09:26   be a bigger story it is amazing isn't it

01:09:29   this guy got done with all the other

01:09:31   stuff that's out there it's weird I

01:09:33   think it's because it was a viral video

01:09:35   right so we're just like it becomes a

01:09:36   bunch of attention and then someone

01:09:38   files a complaint and like now it starts

01:09:40   getting run up the chain I feel like for

01:09:41   my old days as a teacher I understand

01:09:43   very well the concept of like a small

01:09:45   thing the instant someone makes an

01:09:48   official complaint about it it like oh

01:09:49   god like now it's going to be run up the

01:09:51   chain and like no one can stop the

01:09:53   machine that keeps making this a bigger

01:09:54   deal than it actually is right even if

01:09:56   no one involved in the chain wants to

01:09:58   keep escalating it there's an incentive

01:10:00   at every level to keep escalating it so

01:10:02   I feel like this can just be like the

01:10:04   random weather of the Internet yeah but

01:10:06   so that's why I do want to be clear like

01:10:07   I'm not saying like everybody has an

01:10:09   equal contribution to the conversation

01:10:10   because obviously they don't some people

01:10:12   are really smart and they can contribute

01:10:14   more to conversations some people are

01:10:16   really stupid and they don't have very

01:10:17   much to add at all anyway you just as I

01:10:19   said I agree with you you just asked for

01:10:20   some pushback and my pushback is the way

01:10:22   you frame your argument I think cloaks

01:10:25   these people in garments of

01:10:26   righteousness that they're entitled to

01:10:29   the garments but I just feel a bit like

01:10:31   let me just say the reason that I do say

01:10:34   it is thoughts and ideas is it just like

01:10:36   when I made those videos about the

01:10:38   voting systems a long time ago that

01:10:40   there's a reason I don't talk about

01:10:43   specific political parties because as

01:10:46   soon as you talk about specific

01:10:48   political parties the only thing people

01:10:50   want is whatever voting system gets

01:10:53   their guys into office right that's the

01:10:55   one that seems the most fair and you

01:10:58   need to have a system where everyone can

01:11:01   agree that the election is fair before

01:11:04   the election takes place when I'm

01:11:06   talking about thoughts and ideas in the

01:11:08   public realm we need to talk about them

01:11:10   in the most abstract way because if we

01:11:14   start talking about like what do we do

01:11:15   with good ideas and what do we do with

01:11:17   bad ideas like the conversation about

01:11:19   how do you manage conversation is

01:11:22   already poisoned because everybody has

01:11:25   in their own head like the idea of what

01:11:27   a bad idea is and then so we should do

01:11:28   bad things to bad ideas

01:11:30   like the idea of like Oh which side of a

01:11:32   debate do you want to not allow to speak

01:11:34   so that's why I try to talk about it in

01:11:36   general terms because I think that is

01:11:37   the only sensible way to try to think

01:11:41   through the topic and yeah of course

01:11:43   whenever you're discussing freedom of

01:11:45   speech and you are defending it you are

01:11:48   by definition having to defend the most

01:11:52   fringe of fringe people that's just the

01:11:54   nature of this conversation and I think

01:11:57   it's why free speech defenders are often

01:12:00   at a bit of a disadvantage because like

01:12:03   by necessity you you're like the guy who

01:12:06   has to defend the Nazi joke when it

01:12:09   would be much easier to just join in

01:12:11   with a sanctimonious crowd and be like

01:12:13   he should have phrased it differently

01:12:15   like I don't appreciate that joke at all

01:12:17   and I don't think it's good for the

01:12:18   world I do think he should have phrased

01:12:19   it but he's very good how many more

01:12:22   people do you have to see step on the

01:12:24   landmine before you think hmm

01:12:26   maybe I won't walk into that minefield

01:12:28   how many people do you have to see get

01:12:30   blown up to take it in a slightly

01:12:31   adjacent way I'm reading a book which is

01:12:34   about the court case that happened with

01:12:35   Gawker a few years ago with the Hulk

01:12:38   Hogan sex tape and Peter Thiel was

01:12:40   funding this lawsuit against Gawker it's

01:12:43   an interesting book so far but part of

01:12:46   what they're talking about as well is

01:12:47   this effect that like gossip columns

01:12:50   have on people's ability to think in

01:12:54   public and it's like oh people start

01:12:57   worrying when there's a gossip column

01:12:59   that's going after the technology sector

01:13:01   that it it naturally has people close

01:13:04   ranks much further than they otherwise

01:13:07   would like they're much more reluctant

01:13:09   to express any ideas in public because

01:13:12   they're worried about it being twisted

01:13:14   because like that's what the gossip

01:13:16   column does and that's what I mean when

01:13:19   I say like I think there's something

01:13:20   dangerous in this kind of

01:13:22   that is like even if you're a million

01:13:25   miles from making Nazi jokes like you

01:13:30   said it's you feel like ah the boundary

01:13:33   of acceptable conversation has been

01:13:35   moved and so if I want to stay far from

01:13:38   that boundary I need to move even

01:13:41   farther in to only acceptable things you

01:13:44   know so it's like I want to stay a

01:13:45   hundred metres away from that boundary

01:13:47   and we keep pulling that boundary and

01:13:50   further and further it has more of an

01:13:52   effect than just the people at the very

01:13:54   edge and I just don't think that that's

01:13:56   good for public conversation fair enough

01:13:58   sorry about that I didn't mean to rant

01:14:00   for so long you've passed my guy this

01:14:04   whole time I've been ranting Brady to

01:14:06   the dead fish on the H I love me for my

01:14:09   computer screen I never close that image

01:14:12   it's still there hello internets this

01:14:16   audio sounds terrible because I have

01:14:19   just gotten into a hotel room I'm on the

01:14:22   last leg of a multi leg trip and without

01:14:26   my regular audio equipment and you know

01:14:28   when you're traveling it's very easy to

01:14:30   find yourself without the stuff that you

01:14:32   really need including for example your

01:14:36   computer files you grab your laptop and

01:14:39   you go but you know what you forgot that

01:14:41   thing that's sitting back at home on

01:14:43   your desktop what are you gonna do in

01:14:46   that scenario well under most

01:14:48   circumstances you're just screwed

01:14:50   there's nothing that you can do but if

01:14:53   you have taken our advice on hello

01:14:56   internet and installed Backblaze then

01:15:00   everything is fine because Backblaze in

01:15:03   addition to being the backup service

01:15:05   that you should obviously have on your

01:15:08   computer will also allow you anywhere in

01:15:11   the world to access any of your files

01:15:14   and speaking from experience not only

01:15:16   has back lays on multiple times totally

01:15:20   saved my bacon with lost files back lays

01:15:23   has been super helpful for grabbing a

01:15:26   file that I just saved in a random spot

01:15:29   on my hard drive it didn't really think

01:15:31   about and then later needed when I was

01:15:33   away from the computer

01:15:34   once again

01:15:35   I'm going to tell you that's if you're

01:15:38   sitting in front of your computer and

01:15:39   you do not have Backblaze installed on

01:15:42   that computer right now you need to open

01:15:45   up your web browser open it up and go to

01:15:48   Backblaze comm slash hello Internet and

01:15:50   get started there really isn't any

01:15:53   better service for your Mac or PC to

01:15:56   just take care of your backups and allow

01:15:59   you remote access and it's just five

01:16:01   dollars a month five dollars a month for

01:16:05   peace of mind so please go to Backblaze

01:16:08   comm slash hello Internet

01:16:10   do it for your sake so you can sleep

01:16:12   better at night do it for my sake so

01:16:15   I'll sleep better at night knowing that

01:16:17   your files are protected so once again

01:16:20   go to Backblaze comm slash hello

01:16:23   internet that lets Backblaze know that

01:16:25   you came from us and it will get you a

01:16:27   fully featured 15-day free trial for

01:16:31   your computer that's Backblaze comm

01:16:33   slash hello Internet alright enjoy the

01:16:37   rest of the show I I gotta get going so

01:16:42   there was a full-page ad in my copy of

01:16:45   The Times Sunday Times this morning

01:16:49   Mark Zuckerberg kind of apologizing to

01:16:53   the world but in that kind of in a

01:16:56   weaselly way where he's not really

01:16:58   apologizing is also blaming other people

01:17:00   right like the way a spider who's

01:17:02   pretending to be a person would

01:17:03   apologize he's got a terrible signature

01:17:07   by the way Matt psycho does he mmm well

01:17:10   that's because spiders riding inside of

01:17:12   human suits have a really hard time with

01:17:14   that finger dexterity that's an

01:17:16   interesting theory because it is kind of

01:17:18   spider like in fact I took a photo of it

01:17:20   to send you so I'll show you maybe this

01:17:24   is wrong maybe this is me bullying a

01:17:27   public person but I always just get the

01:17:30   feeling that Mark Zuckerberg is like

01:17:32   there's a spider riding around in a

01:17:33   robot that's shaped like Mark Zuckerberg

01:17:35   this is the feeling like whenever he

01:17:37   talks whenever he moves it's like

01:17:39   there's not a person in there there's a

01:17:40   spider in there here's my question all

01:17:42   right Facebook is in a new controversy

01:17:45   to do with data and stuff like that we

01:17:47   can talk about it in a minute if you

01:17:48   want

01:17:49   but my overall question is because this

01:17:50   this is gonna keep happening to Facebook

01:17:52   because they're a company that seems to

01:17:54   cause a lot of controversy is Facebook

01:17:56   too big to fail now is facebook gonna be

01:17:59   with us forever

01:18:00   it's interesting if you'd asked me a

01:18:03   month ago I would have said no that I

01:18:06   think Facebook is too big to fail but

01:18:10   Facebook has done an amazing job of

01:18:14   collecting some of the worst PR in the

01:18:17   world in a short period of time just at

01:18:21   the wrong time zeitgeist why's that

01:18:25   people are super receptive to it like I

01:18:28   think the idea that hey maybe all of the

01:18:32   social media is bad for us was really

01:18:35   coming to fruition earlier in the year

01:18:38   and then it's like right around that

01:18:39   time Facebook had some of the worst PR

01:18:41   in the world and then they have this

01:18:43   most recent scandal it just might be a

01:18:48   perfect storm of things to drive a stake

01:18:52   through the heart of Facebook but that

01:18:55   being said I think even if it were to

01:18:58   happen we are at a technological point

01:19:02   in time where there is always going to

01:19:05   be a natural monopoly for a social

01:19:09   network if you strike Facebook down

01:19:11   something else will just grow up and be

01:19:13   stronger in its place and will be

01:19:15   functionally equivalent to Facebook I

01:19:17   mean there has to be one doesn't there

01:19:18   for a social network to actually work

01:19:20   the way it should they sort of has to be

01:19:22   one doesn't there otherwise we're not

01:19:24   networked yeah 100%

01:19:25   I am really convinced that it's not just

01:19:27   Facebook I think that the big tech

01:19:29   companies exist in no small part because

01:19:32   we're at a technological point where

01:19:34   there are these natural monopolies I

01:19:37   think that search and video are a kind

01:19:40   of natural monopolies that Google has

01:19:42   that there will be a super big video

01:19:45   site I think it's almost inevitable I

01:19:47   almost think that like Amazon is kind of

01:19:49   at a logistics and server running kind

01:19:53   of natural monopolies that makes it

01:19:55   almost impossible for any other company

01:19:56   to compete with them I think we really

01:19:58   are in a weird phase of the world where

01:20:01   that is a truth

01:20:02   so these big companies exist and even if

01:20:04   we were to somehow break them up it

01:20:06   would just take a couple of years before

01:20:08   the thing that replaces them is is just

01:20:11   as big because everything is pointing in

01:20:13   that direction but yeah you're 100%

01:20:14   right

01:20:15   the social network is the most strong

01:20:18   version of that like definitionally

01:20:20   there almost has to be one do you use

01:20:22   Facebook anymore

01:20:23   I haven't used Facebook in any

01:20:25   meaningful way in years in any

01:20:27   meaningful way but do you use it non

01:20:29   meaningful ways yes I do use it in a non

01:20:31   meaningful way which is that I have a

01:20:33   CGP grey like business page right

01:20:36   because if you're a public person

01:20:37   Facebook doesn't let you have a personal

01:20:39   webpage you have to have this business

01:20:41   page I don't even know if it still works

01:20:43   but I did have set up like an if this

01:20:46   than that trigger that would just

01:20:47   automatically post stuff from the

01:20:49   website onto that page yeah and so when

01:20:51   I can when I say like I haven't used it

01:20:52   in a meaningful way that's what I mean

01:20:54   like I haven't logged into Facebook and

01:20:56   like six months at a bare minimum yeah

01:20:58   and even then it would be just to like

01:21:00   check on a thing I pretty much checked

01:21:03   out of Facebook the instant they started

01:21:06   that you need to pay us to send out your

01:21:09   posts to the people who follow you on

01:21:11   Facebook thing hmm

01:21:12   that was for me the day Facebook died I

01:21:15   was like okay well I don't use Facebook

01:21:16   on a personal level anyway and so if the

01:21:19   only thing that I use it for is on a

01:21:20   business level and now you're playing

01:21:23   the Mafia game of like oh it's a nice

01:21:24   business page you have there sure would

01:21:26   be a shame if something happened to it

01:21:28   then it's like okay I'm out I'm out I'm

01:21:30   not really gonna play this game I'm not

01:21:32   gonna participate in this so now I

01:21:34   haven't used it personally in ever to

01:21:37   you yeah I mean I have pages for all my

01:21:41   different projects that I maintain yeah

01:21:43   I go in there every day and have a look

01:21:45   just to check what my friends are up to

01:21:46   and I don't post a lot of stuff myself

01:21:49   on there unless I'm doing like something

01:21:50   amazing like yeah amazing my life is I

01:21:54   don't post a pending three weeks in a

01:21:56   row sitting at a desk editing videos but

01:22:00   and that's the social media effect where

01:22:02   everybody's life seems a million times

01:22:03   more amazing than yours only the most

01:22:06   amazing things they do and of course

01:22:09   he's contributing to making the world a

01:22:11   worse place by doing that brady you have

01:22:13   to post your mundane days as well I

01:22:14   think it's a problematic

01:22:16   company but I almost feel like it's too

01:22:19   big to fail because I look at you know

01:22:20   although I don't reach all my followers

01:22:22   because of the Mafia thing you mentioned

01:22:24   do you have you know followings of all

01:22:25   these Facebook pages and it seems like I

01:22:28   can't just throw that away again I like

01:22:29   ignore us so I mean I can but you

01:22:33   totally can right I mean I'm willing to

01:22:36   bet that if you dig into your analytic

01:22:38   data on YouTube that Facebook is a much

01:22:40   smaller portion of the views than you

01:22:42   might expect that it is yeah I think

01:22:45   you'd be right if that is the case I

01:22:46   would certainly encourage you to not

01:22:48   waste your time on it because if you're

01:22:50   tracking like how much time do you spend

01:22:52   on Facebook and how many views does that

01:22:54   actually translate into I bet that

01:22:56   equation would show very fast that

01:22:57   Facebook is not worth actually spending

01:22:59   any time on I would definitely say I

01:23:01   don't waste time on it I do probably

01:23:03   waste time on you know looking at my

01:23:05   friends puppy but I don't waste time

01:23:08   business on it like if there's a new

01:23:09   numberphile video I'll just go in say

01:23:12   hey everyone there's a new numberphile

01:23:13   video I hope you like it mm-hmm

01:23:14   here's the link and then leave and I

01:23:16   won't hang around reading comments and

01:23:18   you know it's it's perfunctory Facebook

01:23:23   terrible I like this much cuz I could

01:23:26   have expired a thing you've got me

01:23:27   thinking now

01:24:01   we have much to complain about we do

01:24:03   have much to complain about Braden we

01:24:05   may need another 100 episodes gray Oh Oh

01:24:09   Brady you can't do that to the people I

01:24:13   can't do that to the people

01:24:13   can't do that to the people

00:00:00   hi this is Dirk from Veritas okay here

00:00:06   we go here we go hmm this is Dirk from

00:00:09   Vera stab Liam and you're listening to

00:00:10   the hundredth episode of hello Internet

00:00:13   maybe it's time to stop making fun of my

00:00:15   name green vs. yellow tennis balls yep

00:00:20   this has become quite the thing yeah

00:00:23   I've seen a little bit of this on my

00:00:25   Twitter timeline so for people who don't

00:00:28   know I'll do the quick background gray

00:00:30   and his wife had a little marital

00:00:32   discussion about where the tennis balls

00:00:33   were green or yellow which caused gray

00:00:35   to do a little tweaked poll about it do

00:00:38   you think tennis balls are green or

00:00:39   yellow lots of people got into it got

00:00:41   lots of votes it even got a little bit

00:00:43   of like media coverage which was quite

00:00:45   funny yeah and we discussed it on hello

00:00:47   Internet

00:00:48   and it was really interesting what

00:00:49   happened as a result of the hello

00:00:50   internet discussion there was lots of

00:00:52   tim foolery

00:00:53   there are several scientific studies

00:00:54   done I don't know if you've seen the one

00:00:56   that I've popped in the show notes where

00:00:57   someone actually took my advice and

00:01:01   measured the light spectrum from lime

00:01:04   from lemon and from a tennis ball which

00:01:07   it was closer to and so you know they

00:01:08   wrote up a little scientific report and

00:01:10   many a laugh was had and that could have

00:01:13   been the end of it except that one

00:01:16   listener I think he must have found out

00:01:18   that his girlfriend and his girlfriend's

00:01:20   father were going to the tennis so he

00:01:23   decided to tell them I listen to this

00:01:24   podcast they were talking about where

00:01:26   the tennis balls a green or yellow and

00:01:28   the father who is not a Tim who was just

00:01:31   the father of a girl whose boyfriend

00:01:33   listens to Holly winter net right was

00:01:35   quite taken by this and the next day he

00:01:37   found himself in a little gaggle of

00:01:39   people who were waiting to meet and get

00:01:41   autographs from quite possibly the

00:01:43   greatest tennis player ever current

00:01:46   world number one tennis player although

00:01:47   he's about to just go back down to

00:01:49   number two but I think he's the greatest

00:01:51   tennis player ever Roger Federer they

00:01:54   were all waiting for Roger Federer to

00:01:55   walk past between whatever matches he

00:01:58   was playing sign a couple of autographs

00:01:59   and whisk away and while he was signing

00:02:02   these autographs this dad calls out

00:02:03   Roger had tennis balls green or yellow

00:02:07   and amongst all like the confusion Roger

00:02:10   Federer at the question two suddenly

00:02:11   took him and he looked up and he went

00:02:13   I think he said something long was of

00:02:15   the yellow aren't they mm-hmm and then a

00:02:17   few other people in the quadrant yeah

00:02:18   yeah they're yellow and this guy the

00:02:20   dead who was filming this with his

00:02:22   mobile phone said I think they're green

00:02:24   but lots of other people think they're

00:02:25   yellow and then Roger Federer said Nana

00:02:27   they're yellow mm-hmm and then walked

00:02:29   off you know that in itself I think was

00:02:32   a great hello internet moment and I

00:02:35   declared then and there that this Tim's

00:02:38   girlfriend's dead is gonna get a Hello

00:02:40   Internet Medal of Honor is that legal he

00:02:42   doesn't even listen to the show but yeah

00:02:44   I think he's getting around he's getting

00:02:45   one and he's what within enough degrees

00:02:47   of separation from the initiating

00:02:49   incident that he's gonna get a medal

00:02:51   it's never been written anywhere that

00:02:52   you have to be a listener to get a Medal

00:02:54   of Honor imagine it helps I mean to be

00:02:56   fair I don't think we have anything

00:02:57   written anywhere about how anything

00:03:00   works but yes that's true we haven't

00:03:02   written down rules about how the medals

00:03:03   work now a lot of people said that I

00:03:05   shouldn't be just giving them

00:03:06   willy-nilly and I needed your approval

00:03:08   but even if you said no to this I would

00:03:10   say too bad he's getting one this is

00:03:13   what happens people maybe could I

00:03:15   stopped him no often I could not I could

00:03:18   not would you stop me is the question

00:03:20   you wouldn't veto this one surely this

00:03:22   is a great moment having the greatest

00:03:24   tennis player in history weigh in on

00:03:26   this - squabble about the color of

00:03:28   tennis books I certainly could disagree

00:03:30   with you one you send me a video and

00:03:33   from my perspective it's some guy in a

00:03:35   crowd is commenting on the tennis ball

00:03:37   thing right it's like oh okay

00:03:39   great I feel like the medals should be

00:03:42   going to actual listeners of the show so

00:03:44   I do feel like I can disagree with you

00:03:46   but I just you seem so tickled by this

00:03:50   that I know there's nothing I can do to

00:03:53   stop you from simply mailing out one of

00:03:55   these medals to a guy who's going to be

00:03:57   very surprised to just get a box that's

00:03:59   an award it would be the second one

00:04:03   awarded look right I see what you mean

00:04:06   about you have to be a listener right

00:04:08   like you have to be a citizen of a

00:04:09   certain country to win certain honors

00:04:11   but I think medals of Honor generally

00:04:13   are just for the furtherance of the

00:04:15   greatness of hello Internet and this

00:04:17   definitely has furthered the cause in a

00:04:19   very special way to me so I thought it

00:04:21   was fantastic right I thought was

00:04:22   absolutely fantastic I had a right laugh

00:04:24   at it and that could have been it just

00:04:27   so you

00:04:27   understand I know you don't really

00:04:28   appreciate Who Roger Federer is this is

00:04:30   the equivalent of say instead of tennis

00:04:32   balls you and your wife had been

00:04:33   discussing something about mr. chompers

00:04:36   ears you couldn't decide something about

00:04:38   his ears or not mm-hmm and then someone

00:04:41   went and got an official comment from

00:04:43   Charles Darwin about it this is what it

00:04:44   is to me this is a level of greatness in

00:04:47   the field this is the Charles Darwin or

00:04:49   Isaac Newton of tennis hmm he just

00:04:51   happens to be alive this means that I'm

00:04:53   officially right in your eyes is that

00:04:56   how this works well that's another issue

00:04:58   I tell you another thing though about

00:04:59   Roger Federer that you may not

00:05:01   appreciate but you know how there are

00:05:03   certain things on the internet that you

00:05:05   don't mess with because everyone will

00:05:07   jump on you like you know electric

00:05:08   fences because they have really jealous

00:05:11   followers Roger Federer is one of those

00:05:13   things that was about to say can you

00:05:15   give me an example of things that you

00:05:16   don't mess with on the Internet Brady

00:05:18   like what are beehives you are not

00:05:19   supposed to poke the sticks you're

00:05:21   normally the one who tells me so you

00:05:23   don't worry about them than me you're

00:05:25   the one who says no we're not talking

00:05:26   about that I have never said such a

00:05:28   thing braving criticizing Roger Federer

00:05:30   is such a beehive never criticize Roger

00:05:33   Federer because his fans known as the

00:05:35   fed heads Oh God are very zealous

00:05:38   this is like tennis Logan Pollard's is

00:05:41   that what this is like I'll make jokes

00:05:42   about Elon Musk who the cows come home

00:05:44   but I'm not saying anything bad about

00:05:45   Roger Federer or his fans so annoying

00:05:48   that you don't want to like him Brady is

00:05:50   that one of these situations no comment

00:05:53   no comment from Brady at all so anyway

00:05:55   after Roger Federer weighed in on the

00:05:59   debate mm-hmm

00:06:00   this story for lack of a better word

00:06:03   completely blew up and went absolutely

00:06:07   everywhere like The Today Show in

00:06:09   America it was in all the newspapers it

00:06:12   was on all the web sites it was all over

00:06:14   the BBC tennis balls yellow or green for

00:06:17   like a day and a half or two days became

00:06:19   an absolute obsession a media obsession

00:06:23   any mention of gray and his wife where

00:06:26   it all started or hello Internet had

00:06:28   completely vanished from existence lots

00:06:31   of teams were upset about this they're

00:06:32   like oh they're not mentioning gray and

00:06:34   then I'm mentioning hello Internet like

00:06:35   that's just what happens with these

00:06:36   things yeah but it became absolutely

00:06:38   massive you couldn't look anywhere

00:06:41   without it coming up do you care or you

00:06:43   relieved that you kind of got pushed out

00:06:45   of the story or I mean you kept sending

00:06:49   me more and more things from stories

00:06:52   about where it was being posted and the

00:06:56   number one thing no I don't feel like I

00:06:59   need to be credited that I worded a

00:07:02   tweet that was a discussion about my

00:07:04   wife and I and tennis ball like that is

00:07:06   the kind of thing on the internet that

00:07:08   trying to have any kind of ownership of

00:07:10   that is just like an insanity right so I

00:07:13   want to be really clear I would have no

00:07:15   expectations that by the time this thing

00:07:18   gets pattern matched up to Good Morning

00:07:21   America The Good Morning America is like

00:07:23   Greg and his wife we're having a

00:07:24   discussion about tennis balls right like

00:07:26   I would not expect that but I did have

00:07:28   this different feeling where I kind of

00:07:30   did not like you keeping sending me

00:07:32   these articles were you like oh look

00:07:33   it's over here now three or four I think

00:07:37   it was three or four hundred it felt

00:07:39   like I would likely continue but you are

00:07:43   doing me a disservice here it was not

00:07:44   that man it was three or four it wasn't

00:07:47   that many but it was also on Twitter

00:07:48   like people just kept sending me like

00:07:50   hey look the thing is over here okay so

00:07:52   there is awesome away where I feel like

00:07:54   how can I say this thing without

00:07:56   sounding like the world's most jaded

00:07:59   asshole I think there's a thing where

00:08:05   people want you to be like happy that

00:08:07   this thing that you started has spread

00:08:09   far and wide throughout the news media

00:08:11   and my response to this is not a kind of

00:08:14   happiness it's just like enough of these

00:08:17   things have happened over the years I've

00:08:18   seen this happen with other people

00:08:20   enough over the years that this is just

00:08:21   like a pattern and I feel like this is

00:08:24   what the media machine does like they're

00:08:27   looking for things that they know are

00:08:29   going to get them views and attention

00:08:31   and they will just repeat and pattern

00:08:35   amplify that and copy the others what

00:08:37   the others are doing

00:08:38   yeah and copy the others but they copy

00:08:39   the others in this like battle of the

00:08:41   memes for attention of who's watching TV

00:08:44   and any kind of thing like this where

00:08:45   you can divide people in teams like it's

00:08:48   going to spread but my feeling people

00:08:50   are like oh aren't you excited like it

00:08:52   made the New York Times is like not

00:08:54   exciting

00:08:54   this is like literally the job of these

00:08:58   companies is to find things that will

00:09:00   get people's attention and to pattern

00:09:02   amplify them across time so I almost

00:09:05   felt like oh this like thing that was

00:09:06   fun for the podcast like now it is

00:09:08   everywhere in the world like and it's

00:09:10   just been pattern amplified all over the

00:09:12   place it was a strange thing to see this

00:09:15   spread across the internet but that's

00:09:16   what happens like catchy ideas spread it

00:09:19   would be like definitionally impossible

00:09:22   for them not to spread but that's how

00:09:24   these things go they spread across the

00:09:25   internet very fast and very far just to

00:09:28   be clear I wasn't sending you those

00:09:30   things because I wanted you to be like

00:09:31   excited or happy or proud I was winding

00:09:34   you up basically because I don't know

00:09:37   how much you hate the media so the fact

00:09:39   that like you and your wife just you

00:09:41   know chatting about tennis balls for ten

00:09:43   minutes in your house can make it as a

00:09:45   news story in The Times of London yeah

00:09:47   like I expect you to see that oh my god

00:09:50   what where do I live in I'm sending that

00:09:52   to you to like poke the grumpy old man

00:09:54   and get him more upset about news not

00:09:56   like well done you should be proud

00:09:58   you've made it yeah I know that's 100%

00:10:01   what you're doing which is also why

00:10:02   Brady you may have noticed you've got

00:10:03   very few responses from me Brady sends

00:10:07   the articles through and is like I say

00:10:09   nothing back right reply because I know

00:10:13   you're trying to wind me up right and

00:10:15   that's why I CC your wife into the

00:10:17   messages as well because she's polite

00:10:19   and replies and she was also getting a

00:10:22   bit wound up and I happened to be away

00:10:24   while this was going on and you were

00:10:25   messaging my wife and I and I was busy

00:10:28   but I came very very close to sending my

00:10:31   wife a message which would have been

00:10:33   something along the lines of like don't

00:10:35   keep replying to Brady or just

00:10:36   encouraging her because she was all like

00:10:38   this is tweeting right this is

00:10:40   outrageous she got the joke but yeah no

00:10:46   of course of course I know you're trying

00:10:47   to wind me up and it fits exactly in

00:10:50   like the gray checklist of not media

00:10:53   hatred but media sadness disappointment

00:10:57   a little bit of despise 'el I'm not

00:11:01   angry at the media I'm disappointed yeah

00:11:03   I'm disappointed and next time you want

00:11:06   to tell me about the brave Fourth Estate

00:11:08   protect

00:11:08   our democracy from all sorts of ills

00:11:10   I'll remind you that they spent lots of

00:11:12   time on whether or not tennis balls are

00:11:14   green or yellow can I give you that I

00:11:16   know we talked about how I will

00:11:19   sometimes cut things that you say from

00:11:21   the show segments that I find boring

00:11:23   sports ball corners sometimes but I have

00:11:26   to say that last episode I did have a

00:11:28   real crisis of confidence of if I should

00:11:31   even put up that section about necessary

00:11:33   lies of society there is something about

00:11:35   it that just felt like I don't know if I

00:11:37   really want to open this door publicly I

00:11:39   really did think about it like should we

00:11:41   put up this little section we're not

00:11:43   really saying anything that's terrible

00:11:44   in here this is not like a brand new

00:11:46   idea that has never been released into

00:11:48   the world but there was something about

00:11:50   it that I felt a little bit

00:11:51   uncomfortable about but once the show

00:11:54   was up then I immediately asked on

00:11:55   Twitter like hey everybody tell me your

00:11:57   necessary lies to keep civilization

00:11:59   together because I'm like well when it's

00:12:00   out here let's just double down and it's

00:12:03   interesting because I think I got back

00:12:04   many terrible answers and that people

00:12:08   would just say things that like our

00:12:10   phrases that you hear that simply aren't

00:12:12   true so I'm trying to think about how to

00:12:15   define more clearly this idea like a

00:12:17   necessary lie of civilization and I

00:12:20   think you're one just then as the

00:12:22   transition is totally true like cheaters

00:12:24   never win is a kind of civilization

00:12:27   propaganda that we have to try to get

00:12:29   people on board with even though it's

00:12:30   not true anyway so there were a bunch of

00:12:33   answers that I thought were not really

00:12:35   good or just like aphorisms but I think

00:12:38   the best answer that I got was from

00:12:41   greed grow on reddit and the necessary

00:12:44   lie is that violence is not the answer

00:12:47   or violence is not the solution man

00:12:50   that is a really great necessary lie of

00:12:53   civilization

00:12:54   it's almost definitionally there to keep

00:12:57   civilization together that like

00:12:58   civilization is the resolution of

00:13:02   problems without having to resort to

00:13:05   violence and that feels like a real

00:13:07   ground level kind of agreement that

00:13:10   there are plenty of things that you

00:13:12   could probably try to solve with

00:13:14   violence but we all have to agree

00:13:16   together that like we're not going to

00:13:18   resort to this tool and we have to agree

00:13:22   like this

00:13:22   things together like we're not going to

00:13:25   resort to violence as a solution but

00:13:28   it's like there are plenty of situations

00:13:30   where we use violence as the solution we

00:13:32   have Wars right or it's like we pay

00:13:33   police like to essentially force things

00:13:37   to occur so I really like that as an

00:13:38   answer for a necessary lie of

00:13:40   civilization is like we teach kids like

00:13:42   violence is not the answer violence is

00:13:44   not the solution and and often can be we

00:13:47   just we can't resort to it otherwise

00:13:49   we'll be a bunch of chimps tearing each

00:13:52   other apart

00:13:52   so violence isn't a solution then

00:13:54   because ultimately we tear each other

00:13:56   apart violence is not the solution if

00:13:58   everyone uses violence the violence can

00:14:00   be a solution if a select few people use

00:14:02   it yeah that's really getting to the

00:14:03   core of like we were talking last unlike

00:14:06   the psychopathy school of like oh we

00:14:07   pull some people apart and we say like

00:14:09   hey here's some things you might want to

00:14:10   they're actually true or like you're

00:14:11   gonna be very successful or very

00:14:13   terrible people and being willing to

00:14:16   resort to violence is one of those

00:14:17   things it's like if everybody does this

00:14:19   civilization falls apart which is why

00:14:21   it's a necessary lie but like on an

00:14:23   individual basis it can kind of work the

00:14:25   thing it made me think about which it

00:14:26   can it's like how can I don't even want

00:14:28   to say it out loud when I was in middle

00:14:30   school and like on the edge of being a

00:14:32   kid who was about to be bullied a little

00:14:35   bit and you know the adults tell you

00:14:37   it's like oh I'll ignore the situation

00:14:39   you know don't do anything bullies just

00:14:41   feel bad about themselves which is also

00:14:43   like a weird lie

00:14:44   like you hear all of this advice and my

00:14:46   experience as a kid was nothing stopped

00:14:50   the beginning of bullying faster than

00:14:52   adding a little bit of violence into the

00:14:54   solution what did you do grey because my

00:14:57   question I was gonna ask you was have

00:14:58   you ever used violence as a solution and

00:15:00   I thought there's no way he'd answer

00:15:01   that

00:15:02   and you've just offered it up well your

00:15:06   criminal past is not criminal it's

00:15:08   self-defense

00:15:09   what did you do did you punch someone a

00:15:11   bully did you punch your bully look I

00:15:13   don't even really want to get into the

00:15:14   details but it sounds cooler if you

00:15:17   don't tell anyway because a lot we can

00:15:19   imagine something more awesome I don't

00:15:20   know look here's the thing the reason

00:15:22   why I would be like a potential bullying

00:15:26   target is because I wasn't like a big

00:15:27   buff kid or anything like shocked

00:15:29   surprised younger gray was kind of a

00:15:32   nerd right like total surprise I'm sure

00:15:35   - absolutely every

00:15:36   oh you mean he wasn't quarterback at the

00:15:40   school football team yeah that's not the

00:15:42   image of how it was so you just had to

00:15:44   lay down a marker once you lay down a

00:15:47   marker and the trick is you have to be

00:15:50   able to convince the other person that

00:15:52   retaliation will be unreasonably large

00:15:55   again I don't want to say these things

00:15:57   out loud Brady why do I say them into

00:15:58   the microphone but it's just you it may

00:16:00   here when I was a teacher I would just

00:16:02   see it like I had the interesting

00:16:03   experience of teaching at mixed schools

00:16:05   with both boys and girls teaching at

00:16:07   girl only schools and teaching at boy

00:16:10   only schools like it was really

00:16:11   interesting as an adult

00:16:12   to see this like from a different

00:16:13   perspective and it's like oh there's a

00:16:16   couple of lessons that I learned there

00:16:17   like lesson number one is high school is

00:16:21   even more awful like viewing it from the

00:16:23   perspective as an adult than it was as a

00:16:25   kid but it's like oh okay I can see that

00:16:28   none of the boys make it through school

00:16:30   without being subjected to some kind of

00:16:33   physical violence and none of the girls

00:16:35   make it through school without being

00:16:36   subjected to some kind of psychological

00:16:38   violence from each other it's like it's

00:16:40   just so obvious

00:16:41   I like that violence is not the answer

00:16:42   or the solution one I feel like that was

00:16:44   just a like a really really great answer

00:16:47   to this question and I'll be sure to cut

00:16:51   or dramatically cut down this section

00:16:52   don't let me leave it up Brady don't let

00:16:55   me leave it up we're gonna tear the

00:16:56   world apart I don't mean to hello hello

00:16:59   internet listeners is there something

00:17:01   that you want to learn is there a skill

00:17:03   that you want to require something you

00:17:05   wish you could do that you can't do now

00:17:07   then you should check out skills share

00:17:10   skill share is an online learning

00:17:12   community with thousands of classes in

00:17:15   design business technology and more with

00:17:19   skills share Premium Membership you get

00:17:21   unlimited access to high-quality classes

00:17:23   on must know topics so you can improve

00:17:26   your skills unlock new opportunities and

00:17:28   do the work you love right now you're

00:17:31   listening to a podcast and I'm willing

00:17:33   to bet that many of you probably want to

00:17:36   start a podcast of your own but you

00:17:38   don't know where to begin and it can be

00:17:40   pretty overwhelming even after having

00:17:43   done this for years I'm constantly

00:17:44   surprised at how fiddly and fussy doing

00:17:47   a podcast can be well

00:17:49   Skillshare has podcasting classes and

00:17:52   not just one thing on podcasting they

00:17:55   cover all of the various ranges from the

00:17:58   technical side about how to get started

00:18:00   to the editing side how to work with

00:18:03   tools like logic or Adobe to actually

00:18:05   edit the podcast and they have classes

00:18:07   on how to improve your extemporaneous

00:18:10   way of speaking so across the whole

00:18:13   range of everything you could want to

00:18:15   know about how to make a podcast

00:18:17   Skillshare has classes on that so again

00:18:20   an annual scription to skills share is

00:18:22   less than ten dollars a month but right

00:18:25   now if you go to Skillshare comm slash

00:18:28   hello the first 1,000 people to sign up

00:18:31   will receive their first two months for

00:18:33   99 cents that's Skillshare comm slash

00:18:37   hello and the first 1,000 people will

00:18:40   receive their first two months for 99

00:18:42   cents there are quite a lot of hello

00:18:44   internet listeners so if you are hearing

00:18:46   the sound of my voice right now you

00:18:48   probably want to go and check that out

00:18:51   as soon as you can it gets you a

00:18:53   discount on Skillshare and let's

00:18:55   Skillshare know that you came from our

00:18:57   show so thanks to Skillshare for

00:18:59   supporting hello Internet and thanks to

00:19:01   Skillshare for teaching people the

00:19:03   skills they can use to make their lives

00:19:06   better a thing has happened which we all

00:19:10   know we have been waiting for it to

00:19:12   happen a bad thing that was inevitable

00:19:15   which is we have had now the first death

00:19:20   as a result of a self-driving car as in

00:19:23   hitting a pedestrian because people have

00:19:24   died in their own cars already there

00:19:26   have been a bunch of these like leading

00:19:28   up incidences I think some of which

00:19:30   we've even talked about on the show in

00:19:32   the past where like someone is in a car

00:19:35   and they're not paying attention or like

00:19:37   there's accidents but this is one where

00:19:40   it was

00:19:42   uber testing a like fully autonomous

00:19:45   self-driving car and there was someone

00:19:48   behind the wheel who was just like

00:19:50   keeping an eye on the equipment and a

00:19:52   pedestrian walking a bicycle across the

00:19:55   street got hit by the uber and died

00:20:00   this to me is like the first like

00:20:03   clearest example of someone has died

00:20:06   from a self-driving car that was being

00:20:09   autonomous and it's a car that is

00:20:11   designed to be autonomous in the way

00:20:14   that we're all thinking about as opposed

00:20:16   to the malfunction of a system which is

00:20:19   much more like an assistive driving

00:20:22   system or like a lane maintaining system

00:20:24   yeah what do you think about this as a

00:20:27   moment in time or as a news story that

00:20:30   has occurred it's a significant sort of

00:20:32   moment like in terms of being a

00:20:34   milestone I've seen footage of the

00:20:35   accident mm-hm and I've seen what a lot

00:20:37   of people wrote about the accident

00:20:39   and like to look at it without knowing

00:20:42   all the facts like it does seem like

00:20:44   this person was crossing the road in a

00:20:47   way that wasn't like being super super

00:20:50   cautious like they did seem to be

00:20:53   crossing the road in a way that looked

00:20:55   dangerous but we're also told these

00:20:59   self-driving cars can see everything off

00:21:01   to the sides and know what's going on

00:21:02   and predict these things and so all the

00:21:05   people who are saying well of course no

00:21:07   one could have avoided that no human

00:21:09   could have avoided it how do we expect a

00:21:10   self-driving car to avoid it well I will

00:21:13   hold self-driving cars to a higher

00:21:15   standard but you can only hold them to

00:21:17   so high standard if someone wants to die

00:21:19   by jumping in love this person did but

00:21:22   if someone wants to die by jumping in

00:21:24   front of a car they they can still do it

00:21:25   with a self-driving car so I think we

00:21:27   have to realize people are still gonna

00:21:29   die at the hands of self-driving cars I

00:21:32   think this was just a a milestone that

00:21:34   was always gonna happen

00:21:35   it'll slow things down it does make me

00:21:38   wonder if when the first person died

00:21:39   from an actual car like you know back in

00:21:42   the olden days whether or not cars being

00:21:45   on the road was stopped for like six

00:21:47   months while people rethought the

00:21:48   situation like was it the same back then

00:21:51   when the first pedestrian got hit by

00:21:52   some five miles an hour

00:21:55   automobile did everyone go no we can't

00:21:58   have these things on the road and go up

00:22:00   in arms well there was no social media

00:22:01   on which to share these stories

00:22:04   who knows how information was spread

00:22:06   back in those days of yore nobody nobody

00:22:08   knows I think it's interesting because

00:22:10   this happened a little while ago and I

00:22:12   remem

00:22:13   when the first thing that could be even

00:22:15   classified as a kind of accident with

00:22:17   any self-driving occurred which i think

00:22:18   was something like a tesla system a

00:22:20   couple years ago on the highway had an

00:22:22   accident that became like the first

00:22:23   story about this I thought it was

00:22:26   interesting to see that that blew over

00:22:28   relatively quickly and I feel like this

00:22:31   story as well at least from the little

00:22:33   bit I've been keeping an eye on it seems

00:22:35   like it's blowing its way through the

00:22:38   news cycle pretty quickly yeah I don't

00:22:40   know if this is actually going to have

00:22:42   any significant effect on the

00:22:45   development of self-driving cars you can

00:22:48   find the video online that shows the

00:22:50   camera in front of the car and I agree

00:22:53   that it's a case where no human driver

00:22:55   could have possibly avoided that

00:22:56   accident it's at nighttime the woman

00:22:59   walking the bicycle is walking across a

00:23:01   highway where there's not a pedestrian

00:23:03   intersection and by the time she appears

00:23:06   in the headlights

00:23:07   the accident is inevitable like there's

00:23:09   no way even if the car slammed on the

00:23:11   brakes immediately there's no way that

00:23:13   you could possibly do it but I feel like

00:23:15   I don't need self-driving cars to be

00:23:20   better than humans to be on the road I

00:23:23   feel like they just need to be

00:23:25   equivalent to humans that is my my point

00:23:29   of like where can we start putting

00:23:31   self-driving cars on the road no I need

00:23:33   to be better ok why do they need to be

00:23:35   better and how much better because when

00:23:37   they need to better cuz we're so bad at

00:23:39   driving like L live was unacceptable

00:23:41   basically but they don't need to be

00:23:44   infallible they need to be more

00:23:46   impressive than humans and how could

00:23:48   they not be because we're such bad

00:23:50   drivers they are already better so I

00:23:52   think this debates already over yeah I

00:23:54   did see someone on Twitter make a joke

00:23:56   or something like a self-driving car has

00:23:58   killed one person today and people

00:24:00   driving cars will kill 3,000 people

00:24:01   today right which is like Oh God it's an

00:24:04   unfair comparison in many ways but I

00:24:06   think it really does make a point of

00:24:08   humans are terrible drivers my feeling

00:24:11   there was like I'm happy with

00:24:12   equivalency because I think the benefits

00:24:15   that you get from the self-driving car

00:24:17   then make that mental calculus of is

00:24:19   this better it makes it clearly better

00:24:21   even if the self-driving cars are only

00:24:23   at human level and I feel like it's

00:24:25   inevitable that they'll obviously keep

00:24:27   improve

00:24:27   over time but you're right that it is a

00:24:30   case where the story about the

00:24:32   self-driving cars is like oh they have

00:24:34   lidar which looks in all directions and

00:24:35   can see everything it does raise the

00:24:38   human expectation that like oh this car

00:24:40   should always be able to see all

00:24:42   pedestrians in all situations I also

00:24:45   wanted to mention the story because

00:24:46   there's a thing that has been on my mind

00:24:48   for a while with self-driving cars which

00:24:52   I find myself wondering or maybe I

00:24:57   should say reevaluating my position on

00:24:59   how fast this is actually going to occur

00:25:01   because when I put out that humans need

00:25:05   not apply video a few years ago it felt

00:25:08   like things were moving very quickly in

00:25:10   the world of self-driving technology and

00:25:12   most people were not aware of it and now

00:25:15   that people know self-driving cars are a

00:25:18   thing and that they exist and people are

00:25:20   sort of waiting for self-driving cars it

00:25:24   feels like there has been just very very

00:25:28   slow progress in this field and I wonder

00:25:33   like is this a technology that is going

00:25:34   through some kind of s-curve where there

00:25:39   was no progress made for a long time and

00:25:41   then like the year 2000 s things start

00:25:43   to speed up and accelerates and then we

00:25:46   hit like 2015 and it looks like oh man

00:25:49   it's gonna be here any day and then all

00:25:50   of a sudden it starts to really level

00:25:52   off like the rate at which we're making

00:25:53   progress in this area and I do wonder if

00:25:56   it is the case where it's a

00:25:59   technological problem where the last 10

00:26:01   or 15 percent of the problem turns out

00:26:04   to be you know 85% of the actual work

00:26:07   that getting a car - not quite human

00:26:10   level is much much simpler than getting

00:26:14   it to human level like getting it to

00:26:16   that last 10% but I am aware that I feel

00:26:19   like past me expected there to be more

00:26:22   progress in self-driving cars by now

00:26:24   then there actually are and I feel like

00:26:27   we've only seen the smallest scale tests

00:26:29   of these things so you think this last

00:26:32   15 10 % that we might be going through

00:26:35   at the moment is purely going to be a

00:26:37   technological hurdle we haven't started

00:26:39   running into the social and political

00:26:41   side of things you think it's so

00:26:43   technology now it's funny I didn't

00:26:45   mention it because that is just not a

00:26:47   thing that's even on my radar I really

00:26:48   just don't think that that is going to

00:26:50   be any kind of significant barrier right

00:26:52   I still think it is such a technological

00:26:55   advantage if we can get it there that

00:26:57   there's no way that the places that say

00:27:00   oh we're gonna wait on self-driving cars

00:27:02   I feel like that is such a self slowdown

00:27:04   that very few places will do that like

00:27:06   it would be like banning cellphones

00:27:08   because we're not quite sure how

00:27:10   addictive they are yet it's like well

00:27:12   okay good luck with that

00:27:13   but the rest of the world is gonna

00:27:15   totally leave you behind

00:27:16   and I think that yeah when we do get the

00:27:18   self-driving cars it'll be the same

00:27:19   thing that if some states or some

00:27:21   country say mmm we're gonna hold off on

00:27:24   this for a little while it's going to be

00:27:25   just such a tremendous economic

00:27:26   advantage that they won't be able to do

00:27:28   so oh yeah

00:27:29   I don't know I wonder if it's just gonna

00:27:31   take a lot longer than I expected I hope

00:27:33   it doesn't but I'm a little worried that

00:27:34   it might well I'm finding every time I

00:27:36   think it's gonna take forever at speeds

00:27:38   up and every time I think it's gonna be

00:27:39   tomorrow it slows down so I've given up

00:27:41   all together I have this feeling almost

00:27:43   like as soon as we publish this episode

00:27:45   again it's going to be like oh and

00:27:47   self-driving cars are here in three

00:27:49   weeks they're on sale right you can

00:27:50   pre-order them now I read an article in

00:27:52   the paper this morning that really

00:27:53   disappointed me about self-driving cars

00:27:55   they hmm they're talking about building

00:27:57   into the interface for self-driving cars

00:28:00   things to keep people's attention

00:28:02   whether it's like you know games of

00:28:04   tetris or things like that so that

00:28:07   people will stay alert so they can take

00:28:09   over the car in an emergency that's not

00:28:11   what I want my self-driving car to be I

00:28:13   want to be in the back reading the paper

00:28:15   or working on my laptop I don't want to

00:28:17   have to be ready to take over in an

00:28:18   emergency that's an excellent point and

00:28:20   I think there is a way where we are

00:28:23   actually in a little bit of a dangerous

00:28:26   like local minimum right now with

00:28:28   self-driving technology because like my

00:28:31   parents bought a new car that has a lot

00:28:33   of Drive assistance stuff in it so we

00:28:37   can do things similar there too like

00:28:38   what Tesla's will do where you can tell

00:28:41   it to stay in a lane and it will

00:28:43   maintain its position within that Lane

00:28:46   on a highway even if it goes around the

00:28:47   corner or it will be able to like you

00:28:50   should if you're trying to minimize the

00:28:52   amount of traffic like it'll keep an

00:28:54   equal distance between the

00:28:55   in front of it and behind it on the

00:28:57   highway and it will do some limited

00:29:00   accident avoidance stuff and it's very

00:29:03   interesting to drive that and play

00:29:04   around with that but I really do think

00:29:06   it does put humans in the worst

00:29:10   situation where the car is good enough

00:29:14   on the highway that you feel like you

00:29:16   don't need to pay any attention at all

00:29:18   which must dramatically raise the

00:29:21   possibility of an accident like I would

00:29:23   love to see what the numbers are for

00:29:26   what is the accident rate per 100,000

00:29:30   miles when Lane maintenance is engaged

00:29:33   on a highway and I wouldn't be the least

00:29:35   bit surprised if like minor traffic

00:29:37   accidents go up because the person feels

00:29:41   like I don't have to really pay

00:29:42   attention I can be doing something else

00:29:44   like and I noticed that behavior in

00:29:46   myself when I was driving that car it's

00:29:49   like oh I'm on the highway I feel much

00:29:51   safer looking at my phone in a way that

00:29:55   if I was driving a car without it I feel

00:29:57   like the phone doesn't exist for me if

00:29:58   I'm driving a car without any assistance

00:30:00   stuff and I found myself like oh maybe I

00:30:03   can just look at that text message and

00:30:05   see what it was about or I want to

00:30:06   change what podcast I'm listening to I

00:30:08   can flip over and do like a fiddly app

00:30:10   switch on my phone right now because I

00:30:11   the car's got it on the highway so I

00:30:13   wouldn't be surprised if cars with that

00:30:15   stuff are a little bit worse than cars

00:30:17   without it that's like we need to push

00:30:18   past here to get to exactly what you're

00:30:21   talking about like I can sit in the back

00:30:23   with a laptop and not have to care at

00:30:25   all but if if you need to have Tetris on

00:30:28   the dashboard to keep a person mentally

00:30:29   engaged I think that's almost worse than

00:30:32   not having that technology at all there

00:30:35   was a plane crash caused by a pilot

00:30:37   basically saying these things can land

00:30:39   themselves I'll prove it and didn't look

00:30:41   while landing and then crash the plane

00:30:43   and killed a bunch of people pilot snow

00:30:44   dares in the cockpit okay no we don't

00:30:48   want that that's no good okay I've been

00:30:51   going through the listener emails you

00:30:53   know what people are doing what people

00:30:54   have to say I know you want to have the

00:30:57   people feel like you're listening to

00:30:59   them you're looking through their

00:31:00   feedback always listening that's a

00:31:03   little much oh no no I knew when they

00:31:04   want to be listened to and sometimes not

00:31:06   been

00:31:07   but go message for Brady at gmail the

00:31:10   number four is where these things should

00:31:12   go and I will have a look from time to

00:31:15   time including an email from Ryan which

00:31:19   was so interesting I sent him an email

00:31:21   back asking for some more details and

00:31:23   boy did he provide them he wrote a very

00:31:26   lengthy reply thank you very much Ryan

00:31:28   but the main reason he caught my

00:31:30   attention was he has gotten himself a

00:31:33   tattoo that I thought I should share

00:31:35   with you ah he already had tattoos and

00:31:39   this is like he's kind of added this as

00:31:41   you'll see I think these tattoos are

00:31:43   sort of to reflect his interests sorry I

00:31:46   accidentally just sent them to my wife

00:31:49   replied with oh my god then we send them

00:31:55   to you I thought why is grey replying

00:31:57   with oh my god and not just saying it

00:31:58   out loud I've decided that we should do

00:32:01   this podcast via texting Brady there we

00:32:03   go now hopefully you've got them too oh

00:32:05   my god we have a nail and gear tattoo

00:32:09   that is connected to Oh God

00:32:14   okay USB symbol at the other end and

00:32:17   that's a representation of a certain

00:32:19   kind of chemical connection right the

00:32:22   hexagon with the three lines yeah I mean

00:32:23   that card if I paint like a benzene ring

00:32:25   but it can't be that cuz it's got a

00:32:27   whole bunch of other stuff I don't know

00:32:28   what that is but yes so three symbols

00:32:30   connected to each other the nail and

00:32:33   gear the benzene ring and the USB symbol

00:32:36   it's like a chemical compound that we

00:32:38   don't know what it is a USB symbol

00:32:40   coming off and most recently he's added

00:32:42   quite a bold nail and gear to his inner

00:32:45   arm it looks very fresh that nail and

00:32:47   cure yeah that picture you're looking at

00:32:49   was very soon after it was done he says

00:32:51   I really love the podcast I'm a huge fan

00:32:53   of you and gray the H o podcast got me

00:32:55   through some tough times and I want my

00:32:57   tattoos to be a sort of time capsule of

00:32:59   what was important to me at each stage

00:33:01   in my life

00:33:02   it took about half an hour to have done

00:33:03   my friends like the tattoo although some

00:33:06   of them got a bit confused when they

00:33:07   asked me what the nailing gear was and I

00:33:09   went on a five-minute tangent on the

00:33:11   podcast about plane crashes and flags

00:33:14   so this is a thing Brady you send me

00:33:16   these I get nervous with the tattoos

00:33:18   there is definitely a way in which I

00:33:21   obviously think about tattoos in a

00:33:25   different way than people who get

00:33:26   tattoos think about tattoos I think that

00:33:28   way too

00:33:29   yeah yeah and even that comment from

00:33:32   Ryan sums up what I've heard from a lot

00:33:36   of people with tattoos where they want

00:33:37   them as markers of a point in life and I

00:33:43   can't think about tattoos that way I am

00:33:47   tattoo las' as I'm sure will be a

00:33:50   surprise to no one in the world it'll be

00:33:52   a surprise to Ryan cuz he says in his

00:33:53   longer version of the email that he

00:33:55   feels like he's on the same wavelength

00:33:56   as you not about tattoos they may be

00:33:59   just about other things I feel like

00:34:01   there's a lot of pressure all of a

00:34:02   sudden right because when someone puts a

00:34:04   tattoo on their body and they say oh I

00:34:08   love the show right now what happens if

00:34:12   you decide in the future that you're not

00:34:15   such fans of us but nothing's ever gonna

00:34:17   change the fact he was a fan for I know

00:34:19   I know this is exactly it like this is

00:34:21   what I try to understand is this idea of

00:34:24   it is a marker of a time and so the

00:34:27   future is not affecting the thing like

00:34:30   tattoos I feel like are the opposite of

00:34:32   my mental framework because people who

00:34:37   get tattoos it does seem to be much more

00:34:40   passed focus like it's a marker of a

00:34:42   thing that happened and of course I feel

00:34:46   like I want nothing to do with the past

00:34:49   it doesn't exist anymore I have no

00:34:51   interest in it it is gone

00:34:53   and it is only about the future so when

00:34:55   I look at tattoos I can't not always

00:34:57   mentally frame them from the perspective

00:35:00   of what will future me think about this

00:35:04   tattoo which then always leads me to the

00:35:06   conclusion of well I don't have any idea

00:35:09   what that guy's going to think about

00:35:10   this and then like well I can't make

00:35:13   decisions for that guy in this way so

00:35:16   I'm like I'm not going to get a tattoo

00:35:17   but when I've spoken to people who have

00:35:21   tattoos and I try to explain my thinking

00:35:22   about them they look at me like I'm

00:35:24   crazy

00:35:25   and I clearly don't understand

00:35:27   so yeah I mean it can go wrong I just

00:35:31   sent you a tattoo that was doing the

00:35:32   rounds quite recently that someone got a

00:35:34   few days before all the news stories

00:35:36   broke about Kevin Spacey and he got a

00:35:38   tattoo of Kevin Spacey on his arm and

00:35:39   then like you know all that stuff hit

00:35:42   the fan it's a bit like what does that

00:35:43   mean now that's the problem there's a

00:35:46   reason that most countries also decide

00:35:48   that they are only going to put dead

00:35:50   people on their money is to avoid this

00:35:52   sort of problem if you don't want to put

00:35:54   a living person on your money and then

00:35:57   that living person does something that

00:35:59   you as a country don't like they start

00:36:02   spilling the necessary lies of

00:36:04   civilization and you're like oh no we

00:36:05   just put them on the $2 bill I don't

00:36:07   think Frank Underwood is a president

00:36:09   he'd want to put on your American money

00:36:10   anyway so I think we were gonna always

00:36:12   gonna be safe on that one well Ryan's

00:36:14   happy with her he thinks and looks good

00:36:16   and I think it looks good it does look

00:36:18   good it does look good and we appreciate

00:36:20   it and we will try to keep the support

00:36:22   cast that is worthy of your tattoo no

00:36:24   sorry I shouldn't say that that's what

00:36:25   you don't mean to say yes because all

00:36:27   right because look we're not gonna be

00:36:28   beholden to Ryan that's not what's going

00:36:30   to happen right and we make no promises

00:36:32   about the future quality of this show

00:36:34   right

00:36:35   Brady feels happy about it I feel

00:36:37   slightly uncomfortable about it but I am

00:36:39   glad that you are glad that's what

00:36:42   matters you got to make decisions for

00:36:43   you you don't make decisions for me oh

00:36:45   so I had another photo from a viewer a

00:36:49   kind of what you do while listening to

00:36:50   hello internet moment that really

00:36:52   tickled me before I Audrey that's just

00:36:57   that's terrible she's getting worse

00:36:58   great she wants to be on the podcast but

00:37:00   he is on the podcast so here's one from

00:37:04   Eden I am a student studying ecology I'm

00:37:08   currently working on research about

00:37:09   differences in animal community

00:37:11   structure that result from external

00:37:13   factors around streams one of the things

00:37:17   this involves is collecting fish and

00:37:19   measuring 24 different parts of the

00:37:21   fish's body which is amazing to me

00:37:23   because I didn't know fish had 24 parts

00:37:24   of their body I tend to listen to

00:37:26   podcasts while doing this as it is

00:37:28   tedious work hello internet is of course

00:37:31   the world's greatest podcast trademark

00:37:34   because I like to listen to you and gray

00:37:36   discuss those pesky flag flag ripples

00:37:38   while I work I've attached

00:37:40   lightly morbid proof of my claim keep up

00:37:43   the work so here's the picture the fish

00:37:47   studier

00:37:48   eden has sent grouse I feel like I can

00:37:51   smell all of those fish on that

00:37:53   measuring tray through this photograph

00:37:55   that has a very fishy photograph nicely

00:37:58   arranged in a pattern we have some dead

00:38:01   fish spelling out a chai and some tiny

00:38:06   little fish around a border making the H

00:38:09   I love go all on a medical examining

00:38:11   tray I love that I love that there is a

00:38:13   little tiny little baby sad anyone's

00:38:15   make up the little dots of the border

00:38:17   and the big fish make the lettuce I was

00:38:19   originally going to say I don't believe

00:38:20   this photograph because I don't see the

00:38:22   mandatory podcast logo in the image and

00:38:25   then it took me a second to resolve what

00:38:27   I was actually looking at it like oh

00:38:28   right the fish are the logo I don't know

00:38:30   what kind of fish they are but they're

00:38:31   freaky looking their eyes are scary I

00:38:33   think that's how things look when

00:38:34   they're dead Brady you know they passed

00:38:36   on to the next realm and their eyes are

00:38:38   all cloudy I've seen dead fish in the

00:38:40   fish market and their eyes were as cold

00:38:42   and dead as those ones I'm very happy

00:38:45   that people listen to the show I'm happy

00:38:46   that it gets them through tedious things

00:38:48   in their life and I'm happy that

00:38:50   somewhere in the world someone arranged

00:38:52   a bunch of fish in a laboratory to spill

00:38:55   out the HR logo sentences you'd never

00:38:57   thought you'd say Brady when you bring

00:38:59   stuff to me especially when you bring

00:39:01   stuff from listeners I never know what's

00:39:03   going to happen and I always feel

00:39:04   slightly apprehensive no sigh well yeah

00:39:13   I'm something big because there is a

00:39:17   plague that is spreading through the

00:39:20   world Brady and that plagues name is

00:39:23   bitmoji yeah bit mochi yep and I know

00:39:29   you know what bitmoji are huh

00:39:32   because you created one as well recently

00:39:36   hey I've had mine for a while I just

00:39:38   never send it to you yeah as you know

00:39:41   better yeah I think that's why bitmoji

00:39:45   it's a company that's obviously decided

00:39:47   the regular emoji not enough and so what

00:39:51   they allow you to do is you can create

00:39:54   little avatar of yourself and then this

00:39:59   app will allow you to place that avatar

00:40:00   in a bazillion situations so it was like

00:40:04   you're creating a custom mochi for

00:40:07   everything that's possible Under the Sun

00:40:09   would you say that's a fair description

00:40:11   of what the bitmoji are it's kind of

00:40:14   fair the only thing that's not fair

00:40:15   about it and I think this is a problem

00:40:17   with the name maybe is it's not like a

00:40:19   replacement for emojis to me it's just a

00:40:22   different toy it's just a different fun

00:40:24   things yeah they're using the the Apple

00:40:26   sticker method they're not really emoji

00:40:28   like bitmoji is a good name but it there

00:40:30   are custom stickers but I think that

00:40:32   when people use bitmoji they often start

00:40:35   to displace the regular emoji so what

00:40:39   you could do so people can better

00:40:40   understand if grey said let's meet at 4

00:40:43   o'clock and maybe I would send him an

00:40:46   emoji of just a thumbs up instead I

00:40:48   could send him this customized a little

00:40:49   Brady that looks just like me riding a

00:40:52   horse saying yes let's do it or

00:40:54   something like that right these things

00:40:56   are awful they're terrible I would love

00:40:59   to say one of you I'm gonna design one

00:41:01   to look like you just to see if I can do

00:41:03   it

00:41:03   do not Brady I am it no you know I won't

00:41:05   put it out there because it'll look so

00:41:07   much like II that that would be unfair

00:41:08   for spoilers but I'm gonna make one just

00:41:10   as my own little project great have you

00:41:15   tried to make one look like you no of

00:41:17   course I haven't tried to make one look

00:41:19   like me I mean look here's this thing

00:41:20   these bitmoji they're just like a

00:41:22   nightmare from an episode of Black

00:41:24   Mirror these are so close to the little

00:41:28   avatars in 15 million merits which is

00:41:30   still my favorite black mirror episode I

00:41:33   swear to God when people send them to me

00:41:36   they make me sad they really do like I

00:41:40   was on a thread recently with a bunch of

00:41:42   people and it's like oh this is supposed

00:41:44   to be like a happy event and everybody

00:41:46   is expressing their emotions through the

00:41:48   bitmoji maybe there's something wrong

00:41:50   with me but I feel like it sucks all of

00:41:55   the genuine emotion out of the world

00:41:57   like even emojis regular emojis do this

00:42:00   to some acquitted out like there's a way

00:42:02   that they kind of suck out emotion in

00:42:05   your expression

00:42:07   the bitmoji is like more and more people

00:42:08   are using them and because they're

00:42:10   customizable for every situation you see

00:42:13   them all the time and I'd like they make

00:42:15   me sad and seeing people use them to

00:42:19   express thoughts a lot it's like I hate

00:42:21   them I hate them so much

00:42:23   and they're creepy they're so creepy and

00:42:27   I had to endure this longest hour of my

00:42:31   life when my wife finally got sucked up

00:42:34   in the bitmoji thing and she was

00:42:36   creating her own bitmoji avatar she's

00:42:38   good at though she's good at it

00:42:40   she kept asking me does this one look

00:42:44   more like me whenever she would show me

00:42:49   one it's like that doesn't look like you

00:42:51   that looks like an uncanny valley

00:42:53   nightmare version of you the thing I

00:42:56   don't get as well is that like almost

00:42:58   all the women I know they're bitmoji

00:43:00   look exactly the same whereas the men I

00:43:02   know like that kind of looks like them

00:43:04   and I don't know why that is

00:43:05   I can't figure out why it is but like I

00:43:07   know three or four women who use bitmoji

00:43:09   all look the same it could be any of

00:43:11   them my wasn't happy with hers and I

00:43:14   said what do you think of mine she's

00:43:16   like yes yours looks just like you like

00:43:17   angrily she said has more options or

00:43:21   something but I don't know why this has

00:43:22   happened but it feels like it has

00:43:23   happened I'll tell you that bitmoji app

00:43:25   is not short on options though my wife

00:43:28   was doing this on the iPad so she could

00:43:30   have like the full palette of everything

00:43:32   that was possible

00:43:33   I couldn't believe like the enormous

00:43:36   number of options to be able to maintain

00:43:39   different noises and oh 19 we would be

00:43:41   blessed with but 19 noses to go through

00:43:44   right there were hundreds of noses she's

00:43:47   like is this one more like me or is this

00:43:48   one more than me like I don't like any

00:43:49   of them they all make me uncomfortable

00:43:50   it's all securely in the uncanny valley

00:43:53   please never send this to me ever it

00:43:55   will make me sad I don't like it but

00:43:58   they just keep spreading and people keep

00:43:59   using them I want them to go away but

00:44:02   they won't it's too much like it's too

00:44:04   far I feel like the constraint of but

00:44:07   like 200 regular emojis is almost good

00:44:10   for people it's like it's too adaptable

00:44:13   and then people use it for everything

00:44:15   and it it makes me die a little inside

00:44:18   every time I see one

00:44:20   says the man who portrays himself

00:44:22   publicly and videos with an animated

00:44:24   stick figure and face and doing key

00:44:26   things yeah in my youtube videos which

00:44:28   are produced like I'm not when we text

00:44:30   Brady I don't quickly in Inkscape whip

00:44:33   together and exactly correct CGP grey to

00:44:36   be a response to everything that you do

00:44:38   know it's is totally different don't

00:44:40   even try to compare it like those things

00:44:42   are the same and also I don't think the

00:44:44   stick figure has like the creepy creepy

00:44:46   Dead Eye uncanny valley nests of these

00:44:49   bitmoji 's don't you agree with me

00:44:51   aren't they unsettling in the way that

00:44:53   they look no I don't agree with you they

00:44:56   don't unsettle me go to bitmoji comm and

00:44:59   look upon the nightmare that stares back

00:45:01   at you from the top of that page I'm

00:45:02   gonna make a CGP grey one I tell you why

00:45:05   I could see it to be a green one you

00:45:06   actually physically cannot stop me it's

00:45:09   impossible I can't believe your wife

00:45:11   didn't make one of you and said looks

00:45:14   like you because she respected my

00:45:15   desires and she knows never to send me

00:45:19   the bit mode I don't even like seeing

00:45:20   her using the bitmoji talking with other

00:45:22   people I just like I don't want to see

00:45:24   them they're creepy and they're weird

00:45:26   and my wife respects my desires unlike

00:45:29   my podcast co-host he was probably right

00:45:31   now giggling on his phone making a

00:45:34   bitmoji of me I'm looking at the front

00:45:35   page of the bitmoji website and there's

00:45:37   a one that looks a bit like me maybe

00:45:38   these are random I have no idea which

00:45:40   one you think looks like you that really

00:45:42   had someone the winking guy is that the

00:45:43   one who you think looks like you to the

00:45:46   right the far right

00:45:48   no it doesn't look a bit like you and if

00:45:50   it did look like you it would only look

00:45:52   like you in a nightmare world where

00:45:54   we're all trapped and emotionally

00:45:56   separated from each other riding on

00:45:58   bicycles all day long for pretend points

00:45:59   that's the way it would look like you

00:46:01   and that one on the left is the one that

00:46:02   looks like all the people I know who use

00:46:04   it generic brown hair go bitmoji thumbs

00:46:10   down they're terrifying hello Internet

00:46:14   if you are someone who bills by the hour

00:46:17   here's a quick mental calculation that

00:46:20   might leave you unsettled

00:46:22   how much of your time in a day are you

00:46:25   losing to having to complete annoying

00:46:28   paperwork and admin tasks how about in a

00:46:31   month or in a year

00:46:33   our friends at fresh book calculate that

00:46:36   you could get back as many as 192 of

00:46:39   those hours per year by using fresh

00:46:43   books ridiculously easy to use cloud

00:46:45   accounting software here's how it takes

00:46:48   about 30 seconds to send a perfectly

00:46:50   formatted invoice in literally two

00:46:53   clicks you can set yourself up to

00:46:55   receive payments online which means no

00:46:57   more trips to the ATM to deposit a

00:46:59   client cheque which also means you get

00:47:02   to say goodbye to the time vacuum that

00:47:04   is the expense spreadsheet FreshBooks

00:47:07   also has other things to help you save

00:47:09   time like you can take pictures of your

00:47:11   receipts with your phone making claiming

00:47:13   expenses a million times faster and

00:47:16   easier getting back hours to your

00:47:19   business means you can do more business

00:47:22   a hundred and ninety two hours in a year

00:47:25   is about sixteen hours per month that's

00:47:27   a whole lot of your time to get back

00:47:29   simply by using fresh books so if you

00:47:32   bill by the hour in any way and you

00:47:35   listen to hello internet which

00:47:37   definitionally you are doing right now

00:47:39   try out fresh books for a free

00:47:42   unrestricted 30-day trial to do this

00:47:45   just go to fresh books calm /hello and

00:47:48   enter hello in the how did you hear

00:47:51   about a section this lets fresh books

00:47:53   know that you came from us so once again

00:47:55   go to fresh books calm /hello

00:47:58   and enter hello in the how did you hear

00:48:00   about a section and get back many

00:48:03   billable hours for your business let's

00:48:07   go to a less controversial subject that

00:48:09   won't upset you as much as bitmoji grade

00:48:11   let's talk about not cease Brady this is

00:48:20   an interesting new story that's been

00:48:22   doing the rounds this caught your eye

00:48:23   did no it did catch my eye I feel like

00:48:26   bitmoji not see pugs things going on

00:48:31   with Facebook things going on with

00:48:32   YouTube I'm at a low point with my

00:48:34   thoughts about people and the internet

00:48:36   and what the internet does to people and

00:48:38   vice versa it's all a terrible mess

00:48:41   there has been a sort of conclusion to a

00:48:45   story which to me is a terrifying story

00:48:47   let me try to summarize it is a story

00:48:50   about a Nazi pug as in a little dog hmm

00:48:56   this guy up in Scotland count Deng Cola

00:49:00   is his online name I don't know what his

00:49:02   actual name is because count Acula is

00:49:04   way easier to remember he made a video

00:49:06   for youtube where he trained his

00:49:12   girlfriend's pug to be excited and do a

00:49:15   little Nazi salutes upon hearing

00:49:18   particular phrases particular phrases

00:49:21   you would expect Nazis to say so he put

00:49:25   together this video and he sets it up as

00:49:30   a joke like it's yeah there is this

00:49:33   little bit of a story that people always

00:49:34   say when they upload videos to YouTube

00:49:36   with like oh I only expected a few of my

00:49:38   friends to see a thing and then like it

00:49:39   went viral and I had no idea that it

00:49:41   would I'm not entirely sure that I

00:49:43   believed that because that's like a

00:49:44   standard story that everybody says like

00:49:46   I kind of think that the guy did expect

00:49:48   it was gonna be a bit of a viral video

00:49:49   but he sets it up in the beginning that

00:49:52   like he is playing a joke on his

00:49:56   girlfriend who the pug belongs to yeah

00:49:58   that the pug is super cute and his

00:50:00   girlfriend is always talking about the

00:50:02   pug and loves the pug so much that like

00:50:04   she's on vacation or something and while

00:50:06   she's gone he's gonna train the dog

00:50:08   right on like zig Heil that's going to

00:50:11   put up its little paw in a Nazi salute

00:50:13   which you can almost say the dark humor

00:50:15   of car he like that is this dog's the

00:50:17   apple of her eye and he turns into this

00:50:18   leg yeah it's pretty twisted but you sit

00:50:21   there is that twisted brand of humor

00:50:23   that exists yes there is a choice of

00:50:25   brand of humor that exists and there's

00:50:26   this weird thing where everybody feels

00:50:27   like they need to like distance

00:50:29   themselves a little bit from it like

00:50:30   when I saw this video like it's funny

00:50:32   like it's a funny joke and people have

00:50:35   to pretend like it's not funny but guess

00:50:37   what like part of humor is surprise and

00:50:39   unexpected juxtapositions of things

00:50:42   right and so like he films a shot where

00:50:44   it's like pretending like the dog is

00:50:46   watching a Nazi rally on TV and like

00:50:48   that's a funny shot because of course

00:50:50   like the pug doesn't know anything about

00:50:51   what's going on like he's just setting

00:50:53   up a frame yeah these are jokes and he

00:50:56   sets it up as a joke but what has

00:50:58   happened in the UK is that

00:51:01   the UK has less freedom of speech laws

00:51:06   than the United States does and there

00:51:10   are more caveats and this guy counting

00:51:14   Kela has been convicted guilty of I

00:51:20   forget exactly what it is but it's like

00:51:21   spreading obscene messages over the

00:51:24   Internet he's been found guilty as of

00:51:26   the time of recording the sentencing

00:51:29   hasn't occurred the sentencing is going

00:51:30   to happen in a couple of weeks but he is

00:51:33   facing prison time over a viral video

00:51:37   where he trains a dog to raise its paw

00:51:41   on particular commands and I think this

00:51:44   is just terrifying I agree it's

00:51:47   terrifying and I wonder him what'd he

00:51:48   say about it just to be clear though

00:51:49   it's not the fact that dog raises poor

00:51:50   that's got him in trouble it's like the

00:51:53   phrase he's repeating isn't it yeah okay

00:51:54   but so can I say the phrase are you

00:51:56   afraid to say the phrase Brady I think

00:51:58   he should not have said use that phrase

00:52:00   so one of the phrases that he uses in

00:52:02   the video is he says hi future gray here

00:52:07   stepping in for a moment I wouldn't

00:52:09   normally do this but it is a special

00:52:11   occasion since we recorded this episode

00:52:13   I looked into more of the specifics

00:52:15   around this law and this ruling and haha

00:52:19   turns out that the judge did know it was

00:52:22   a joke but the way the obscenity law is

00:52:25   written in the UK allows judges to

00:52:28   ignore that factor it seems to allow

00:52:30   judges to ignore all context around the

00:52:35   literal words that are spoken in

00:52:37   isolation from everything else which

00:52:40   seems to make it impossible to talk

00:52:42   about the bad ruling of an obscenity

00:52:45   case without also exposing yourself to

00:52:48   the very same obscenity law that you're

00:52:52   talking about which makes this just a

00:52:54   rather breathtaking catch-22 / Brazil

00:52:58   level law so anyway here we are

00:53:01   future gray is going back and cutting

00:53:04   off past gray and thus this very

00:53:06   conversation becomes an example of what

00:53:08   we end up talking about later in the

00:53:10   conversation in the video as he says

00:53:15   and the Pug sits up and looks interested

00:53:18   like he's obviously been giving the dog

00:53:19   treats for that phrase right clearly

00:53:23   this to me is the thing where there's

00:53:25   something I find kind of awful in the

00:53:27   world where everybody's like oh well I

00:53:32   don't agree with the sentiment of that

00:53:35   phrase it's like of course nobody agrees

00:53:37   with the sentiment of that phrase who

00:53:38   agrees with that nobody everybody feels

00:53:41   like of course we want to distance

00:53:42   ourselves from that but the actual

00:53:44   question here is should you be able to

00:53:47   put someone in prison for these actions

00:53:52   prison where we physically remove

00:53:54   someone from society for what they have

00:53:57   done I think this is terrible I think

00:54:01   this is really one of the worst things

00:54:04   in UK law I have come across in the

00:54:08   years that I have lived here because

00:54:11   this is like such an incredible chilling

00:54:14   effect that it comes down to like a

00:54:17   judge thought that this is obscene and

00:54:20   so finds him guilty of communicating

00:54:24   obscenely over the Internet man if

00:54:26   that's the case there are plenty of

00:54:29   other UK youtubers who should like be in

00:54:32   prison over similar kinds of things and

00:54:35   everyone on Retta yeah like next time

00:54:37   Mel Brooks lands in the UK straight to

00:54:41   prison he's going to go because he made

00:54:43   the producers an entire movie which is

00:54:46   is nothing but Nazi jokes right all the

00:54:50   way through I don't know I find this

00:54:52   genuinely terrifying because it opens

00:54:55   the door to this idea that if the legal

00:55:00   system doesn't like what you say it can

00:55:04   put you in prison and what does it mean

00:55:06   to say that something's obscene this

00:55:07   video where a dude is training a dog to

00:55:10   do the zig heil like I don't think we

00:55:14   should put people in prison for that

00:55:15   whether or not you think it's a funny

00:55:17   joke whether or not you think the guy

00:55:18   should have done it whether or not we

00:55:21   think like oh well I don't agree with

00:55:23   the phrase that he uses in like a

00:55:25   schoolmarm kind of way

00:55:27   like I don't think that that is

00:55:29   justification to put someone in prison

00:55:31   like it has such an incredible chilling

00:55:34   effect on speech I don't know I find it

00:55:38   just deeply deeply upsetting that this

00:55:42   has occurred as always with these things

00:55:45   like you find yourself like ok I have to

00:55:47   defend the guy who like made a video

00:55:48   about a Nazi pug but it's never the

00:55:51   thing like it's the system that it

00:55:53   allows like it is a precedent of putting

00:55:57   people in prison for saying things that

00:56:01   is really dangerous I really don't like

00:56:03   it and it makes me deeply uncomfortable

00:56:05   with the way the laws are in the UK I

00:56:09   agree with you you know whether you

00:56:12   thought it was funny or not I liked the

00:56:13   video was kind of funny

00:56:15   yeah but you know some people have

00:56:17   problems with that really

00:56:18   confrontational humor like comedians

00:56:19   like Frankie Boyle and people like that

00:56:21   use and some people really enjoy it I

00:56:24   think it's ridiculous I don't think he's

00:56:27   committed a crime and it's an injustice

00:56:31   there are two caveats though there are

00:56:34   two things that I do feel more strongly

00:56:37   about that I think we can be a bit lacks

00:56:39   about on the internet that I do think

00:56:41   should be punished one is if you are

00:56:43   harassing an individual hmm and I think

00:56:46   a lot of that happens on the Internet

00:56:48   where people like say things and harass

00:56:52   individuals and people that you know

00:56:53   it's free speech I can say whatever I

00:56:54   want on on the Internet and then I think

00:56:56   if you're targeting a person I think

00:56:58   that's unfair and I think we need to

00:57:00   start getting stricter about that

00:57:01   because I don't like seeing people being

00:57:03   harassed like that and the other thing

00:57:06   is if he had been inciting people with

00:57:10   what he was doing if he was trying to

00:57:11   incite you know anti-semitism or

00:57:14   something with the video then I think

00:57:15   maybe it needs to be looked at but he

00:57:17   clearly wasn't doing that so this this

00:57:19   does not apply to this person but I

00:57:21   think if you were making you know

00:57:23   Holocaust jokes with the intent of

00:57:25   inciting racial hatred then that can

00:57:27   start having like real-world

00:57:28   repercussions and people can start

00:57:30   getting hurt and when that happens I

00:57:32   think you know we need to protect people

00:57:35   in society I agree with you there but I

00:57:37   I feel just extraordinarily cautious

00:57:39   towards

00:57:40   this idea individuals I'll agree like if

00:57:43   you are harassing an individual online

00:57:47   there's a place where there are some

00:57:49   kind of repercussions there

00:57:50   although again in our society when you

00:57:53   talk about people who are public figures

00:57:55   all of that just goes out the window and

00:57:57   we're like oh no then it's fine right

00:57:59   yeah you know like if a person's in

00:58:01   public like that screw that like they

00:58:03   don't deserve any protections which like

00:58:05   maybe make sense and maybe doesn't but

00:58:07   even then like starts to blur the

00:58:11   boundaries of this like even that is not

00:58:13   as crystal clear and I also agree with

00:58:18   direct incitement to violence but I also

00:58:22   find like in conversations with people

00:58:25   that there's a thing that happens where

00:58:27   people do a kind of mind-reading where

00:58:30   they assume intentions of when people

00:58:34   are saying things and it's like hmm my

00:58:37   feeling is like a direct incitement to

00:58:39   violence is a direct incitement to

00:58:43   violence like it is someone calling like

00:58:45   for an attack on a specific person or a

00:58:48   specific group like I think what this

00:58:51   guy got in trouble for is exactly that

00:58:55   blurry boundary what you're saying where

00:58:56   it's like oh when someone's making jokes

00:58:58   and it's jokes that are to incite

00:58:59   violence like that's kind of clearly

00:59:01   what the judge was thinking in this case

00:59:03   is like oh yeah this guy's making jokes

00:59:05   but he's making jokes and this is like

00:59:07   promoting hatred in the UK you can't pin

00:59:10   that on me of course because I think

00:59:12   that judgment is absolutely ridiculous

00:59:13   oh no yeah no I agree with you I know

00:59:15   that in the case of Kant Dacula

00:59:17   that the judge decided it wasn't like a

00:59:19   joke and how anyone could watch that and

00:59:21   not think that was a joke is completely

00:59:23   beyond me I wasn't saying that you were

00:59:24   saying that I'm simply saying that so

00:59:27   you read the section of the law that he

00:59:28   gets convicted under and it's one of

00:59:30   these things where it's like oh it's

00:59:31   illegal to communicate like obscene

00:59:33   language across the internet what does

00:59:35   that mean and that's why I think like

00:59:37   the standard has to be incredibly high

00:59:39   like a person has made a statement that

00:59:42   any child could agree is like a direct

00:59:47   threat against a particular person and

00:59:50   it's not like oh well this person is

00:59:54   contributing to like a general overall

00:59:57   feeling in society this way like through

00:59:59   there