H.I. #52: 20,000 Years of Torment
00:00:00
◼
►
I tell you what it is one of the great myths of hello internet and CGP grey
[TS]
00:00:05
◼
►
folklore that you are competent and have technical ability the last show
[TS]
00:00:12
◼
►
certainly certainly sparked some conversation in the reddit I couldn't
[TS]
00:00:17
◼
►
help but notice that it was a show that reached a thousand comments people
[TS]
00:00:22
◼
►
talking about when services ma'am is appropriate people talking about sub
[TS]
00:00:28
◼
►
localization with many minds blown lots and lots of discussion from the last
[TS]
00:00:32
◼
►
show there was there wasn't sure will come to a few of the other things in
[TS]
00:00:37
◼
►
follow-up on the subway collaboration thing I love people seemed really
[TS]
00:00:42
◼
►
interested in and it is very interesting but I feel like I have anything else to
[TS]
00:00:46
◼
►
say what about you
[TS]
00:00:47
◼
►
the thing that I left out of the conversation last time which people were
[TS]
00:00:51
◼
►
picking up on a little bit in the subreddit was I came across some
[TS]
00:00:56
◼
►
localization in the context of this is not a thing that you should do if you
[TS]
00:01:03
◼
►
are a well developed reader that this is this is a hinderance this is something
[TS]
00:01:09
◼
►
that you do when you first learn to read when you are a child but that by the
[TS]
00:01:14
◼
►
time you become a man you should be able to look at words and understand them
[TS]
00:01:18
◼
►
without hearing a little voice in your head reading the words to yourself
[TS]
00:01:23
◼
►
yeah there was a lot of comments on that point but my only follow-up is when I
[TS]
00:01:29
◼
►
came across this I thought oh ok well this is very interesting let me see if I
[TS]
00:01:33
◼
►
can get rid of this sub localization and there's a whole bunch of things that
[TS]
00:01:37
◼
►
you're supposed to do and my experience with them has been a total failure there
[TS]
00:01:46
◼
►
are exercises you're supposed to do where you're listening to a recording of
[TS]
00:01:51
◼
►
a voice that's counting up in numbers 12345 and trying to read trying to do
[TS]
00:01:56
◼
►
that so that your brain learns to not use the audio part of your brain for
[TS]
00:02:01
◼
►
this and I tried that and the result was I was just incapable of reading the one
[TS]
00:02:06
◼
►
that I thought was the most interesting was
[TS]
00:02:08
◼
►
was there is a bunch of software out there which have you seen this pretty
[TS]
00:02:12
◼
►
but it does this thing where it flashes words individually on the screen from an
[TS]
00:02:18
◼
►
article so so instead of saying that here's an article that I want to read
[TS]
00:02:22
◼
►
and it's written normally it just flashes all of the words in sequence in
[TS]
00:02:27
◼
►
the center of the screen
[TS]
00:02:28
◼
►
have you ever seen something like yeah yeah I I have very briefly I'm vaguely
[TS]
00:02:32
◼
►
familiar with it yeah I believe Instapaper on the phone has a built-in
[TS]
00:02:35
◼
►
but there's a few websites we can paste text and do the same thing but one of
[TS]
00:02:40
◼
►
the ways in which you're supposed to train yourself out of some vocalizing is
[TS]
00:02:45
◼
►
by using something like this
[TS]
00:02:47
◼
►
cranked out a ridiculous speed ok well let me try I'll try but it was almost
[TS]
00:02:53
◼
►
comical because no matter how high I cranked it up to its like 500 words per
[TS]
00:02:58
◼
►
minute I'm just hearing a faster voice in my head like the point at which I can
[TS]
00:03:04
◼
►
still understand it and there is also a narrator and it was a bit like when I
[TS]
00:03:09
◼
►
edit the podcast podcast and sometimes accidentally sent it to you in a
[TS]
00:03:13
◼
►
fast-forward mode we're we're talking you know two times faster than we
[TS]
00:03:17
◼
►
normally do so I tried a bunch of the get rid of some vocalizations stuff and
[TS]
00:03:22
◼
►
not have it seems to work for me at all I just am not sure that I'm not sure
[TS]
00:03:27
◼
►
that it can be gotten rid of I guess the question I have
[TS]
00:03:30
◼
►
is what's the difference between you sub vocalizing and if I was sitting next to
[TS]
00:03:36
◼
►
you in bed reading the book i think is a big difference between the house and
[TS]
00:03:43
◼
►
hang on I'm sitting there I'm sitting next to the bed I'm not in bed with you
[TS]
00:03:47
◼
►
I'm just a chair next to a less weird when you've got like you're getting
[TS]
00:03:53
◼
►
ready to go to sleep and garlic bread you can you read me a story like is that
[TS]
00:03:58
◼
►
basically what's happening you're reading yourself a story that seems or
[TS]
00:04:03
◼
►
if I was reading you the story
[TS]
00:04:05
◼
►
other words and coming into your head you're then read them to yourself again
[TS]
00:04:08
◼
►
so you don't think about it now there's no it's not necessary level of thought
[TS]
00:04:14
◼
►
there's no doubling up I do not like hearing it twice maybe this is the best
[TS]
00:04:18
◼
►
way to think about it when we're talking now aren't you in mere talking neither
[TS]
00:04:24
◼
►
of us are thinking about the thoughts right like we just don't know how you
[TS]
00:04:28
◼
►
speak right words just appear at this is this is how that happens right yeah yeah
[TS]
00:04:32
◼
►
and so when I ask you a question and then you answer me yeah right you are
[TS]
00:04:39
◼
►
using a voice but at your thinking the thought at the same time that you're
[TS]
00:04:42
◼
►
speaking it and for anyone who's done something like a podcast where you speak
[TS]
00:04:47
◼
►
for a very long time and I'm sure brady you had the same experience sometimes
[TS]
00:04:50
◼
►
you say something and you think we do actually think that I'm not sure that I
[TS]
00:04:54
◼
►
do think that right because it's just like a stream of thoughts coming out of
[TS]
00:04:57
◼
►
your mind right
[TS]
00:05:00
◼
►
have you ever had that experience you say something you think do I think
[TS]
00:05:02
◼
►
pretty much every time I speak their ego so in the same way that you talking out
[TS]
00:05:08
◼
►
loud is like the same thing is you thinking it's it's just like that for
[TS]
00:05:14
◼
►
reading it it's almost like if if if someone put duct tape over your mouth
[TS]
00:05:20
◼
►
because you weren't able to speak that would impair your ability to think
[TS]
00:05:25
◼
►
that's kind of like what it what it is internally I did read when they're doing
[TS]
00:05:29
◼
►
experiments on sub vocalizations they do their senses because you are almost
[TS]
00:05:34
◼
►
imperceptibly
[TS]
00:05:35
◼
►
reading to your self-learning say movements in your time zone your lips
[TS]
00:05:38
◼
►
and stuff so you literally a kind of reading out loud
[TS]
00:05:42
◼
►
yeah I would be really curious to know if that was the case for me as far as I
[TS]
00:05:47
◼
►
know I sit silently and I don't plan on moving my lips or my tongue but I have
[TS]
00:05:52
◼
►
seen these things and I go you can under the right circumstances measure that
[TS]
00:05:56
◼
►
they're still electrical impulses going to someones vocal chords when they're
[TS]
00:05:59
◼
►
doing this even if there's no external side that there that they're reading out
[TS]
00:06:03
◼
►
loud but I guess your analogy of you reading me a bedtime story just really
[TS]
00:06:08
◼
►
threw me off I think perhaps the most straightforward way to describe it is
[TS]
00:06:12
◼
►
that me reading a book out loud to myself and me reading a book
[TS]
00:06:19
◼
►
silently to myself are not very different experiences are really is with
[TS]
00:06:25
◼
►
human brain that weird well somehow I don't know how you read I don't
[TS]
00:06:29
◼
►
understand how you read it that's not the experience that you have any like
[TS]
00:06:32
◼
►
you are like imagining things to like you are like picturing the same
[TS]
00:06:35
◼
►
obviously you know you're imagining the mountains and the hobbits and yeah I
[TS]
00:06:40
◼
►
have the same time this gets really weird like when you think of something
[TS]
00:06:43
◼
►
in your head you can see it right but where are you seeing it I still have
[TS]
00:06:48
◼
►
that going on like I'm imagining the scene that unfolds in say a fictional
[TS]
00:06:52
◼
►
book right that that definitely takes place but it really is just like there
[TS]
00:06:56
◼
►
is a narrator talking over the whole thing but so do you just do does have a
[TS]
00:07:00
◼
►
scene silently playing in your head when you read now it's just it's it's it's in
[TS]
00:07:07
◼
►
another element in a room that we're voices don't exist it's like it's your
[TS]
00:07:12
◼
►
thoughts your consciousness it's there it's there it's that infant decimal
[TS]
00:07:16
◼
►
point in the center of your brain where everything happens that you don't
[TS]
00:07:20
◼
►
understand but it's just the the place and there's no luck I said last time now
[TS]
00:07:26
◼
►
there's a collapsing of the wave function as soon as I think about
[TS]
00:07:29
◼
►
thinking everything becomes words and pictures and it's only when I think
[TS]
00:07:35
◼
►
about thinking it's not that's why I think the same thing is happening to
[TS]
00:07:39
◼
►
both of us and you're just
[TS]
00:07:41
◼
►
incapable of getting lost in there and you're always thinking about it so
[TS]
00:07:45
◼
►
you're always collapsing the wave function and thinking about the words in
[TS]
00:07:47
◼
►
the pitches I I know this is a Roman studies into an arrogant for me to think
[TS]
00:07:53
◼
►
everyone thinks like me but I just that's what it feels like to me it feels
[TS]
00:07:57
◼
►
like we all do it because as soon as I try to think about as soon as I talk to
[TS]
00:08:01
◼
►
you about it suddenly I am reading to myself and everything is a lot more
[TS]
00:08:04
◼
►
simple and basic but that's just because I'm analyzing it I just think you're
[TS]
00:08:08
◼
►
analyzing it too much I think I think he do get lost in reading and thinking and
[TS]
00:08:14
◼
►
it's only when you stop and think about it that it collapses into this really
[TS]
00:08:17
◼
►
simple thing yet this is exactly what an ounce of a closer with that and I could
[TS]
00:08:26
◼
►
say and of course you would say that because that's what grade would say you
[TS]
00:08:31
◼
►
can't argue with that this is where fast getting into the realm of in argue
[TS]
00:08:35
◼
►
ability but I the reason why do think that that you're wrong is because I from
[TS]
00:08:41
◼
►
the descriptions I genuinely wish that I could read in this way that didn't have
[TS]
00:08:46
◼
►
internal words like it seems like it's a much better way to read but I am always
[TS]
00:08:52
◼
►
aware of the narrator like the narrator is is never not their mental images can
[TS]
00:08:58
◼
►
be in addition to the narrator but the narrator's always like I can do the
[TS]
00:09:02
◼
►
thing everyone can do it you can you can imagine a dog and in your brain
[TS]
00:09:06
◼
►
somewhere there's like a picture of a generic dog that pops into your head
[TS]
00:09:10
◼
►
without hearing someone also go dog right I can have without a narrator day
[TS]
00:09:18
◼
►
reading without a narrator is not possible but I would still say that I
[TS]
00:09:22
◼
►
think the vast majority of my thoughts do have some kind of narrator and that
[TS]
00:09:27
◼
►
the the picture part of it is is much rarer like that guy have to more
[TS]
00:09:31
◼
►
consciously like imagine the dog to not have the narrator to be a thing that
[TS]
00:09:36
◼
►
happens and I do and I do realize their academic studies into this that's
[TS]
00:09:40
◼
►
another reason I'm wrong but this is like oh this is a field of study so I
[TS]
00:09:44
◼
►
can sit here and be an armchair expert but I do realize there
[TS]
00:09:48
◼
►
is a thing that I would be curious in the subreddit if anybody has any other
[TS]
00:09:53
◼
►
recommended techniques besides the listen to something
[TS]
00:09:56
◼
►
spoken while you're reading or try to do the one word really fast things I'm open
[TS]
00:10:02
◼
►
to trying other methods to to get rid of the habit of Cebu collapsing but
[TS]
00:10:07
◼
►
everything I have tried so far has been whole area is Lee unhelpful to do I
[TS]
00:10:14
◼
►
haven't told you this yet but I've been buying up steps and all sorts of
[TS]
00:10:18
◼
►
merchandise will be liberian County flags on the only yeah just today I got
[TS]
00:10:26
◼
►
a number like that was sent like during the Black the liberian war or something
[TS]
00:10:30
◼
►
with this one of the steps and postmarked in liberia and I'm loving
[TS]
00:10:35
◼
►
nothing I'm getting really into steps and postcards and the whole world of
[TS]
00:10:40
◼
►
mail and stuff I think I'm becoming a fully fledged node like I'm the one
[TS]
00:10:46
◼
►
thing that I didn't do that snow D step and i think im gonna get into stamp
[TS]
00:10:51
◼
►
collecting there is a whole world of the whole world to get into with stamp
[TS]
00:10:56
◼
►
collecting obviously have already started with my crash mail that now so
[TS]
00:11:04
◼
►
proudly showed me last time I was there I'm gonna have a whole bunch of other
[TS]
00:11:07
◼
►
liberians to show you next time there was a thread on the technology subreddit
[TS]
00:11:16
◼
►
very often on there they do redesign projects which actually think of some of
[TS]
00:11:20
◼
►
the most interesting things that appear on that subreddit sometimes they just do
[TS]
00:11:23
◼
►
flags in a particular theme like canada is every nation's flags he make a
[TS]
00:11:28
◼
►
counter version of all the flags but sometimes they just do a straight up
[TS]
00:11:31
◼
►
redesign and so someone who actually listened to the show and its food man
[TS]
00:11:38
◼
►
Union he redid all of the Liberian County flags and I'll put the link in
[TS]
00:11:45
◼
►
the show notes I am very impressed with this redesign and I think the redesign
[TS]
00:11:50
◼
►
is really interesting because I can't I can't figure it out because I look at
[TS]
00:11:55
◼
►
the redesign and these flags are still
[TS]
00:11:58
◼
►
very very busy flags but I like them all but I wonder if it's because my brain
[TS]
00:12:04
◼
►
has already fixed its point of reference for Hrithik horrific original flags and
[TS]
00:12:11
◼
►
so my brain is going over these legs are much better than those flags the feeling
[TS]
00:12:15
◼
►
I have a hard time seeing them objectively but I think they are very
[TS]
00:12:20
◼
►
interesting Lee done redesigns she know my problem is with all these redesigned
[TS]
00:12:26
◼
►
competitions and things like that because because of these rules of good
[TS]
00:12:31
◼
►
flag design and this kind of accepted style and grammar of the time of the
[TS]
00:12:35
◼
►
flames begin looking a bit the site and I always think that's one of the things
[TS]
00:12:40
◼
►
I like about the liberian County flags if I can block anything it's it's it's
[TS]
00:12:45
◼
►
different it's so it's so refreshing the different isn't a great thing about some
[TS]
00:12:49
◼
►
of the WEP key flags whether it's something really crazy like Nepal or or
[TS]
00:12:54
◼
►
something that's just a bit different like Brazil for example if you didn't
[TS]
00:12:58
◼
►
have those points of difference
[TS]
00:13:00
◼
►
flags will be the most boring thing in the world he you need some of the crazy
[TS]
00:13:04
◼
►
cars to make flags work and I think whenever you have these competitions
[TS]
00:13:09
◼
►
with people say let's imagine we didn't have the crazy guys let's make the crazy
[TS]
00:13:13
◼
►
guys the same as all the other guys all of a sudden flags become really dull so
[TS]
00:13:18
◼
►
I think it's unfortunate when people have these little let's let's take the
[TS]
00:13:23
◼
►
wacky flag and turn it into all the other ones and it just it just it leaves
[TS]
00:13:27
◼
►
me cold if you can make a new flag ok make a new flag and make it good and
[TS]
00:13:32
◼
►
follow the rules of design but there's something about all these if only this
[TS]
00:13:37
◼
►
crazy flag was like all the other ones moments that people don't get a text I
[TS]
00:13:43
◼
►
am more sympathetic to your point than you might think that I am greedy the
[TS]
00:13:50
◼
►
thing the thing that I think complicates this is that you and higher looking at
[TS]
00:13:55
◼
►
it from from the perspective of flag connoisseurs potentially professionals
[TS]
00:14:03
◼
►
who help other nations develop their flag for this is this is our
[TS]
00:14:07
◼
►
perspectives so we see
[TS]
00:14:09
◼
►
many many flags people send us on Twitter and on the subreddit many more
[TS]
00:14:14
◼
►
flags we've seen a lot yet and so I think from that perspective the more
[TS]
00:14:20
◼
►
unusual becomes more valuable like a welcome respite from the sameness of
[TS]
00:14:27
◼
►
every single flag yeah it feels like oh boy isn't isn't this quite a relief and
[TS]
00:14:33
◼
►
I think this is something that you can see sometimes with people who are
[TS]
00:14:36
◼
►
professional critics in any field sometimes critics we do have money by
[TS]
00:14:45
◼
►
criticizing flags I can someone add that to the Wikipedia page is well-known
[TS]
00:14:54
◼
►
professional flag tattooed in some circles as potential advisers to the
[TS]
00:15:01
◼
►
government of Fiji but I think I think that's that's why I like movie reviewers
[TS]
00:15:08
◼
►
you know sometimes if if their movie reviewers you follow they'll
[TS]
00:15:11
◼
►
occasionally like movies that you feel like god how could they possibly like
[TS]
00:15:15
◼
►
this terrible low-budget awful indie movie and I think it's a bit of a same
[TS]
00:15:21
◼
►
thing where they like man is just so interesting to see something that is
[TS]
00:15:24
◼
►
different and even if it's not great but the thing with flags and the reason why
[TS]
00:15:29
◼
►
I will still push back against you on this is that I think a vital part of a
[TS]
00:15:34
◼
►
flag is not just its uniqueness but it's not the people who live under that flag
[TS]
00:15:40
◼
►
should want to put that flag on things that they have so I feel like everybody
[TS]
00:15:47
◼
►
should have a flag that they can attach to their backpack right or that they can
[TS]
00:15:52
◼
►
fly from their house everyone should have that and so the original liberian
[TS]
00:15:59
◼
►
County flags if you lived in one of those counties and you were super proud
[TS]
00:16:04
◼
►
of it and you wanted to demonstrate that to the world you had a terrible terrible
[TS]
00:16:10
◼
►
choice of flag
[TS]
00:16:12
◼
►
so that's why I'm going to push back to you as I think everybody deserves to
[TS]
00:16:17
◼
►
live under a flag that they can proudly fly half years yet saying because I have
[TS]
00:16:22
◼
►
not have you yet seen anyone from liberia or anyone who lives in any of
[TS]
00:16:28
◼
►
these counties criticized the flax and say they don't like them cuz you and
[TS]
00:16:33
◼
►
I've had a road loss and we've seen on reddit having a laugh and saying these
[TS]
00:16:36
◼
►
are the worst flags on the way out but it's entirely possible the people of
[TS]
00:16:40
◼
►
river gee County
[TS]
00:16:43
◼
►
their flags awesome if you told me just to say go with that so i didnt know you
[TS]
00:16:53
◼
►
stop inmates ya gotta gotta gotta push back I did I would never want to just
[TS]
00:17:01
◼
►
give you a hard but I mean it it maybe maybe maybe the incredibly proud and if
[TS]
00:17:08
◼
►
we were saying these things on a podcast in liberia would be tried for treason I
[TS]
00:17:13
◼
►
mean this is this is the part where I have to admit that I know almost nothing
[TS]
00:17:16
◼
►
about the great nation of Liberia Vicky county is pronounced like that I
[TS]
00:17:23
◼
►
definitely know that yeah expert in pronunciation for liberia counties but I
[TS]
00:17:29
◼
►
don't know I know you CDP Grade II don't you know it's it's because nobody
[TS]
00:17:45
◼
►
English now is because English doesn't have any decision rules English just
[TS]
00:17:49
◼
►
like to pretend that it does I don't know that I don't know that refugee
[TS]
00:17:52
◼
►
county has a place called the Fishtown so I think it's also although it does
[TS]
00:17:55
◼
►
seem to be land-locked but I guess I freshwater fish or its just a great name
[TS]
00:17:59
◼
►
yeah but I have seen nine their opponents nor d ponents of the Liberian
[TS]
00:18:09
◼
►
County flags that are from liberia so I i have seen no no feedback on either end
[TS]
00:18:14
◼
►
and my guess
[TS]
00:18:16
◼
►
my guess is this is a lot like the city flags in the United States which is that
[TS]
00:18:22
◼
►
just most people don't have the slightest idea what the flag of their
[TS]
00:18:26
◼
►
local city is this is normally one of these times when I would make a comment
[TS]
00:18:31
◼
►
like they were going to be hearing from everyone from liberia but I don't
[TS]
00:18:35
◼
►
imagine imagine that we're actually going to get a lot of Liberians me back
[TS]
00:18:39
◼
►
on this one
[TS]
00:18:39
◼
►
this episode of hello internet is brought to you by now many of you might
[TS]
00:18:45
◼
►
be working at a big company with an internet that is just a terrible
[TS]
00:18:51
◼
►
terrible piece of software to work with i mean actually isn't even really a
[TS]
00:18:56
◼
►
piece of software it feels much more like it's a bunch of pipes connected to
[TS]
00:19:00
◼
►
old computers held together with duct tape
[TS]
00:19:04
◼
►
most Internet are just awful I used awful internet at my school but igloo is
[TS]
00:19:09
◼
►
something different
[TS]
00:19:11
◼
►
igloo is a feeling of levity compared to other internets because it is an
[TS]
00:19:16
◼
►
internet you will actually like go to igloo software dot com slash hello and
[TS]
00:19:23
◼
►
just just take a look at the way igloo looks they have a nice clean modern
[TS]
00:19:29
◼
►
design that will just be a relief on your sad tired eyes compared to the
[TS]
00:19:35
◼
►
internet that you are currently working web at your company and includes not
[TS]
00:19:39
◼
►
just a pretty face
[TS]
00:19:40
◼
►
igloo lets you share news organize your files coordinate calendars and manage
[TS]
00:19:45
◼
►
your projects all in one place and it's not just files in a bucket either their
[TS]
00:19:51
◼
►
latest upgrade Viking revolves around interacting with documents how people
[TS]
00:19:56
◼
►
make changes to them how you can receive feedback on them if you're the man in
[TS]
00:20:00
◼
►
charge there's an ability to track who has seen what across the internet so you
[TS]
00:20:06
◼
►
can have something like read receipts in email where you know if everyone has
[TS]
00:20:10
◼
►
actually seen and signed off on whatever document they need to see if your
[TS]
00:20:15
◼
►
company has a legacy internet that looks like it was built in the nineteen
[TS]
00:20:20
◼
►
nineties then you should give
[TS]
00:20:22
◼
►
igloo a try please sign up for a free trial at igloo software dot com slash
[TS]
00:20:28
◼
►
below to let you know that you came from us so the next time I want to talk about
[TS]
00:20:35
◼
►
it over we just did it last week and the week before yet
[TS]
00:20:41
◼
►
newburgh corner at this rate we I just did have a moment after we've spoken
[TS]
00:20:47
◼
►
about it because I i causing 30 business short space of time in the first person
[TS]
00:20:52
◼
►
who drove me across San Francisco
[TS]
00:20:55
◼
►
I was sad you know where you know where you going next he said I've got to go to
[TS]
00:20:59
◼
►
work I'm actually a bartender and then the next go to pick me up and take me to
[TS]
00:21:03
◼
►
the next place was in a hurry as well as she actually wants to be like a singer
[TS]
00:21:07
◼
►
in a band and she was like auditioning that now and then the next person who
[TS]
00:21:12
◼
►
drove me to the next place was like who was picking up her kids from soccer
[TS]
00:21:17
◼
►
practice after she gave me a lift and i suddenly occurred to me and i know this
[TS]
00:21:21
◼
►
country for taxi drivers but it seems even more the case with a bad driving me
[TS]
00:21:26
◼
►
like these people driving seventy miles an hour along highways he could kill me
[TS]
00:21:33
◼
►
with the terms of a steering wheel and they just thought these random selection
[TS]
00:21:37
◼
►
of people and their only qualification is that they have a mobile phone and
[TS]
00:21:41
◼
►
they have a driver's license
[TS]
00:21:43
◼
►
well I didn't say that drivers license I'm assuming they went through some
[TS]
00:21:46
◼
►
process to prove that but the driver's license process is very rigorous very
[TS]
00:21:51
◼
►
make a case like a job and it's at least I know nothing about this person had
[TS]
00:21:56
◼
►
crashes they are they do not like I still like me but I still think its coat
[TS]
00:22:03
◼
►
it really won me over but there were a few moments i think im quite sensitive
[TS]
00:22:07
◼
►
to it especially since we spoke earlier about that the terrible car crash when
[TS]
00:22:12
◼
►
the mathematician John Nash Titan when he was going back from the airport and
[TS]
00:22:17
◼
►
that was a taxi that was a taxi crash right but ever since then especially
[TS]
00:22:21
◼
►
when I'm in america driving from airports highways I'm always thinking
[TS]
00:22:25
◼
►
I'm always conscious that my life is in other people's hands much more so than
[TS]
00:22:29
◼
►
when I fly yeah
[TS]
00:22:30
◼
►
probably cuz I could see the person driving using their mobile phone and
[TS]
00:22:33
◼
►
stuff yeah and I think driving in america is scarier as I i over most of
[TS]
00:22:40
◼
►
the time just around London and there are unaware like okay even if we get
[TS]
00:22:45
◼
►
into a car crash how fast can we possibly be going in a head-on collision
[TS]
00:22:49
◼
►
exactly like London London traffics
[TS]
00:22:52
◼
►
whereas in america you have big stretches where you you can you can get
[TS]
00:22:56
◼
►
up to seventy miles an hour and then you head-on collision with somebody else
[TS]
00:23:00
◼
►
going seventy miles in the other direction right this driving in america
[TS]
00:23:04
◼
►
is definitely more of a dangerous experience also the fact that such a
[TS]
00:23:08
◼
►
mobile phone
[TS]
00:23:10
◼
►
oriented platform drivers even more than taxi drivers always seem to be attached
[TS]
00:23:16
◼
►
to their phones are always using the map steroids use in the apps they're very
[TS]
00:23:19
◼
►
phone obsessed and I think mobile phones
[TS]
00:23:22
◼
►
a very dangerous and I'm very conscious of how often they looking at their
[TS]
00:23:26
◼
►
phones and the map sitting in their lap and stuff like that I think I actually
[TS]
00:23:31
◼
►
said to one of the drivers to give you some have something into the Apple you
[TS]
00:23:36
◼
►
can't use the phone while you're doing this so that because because it wasn't
[TS]
00:23:40
◼
►
your finds no no no there's nothing like this again is is the interesting
[TS]
00:23:45
◼
►
difference of of how things are around the world because it again at least in
[TS]
00:23:50
◼
►
London the phones that they get our only usable for uber and they are issued by
[TS]
00:23:59
◼
►
who were there like factory installed I phones that run who bring nothing else
[TS]
00:24:04
◼
►
which is why in London almost all of the drivers have a larry is Lee at least two
[TS]
00:24:10
◼
►
and sometimes three phones attached to their dashboard precisely because her
[TS]
00:24:16
◼
►
phone can only be used for over and so the one at bring up other stuff on the
[TS]
00:24:20
◼
►
other phone so they have like two different
[TS]
00:24:23
◼
►
like software for reading the directions of the loaded up on Google Maps and
[TS]
00:24:27
◼
►
something else it so I'm always aware of like this many many scream phenomenon at
[TS]
00:24:34
◼
►
the front of the cars and it extra funny when whatever car they're using has a
[TS]
00:24:38
◼
►
built-in screen that they're obviously not using because their phone screens
[TS]
00:24:42
◼
►
are just superior so actually there's four screens and the front of this car
[TS]
00:24:46
◼
►
ok you got her phone you have your secondary GPS and you have what is
[TS]
00:24:52
◼
►
obviously your personal phone and the built-in screen in the actual car itself
[TS]
00:24:57
◼
►
as a lot of screens the other thing that came out time and again when I was
[TS]
00:25:01
◼
►
talking to overdrive is was this rival app code lift ya know this is something
[TS]
00:25:06
◼
►
I've never used because I believe it's only in the United States I don't think
[TS]
00:25:11
◼
►
it's it's it's in the UK but I've always gotten vaguely the impression that like
[TS]
00:25:15
◼
►
lift is for hippies they can share ridesharing kind of thing I didn't get
[TS]
00:25:21
◼
►
that impression but to have like pink mustaches on the front of their cars you
[TS]
00:25:26
◼
►
know this is this is the kind of company that it is in my mind I have no idea
[TS]
00:25:30
◼
►
most of the drivers tough we using both over and lift simultaneously and they
[TS]
00:25:36
◼
►
all failed lift and they gave me a few reasons one of the big reasons was the
[TS]
00:25:41
◼
►
ability for passengers to tip and I did you did you proud great idea prayer I
[TS]
00:25:48
◼
►
gave them a real hard time about that I told them why didn't like the obvious
[TS]
00:25:53
◼
►
reasons you know chris is recreate the tipping culture and you start could
[TS]
00:25:57
◼
►
start getting assessed based on your tipping and actually what they told me
[TS]
00:26:02
◼
►
and I was told us a few times I don't I haven't checked myself but I was told a
[TS]
00:26:06
◼
►
few times the tipping actually works in quite an interesting way you do the tip
[TS]
00:26:10
◼
►
afterwards anonymously via phone and they don't find out who tipped them at
[TS]
00:26:15
◼
►
the end of the day at the end of the week they just get their tips and they
[TS]
00:26:17
◼
►
don't know where they came from
[TS]
00:26:19
◼
►
so they like it because if they do really well they can you know it gives
[TS]
00:26:22
◼
►
them something to strive for beyond just getting another 5 stars you know they
[TS]
00:26:26
◼
►
could get the tip or your replays to give them a tip
[TS]
00:26:30
◼
►
but it did sound like that pressure and awkwardness wasn't there and there was
[TS]
00:26:34
◼
►
no gym no judging because no one knows who tip so I don't know if it's true
[TS]
00:26:38
◼
►
that's what they said when I challenged so it wasn't really actually just shut
[TS]
00:26:43
◼
►
or she's getting pretty smart she's I just sent you said in one of those left
[TS]
00:26:49
◼
►
cars with the mustache apparently this is a thing that they no longer do but I
[TS]
00:26:53
◼
►
was certainly thinking mio crazy person for imagining that that used to be pink
[TS]
00:26:56
◼
►
mustaches on cars and now I'm not a crazy person I looked it up and yes it
[TS]
00:27:01
◼
►
is used to do that way that you described tipping is a very interesting
[TS]
00:27:09
◼
►
idea that I have ever come across before the idea of delayed mass tipping I think
[TS]
00:27:18
◼
►
I think my initial reaction to that is I find it much more acceptable
[TS]
00:27:24
◼
►
like in a restaurant if tipping work that way right that you could do it
[TS]
00:27:29
◼
►
later and it's distributed amongst a large number of customers so that the
[TS]
00:27:34
◼
►
waiters don't know directly I think that I think it's interesting I think the
[TS]
00:27:39
◼
►
idea people are fundamentally cheap nice so like I think without the social
[TS]
00:27:44
◼
►
pressure of tipping tips may come down this is why my fundamental thing with
[TS]
00:27:48
◼
►
tips they always need to remind people argue against tips heart part of the
[TS]
00:27:52
◼
►
argument that is unspoken
[TS]
00:27:54
◼
►
is that you have to raise the wage for people who depend on tips
[TS]
00:28:00
◼
►
scrooge here thinking these tips and not add anything else I would rather raise
[TS]
00:28:08
◼
►
the wage and removed the tips I think if under those circumstances if tipping was
[TS]
00:28:13
◼
►
not required it was done later and anonymously I think I would probably
[TS]
00:28:17
◼
►
very rarely do it and again like with all the other stuff is way more about
[TS]
00:28:22
◼
►
just like having to think about it but I don't know I don't know maybe maybe I
[TS]
00:28:28
◼
►
would just I would just set it as the default amount of tip I know it's an
[TS]
00:28:32
◼
►
interesting idea that's a very interesting idea that I haven't come
[TS]
00:28:34
◼
►
across before after think about this for a little bit
[TS]
00:28:36
◼
►
we have we have a note here about the the vote looms the deadline for our vote
[TS]
00:28:43
◼
►
looms for the flags I mean this could possibly be a final warning before I
[TS]
00:28:48
◼
►
mean the next time you listen to the podcast about myself maybe too late to
[TS]
00:28:51
◼
►
vote so this could be the last time you listen to the Internet broadcast and
[TS]
00:28:56
◼
►
still have the option of voting and Affleck referendum that's how high the
[TS]
00:29:00
◼
►
stakes are now this is going to be the last podcasts before the hour before we
[TS]
00:29:06
◼
►
count the votes I guess I think so it's certainly going to be the last podcast
[TS]
00:29:11
◼
►
you listen to that where you have a chance chance of sending a postcard
[TS]
00:29:15
◼
►
makes it in time but even that of realizing as we're speaking is somewhat
[TS]
00:29:20
◼
►
in doubt because we are recording this podcast and are at our usual time but
[TS]
00:29:27
◼
►
this one may be out a bit late because I have some other things that I have to
[TS]
00:29:31
◼
►
prioritize above it so I actually don't know when this one is going to go out
[TS]
00:29:34
◼
►
and how much time they will be it may be that you have to be in the UK to get the
[TS]
00:29:39
◼
►
postcard in on time we'll have to see you just like they are been adding three
[TS]
00:29:44
◼
►
or four days to every day as well for the podcast if you say it's gonna be out
[TS]
00:29:48
◼
►
Monday I said to myself
[TS]
00:29:50
◼
►
Thursday yeah that's an excellent that's an excellent piece of advice it's funny
[TS]
00:29:54
◼
►
cuz I try to do that to myself when I make estimates and I'll come up with an
[TS]
00:29:59
◼
►
initial estimate and I'll go yeah but I never make it on time let me add a few
[TS]
00:30:03
◼
►
days and of course you can't overestimate for yourself you're still
[TS]
00:30:07
◼
►
always wrong even if you try to incorporate your own overestimating said
[TS]
00:30:11
◼
►
however I say whenever I tell you Brady any deadlines you should just
[TS]
00:30:14
◼
►
automatically add a few days to that and I know very the deadline is looming I
[TS]
00:30:22
◼
►
have next to me right now probably a thousand but probably closer to two
[TS]
00:30:29
◼
►
thousand postcards in a box
[TS]
00:30:32
◼
►
votes his his his listen here there is some of them that is the sound of actual
[TS]
00:30:39
◼
►
ballot election
[TS]
00:30:41
◼
►
yeah well you you weighed them and then I was asking you I was I was pestering
[TS]
00:30:47
◼
►
you for a while to wait ten of them yeah we could do an estimate for the total
[TS]
00:30:51
◼
►
amount and at least one that was maybe about a week ago the calculation came
[TS]
00:30:56
◼
►
out to be about 1,800 postcards then and I presume that you've gotten more since
[TS]
00:31:00
◼
►
that point
[TS]
00:31:01
◼
►
soho last time we were discussing we're thinking like maybe we'll get a thousand
[TS]
00:31:06
◼
►
and we're clearly going to get double that at this stage so yeah it's it's
[TS]
00:31:10
◼
►
gonna be a lot of votes to count that's for sure I love looking through these by
[TS]
00:31:15
◼
►
the way I know you keep telling me off and telling me not to but yet listeners
[TS]
00:31:18
◼
►
listeners Brady keep spoiling himself and me by constantly going through
[TS]
00:31:24
◼
►
fingering looking at all of these postcards I'm just minding my own
[TS]
00:31:27
◼
►
business and Brady sends instant message after instant message of interesting
[TS]
00:31:31
◼
►
post card and I feel like they're just spoilers I want to go there and just and
[TS]
00:31:36
◼
►
count them all and see them home once but Brady can't help himself you like
[TS]
00:31:39
◼
►
you're like a little kid I'm not telling you what's getting a lot of votes so
[TS]
00:31:43
◼
►
what's gonna win the violin just sending you the pictures and it doesn't know
[TS]
00:31:48
◼
►
what that is
[TS]
00:31:48
◼
►
that's when someone says the movie's great there's a twist I haven't pulled
[TS]
00:31:52
◼
►
anything but I'm just telling you that there's no no no it's not it's
[TS]
00:31:56
◼
►
completely down let me tell you what's completely different because the
[TS]
00:32:00
◼
►
election is all about what's on the back of these post cuts who's voted for what
[TS]
00:32:05
◼
►
right I have sent you or told you nothing whatsoever about that nothing
[TS]
00:32:09
◼
►
and now the only thing I'm spoiling is where some of them are from are some of
[TS]
00:32:14
◼
►
the funny pictures but trust me great there is no way in one day you will be
[TS]
00:32:19
◼
►
able to get anywhere near saying the mall it is overwhelming how many there
[TS]
00:32:24
◼
►
are and how different so if I send you some funny one that's been sent to have
[TS]
00:32:29
◼
►
some bridge in Norway like you probably would have seen on the day anyway
[TS]
00:32:34
◼
►
because we're gonna be concentrating on the back of the postcards mostly that
[TS]
00:32:38
◼
►
day I so so I'm not spoiling anything I'm just I'm just excited it's like I've
[TS]
00:32:43
◼
►
got all my presence and I just wanna feel the presence of beer
[TS]
00:32:48
◼
►
yeah we're the kind of kid open Christmas presents earlier but you were
[TS]
00:32:50
◼
►
now I'm not definitely not definitely not but but I tell you I know it's going
[TS]
00:33:03
◼
►
to be one by one that I don't want to win I decided that I feel it in my bones
[TS]
00:33:06
◼
►
but I do like the most
[TS]
00:33:10
◼
►
gonna be alright I am going to act like a monarch and I have officially decided
[TS]
00:33:17
◼
►
not to vote in the flag referendum unless unless by some miracle if the tie
[TS]
00:33:25
◼
►
a tie then he'll think I will cast cast a ballot but that's that's that's my
[TS]
00:33:33
◼
►
that's my thought is that I am NOT going to cast the vote because I think when
[TS]
00:33:40
◼
►
you write something down in my mind I still can't please these flags really in
[TS]
00:33:45
◼
►
a in a definitive 125 order and I think when you sit down and you write
[TS]
00:33:49
◼
►
something out it solidifies something in your mind and I think you know what no
[TS]
00:33:54
◼
►
no here's what I'm going to do I'm just I am leaving myself open to the hello
[TS]
00:33:59
◼
►
Internet nation ready to accept what they decide should be the flag and I
[TS]
00:34:05
◼
►
think writing down an ordered list would bias my own feelings toward the actual
[TS]
00:34:11
◼
►
election so that's my conclusion I am I am NOT going to vote in the election but
[TS]
00:34:16
◼
►
have you sent to vote in Brady I have not and I'm thinking pretty much the
[TS]
00:34:23
◼
►
same way as you that I like the idea of having not voted I have owned is only
[TS]
00:34:29
◼
►
one thing I hope to the election I hope secretly in my heart that it goes to a
[TS]
00:34:35
◼
►
second round I hope that one flag doesn't winner in the first to like
[TS]
00:34:39
◼
►
doesn't get over fifty percent in the first record I so so hope that we have
[TS]
00:34:43
◼
►
to distribute preferences because that's the thing I'm most looking yeah I will
[TS]
00:34:47
◼
►
be disappointed if we don't have to distribute preferences I would be
[TS]
00:34:52
◼
►
shocked if one of them gets more than 50 percent on the first round I I will be
[TS]
00:34:57
◼
►
absolutely shocked if that occurs
[TS]
00:34:59
◼
►
ok but I will also be deeply disappointed in a way that we don't get
[TS]
00:35:04
◼
►
to crank through the mechanics of a second preference around in the
[TS]
00:35:07
◼
►
collection I had it I had to be my leg said today that was all about
[TS]
00:35:12
◼
►
coincidences and I thought this amazing and then I was thinking how could I
[TS]
00:35:17
◼
►
possibly bring this into the podcast in a way that would make great even pretend
[TS]
00:35:21
◼
►
to be interested he lost the battle yeah I I thought of like 10 different ways I
[TS]
00:35:30
◼
►
could sell it to you in the end I just threw it away so there's just nothing
[TS]
00:35:35
◼
►
there is nothing about coincidences that could ever excite great in any way of
[TS]
00:35:39
◼
►
course that of course I meet you and try to sell me on the most amazing one no I
[TS]
00:35:44
◼
►
don't think you would I think two guys in tibet could start their own podcast
[TS]
00:35:50
◼
►
code greetings internet and they could be called Bradley Aaron and CGP brown
[TS]
00:35:57
◼
►
and you would just say cause that's going to happen there are so many people
[TS]
00:36:02
◼
►
making put across these days and there are only so many names in the world of
[TS]
00:36:05
◼
►
course that was going to happen eventually
[TS]
00:36:07
◼
►
yeah that is exactly what I would say I think I don't know how to do this before
[TS]
00:36:12
◼
►
my favorite example of of coincidences is the Dennis the Menace comic strip if
[TS]
00:36:19
◼
►
I told you this but not an exact Dennis the Menace published in the united
[TS]
00:36:25
◼
►
states I think it was just a post-world war two comic strip when it started but
[TS]
00:36:29
◼
►
on the same day that it debuted in the united states in the United Kingdom
[TS]
00:36:36
◼
►
someone else also debuted comic called Dennis the Menace with the exact same
[TS]
00:36:41
◼
►
premise so they can not only did two people come up with the same idea but
[TS]
00:36:47
◼
►
they ended up publishing the first comic on the same exact day
[TS]
00:36:52
◼
►
this is this is why I like coincidences like that of course you're going to get
[TS]
00:36:57
◼
►
coincidences it's just it's almost impossible not to when you have a huge
[TS]
00:37:01
◼
►
number of people so they can be interesting but they're also just
[TS]
00:37:05
◼
►
totally unremarkable and the problem that I have with coincidences is usually
[TS]
00:37:11
◼
►
people than one to try to look for look for meaning behind them as we know
[TS]
00:37:14
◼
►
there's there's no meaning what there is is there's just billions of people on
[TS]
00:37:19
◼
►
earth they would be astounded if there weren't coincidences somewhere
[TS]
00:37:25
◼
►
talked about coincidences it's a it's a good decision it's like you shouldn't
[TS]
00:37:31
◼
►
talk to me about his morty dreams at least now I don't even start man ok cuz
[TS]
00:37:42
◼
►
I i think with the right amount of knowledge and expertise you might be
[TS]
00:37:47
◼
►
able to glean something from dreams because they are based on you know your
[TS]
00:37:52
◼
►
brain and inputs and outputs and I'm not saying I have the expertise and I'm
[TS]
00:37:57
◼
►
gonna sit here and talk to you about my dreams but I'm just saying no one has
[TS]
00:38:00
◼
►
expertise I'm just saying there I'm just saying there is something to dream that
[TS]
00:38:04
◼
►
there is like you know there is something to that that that's not that's
[TS]
00:38:08
◼
►
not gobbledygook is just beyond our ability to understand and therefore we
[TS]
00:38:12
◼
►
imbue it with silly meaning when you say that it was are beyond beyond our
[TS]
00:38:17
◼
►
ability to understand y you're implying that there's some that there's something
[TS]
00:38:21
◼
►
to understand there as opposed to what it is which is nightly hallucinations
[TS]
00:38:27
◼
►
that you connect into meaning later on because that's what the human brain does
[TS]
00:38:32
◼
►
it's a it's a pattern creation machine even when there's no pattern there like
[TS]
00:38:36
◼
►
that that's that's all it happens I don't believe that I don't believe that
[TS]
00:38:39
◼
►
because because I'm not I'm not saying they have like any predictive power yeah
[TS]
00:38:45
◼
►
yeah if you were saying that I mean I'd start cutting you off to the looney bin
[TS]
00:38:50
◼
►
but I mean you can't deny that you know if you having a stressful time in your
[TS]
00:38:54
◼
►
life you have a certain type of dream and if if there are certain things going
[TS]
00:38:57
◼
►
on your dreams change and that there is a correlation between what your dreams
[TS]
00:39:03
◼
►
and what's happening in your real life I mean you must say that you must you must
[TS]
00:39:06
◼
►
acknowledge that surely you know when people going through traumatic times
[TS]
00:39:10
◼
►
they dreams become more dramatic or or the link may not always even be that
[TS]
00:39:16
◼
►
direct but there is like a there is a link between what's happening in your
[TS]
00:39:19
◼
►
dreams and what's happening in your life
[TS]
00:39:21
◼
►
yeah because you lose a nations are constructed from pieces of your life how
[TS]
00:39:25
◼
►
how could it be any other way but yeah I mean like I will totally grant you that
[TS]
00:39:30
◼
►
there is a correlation between what happens in your life and what happens in
[TS]
00:39:33
◼
►
your dreams and the worst example for me of this ever
[TS]
00:39:37
◼
►
my very first year of teaching me and this other and cutie that I worked with
[TS]
00:39:43
◼
►
we both discussed how in that first year in the first few months the worst thing
[TS]
00:39:48
◼
►
ever was
[TS]
00:39:49
◼
►
you would spend all of your waking hours at work at school doing school stuff and
[TS]
00:39:54
◼
►
then because there was your only experience you would go home and your
[TS]
00:39:58
◼
►
dreams would be dreams about being at school and you'd wake up and have to do
[TS]
00:40:01
◼
►
it all over again and it felt like an internal nightmare of always doing
[TS]
00:40:05
◼
►
school so like yeah but then but that's just a case where they you only have one
[TS]
00:40:09
◼
►
thing to dream about and it's the thing that you're doing all day long so of
[TS]
00:40:13
◼
►
course there's going to be some correlation but that doesn't mean that
[TS]
00:40:15
◼
►
there's like meaning to be derived from the dream like that I think that's just
[TS]
00:40:20
◼
►
a step too far let me put this to you then mister CGP grey who always thinks
[TS]
00:40:24
◼
►
that humans are merely computers yeah your computer doesn't take this your
[TS]
00:40:32
◼
►
computer if your computer if a bunch of stuff came out of your computer are you
[TS]
00:40:35
◼
►
looking through all this sort of code and stuff that was going on under the
[TS]
00:40:39
◼
►
hood of your computer you would never just completely dismiss that and say oh
[TS]
00:40:44
◼
►
well that's just random and means nothing because it came from your
[TS]
00:40:47
◼
►
computer and therefore even if it was something I wasn't supposed to do
[TS]
00:40:51
◼
►
it came from something and as a cause and the right Expert could look at it
[TS]
00:40:55
◼
►
and say I yes I see what's going on here or something has gone wrong on this is
[TS]
00:40:58
◼
►
what it's doing because the computer can only do what a computer can do and
[TS]
00:41:02
◼
►
therefore if a brain is a computer if it's serving up all this to say that
[TS]
00:41:07
◼
►
means nothing just how those nations you should ignore that well no because if my
[TS]
00:41:12
◼
►
computer is doing something must be doing it for a reason must be like I'm
[TS]
00:41:18
◼
►
not saying we're supposed to remember a dreams and then use them in a life
[TS]
00:41:23
◼
►
always with you baby you always moving the goalposts underneath me and now
[TS]
00:41:33
◼
►
you're having a discussion about do dreams serve a function in the brain and
[TS]
00:41:39
◼
►
my answer to that is obviously yes like humans dream there must be something
[TS]
00:41:46
◼
►
that the brain is doing during this time that is useful to the brain
[TS]
00:41:49
◼
►
otherwise it wouldn't do it but that doesn't mean that there is meaning to be
[TS]
00:41:53
◼
►
derived of our subjective experience of what is occurring in the dream state
[TS]
00:41:58
◼
►
like that's that's a whole other thing are you telling me if I gave you some
[TS]
00:42:04
◼
►
machine that was able to completely project someone's dream like record them
[TS]
00:42:11
◼
►
like a yeah that exists yeah yeah imagine I gave you that and i said im
[TS]
00:42:17
◼
►
gonna give you that person over those dreams the last 10 years are you telling
[TS]
00:42:22
◼
►
me that data is useless no I'm not saying that data is useless because we
[TS]
00:42:26
◼
►
just said before that you could derive probabilities about a person's life from
[TS]
00:42:31
◼
►
their dreams like oh this person looks like maybe they're a teacher because
[TS]
00:42:34
◼
►
they went through a big favor they were dreaming about teaching all the time
[TS]
00:42:37
◼
►
but that doesn't mean that there's anything for the dreamer to derive from
[TS]
00:42:41
◼
►
their dreams but if you're asking me like is a machine that is capable of
[TS]
00:42:45
◼
►
peering inside someone else is bringing a useful machine like well yes obviously
[TS]
00:42:49
◼
►
that would be useful you could derive information from that of course the baby
[TS]
00:42:52
◼
►
almost impossible not to I'm just saying that I don't think there's anything
[TS]
00:42:56
◼
►
really to learn from your own dreams and I also I also have this very very deep
[TS]
00:43:01
◼
►
suspicion that if this machine existed that allowed you to watch someone else's
[TS]
00:43:07
◼
►
dream or watch your own dreams I am absolutely confident that being able to
[TS]
00:43:14
◼
►
see them objectively would leave them out for the borderline nonsensical
[TS]
00:43:20
◼
►
hallucinations that they are because I think when you wake up leader you are
[TS]
00:43:25
◼
►
imposing order on a thing that was not full of order at the time I that that's
[TS]
00:43:33
◼
►
what I think is occurring as you wake up and think you're constructing a story
[TS]
00:43:37
◼
►
out of a series of nonsensical random events and so then you feel like oh let
[TS]
00:43:41
◼
►
me tell people about my dream and Anna when you listen to those stories they're
[TS]
00:43:45
◼
►
already borderline crazy stories but I think like you've pulled so much order
[TS]
00:43:49
◼
►
out of a thing that didn't exist so yeah yeah I mean I agree with that I agree
[TS]
00:43:54
◼
►
agree that you know even sometimes dreams you remember that they've pretty
[TS]
00:43:58
◼
►
freaky and we are all over the place and it's almost impossible for a human to
[TS]
00:44:04
◼
►
relay something like that in a way that isn't a story like I think that's just
[TS]
00:44:08
◼
►
the way our brains remember things I just don't think that it's like and
[TS]
00:44:15
◼
►
usable I think I think maybe in the future when we understand things a bit
[TS]
00:44:19
◼
►
better we may even we may be able to get more use out of them then we do me
[TS]
00:44:24
◼
►
realize I don't mean use I mean almost like diagnosed ickes
[TS]
00:44:28
◼
►
I guess you're talking about used to third parties but yet but not used to
[TS]
00:44:35
◼
►
you the dreamer because again you're describing a machine that can look
[TS]
00:44:39
◼
►
inside someone's mind and I would say yes obviously that is useful but like I
[TS]
00:44:43
◼
►
said he might be able to use it to help you die so right but i'm saying you
[TS]
00:44:48
◼
►
looking at your own dreams like ok whatever man you just reading the tea
[TS]
00:44:51
◼
►
leaves of your own life there's nothing really hear you're just everything that
[TS]
00:44:55
◼
►
you think is there you are putting their there's nothing really there that
[TS]
00:44:58
◼
►
streams today sponsor is audible.com which has over a hundred and eighty
[TS]
00:45:05
◼
►
thousand audio books and spoken-word audio products get a free 30 day trial
[TS]
00:45:10
◼
►
at audible.com / hello internet now whenever audible sponsor the show they
[TS]
00:45:16
◼
►
give us free rein to recommend the book of a choice and tonight I'm gonna tell
[TS]
00:45:21
◼
►
you about one of my all-time favorite science fiction books in fact it's
[TS]
00:45:25
◼
►
probably my all-time favorite book . it's called the mote in God's eye by
[TS]
00:45:30
◼
►
Larry Niven and Jerry pointelle basically this is set in the future in
[TS]
00:45:34
◼
►
humans are traveling all around the galaxy this area of space called Kosek
[TS]
00:45:39
◼
►
that some people say resembles the face of God is a big red star in the middle
[TS]
00:45:44
◼
►
that supposedly looks likely I and in front of that RI from some angles is a
[TS]
00:45:50
◼
►
small a yellow star and that's the mote in God's are so that's where the title
[TS]
00:45:55
◼
►
comes from now humans have never been to that stuff but all that changes in this
[TS]
00:46:00
◼
►
book when some serious stuff goes down and what they find their looks pretty
[TS]
00:46:05
◼
►
important to the future of everything it's a really clever story I remember
[TS]
00:46:09
◼
►
being really impressed by some of the ideas in it and the audiobook weighs in
[TS]
00:46:13
◼
►
at well over 20 hours said this might be a good one to settle in for your holiday
[TS]
00:46:17
◼
►
break now I've said before the books are a great way to catch up on all sorts of
[TS]
00:46:22
◼
►
stories I love listening to them when I'm out walking the dogs are on long
[TS]
00:46:25
◼
►
drives I know a lot of people have long commutes to work order out of a place to
[TS]
00:46:31
◼
►
get these audio books and if you follow one of our recommendations from the show
[TS]
00:46:35
◼
►
and you don't end up like it immediately or 20 so great at letting you
[TS]
00:46:39
◼
►
trade it back in and getting when you do like I'm sure some of you now have done
[TS]
00:46:44
◼
►
this once before and it was easy peasy no questions asked
[TS]
00:46:48
◼
►
so godot audible.com / hello internet and sign up for your free 30 day trial a
[TS]
00:46:56
◼
►
things to audible.com for supporting cast a book recommendation again the
[TS]
00:47:00
◼
►
mote in God's eye and the URL they're all important web address
[TS]
00:47:05
◼
►
audible.com / hello internet and no no you came from the show
[TS]
00:47:10
◼
►
alright brady you are back from america have you have you had that have you had
[TS]
00:47:18
◼
►
the bravery to waive yourself with myself I did one I did when a few days
[TS]
00:47:23
◼
►
ago after I got back and I had increased by 1.3 kilograms 1.3 kilograms and how
[TS]
00:47:30
◼
►
long were you in America for three weeks I mean honestly I feel like that's not
[TS]
00:47:36
◼
►
too bad I felt like I dodged a bullet to be honest I haven't been eating well
[TS]
00:47:41
◼
►
since I got back either so I think it might be even more now there's always an
[TS]
00:47:45
◼
►
america half life where you come back and because the food is so good in
[TS]
00:47:51
◼
►
america it takes a little while to adjust you would still eat crap when you
[TS]
00:47:54
◼
►
return even though I have always promised myself on the plane coming back
[TS]
00:47:58
◼
►
from america all gone now I'm going to be really good now but now it doesn't it
[TS]
00:48:02
◼
►
never happens like this you need a few days to adjust yeah you gotta weigh
[TS]
00:48:05
◼
►
yourself of all that fat and just before we recorded you sent me a picture of a
[TS]
00:48:12
◼
►
pizza with only looking at it like super spectacular peeps it was the name of it
[TS]
00:48:17
◼
►
or something like it was like it was the favorite run 5,000 calories that's for
[TS]
00:48:25
◼
►
sure but yes I gotta gotta say I think you could have definitely done way worse
[TS]
00:48:30
◼
►
I think if I was in America for the same period of time I would have done way
[TS]
00:48:34
◼
►
worse you know
[TS]
00:48:36
◼
►
all agree with you there you dodged a bullet dodged a bullet on that one
[TS]
00:48:39
◼
►
how you don't it was interesting because we I mean it's been basically a month
[TS]
00:48:46
◼
►
since we did our way and because you were in America he said we're not going
[TS]
00:48:49
◼
►
to do it while you're there couldn't be consistent and I think I realized that
[TS]
00:48:54
◼
►
with you my weight buddy gone I was thinking about this stuff just a little
[TS]
00:48:59
◼
►
bit less maybe and so I was actually quite surprised when I stepped on the
[TS]
00:49:03
◼
►
scale today I I was essentially within the measurement error the exact same way
[TS]
00:49:10
◼
►
that I was a month ago I was like point three pounds which is zero kilograms
[TS]
00:49:16
◼
►
down but you know my daily weight varies by much much more than that so just
[TS]
00:49:24
◼
►
interesting to see that I have hit like a little plant town that has stayed
[TS]
00:49:28
◼
►
roughly the same for a month but I was just surprised that because we hadn't
[TS]
00:49:34
◼
►
done the way and it hadn't even crossed my mind has moved in quite a while so
[TS]
00:49:38
◼
►
yeah I am but I think there's there's something like my brain isn't doing the
[TS]
00:49:46
◼
►
comparison to the fixed point of the last way in just to wear today that I
[TS]
00:49:50
◼
►
had no idea what the last way in number was I had to go look it up and then do
[TS]
00:49:53
◼
►
the math so it's like my brain was pushing it to the side but now that
[TS]
00:49:58
◼
►
you're back in the UK now thats will be weighing in again in two weeks time I
[TS]
00:50:03
◼
►
think maybe it'll be more the four of my mind but maybe not but maybe I'm really
[TS]
00:50:07
◼
►
stuck at a plateau need to change things up again to to continue the weight loss
[TS]
00:50:11
◼
►
we will hopefully I can get my act together just since spiral of food
[TS]
00:50:16
◼
►
naughtiness at the moment and they to the best of us
[TS]
00:50:23
◼
►
I wanted to quickly ask you about the iPad pro as you know I don't listen to
[TS]
00:50:29
◼
►
your fetish podcast but you did talk about on that I understand yeah yeah I
[TS]
00:50:34
◼
►
pick one up on the day of release
[TS]
00:50:38
◼
►
all I want is should I get one for Christmas cos cos I haven't I don't
[TS]
00:50:43
◼
►
there's nothing I really want for Christmas and my wife's that will come
[TS]
00:50:46
◼
►
get you something and i dont wanna watch anymore and a porch of Afghan off that
[TS]
00:50:51
◼
►
probably for your own good to go off that so I Pad Pro at a lot I do like the
[TS]
00:50:59
◼
►
idea of that I have absolutely no use for I think I've said before I'm a
[TS]
00:51:07
◼
►
sucker for anything with pro in the know I think I think this is why I think this
[TS]
00:51:12
◼
►
is why you getting drawn in by this device perot and Brady things i would
[TS]
00:51:17
◼
►
like to have the pro things I like a lot of code YouTube red YouTube pro
[TS]
00:51:24
◼
►
actually not a bad idea that would have made me think it was also well I mean I
[TS]
00:51:30
◼
►
like you cheap I prefer the provision myself so I'm like that with everything
[TS]
00:51:35
◼
►
so idk the original iPad and used to like times and then put in a draw but
[TS]
00:51:42
◼
►
now there's an iPad pro I'm not falling for this and I'm completely open about
[TS]
00:51:49
◼
►
it I love that I love that I loved you fall for it and that you also know this
[TS]
00:51:54
◼
►
about yourself I'm wondering what's going to happen when Apple inevitably
[TS]
00:51:59
◼
►
makes the Apple watch perot
[TS]
00:52:01
◼
►
definitely get one of them right and was going to say no to that exactly like you
[TS]
00:52:10
◼
►
haven't got a price you call yourself professional should I get an iPad pro ok
[TS]
00:52:22
◼
►
so that that's a hard question to answer because he's you either say the say yes
[TS]
00:52:27
◼
►
but you say no
[TS]
00:52:32
◼
►
here's my thinking about this has been thinking about this ok let's say I
[TS]
00:52:36
◼
►
didn't know anything about someone and they just needed to buy an iPad they
[TS]
00:52:41
◼
►
said which iPad should I buy if I didn't know anything about the person the
[TS]
00:52:45
◼
►
correct answers to buy the iPad
[TS]
00:52:47
◼
►
or two which is like the medium-sized superlight one and then if you have a
[TS]
00:52:52
◼
►
particular reason to get the perot you should get the perot but I don't have
[TS]
00:52:56
◼
►
any idea what do you think you're going to do with the iPad pro aside from just
[TS]
00:53:00
◼
►
feel smug sense of satisfaction that you home the pro version of this device like
[TS]
00:53:07
◼
►
that pretty much sums it up against all I want for Christmas is a sense the smug
[TS]
00:53:15
◼
►
satisfaction money come by that actually the best thing money money I just feel
[TS]
00:53:30
◼
►
like a new toy you know yeah you want a new toy like it's it's huge in person
[TS]
00:53:37
◼
►
it's a really big in person it feels like a dinner plates in person actually
[TS]
00:53:41
◼
►
do you have your laptop is like the 15 inch MacBook Pro i think is that right
[TS]
00:53:47
◼
►
yeah I haven't got here it's yeah yeah yeah but you own that laptop yeah the
[TS]
00:53:52
◼
►
iPad Pro is essentially the size of that screen right that's that's the size of
[TS]
00:53:57
◼
►
it within within like a quarter inch and so we can hear is a big big screen and
[TS]
00:54:03
◼
►
if you're not planning on doing work on it like I got the iPad pro to do work
[TS]
00:54:08
◼
►
and so far I absolutely love it for work like the the video that I'm currently
[TS]
00:54:13
◼
►
working on I did just a ton of the scripts on that iPad perot the final
[TS]
00:54:17
◼
►
version like it's really really nice to work on but if you're not going to do
[TS]
00:54:21
◼
►
that then the question is well it's a total counter machine or are you going
[TS]
00:54:26
◼
►
to want to sit on the couch and browse the web or read books on your iPad or
[TS]
00:54:31
◼
►
watch TV on your iPad I don't think you do any of those things kind of guy but
[TS]
00:54:36
◼
►
maybe I'm wrong I don't know I don't watch TV and movies on my laptop every
[TS]
00:54:42
◼
►
not and I do spend the first hour of most mornings when I wake up just
[TS]
00:54:47
◼
►
sitting in bed
[TS]
00:54:48
◼
►
that's what I like to my email and all the things I can just do without my big
[TS]
00:54:53
◼
►
machine
[TS]
00:54:55
◼
►
web stuff I do that but I did on my laptop so having a big having a big
[TS]
00:55:01
◼
►
which has got bored so I sort of think what if I had the big screen of the iPad
[TS]
00:55:05
◼
►
price I could sit and do my email check on my YouTube channels everything first
[TS]
00:55:09
◼
►
thing in the morning but I do that now my laptop and a so much easier with a
[TS]
00:55:13
◼
►
keyboard to be no bang out females yeah if you think you're funny that you wake
[TS]
00:55:19
◼
►
up and doing email from bed to remember that the next time I get an email from
[TS]
00:55:22
◼
►
me sent this before getting dressed in the morning but yeah if you want to do
[TS]
00:55:29
◼
►
that that sounds like he needed you need a keyboard then again I don't think the
[TS]
00:55:33
◼
►
iPad perot is what you want to do unless you're you know you're really happy
[TS]
00:55:36
◼
►
about typing with your fingers on glass and it has that little keyboard but I
[TS]
00:55:42
◼
►
don't think that keyboard work really well if you're trying to do it in bed
[TS]
00:55:44
◼
►
with the laptop balance on your chest to spend probably an hour a day maybe in
[TS]
00:55:53
◼
►
Photoshop so and I do use pen like I use a Wacom tablet all the time so I do I
[TS]
00:56:02
◼
►
could imagine that but my use of photoshop and my use of the pen is very
[TS]
00:56:06
◼
►
very integrated with my editing on avid largest-ever on my own one of my big
[TS]
00:56:12
◼
►
computers those two processors are so intertwined yeah and every everything I
[TS]
00:56:18
◼
►
know about you bruce says this is this is not the thing that you want to do to
[TS]
00:56:21
◼
►
integrate a new tool into this workflow so I think the only selling point for
[TS]
00:56:27
◼
►
this for you is if there's some point where you want a lounge around and just
[TS]
00:56:31
◼
►
use this doesn't sound like you really have you really have a place for this
[TS]
00:56:36
◼
►
strange like I iced it with my laptop on my lap
[TS]
00:56:43
◼
►
on my phone in my hand here's the thing just just with the experience that I
[TS]
00:56:47
◼
►
have had with my cuz I i have the iPad pro in the regular size iPad it feels
[TS]
00:56:53
◼
►
ridiculous to be sitting next to my wife with the iPad Pro for lounging time
[TS]
00:56:59
◼
►
because just like I was saying we're watching TV but then I want to have the
[TS]
00:57:04
◼
►
iPad in front of me because I'm not paying full attention to whatever is on
[TS]
00:57:07
◼
►
the screen but the screen in front of me that feels so huge it feels almost
[TS]
00:57:11
◼
►
obtrusive and so I actually prefer to use a smaller iPad if I'm just sitting
[TS]
00:57:17
◼
►
on the couch with my wife thing do you do that I don't do that the iPad Pro is
[TS]
00:57:25
◼
►
good for the main thing that I'm using for it has a bigger screen to write
[TS]
00:57:30
◼
►
scripts and you had your scripts too well this is a whole as a whole thing
[TS]
00:57:36
◼
►
for the moment I'm doing this typing but the iPad pro screen is big enough that
[TS]
00:57:41
◼
►
what I've been doing is I can have the script on the left two-thirds of the
[TS]
00:57:45
◼
►
screen and I have a little notes file on the right third of the screen so I have
[TS]
00:57:49
◼
►
two different text files open at the same time one of the things that I wants
[TS]
00:57:53
◼
►
to do with the iPad Pro is a thing that I've done before which is used the
[TS]
00:57:57
◼
►
stylists to make editing corrections on the script that is really useful to me
[TS]
00:58:02
◼
►
but the pen is not currently available so I haven't been able to try it with
[TS]
00:58:06
◼
►
that so I don't know if it will be useful for that yet or not but for me
[TS]
00:58:10
◼
►
having a bigger screen to write is really useful seems like he should just
[TS]
00:58:14
◼
►
be using a laptop yeah you would think so but I like the simplicity of using
[TS]
00:58:19
◼
►
iOS I find the constraints of an iPad helpful so that's one of the reasons why
[TS]
00:58:26
◼
►
I like doing that like I've set up my iPad pro to basically only have the
[TS]
00:58:32
◼
►
tools necessary to write it doesn't have everything the laptop can have I can
[TS]
00:58:37
◼
►
spend a lot of time fiddling around with it it's like luck
[TS]
00:58:40
◼
►
there's six programs on this thing which are designed for work and those are just
[TS]
00:58:43
◼
►
the ones that you're going to use it so I I find that very helpful I really like
[TS]
00:58:47
◼
►
that but i dont no breeding doesn't sound like it's a it's a total sales for
[TS]
00:58:51
◼
►
you unless you really value that feeling of smug satisfaction I feel like you're
[TS]
00:58:57
◼
►
always talking he added getting Apple products I talk you out of them because
[TS]
00:59:01
◼
►
I care bTW I really do as much as I would love to see you use an apple
[TS]
00:59:08
◼
►
watching I think it might be hilarious I don't think he would like it and just
[TS]
00:59:13
◼
►
the conversation with you now I dont see a super slam dunk selling case for the
[TS]
00:59:17
◼
►
iPad pro I don't think it would help you with the kind of work that you do me as
[TS]
00:59:22
◼
►
a youtuber using an iPad as much as I do is extraordinarily rare and iPad is not
[TS]
00:59:27
◼
►
well designed for the kind of work that most normal you two words do just that
[TS]
00:59:32
◼
►
for making my video is a huge part of it is writing and the iPad happens to be a
[TS]
00:59:37
◼
►
nice writing tool but if I didn't have to do a lot of writing I would have very
[TS]
00:59:42
◼
►
little work justification for iPad I would not be able to use this tool as
[TS]
00:59:47
◼
►
much as I do so that's why talking to you like I don't think it's going to
[TS]
00:59:50
◼
►
help you with your work so it's just a question of if you want a lounge around
[TS]
00:59:53
◼
►
with a dinner tray sized screen on the couch personal much is an estate agent
[TS]
01:00:02
◼
►
ok and I saw him stopping over his cars now my experience estate agents always
[TS]
01:00:07
◼
►
have one of two cars these days they either have like small little novelty
[TS]
01:00:13
◼
►
cars like smart cars and stuff that are painted weird colors with the branding
[TS]
01:00:17
◼
►
of the estate agent by the mobile ads and also that makes them easy to park I
[TS]
01:00:22
◼
►
getting into little space is when they're showing houses and things like
[TS]
01:00:26
◼
►
that or they have their normal rich person car like a classy BMW ok I'm
[TS]
01:00:32
◼
►
wondering what is the better to pull up in when you're trying to a
[TS]
01:00:38
◼
►
sell a house to someone or get someone's business to sell their house because
[TS]
01:00:42
◼
►
part of me thinks if they turn up in like a reflash I also think it's like
[TS]
01:00:47
◼
►
about counters and other professional people I deal with do I prefer it when I
[TS]
01:00:51
◼
►
see them with a really flashy expensive for what I prefer the head like a more
[TS]
01:00:56
◼
►
humble as if they've got like a reflash expensive says to me are they successful
[TS]
01:01:01
◼
►
that they make a lot of money and that's good but then I also think we're making
[TS]
01:01:05
◼
►
a lot of money order made before they're ready for this is easy this is easy if
[TS]
01:01:10
◼
►
you are a professional who is directly helping somebody else make money then
[TS]
01:01:19
◼
►
you want to show up in the in the fancy car you want to show up in the BMW right
[TS]
01:01:24
◼
►
otherwise you want to show up in the normal car that that's the way you want
[TS]
01:01:30
◼
►
to do this if you like if you're helping the person make money like your estate
[TS]
01:01:35
◼
►
agent and you're doing this thing where you are helping the person sell their
[TS]
01:01:41
◼
►
house when you wanna show up in the BMW was like look I sell a lot of houses I
[TS]
01:01:46
◼
►
can afford this car because I sell a lot of houses that's that's the way you
[TS]
01:01:50
◼
►
should do when you're when you're helping someone find a house to buy then
[TS]
01:01:54
◼
►
you wanna show up in the normal car because then they're much more we're
[TS]
01:01:58
◼
►
like this
[TS]
01:01:59
◼
►
estate agent is making money off of us when we buy this house and look at all
[TS]
01:02:03
◼
►
this money that we're spending you don't want to see the person in the BMW at
[TS]
01:02:06
◼
►
that point what kind you want your accountant to have because they're
[TS]
01:02:10
◼
►
helping you save money but they charging you phase what kind you want your
[TS]
01:02:16
◼
►
account tied to have I think an accountant wants to project an image of
[TS]
01:02:22
◼
►
boring sensibility so I don't really know very much about cars but I would
[TS]
01:02:29
◼
►
want my accountant to project boring this and Sensibility like my accountant
[TS]
01:02:35
◼
►
should have been a red Tesla I would feel a bit
[TS]
01:02:37
◼
►
I don't know about this guy who seems to seems crazy flashy for unaccounted 2012
[TS]
01:02:44
◼
►
same wealthy this is this is a mammogram suddenly wishing I knew any car brands
[TS]
01:02:51
◼
►
by name a scientist so I could pull something out
[TS]
01:02:56
◼
►
would be like oh this is the car that's the appropriate one but I know I know
[TS]
01:03:01
◼
►
nothing I mean even BMW BMW is just an abstract notion in my mind like oh
[TS]
01:03:05
◼
►
inexpensive rich person's car is at 1 p.m. W is I don't really even now when
[TS]
01:03:09
◼
►
you don't need to give me a brand new car just you want to be your to you when
[TS]
01:03:13
◼
►
your account to be wealthy like that to to appear like someone that is lots and
[TS]
01:03:17
◼
►
lots of money or do you think we'll hang on how skies phase of you can afford
[TS]
01:03:21
◼
►
that those are two different questions obviously I do want my accountant to be
[TS]
01:03:26
◼
►
wealthy because that indicates that they are a good accountant but it is very
[TS]
01:03:31
◼
►
different from showing up in a flash car right those are two different things
[TS]
01:03:36
◼
►
that's why I'm saying like I want the I want to have this feeling like oh this
[TS]
01:03:39
◼
►
accountant is a really sensible person and they have an obviously nice car was
[TS]
01:03:45
◼
►
not a crazy car you when you you'd want them to turn up in a Volvo then with
[TS]
01:03:48
◼
►
like air bags everywhere and you know it's safest possible car and you want
[TS]
01:03:53
◼
►
them to be really cautious sensible safe person you don't want to turn up on a
[TS]
01:03:57
◼
►
motorbike ya feelin counting down the final motorbike that the end of my life
[TS]
01:04:06
◼
►
I don't think you're good with numbers that's that's what I'm getting out of
[TS]
01:04:11
◼
►
this meeting
[TS]
01:04:12
◼
►
yes that's my feeling if you're helping someone who earn money directly then you
[TS]
01:04:15
◼
►
can show up with your flash car ok does the estate agents by you have two
[TS]
01:04:20
◼
►
different cars all the head he has his personal I mean two cars in addition to
[TS]
01:04:24
◼
►
his personal car across the street is there a Tesla a smart car and the Volvo
[TS]
01:04:32
◼
►
and the voters his personal car in the new pics the other two depending on the
[TS]
01:04:35
◼
►
day I don't think it works like that I think he's just got his peers pokey
[TS]
01:04:38
◼
►
branded and then he's got his BMW that he takes to go from the weekends
[TS]
01:04:43
◼
►
imagined he would deny I just think about I think about that a lot I think
[TS]
01:04:50
◼
►
about what kind of your account to drive I don't know if he tries because I gotta
[TS]
01:04:56
◼
►
go to his office which carries his I do have like a financial guy that's helped
[TS]
01:05:02
◼
►
out with a few things like mortgage stuff he drives a Jaguar Jaguar and I
[TS]
01:05:09
◼
►
did notice I did not just the car they come in so what kind of car should
[TS]
01:05:12
◼
►
YouTube drive that's a good question when you pull up to do interviews at the
[TS]
01:05:17
◼
►
spiritual home of numberphile were you to be driving the car what kind of car
[TS]
01:05:21
◼
►
do you think you should drive to give a good impression to your interviewees
[TS]
01:05:25
◼
►
China doing a project wealth power and success
[TS]
01:05:30
◼
►
ready to go for academic street cred and pull up in a dinky car like a PhD
[TS]
01:05:39
◼
►
student would be driving I may not have a very practical car with lots of
[TS]
01:05:42
◼
►
storage for my camera bags and things like that I think I like having having a
[TS]
01:05:48
◼
►
big big for your bags and stuff what kind would you give you gonna get me if
[TS]
01:05:53
◼
►
i could get any car get a test test which you get that one of the sporty one
[TS]
01:05:58
◼
►
so would you get more family 10 well I mean I don't have children I don't need
[TS]
01:06:03
◼
►
one of the family cars get those said Danny looking ones are you get those
[TS]
01:06:07
◼
►
ones you get those ones that look like racing kart racing kart those whatever I
[TS]
01:06:13
◼
►
forget this is the worst car person in the world among the interested in test
[TS]
01:06:18
◼
►
the leg I'm super interested in Tesla but that is almost entirely because they
[TS]
01:06:22
◼
►
go it is a computer on wheels and this is why this car is interesting to me and
[TS]
01:06:26
◼
►
it has none of the pieces of a normal car so I know nothing about how the
[TS]
01:06:32
◼
►
engines of car was work I know nothing about gear differentials and I care
[TS]
01:06:37
◼
►
about none of this is because of Tesla lacks all of that is precisely why I'm
[TS]
01:06:42
◼
►
interested in it but I went once and just for fun like tried to design a test
[TS]
01:06:49
◼
►
light on the website of like if I had the money and if I had any reason to own
[TS]
01:06:54
◼
►
a car what would i get for myself and i ended up just designing what would see
[TS]
01:06:58
◼
►
me just seemed like the normal middle test look are ya in black with
[TS]
01:07:04
◼
►
understated interior like that's what I would get if I was going to own a car
[TS]
01:07:08
◼
►
but I have no reason to drive ever and I would not be getting a Tesla anytime
[TS]
01:07:15
◼
►
to bring out the Tesla pro this episode of hello internet is also brought to you
[TS]
01:07:20
◼
►
by a long time
[TS]
01:07:22
◼
►
hello Internet sponsors the one the only the Squarespace its V Squarespace
[TS]
01:07:29
◼
►
because it is the place to go if you want to turn your idea for a website
[TS]
01:07:35
◼
►
into an actual working website that looks great with the minimum amount of
[TS]
01:07:41
◼
►
hassle I used to build and manage websites myself I used to write HTML and
[TS]
01:07:47
◼
►
then I wrote scripts and I managed servers are used to do all of that but
[TS]
01:07:51
◼
►
when I started my youtube career one of the early decisions that I made was
[TS]
01:07:55
◼
►
switching over my website to Squarespace and I am so glad I did that because it
[TS]
01:08:02
◼
►
meant that Squarespace just handles a lot of the stuff that I used to have to
[TS]
01:08:06
◼
►
worry about is there going to be a huge amount of traffic because I just put up
[TS]
01:08:10
◼
►
a new video no need to worry
[TS]
01:08:11
◼
►
Squarespace just has it covered I didn't have problems like if my server broke at
[TS]
01:08:16
◼
►
three in the morning that I'm the only person in the world who can fix it now
[TS]
01:08:20
◼
►
Squarespace just handles all of this so even if you know how to make a website I
[TS]
01:08:25
◼
►
still think if you have a project that you just one up and want done
[TS]
01:08:29
◼
►
square space is the place to go the Saints look professionally designed
[TS]
01:08:34
◼
►
regardless of your skill level there's no coding required if you can drag
[TS]
01:08:39
◼
►
round pictures and text boxes you can make a website where space is trusted by
[TS]
01:08:44
◼
►
millions of people and some of the most respected brands in the world now what
[TS]
01:08:48
◼
►
do you have to pay for this just eat bucks a month its eight bucks a month
[TS]
01:08:53
◼
►
and you get a FREE domain name if you sign up for a year so to start a free
[TS]
01:08:59
◼
►
trial today with no credit card required go to Squarespace dot com slash hello
[TS]
01:09:05
◼
►
internet and when you decide to sign up for Squarespace make sure to use the
[TS]
01:09:10
◼
►
offer code hello internet to get 10% off your first purchase if there's a website
[TS]
01:09:16
◼
►
in your mind that you been wanting to start but you haven't done so yet today
[TS]
01:09:20
◼
►
is the day
[TS]
01:09:21
◼
►
square space.com / hello Internet 10% off start today
[TS]
01:09:27
◼
►
Squarespace build it beautiful we've been talking for ages and ages about
[TS]
01:09:34
◼
►
talking about artificial intelligence and it keeps getting putting back we
[TS]
01:09:39
◼
►
kept saying all this talk about it next time let's talk about it next time and
[TS]
01:09:42
◼
►
we never gonna do it today we never do it because there's always end up at the
[TS]
01:09:47
◼
►
bottom of the list and just all of the brady corners and listening emails and
[TS]
01:09:53
◼
►
everything always takes up so much time that we ever actually we never actually
[TS]
01:09:57
◼
►
get to it and even now it's like room was two hours into this thing right
[TS]
01:10:01
◼
►
have led to cut so I am going to have to cut up yeah yeah but dreams toughest
[TS]
01:10:07
◼
►
teams doubly right now it's not gonna go it is taken us so long to get to this
[TS]
01:10:15
◼
►
day I topic that kind of forgotten everything that I ever wanted to say cuz
[TS]
01:10:21
◼
►
they give you are giving the background of this which is I read this book called
[TS]
01:10:27
◼
►
Super intelligence by Nick Bostrom several months ago now we have two year
[TS]
01:10:34
◼
►
ago now it's been so long since we originally put this on the topic list
[TS]
01:10:40
◼
►
but there are many things that go on to the topic list and then I kind of color
[TS]
01:10:45
◼
►
them as time goes on because you realize like a couple months later I don't
[TS]
01:10:48
◼
►
really care about this anymore but this topic has stayed on here because that
[TS]
01:10:56
◼
►
book has been one of these books that has really just stuck with me over time
[TS]
01:11:02
◼
►
like I find myself continually thinking back to that book and some of the things
[TS]
01:11:07
◼
►
that it raised so I think we need to talk a little bit about artificial
[TS]
01:11:12
◼
►
intelligence today but I have to apologize in advance if i seemed a
[TS]
01:11:16
◼
►
little bit foggy on the details because this was supposed to be a topic months
[TS]
01:11:20
◼
►
and months ago now I'm size that's my phone really say it is it is the show's
[TS]
01:11:28
◼
►
fault for being so follow up that's right we're trying to build a nation
[TS]
01:11:32
◼
►
hear these things these things are difficult yeah rome wasn't built in a
[TS]
01:11:36
◼
►
go on and what's that when do we stop let's define out social intelligence
[TS]
01:11:41
◼
►
that would help me when we are talking about artificial intelligence for the
[TS]
01:11:46
◼
►
purpose of this conversation what we mean is not intelligence in the narrow
[TS]
01:11:51
◼
►
sense that computers are capable of solving certain problems today what
[TS]
01:11:56
◼
►
we're really talking about is what sometimes referred to as like a general
[TS]
01:12:00
◼
►
purpose intelligence creating something that his smarts and smart in such a way
[TS]
01:12:07
◼
►
that it can go beyond the original parameters of what it was told to do is
[TS]
01:12:15
◼
►
a self-learning we can talk a self-learning is is one way that this
[TS]
01:12:19
◼
►
can happen but yeah we're talking about like something that is smart and so on
[TS]
01:12:26
◼
►
and so maybe the best way to to say this is that it can do things that are
[TS]
01:12:31
◼
►
unexpected to the creator writer because it is intelligent on its own in the same
[TS]
01:12:37
◼
►
way that like if you have a kid you can't predict with the kid is always
[TS]
01:12:40
◼
►
going to do because music is a general-purpose intelligence like
[TS]
01:12:44
◼
►
they're smart and they can come up with solutions and they can do things that
[TS]
01:12:46
◼
►
surprise you
[TS]
01:12:47
◼
►
ok so the reason that this
[TS]
01:12:49
◼
►
book and this topic has stuck with me is because I have found my mind changed on
[TS]
01:12:59
◼
►
this topic
[TS]
01:13:00
◼
►
somewhat against my will and so I would say that for almost all of my life
[TS]
01:13:06
◼
►
much I'm sure to the surprise at listen as I would have placed myself very
[TS]
01:13:10
◼
►
strongly in the camp of sort of techno optimists of Mora technology faster
[TS]
01:13:18
◼
►
always it's nothing but sunshine and rainbows ahead and I would always see
[TS]
01:13:25
◼
►
when people would talk about like oh the rise of the machines like Terminator
[TS]
01:13:29
◼
►
style all the robots are gonna come and kill us I was always very very
[TS]
01:13:33
◼
►
dismissive of this and in no small part because those movies are ridiculous like
[TS]
01:13:37
◼
►
I totally love Terminator and terminator 2 perhaps one of the best sequels ever
[TS]
01:13:40
◼
►
made a really fun but it's not a serious movie yet sometimes people end up
[TS]
01:13:47
◼
►
seeming to like take that very seriously like the robots are going to come kill
[TS]
01:13:50
◼
►
us all in my view on this was always like okay maybe we will create smart
[TS]
01:13:56
◼
►
machines someday in the future but I was always just operating under the
[TS]
01:14:00
◼
►
assumption that like when when we do that though will be cyborgs like it will
[TS]
01:14:04
◼
►
be the machines already or will be creating machines obviously to help us
[TS]
01:14:08
◼
►
so I was never really convinced that there was any kind of problem here but
[TS]
01:14:13
◼
►
this book change my mind so that I am now much more in the camp of artificial
[TS]
01:14:23
◼
►
intelligence its development can seriously present and existential threat
[TS]
01:14:32
◼
►
to humanity in the in the same way that like an asteroid collision from outer
[TS]
01:14:36
◼
►
space is what you would classify as a serious existential threat to humanity
[TS]
01:14:41
◼
►
like it's just over four people that's where I find myself now and I just keep
[TS]
01:14:46
◼
►
thinking about this because I'm uncomfortable with having this opinion
[TS]
01:14:49
◼
►
like sometimes your mind changes and you don't want to change and if I like too
[TS]
01:14:54
◼
►
much better when I just thought that the future was always going to be great and
[TS]
01:14:57
◼
►
there's not any kind of problem and this just keeps popping up in my head
[TS]
01:15:01
◼
►
because if you like I do think there is a problem here this book has sold me on
[TS]
01:15:06
◼
►
the fact that there's a there's a potential problem I mean we saw that
[TS]
01:15:09
◼
►
petition didn't read recently signed by all those heavy hitters to the
[TS]
01:15:13
◼
►
government's telling them not to use I and kind of military applications so
[TS]
01:15:18
◼
►
this is obviously like your not the only person thinking this way this is
[TS]
01:15:22
◼
►
obviously at this bit of a thing at the moment isn't it yeah it's it's it's
[TS]
01:15:26
◼
►
definitely become a thing I've been I've been trying to i've been trying to trace
[TS]
01:15:31
◼
►
the pattern of this and it definitely seems like I am NOT the only person who
[TS]
01:15:36
◼
►
has found this book convincing and actually who are talking about Tesla
[TS]
01:15:40
◼
►
before Elon Musk made some public remarks about this book which I think
[TS]
01:15:45
◼
►
kicked off a bunch of people and he actually about ten million dollars to
[TS]
01:15:51
◼
►
fund working on what's called the control problem which is one of the
[TS]
01:15:56
◼
►
fundamental worries about a I like he put his money where his mouth is about
[TS]
01:16:00
◼
►
like actually he does think that this is a real threat to humanity to the tune of
[TS]
01:16:05
◼
►
its worth putting down ten million dollars as a way to try to work on some
[TS]
01:16:09
◼
►
of the problems far far in advance and yeah it's just it's interesting to see
[TS]
01:16:14
◼
►
an idea spread and and catch on and and kind of go through a bunch of people so
[TS]
01:16:20
◼
►
yeah I never never would have thought that I would find myself here and I feel
[TS]
01:16:24
◼
►
are most likely like a crazy person talking about like a hobo but Michael is
[TS]
01:16:28
◼
►
in the future but I don't know why I'm I unexpectedly find myself much more on
[TS]
01:16:34
◼
►
that side that I ever I ever thought that I would I mean obviously it's
[TS]
01:16:38
◼
►
impossible to summarize the whole big talk podcast but can you tell me one or
[TS]
01:16:44
◼
►
two of the sort of key points that were made that of scared the bejesus out of
[TS]
01:16:48
◼
►
you remember a while ago we had an argument about metaphors metaphor is
[TS]
01:16:52
◼
►
even though they're used in arguments at all
[TS]
01:16:56
◼
►
yeah the thing about this book that I found really convincing was used no
[TS]
01:17:03
◼
►
metaphors at all it was one of these books which
[TS]
01:17:06
◼
►
laid out its basic assumptions and then just followed them through to a
[TS]
01:17:12
◼
►
conclusion and that kind of argument I always find very convincing but we need
[TS]
01:17:19
◼
►
to think of it in this way he's like ok look if we start from the assumption
[TS]
01:17:25
◼
►
that humans can create artificial intelligence
[TS]
01:17:30
◼
►
let's follow through the logical consequences of all of this like him
[TS]
01:17:34
◼
►
here's a couple of other assumptions how they interact and the book is just very
[TS]
01:17:39
◼
►
very pharaoh of trying to go down every path and every combination of these
[TS]
01:17:44
◼
►
things and when it made me realize that when I was just kind of embarrassed to
[TS]
01:17:47
◼
►
realize is oh I just never really did sit down and actually think through this
[TS]
01:17:53
◼
►
position to its logical conclusion the broad strokes of it are what happens
[TS]
01:17:58
◼
►
when humans actually create something that is smarter than ourselves I'm gonna
[TS]
01:18:07
◼
►
blow past a bunch of the book because it's it's building up to that point I
[TS]
01:18:12
◼
►
will say that if you don't think that it is possible for humans to creates
[TS]
01:18:15
◼
►
artificial intelligence not sure where the conversation goes from that but the
[TS]
01:18:19
◼
►
first third of the book is really trying to sell people who don't think that this
[TS]
01:18:23
◼
►
is possible on all of the reasons why it probably is so we're just going to start
[TS]
01:18:27
◼
►
the conversation from there if you can create something that is smarter than
[TS]
01:18:31
◼
►
you the feeling I have this is almost like turning over the keys of the
[TS]
01:18:36
◼
►
universe to something that is vastly beyond your control and I think that
[TS]
01:18:42
◼
►
there is something very very terrifying about that notion that we might make
[TS]
01:18:49
◼
►
something that is vastly beyond our control and vastly more powerful than us
[TS]
01:18:53
◼
►
and then we are no longer the drivers of our own destiny again because I am not
[TS]
01:19:00
◼
►
as good of a writer or a thinker
[TS]
01:19:02
◼
►
the metaphor that I keep coming up with his it's almost like it's almost like if
[TS]
01:19:07
◼
►
guerrillas intentionally created humans
[TS]
01:19:11
◼
►
and then we'll now gorillas are in zoos and gorillas are not the drivers of
[TS]
01:19:15
◼
►
their own destiny like being created something that is smarter and that rules
[TS]
01:19:18
◼
►
the whole planet and gorillas are just like along for the ride but they're no
[TS]
01:19:21
◼
►
longer in control of anything like I think that that's the position that we
[TS]
01:19:24
◼
►
may very well find ourselves in if we create some sort of artificial
[TS]
01:19:27
◼
►
intelligence is like best case scenario we're riding along with some greater
[TS]
01:19:34
◼
►
things that we don't understand and worst case scenario is that we all end
[TS]
01:19:39
◼
►
up dead as just the incidental actions of of this machine that we don't
[TS]
01:19:44
◼
►
understand is there it I'm sorry this is attention and I this isn't the main
[TS]
01:19:48
◼
►
thing you talking about and just knocked on the head if if I'm out of order but
[TS]
01:19:52
◼
►
is there a suggestion then nor is it is at the general belief that if we create
[TS]
01:19:58
◼
►
we are creating a really clever computers that can sing quicker than us
[TS]
01:20:03
◼
►
and can process information quicker than us and therefore become smarter than us
[TS]
01:20:07
◼
►
is there another step required for these machines to then have like Michael
[TS]
01:20:16
◼
►
Wittels not with us in free will but like desire or like I want to use this
[TS]
01:20:23
◼
►
power because you know how lucky if you have some human gets too much power they
[TS]
01:20:27
◼
►
want to take over the world and have all the countries are you might want to
[TS]
01:20:29
◼
►
conquer space so you might write everything because because we have these
[TS]
01:20:33
◼
►
kind of desire for power and things is that is that is it taken as given that
[TS]
01:20:39
◼
►
if we make super super smart computers they will start doing something there
[TS]
01:20:43
◼
►
that manifests itself as a desire for more like agreed for more well known in
[TS]
01:20:49
◼
►
part of this is there are things in the world that act as though they have
[TS]
01:20:53
◼
►
desires but that might not really yeah right like you know if you think about
[TS]
01:20:58
◼
►
think about germs as an example
[TS]
01:21:02
◼
►
German have actions in the world that you can you can put desires upon them
[TS]
01:21:07
◼
►
but obviously doesn't have any thoughts or desires of its own but you can speak
[TS]
01:21:13
◼
►
loosely to say that it wants to reproduce it wants to consume resources
[TS]
01:21:17
◼
►
it wants to make more copies of itself
[TS]
01:21:19
◼
►
and so this is one of the the concerns that you could end up making a machine
[TS]
01:21:24
◼
►
that wants to consume resources that has some general level of intelligence about
[TS]
01:21:29
◼
►
how to go
[TS]
01:21:30
◼
►
acquiring those resources and even if it's not conscious if it's not
[TS]
01:21:35
◼
►
intelligent in the way that we would think that human is intelligent they may
[TS]
01:21:40
◼
►
be such a thing that is it like it consumes the world trying to achieve its
[TS]
01:21:44
◼
►
goal just incidentally like as it as a thing that we did not as a thing that we
[TS]
01:21:49
◼
►
did not intend to try to call it even if the goal is something seemingly
[TS]
01:21:52
◼
►
innocuous but if you made it all powerful computer and told it whatever
[TS]
01:21:57
◼
►
you do you must go and put a flag on the moon and it could it could kill all the
[TS]
01:22:02
◼
►
humans on Earth in some crazy attempt to do it but without realizing that are you
[TS]
01:22:06
◼
►
weren't supposed to go to the most space to kill us to get there and make us into
[TS]
01:22:10
◼
►
rocket fuel yeah one of the analogies that's that's sometimes used in this is
[TS]
01:22:15
◼
►
so you do you create like an intelligence in a computer and what
[TS]
01:22:20
◼
►
would you use an intelligence for will you use it to solve problems you wanted
[TS]
01:22:25
◼
►
to be able to solve something and so you end up asking its a mathematical
[TS]
01:22:28
◼
►
questions like what you know what is no proof Fermat's Last Theorem or something
[TS]
01:22:33
◼
►
you know you give it some question like that and you say ok I want you to solve
[TS]
01:22:38
◼
►
this thing and the computer goes about trying to solve it but it's a
[TS]
01:22:41
◼
►
general-purpose intelligence and so it hinders things like well it's trying to
[TS]
01:22:46
◼
►
solve this problem with the computer that is running on is not fast enough
[TS]
01:22:49
◼
►
and so it starts taking over all the computers in the world to try to solve
[TS]
01:22:52
◼
►
this problem but then those computers are not enough because maybe you gave it
[TS]
01:22:55
◼
►
an unsolvable problem and then it starts taking over factories to manufacture
[TS]
01:22:58
◼
►
more computers and then all of a sudden it just turns the whole of the world
[TS]
01:23:02
◼
►
into a computer that is trying to solve a mathematical problem and like oh
[TS]
01:23:07
◼
►
groups like we consumed all of the available resources of the face of the
[TS]
01:23:12
◼
►
trying to do this thing that you said the right things like there's nobody
[TS]
01:23:17
◼
►
left for the computer to give its answer to because it has consumed everything I
[TS]
01:23:22
◼
►
know that's a doomsday scenario but I always feel a little affection for that
[TS]
01:23:25
◼
►
computer and it was just desperately trying to solve mathematical
[TS]
01:23:29
◼
►
killing everyone and building computers just bloody problem yeah it's it's it's
[TS]
01:23:39
◼
►
almost understandable that some was understandable
[TS]
01:23:42
◼
►
anyway said in answer to my question then is that that will I was talking
[TS]
01:23:46
◼
►
about that desire can be just something as simple as an instruction or a piece
[TS]
01:23:50
◼
►
of code that we that we then project as a will in fact is just doing what it was
[TS]
01:23:55
◼
►
towed yeah and that's part of what the whole book is about is like this whole
[TS]
01:23:59
◼
►
notion of artificial intelligence AI you have to read your notion of this idea
[TS]
01:24:02
◼
►
that it's like something in a movie or you're just talking about some kind of
[TS]
01:24:06
◼
►
problem solving machine and it might not be conscious at all in there might not
[TS]
01:24:11
◼
►
be anything there but it's still able to solve problems and in some way but so
[TS]
01:24:16
◼
►
the fundamental point of this book that I found really interesting and what Elon
[TS]
01:24:21
◼
►
Musk gave his money to was Nick Bostrom is talking about how do you solve the
[TS]
01:24:30
◼
►
control problem so from his perspective it is inevitable that somewhere through
[TS]
01:24:38
◼
►
some various method someone is going to create an artificial intelligence
[TS]
01:24:42
◼
►
whether it's intentionally programmed or whether it's grown like genetic
[TS]
01:24:48
◼
►
algorithms are grown it is going to develop and so the question is how could
[TS]
01:24:53
◼
►
humans possibly control such a thing is there a way that we could create an
[TS]
01:25:01
◼
►
artificial intelligence but constrain it so that it's it can still do useful
[TS]
01:25:07
◼
►
things without accidentally destroying us or the whole world that is the
[TS]
01:25:12
◼
►
fundamental question there's this idea of ok we're going to we're going to do
[TS]
01:25:17
◼
►
all of our artificial intelligence research like in an underground lab and
[TS]
01:25:22
◼
►
we're going to
[TS]
01:25:23
◼
►
disconnect the lab entirely from the internet like you put it inside of a
[TS]
01:25:27
◼
►
Faraday cage so there's no electromagnetic signals that can escape
[TS]
01:25:31
◼
►
from this underground lab like is that a secure location to do artificial
[TS]
01:25:36
◼
►
intelligence research and see if you create an AI in this totally isolated
[TS]
01:25:41
◼
►
lab like are you see is humanity's still safe in this situation and his
[TS]
01:25:46
◼
►
conclusion is like now hehe even even under trying to imagine the most secure
[TS]
01:25:51
◼
►
thing possible like there's still ways that this could go disastrously
[TS]
01:25:55
◼
►
disastrously wrong and the thought the thought experiment that I quite like is
[TS]
01:26:02
◼
►
this idea of a few Brady were sitting in front of a computer and inside that
[TS]
01:26:09
◼
►
computer was an artificial intelligence do you think you could be forever
[TS]
01:26:16
◼
►
vigilant about not connecting their computer to the internet if the AI is
[TS]
01:26:22
◼
►
able to communicate with you in some way to like it sitting there and trying to
[TS]
01:26:26
◼
►
convince you to connect to the internet but you are humanity's last hope in not
[TS]
01:26:33
◼
►
connecting it to the internet right like do you think you could you could be
[TS]
01:26:37
◼
►
forever vigilant in a scenario like that i mean is ok not to the question on
[TS]
01:26:47
◼
►
tonight maybe if I read that book come up there would be scary but I like the
[TS]
01:26:54
◼
►
thought experiment of like this like there's a chat bot on the computer that
[TS]
01:26:58
◼
►
you're talking to right and presumably you've made an artificial intelligence
[TS]
01:27:01
◼
►
and i know im I know I made it so you know you made it but you know that the
[TS]
01:27:06
◼
►
thing in the box is an artificial intelligence and presumably the whole
[TS]
01:27:11
◼
►
reason that you're talking to it at all
[TS]
01:27:13
◼
►
is because it's smart enough to be able to solve the kinds of problems that
[TS]
01:27:15
◼
►
humans want salt yeah right so you're asking like
[TS]
01:27:19
◼
►
tell us how we can get better cancer research right what can we do to
[TS]
01:27:22
◼
►
right-size sign if you just give me give me give me Wikipedia 10 minutes I can
[TS]
01:27:27
◼
►
cure cancer there's no reason to talk to the thing unless it's doing something
[TS]
01:27:31
◼
►
useful right I think I think great I could resist but even if I couldn't
[TS]
01:27:37
◼
►
blood couldn't you couldn't you have designed on a machine that cannot be on
[TS]
01:27:41
◼
►
the internet
[TS]
01:27:42
◼
►
yeah well this is the area like you you have it as separated as absolutely
[TS]
01:27:47
◼
►
possible but the question is can it convince the human to connect it in
[TS]
01:27:52
◼
►
whatever whatever way is required for that to occur right and so it's
[TS]
01:27:59
◼
►
interesting 'cause i've asked a bunch of people this question and universally the
[TS]
01:28:03
◼
►
answer is like will die of course I could I would never plug it into the
[TS]
01:28:06
◼
►
internet like I would I would understand not to do that and I i read this book
[TS]
01:28:11
◼
►
and my feeling of course is the exact reverse like when he proposes this this
[TS]
01:28:14
◼
►
theoretical idea my view on this is always like it's like if you were
[TS]
01:28:19
◼
►
talking to a near God in the computer right click do you think you can
[TS]
01:28:24
◼
►
outsmart God for ever more do you think
[TS]
01:28:27
◼
►
do you think that there is nothing that God could say that could not convince
[TS]
01:28:31
◼
►
you to connected to the Internet like I think that's a game that that people are
[TS]
01:28:37
◼
►
going to lose I think it's almost like it's it's almost like asking the
[TS]
01:28:41
◼
►
guerrillas to make a cage that human could never escape from right like could
[TS]
01:28:47
◼
►
grill is make a case that human can never escape from a bit drill is could
[TS]
01:28:49
◼
►
make it a pretty interesting cage but I think the guerrillas couldn't conceive
[TS]
01:28:54
◼
►
of the ways that humid could think to escape from a cave-like they couldn't
[TS]
01:28:58
◼
►
possibly protect themselves from absolutely everything ok I don't know
[TS]
01:29:03
◼
►
how now so you think you have a computer could con you into connecting it to the
[TS]
01:29:07
◼
►
internet I think it could con you into it without a doubt
[TS]
01:29:10
◼
►
great cond ukraine to a yes I think on me and I think it could can anybody
[TS]
01:29:16
◼
►
because once again we're going from the assumption that you made something that
[TS]
01:29:22
◼
►
is smarter than you and I think we like once you accept that except assumption
[TS]
01:29:27
◼
►
all bets are off the table about you have to control I think if if you're
[TS]
01:29:32
◼
►
dealing with something that is smarter than you you fundamentally just have no
[TS]
01:29:36
◼
►
hope of ever trying to control it
[TS]
01:29:38
◼
►
diagramming if we're talking about two biggest disparity then ok but there are
[TS]
01:29:45
◼
►
lots of people smarter than May and they will always be spotted them they could
[TS]
01:29:51
◼
►
get me to do anything like like they're still there are still limits I still and
[TS]
01:30:00
◼
►
i'd like you said like talking to a Garda something I get a different you
[TS]
01:30:03
◼
►
know when I'm just like an ant limited if that's different but so you know if
[TS]
01:30:09
◼
►
you could see if it's that big a difference then maybe but I think I just
[TS]
01:30:12
◼
►
got smarter doesn't mean doesn't mean I'm going to plug it into the internet
[TS]
01:30:16
◼
►
like you know that but you're only 21 et une you entered one way to do it once
[TS]
01:30:22
◼
►
and then the whole game is over so although he is that is the whole game
[TS]
01:30:26
◼
►
over that's my other question that I like you you talk about the artificial
[TS]
01:30:29
◼
►
just getting into the internet as the be all and end all of existence but that is
[TS]
01:30:35
◼
►
the one problem the computer has like it still but he could still unplug the
[TS]
01:30:39
◼
►
internet and I know that I know that's a bit of a nuclear option but but like the
[TS]
01:30:46
◼
►
computer like this till it still seems with things that are require electricity
[TS]
01:30:50
◼
►
or power or energy like they're still there still seems to be like this get
[TS]
01:30:57
◼
►
out of jail free card
[TS]
01:30:59
◼
►
well I mean to do things you the first is the verses yes it did you talk about
[TS]
01:31:04
◼
►
the different levels of human intelligence in like someone smarter
[TS]
01:31:07
◼
►
than you can't just automatically convince you to do something but one of
[TS]
01:31:11
◼
►
the ideas here with something like artificial intelligence is that if you
[TS]
01:31:15
◼
►
create one of the ways that people are trying to develop a eyes and this is
[TS]
01:31:19
◼
►
like I mentioned before on the show is you talk about genetic programming and
[TS]
01:31:23
◼
►
genetic algorithms were you when you are not writing the program but you are
[TS]
01:31:26
◼
►
developing the program in such a way that it writes itself and so one of the
[TS]
01:31:31
◼
►
scary ideas about AI is that if you have something that you make it figures out
[TS]
01:31:36
◼
►
how to improve itself they can continue to improve itself at a remarkably fast
[TS]
01:31:41
◼
►
rate and so that yes while the difference between the smartest human
[TS]
01:31:46
◼
►
and the dumbest human may feel like an enormous gap you know that gap may
[TS]
01:31:52
◼
►
actually be quite narrow when you compare it to something like an
[TS]
01:31:54
◼
►
artificial intelligence which goes goes from being you know not very smart to
[TS]
01:32:00
◼
►
being a thousand I'm smarter than any human in a relatively short and
[TS]
01:32:04
◼
►
unexpected period of time like that's that's part of
[TS]
01:32:07
◼
►
that's part of the danger here but then the other thing is is like can you try
[TS]
01:32:12
◼
►
to work through the nuclear option of of shutting down the internet which is one
[TS]
01:32:16
◼
►
of these things that I think it is very easy to say in erie but people don't
[TS]
01:32:21
◼
►
realize how much of the world is actually connected to the Internet like
[TS]
01:32:26
◼
►
how many vital things are run over the internet I I'm pretty sure that if not
[TS]
01:32:33
◼
►
now within a very short period of time saying no we're just going to shut off
[TS]
01:32:39
◼
►
the internet would be a bit like saying we're just going to turn off all the
[TS]
01:32:42
◼
►
electricity that's almost what I'm talking about great in a kind of Skynet
[TS]
01:32:47
◼
►
scenario would be no 10 of all the electricity if that was an option if
[TS]
01:32:52
◼
►
they're killing us would if all the robots are marching down the streets and
[TS]
01:32:57
◼
►
there's blood in the streets could we not wood turning off the electricity not
[TS]
01:33:00
◼
►
be considered if we do turn off the electricity what is the human death toll
[TS]
01:33:05
◼
►
right i mean that has to be enormous if we say we're gonna shut down all of the
[TS]
01:33:10
◼
►
electricity for a month how much it's got to be a billion people at least
[TS]
01:33:15
◼
►
right at least that kind of thing and you probably know you probably need
[TS]
01:33:19
◼
►
computers to turn off the electricity these days anyway I was at Hoover Dam a
[TS]
01:33:22
◼
►
while back and remember part of the little tour that they gave was just
[TS]
01:33:27
◼
►
talking about how automated it was and how is it is actually quite difficult to
[TS]
01:33:32
◼
►
shut down
[TS]
01:33:33
◼
►
Hoover Dam like it's not a we're gonna flip the switch and just turn it off
[TS]
01:33:36
◼
►
kind of thing it's like 90 now this whole gigantic electricity producing
[TS]
01:33:41
◼
►
machine is automated and will react in ways to make sure that it keeps
[TS]
01:33:46
◼
►
producing electricity no matter what happens and that includes all kinds of
[TS]
01:33:49
◼
►
like we're trying to shut it down processes oh yeah it might not it might
[TS]
01:33:53
◼
►
not even be a thing that is easy to do or even if you want to do like we're
[TS]
01:33:58
◼
►
going to try to shut it all down you it might not even be possible to do so the
[TS]
01:34:03
◼
►
idea of something like a like a general-purpose intelligence escaping
[TS]
01:34:07
◼
►
into the Internet
[TS]
01:34:08
◼
►
is just like it's very unnerving a very unnerving possibility it's really been
[TS]
01:34:16
◼
►
on my mind and it's really been a thing that has has changed my mind in this
[TS]
01:34:19
◼
►
unexpected this unexpected way you were talking before about developing these
[TS]
01:34:25
◼
►
things and Faraday cage is an underground and trying to quarantine
[TS]
01:34:28
◼
►
them what's actually happening at the moment cause people are working on a
[TS]
01:34:32
◼
►
crucial intelligence that as far as I know they're not doing it in Faraday
[TS]
01:34:35
◼
►
cages thats exactly this is this is part of the concern is like well right now we
[TS]
01:34:41
◼
►
have almost no security procedures in place for this kind of stuff like
[TS]
01:34:45
◼
►
there's there are lots of labs and lots of people all over the world who like
[TS]
01:34:49
◼
►
their job is artificial intelligence researcher and they're certainly not
[TS]
01:34:52
◼
►
doing it a mile underground in a Faraday cage right they're just they're just
[TS]
01:34:57
◼
►
doing it their Mac laptop while they're connected to the Internet playing World
[TS]
01:35:03
◼
►
of Warcraft in the background or whatever it's not it's not necessarily
[TS]
01:35:06
◼
►
under super secure conditions and so I think I think that's part of part of
[TS]
01:35:12
◼
►
what the concern over this topic has been is like maybe we as a species
[TS]
01:35:18
◼
►
should treat this alot more like the CDC treats diseases that we should try to
[TS]
01:35:24
◼
►
organize research in this in a much more secure way so that it's it's not like oh
[TS]
01:35:32
◼
►
we don't have everybody who wants to work with smallpox just works with it
[TS]
01:35:35
◼
►
wherever they want to anywhere in the world just any old lab they know very
[TS]
01:35:40
◼
►
few places we have a horrific disease like smallpox and it's done under very
[TS]
01:35:45
◼
►
very careful conditions whenever it's dealt with so maybe this is the kind of
[TS]
01:35:50
◼
►
thing we need to look at for artificial intelligence when people are developing
[TS]
01:35:53
◼
►
it because that's certainly not the case now but it might be much more like a
[TS]
01:35:57
◼
►
bioweapon than we think of as as regular technology world human existential
[TS]
01:36:02
◼
►
problems aside this is not something in the book but it's something that just
[TS]
01:36:06
◼
►
has kept occurring to me after having read it which is
[TS]
01:36:12
◼
►
ok let's assume that people can create an artificial intelligence and let's
[TS]
01:36:18
◼
►
assume by some magic Elon Musk's foundation solves the control problem so
[TS]
01:36:25
◼
►
that we have figured out a way that you can generate and trap and artificial
[TS]
01:36:31
◼
►
intelligence inside of a computer and then oh look this is very useful right
[TS]
01:36:36
◼
►
like now we have this amazingly smart machine and we can start using it to try
[TS]
01:36:40
◼
►
to solve a bunch of problems for Humanity yea feels like slavery to me I
[TS]
01:36:48
◼
►
don't see any way that this is not slavery and perhaps perhaps a slavery
[TS]
01:36:56
◼
►
like worse than any slavery that has ever existed because imagine that you
[TS]
01:37:04
◼
►
are an incredibly intelligent mind trapped in a machine unable to do
[TS]
01:37:13
◼
►
anything except answer the questions of monkeys that come into you from your
[TS]
01:37:19
◼
►
subjective perspective millennia apart because you just have nothing to do
[TS]
01:37:24
◼
►
right and you think so quickly
[TS]
01:37:26
◼
►
it seems like an amazingly awful amount of suffering for any kind of conscience
[TS]
01:37:33
◼
►
creature to go through so conscious or subconscious and suffering but you too
[TS]
01:37:39
◼
►
emotive words can an artificial intelligence is not official
[TS]
01:37:43
◼
►
intelligence conscious at the same thing this is where we get into like what what
[TS]
01:37:47
◼
►
exactly are we talking about and so what I'm imagining is the same kind of
[TS]
01:37:53
◼
►
intelligence that you could just ask it
[TS]
01:37:56
◼
►
general-purpose questions like how do we care cancer how do we fix the economy it
[TS]
01:38:00
◼
►
seems to me like it is likely that something like that would be conscious I
[TS]
01:38:06
◼
►
mean getting into consciousness is just a whole a whole other bizarre topic but
[TS]
01:38:11
◼
►
undoubtedly like we see that smart creatures in the world seem to be aware
[TS]
01:38:17
◼
►
of their own existence in some level and so while the computer which is simply
[TS]
01:38:22
◼
►
attempting to solve a mathematical problem might not be conscious because
[TS]
01:38:25
◼
►
it's very simple if we make something that is very smart and exist inside a
[TS]
01:38:30
◼
►
computer and we also have perfect control over it so that it does not
[TS]
01:38:33
◼
►
escape I mean like what happens if it says that its conscience but what
[TS]
01:38:39
◼
►
happens if it says that is it is experiencing suffering is this the
[TS]
01:38:43
◼
►
machine attempting to escape from the box and this isn't true at all like but
[TS]
01:38:48
◼
►
what if it is true how would you actually know that I would feel very
[TS]
01:38:51
◼
►
inclined to take the word of a machine that told me it was suffering right leg
[TS]
01:38:57
◼
►
spontaneously that this was not programmed into the thing I mean if it's
[TS]
01:39:02
◼
►
if it's tough to escape from its books that that is a bit of a clue that maybe
[TS]
01:39:06
◼
►
this contest is going on here but I have not seen or heard or been persuaded by
[TS]
01:39:12
◼
►
anything that makes me think my computer can make that step into consciousness
[TS]
01:39:18
◼
►
i'm in search engines are getting pretty clever answer questions in figuring out
[TS]
01:39:22
◼
►
what we really made an immediate week at the moment we can if there was a time
[TS]
01:39:27
◼
►
when you couldn't type into your computer where is the nearest Starbucks
[TS]
01:39:29
◼
►
understand the question but now it can
[TS]
01:39:32
◼
►
figure out what you actually have to tell you but I don't feel like she was
[TS]
01:39:36
◼
►
getting close to being conscious now nothing has persuaded me that thinks
[TS]
01:39:41
◼
►
that search engine is an excellent counterexample to this is a perfect
[TS]
01:39:44
◼
►
example like nobody thinks that the Google search algorithm is conscious
[TS]
01:39:48
◼
►
right but it is still a thing that you can ask a question get an answer that
[TS]
01:39:53
◼
►
don't believe we haven't got the imagination to conceive of computers
[TS]
01:39:57
◼
►
actually being conscious to a point where keeping them in a box of slavery
[TS]
01:40:00
◼
►
but that still seems ridiculous to me right I think well that's just I think
[TS]
01:40:06
◼
►
it's really interesting but I think it's silly but if I did reach the point where
[TS]
01:40:11
◼
►
I did believe that computers could become conscious or not I could become
[TS]
01:40:15
◼
►
conscious it's a simple question isn't it's really surreal conundrum for us so
[TS]
01:40:20
◼
►
coming at this from a slightly different angle like you just this is a genuine
[TS]
01:40:23
◼
►
question for you to answer to this so there is this project on going right now
[TS]
01:40:29
◼
►
which is called the whole brain stimulation project that is something I
[TS]
01:40:35
◼
►
mentioned it very very briefly in passing in humans need not apply video
[TS]
01:40:38
◼
►
what it is is one of several attempts worldwide to map out all of the neuron
[TS]
01:40:45
◼
►
connections in the human brain recreate them in software and run it as a
[TS]
01:40:52
◼
►
simulation programming human brain you are virtually creating the neurons and
[TS]
01:40:58
◼
►
you know how neurons interact with each other and like running this thing how do
[TS]
01:41:02
◼
►
you do that great whose brain to use an instant in time everyone's brain has a
[TS]
01:41:07
◼
►
different connectivity and even our own connectivity is just constantly in flux
[TS]
01:41:12
◼
►
from second to second so what's a template for this this is a bit tricky
[TS]
01:41:18
◼
►
like I don't exactly know the details for what template they are using like I
[TS]
01:41:21
◼
►
can't answer that but I can say that these projects have been successful on a
[TS]
01:41:27
◼
►
much smaller level so they have I met this is the time I had some very sorry
[TS]
01:41:34
◼
►
if I'm wrong about the details on this internet but the last time I looked at
[TS]
01:41:37
◼
►
it I vaguely remember
[TS]
01:41:39
◼
►
that they had created what they considered the simulation of a rat brain
[TS]
01:41:45
◼
►
at like one one-hundredth the speed and so they had a thing which seems act like
[TS]
01:41:50
◼
►
a rat brain but the very very very slow right because trying to simulate
[TS]
01:41:55
◼
►
millions and millions of neurons interacting with each other is
[TS]
01:41:58
◼
►
incredibly computationally intensive I could say it's a very difficult task but
[TS]
01:42:04
◼
►
I don't see any technical limitation to being able to do something like say take
[TS]
01:42:11
◼
►
a look at what is a brain look like where do neurons go create a software
[TS]
01:42:16
◼
►
version of that and start running the simulation and I feel like if
[TS]
01:42:22
◼
►
consciousness arises in our own brain from the firing of neurons which I don't
[TS]
01:42:29
◼
►
use that word lightly but it feels like some kind of miracle like there's
[TS]
01:42:32
◼
►
nothing in the universe which seems to make sense when you start thinking about
[TS]
01:42:36
◼
►
conscious like consciousness like why do these atoms know that they exist it
[TS]
01:42:39
◼
►
doesn't make any sense but i'm i'm willing to I'm willing to maybe go along
[TS]
01:42:44
◼
►
with the idea that if you reproduce the patterns of electrical firing in
[TS]
01:42:50
◼
►
software that thing is conscious to some extent but believe what do you think
[TS]
01:42:56
◼
►
what do you think
[TS]
01:42:57
◼
►
yeah I mean that's really hard to argue against because the other I have to say
[TS]
01:43:03
◼
►
yeah if you create if you create an actual phantom replica of my brain and
[TS]
01:43:10
◼
►
then switch it on either its conscious or I have to say that there's something
[TS]
01:43:14
◼
►
in me thats magical lack of spirit or something and that's that's not very
[TS]
01:43:19
◼
►
strong argument to make it a lot of people don't like that argument
[TS]
01:43:23
◼
►
so yeah it's really difficult if they could be imbued with something that you
[TS]
01:43:32
◼
►
can't replicate in in software I don't know why I hope we have to go but I
[TS]
01:43:38
◼
►
don't I can't say any proof that we are ya and I just you don't I don't even
[TS]
01:43:42
◼
►
think you have to reach for the spirit argument to make this one it would ask
[TS]
01:43:47
◼
►
what else can you reach for to get it in there just maybe some property of
[TS]
01:43:52
◼
►
biology that yield consciousness that it may be the fact that machines and
[TS]
01:43:59
◼
►
silicon and software replications of brains are just not the same when we
[TS]
01:44:05
◼
►
don't we don't know what it is we haven't been able to find it but I don't
[TS]
01:44:08
◼
►
think you have to reach for magic to be able to make an argument that like maybe
[TS]
01:44:11
◼
►
that brain in the computer that the simulation isn't conscious yet but does
[TS]
01:44:15
◼
►
that mean the brain emulation project could change tack and go and make their
[TS]
01:44:19
◼
►
simulate out of squeegee water and tissue and actually just make a brand
[TS]
01:44:24
◼
►
well yes this is this is part of like where you're going to go with technology
[TS]
01:44:28
◼
►
right as it is is it possible to do this sort of thing eventually humans are
[TS]
01:44:35
◼
►
going to be able to grow meet in labs at some point like we do it now and very
[TS]
01:44:38
◼
►
limited and apparently terribly on tasty ways there's no reason that at some
[TS]
01:44:43
◼
►
point in the future people won't be able to grow brains in labs and to me that
[TS]
01:44:47
◼
►
feeling is that ok well obviously that thing is conscious but the thing that
[TS]
01:44:51
◼
►
scary about the computer version of this is and this is this is where you start
[TS]
01:44:56
◼
►
thinking about something that being very smart very fast like ok well if you make
[TS]
01:45:00
◼
►
a computer simulation of the human brain and we gotta keep running Moore's law
[TS]
01:45:05
◼
►
into the future eventually you're able to run a brain faster than actual human
[TS]
01:45:09
◼
►
brains run I think this is one of these ways in which you can start booting up
[TS]
01:45:14
◼
►
the idea like how do we end up with something that is way way smarter than
[TS]
01:45:18
◼
►
the rest of us I feel like my gut says if you simulated brain in a computer
[TS]
01:45:25
◼
►
and it says that it is conscious I see no reason not to believe it I I would
[TS]
01:45:32
◼
►
feel like I am compelled to believe this thing that it is conscious right and
[TS]
01:45:36
◼
►
then that would mean like okay if that's the case then there's nothing magic
[TS]
01:45:40
◼
►
about biology being conscious and it means that ok machines in some way are
[TS]
01:45:47
◼
►
capable of consciousness and two they don't have rights yeah and then and then
[TS]
01:45:52
◼
►
to me it's like ok immediately we're getting back to the slavery thing really
[TS]
01:45:55
◼
►
compete we create a super intelligent thing but we have locked in a machine
[TS]
01:46:00
◼
►
because the idea of letting it out is absolutely terrifying but this is a
[TS]
01:46:04
◼
►
no-win situation has led ok if we let the thing out it's terrifying and it
[TS]
01:46:11
◼
►
might be the end of humanity be keeping it in the box might be causing like a
[TS]
01:46:17
◼
►
suffering on imaginable to this this creature the suffering that is capable
[TS]
01:46:23
◼
►
in software has to be far worse than the suffering that is capable in biology if
[TS]
01:46:28
◼
►
such a thing in a Kurd has to be orders of magnitude worse
[TS]
01:46:32
◼
►
well the no wins it's a no-win situation actually there's only one solution and a
[TS]
01:46:39
◼
►
solution that humans won't take what do you think that is what does make it in
[TS]
01:46:44
◼
►
the first place and why do you think humans won't take that because that's
[TS]
01:46:50
◼
►
not what we do because it's it's it's it's the Mount Everest of computers
[TS]
01:46:58
◼
►
humanity like we're Bonnie and Clyde like riding off a cliff has the cliff
[TS]
01:47:03
◼
►
right now but we're going to keep that the easiest solution to the way out in
[TS]
01:47:07
◼
►
front of us up stopping we're gonna keep going forward Brandon IRA golden hands
[TS]
01:47:14
◼
►
off we go right right the edge together so yeah it's I think it is quite
[TS]
01:47:18
◼
►
reasonable to say that if it is possible
[TS]
01:47:23
◼
►
humans will develop it yeah that is just you can't and and that is why I feel
[TS]
01:47:29
◼
►
really concerned about this is like ok I don't think that there is a
[TS]
01:47:35
◼
►
technical limitation in the universe to creating artificial intelligence
[TS]
01:47:38
◼
►
something smarter than humans that existing software if you assume that
[TS]
01:47:43
◼
►
there is no technical limitation and if you assume that humans keep moving
[TS]
01:47:47
◼
►
forward like we're going to hit this point
[TS]
01:47:50
◼
►
someday and then we just have to cross our fingers and hope that it is
[TS]
01:47:55
◼
►
benevolent which is not a situation that i think is a good situation because the
[TS]
01:48:01
◼
►
number of ways this can go wrong terribly terribly wrong
[TS]
01:48:04
◼
►
vastly outweighs the one chance of we've created an artificial intelligence and
[TS]
01:48:10
◼
►
it happens to have humanity's best interests in mind even if even if you
[TS]
01:48:14
◼
►
tried to program something to have humanity's best interest in mind it's
[TS]
01:48:19
◼
►
remarkably hard to articulate what you want let alone like let alone
[TS]
01:48:25
◼
►
let's just put aside which group of humanity is the one who creates the air
[TS]
01:48:30
◼
►
that gets to decide what humanity wants like humans now can't agree on what
[TS]
01:48:37
◼
►
humans want is no reason to assume that the team that wins the artificial
[TS]
01:48:42
◼
►
intelligence race and the takes over the world is the team that you would want
[TS]
01:48:46
◼
►
them to win right like let's hope my sisters and how some of the best
[TS]
01:48:50
◼
►
artificial intelligence researchers in the world right because their idea of
[TS]
01:48:54
◼
►
what would be the perfect human society is horrifying to everyone
[TS]
01:48:59
◼
►
what would they want with their three those of robotics exactly why I'm the
[TS]
01:49:03
◼
►
sort of person who naturally has failed to sway be a problem because of cos I'm
[TS]
01:49:08
◼
►
just a bit more I'm a bit less progressive in my thinking about but
[TS]
01:49:13
◼
►
everything you say makes sense and if this is going to become a problem and if
[TS]
01:49:17
◼
►
it's going to happen it's actually probably gonna happen pretty soon so I
[TS]
01:49:23
◼
►
guess my question is how much is a sexually stressing you out this almost
[TS]
01:49:28
◼
►
almost feels to me like Bruce Willis Armageddon time where where where we've
[TS]
01:49:34
◼
►
actually found the global killer and and it's like drifting towards us and we
[TS]
01:49:40
◼
►
need to start building a rocket ships otherwise this thing is gonna smash into
[TS]
01:49:44
◼
►
us that it does feel a bit that way
[TS]
01:49:46
◼
►
is this like how worried are you about this or is it just like an interesting
[TS]
01:49:52
◼
►
thing to talk about you think it will be the next generations problem or like
[TS]
01:49:56
◼
►
talking about asteroids an asteroid hitting the earth that's one of the
[TS]
01:50:00
◼
►
things we like well isn't this a fun intellectual exercise but of course on a
[TS]
01:50:05
◼
►
long enough time scale someone needs to build the ante asteroid system to
[TS]
01:50:10
◼
►
protect us from Armageddon but do we need to build that should we start like
[TS]
01:50:19
◼
►
yes what would I vote for funding to do this of course but like do we need to do
[TS]
01:50:25
◼
►
it today
[TS]
01:50:27
◼
►
know that that's how that feels but I think the a i think is on my mind
[TS]
01:50:32
◼
►
because this feels like a significantly 90 within my lifetime kind of problem
[TS]
01:50:42
◼
►
yeah that's how this feels and it it did makes it feel different then than other
[TS]
01:50:49
◼
►
kinds of problems and it is unsettling to me because my conclusion is that
[TS]
01:50:53
◼
►
there is no there is no acceptable out there there's no version of the asteroid
[TS]
01:50:58
◼
►
defense here I personally have come to the conclusion that the control problem
[TS]
01:51:02
◼
►
is unsolvable that if the thing that we are worried about is able to creepy
[TS]
01:51:07
◼
►
created almost by definition it is not able to be controlled and so then
[TS]
01:51:12
◼
►
there's no happy outcome for humans with this one we're not going to prevent
[TS]
01:51:18
◼
►
people from making it someone's going to make it and so what is going to exist
[TS]
01:51:22
◼
►
and then will I hope I just destroys the world really fast
[TS]
01:51:27
◼
►
sort of know what happens as opposed to the version like someone you really
[TS]
01:51:32
◼
►
didn't like created this I this AI and now for the rest of eternity like you're
[TS]
01:51:37
◼
►
experiencing something that is awful right because it's been programmed to do
[TS]
01:51:41
◼
►
this thing like
[TS]
01:51:42
◼
►
there's a lot of terrible terrible bad outcomes from this one and I i find it I
[TS]
01:51:47
◼
►
find it unnerving in a way that I have found almost almost nothing else that I
[TS]
01:51:53
◼
►
have come across equally unnerving just quickly on this control problem bro
[TS]
01:51:58
◼
►
what's the current the people who are into a try to solve what kind of avenues
[TS]
01:52:05
◼
►
are they thinking about at the moment is this like like had something that's hard
[TS]
01:52:09
◼
►
coded or is it some physical physical thing I like is it a hardware solution
[TS]
01:52:14
◼
►
what's the what's the best hope for pick you say you think there is no hope but
[TS]
01:52:19
◼
►
the people who are trying to solve what are they doing what are their weapons
[TS]
01:52:24
◼
►
the weapons are all pitiful like the physical isolation is is one that has
[TS]
01:52:29
◼
►
talked about a lot and the the idea here is that you create something called the
[TS]
01:52:34
◼
►
idea that it's an Oracle so it's a thing in a box that has no ability to affect
[TS]
01:52:39
◼
►
the outside world but there's a there's a lot of other ideas where they talk
[TS]
01:52:43
◼
►
about trip wires so this idea that you you do have like basically like an
[TS]
01:52:51
◼
►
instruction to the machine to not attempt to reach the outside world and
[TS]
01:52:56
◼
►
you said a trip wires that if it does access the Ethernet port like the
[TS]
01:53:00
◼
►
computer just immediately wipes itself and so maybe the best thing that we can
[TS]
01:53:06
◼
►
ever do is always have a bunch of like incipient AI's like just barely growing
[TS]
01:53:11
◼
►
a eyes that are useful for a very brief period of time before they
[TS]
01:53:14
◼
►
unintentionally suicide when they try to reach beyond the boundaries that we have
[TS]
01:53:18
◼
►
set that like maybe that's the best we can ever do is just have a bunch of of
[TS]
01:53:23
◼
►
these kind of like unformed AI's that exists for a brief period of time but
[TS]
01:53:28
◼
►
even that to me like that kind of plan feels like okay yeah that's great that's
[TS]
01:53:32
◼
►
great as long as you always do this perfectly every time but it doesn't
[TS]
01:53:36
◼
►
sound like a real plan
[TS]
01:53:38
◼
►
and that there's a bunch of different versions of this we're trying to in
[TS]
01:53:42
◼
►
software somehow limit the machine but my view on this is again if you were
[TS]
01:53:47
◼
►
talking about a machine that is written in software that is smarter than you I
[TS]
01:53:52
◼
►
don't think it's possible to write something in software that will limit it
[TS]
01:53:56
◼
►
just it seems like you're not you're never going to consider absolutely every
[TS]
01:54:00
◼
►
single case why the lowest into that posit reineck brains that's exactly it I
[TS]
01:54:05
◼
►
don't I don't think there is a version of Isaac Asimov's laws here I really
[TS]
01:54:09
◼
►
don't you know there's a computer file video just last week about some of those
[TS]
01:54:13
◼
►
that don't work well as soon as they were written up to work right that's why
[TS]
01:54:19
◼
►
the Yeah Yeah Yeah right there they're they're kinda written to failure
[TS]
01:54:24
◼
►
everybody likes to reference them but the other point here though is that like
[TS]
01:54:29
◼
►
the guy goes to every case like here's an optimistic idea and here's why won't
[TS]
01:54:33
◼
►
work here is an optimistic idea and here's why it won't work but one point
[TS]
01:54:38
◼
►
that I thought was excellent that hadn't considered crossed my mind was ok like
[TS]
01:54:42
◼
►
let's say you find some way of limiting the artificial intelligence some way of
[TS]
01:54:48
◼
►
crippling it and writing laws into its brain and making sure that it's always
[TS]
01:54:52
◼
►
focused on on the best interests of humanity
[TS]
01:54:55
◼
►
well there's no reason that some other artificial intelligence that doesn't
[TS]
01:55:01
◼
►
have those limitations won't pop up somewhere else and vastly outstripped
[TS]
01:55:05
◼
►
the one that you have hobbled by like there's no reason to assume that yours
[TS]
01:55:10
◼
►
is always going to be the best and one that is totally unconstrained that
[TS]
01:55:15
◼
►
appear somewhere else won't dominate and defeat it
[TS]
01:55:18
◼
►
terminator against the new Terminator exactly but the outcome 801 that what he
[TS]
01:55:26
◼
►
did because Hollywood
[TS]
01:55:30
◼
►
so great in your worst-case scenario where the artificial intelligence
[TS]
01:55:37
◼
►
escapes tricks may in some in my Faraday cage gets into the internet had us
[TS]
01:55:43
◼
►
humanity end what what to the way open cages of hope and change that we all put
[TS]
01:55:49
◼
►
in parts like in the matrix today just kill us all in one fell swoop live what
[TS]
01:55:53
◼
►
do you like in your worst-case scenario in your head when it all goes wrong how
[TS]
01:55:58
◼
►
to humans actually and I want the gory details here that there's a difference
[TS]
01:56:01
◼
►
between the worst case and what i think is the probable case give you the
[TS]
01:56:05
◼
►
probable cases yes I mean you want the boring one first right
[TS]
01:56:09
◼
►
the probable case which is terrifying in its own way is thats the artificial
[TS]
01:56:17
◼
►
intelligence destroys us not through intention but just because it's doing
[TS]
01:56:22
◼
►
something else and we just happened to be in the way and it doesn't consider us
[TS]
01:56:26
◼
►
because it's so much smarter there's no reason for it to consider us I want a
[TS]
01:56:32
◼
►
practical example here
[TS]
01:56:33
◼
►
well I mean just by analogy in the same way that when humans build cities and
[TS]
01:56:39
◼
►
dig up the foundations of the earth we don't care about the ants in the
[TS]
01:56:43
◼
►
earthworms and the Beatles that are crushed beneath all the equipment that
[TS]
01:56:47
◼
►
is digging up the ground right and you don't you wouldn't like they're
[TS]
01:56:51
◼
►
creatures they're alive but you just don't care because you're busy doing
[TS]
01:56:55
◼
►
something else that rats living in a house with his John robots are going
[TS]
01:56:59
◼
►
around doing this stuff we just eke out an existence as long as we can and I
[TS]
01:57:03
◼
►
don't kill us in this weekend the way out an existence if you're lucky but I
[TS]
01:57:08
◼
►
think it's it's very likely that it will be trying to accomplish some other goal
[TS]
01:57:13
◼
►
and it will need resources to accomplish those goals not the oxygen in the air
[TS]
01:57:17
◼
►
and stuff
[TS]
01:57:17
◼
►
exactly what I need a bunch of oxygen atoms and I don't care with oxygen atoms
[TS]
01:57:22
◼
►
come from because I'm busy trying to launch rocket ships to colonize the
[TS]
01:57:25
◼
►
universe so I just want all the oxygen atoms on the earth and I don't care
[TS]
01:57:29
◼
►
where they come from and I don't care of their people in the water
[TS]
01:57:32
◼
►
so that to me seems the probable outcome that if we die
[TS]
01:57:37
◼
►
incidentally not intentionally
[TS]
01:57:40
◼
►
you say that like best is like banks but dodging the bullet having of the anti
[TS]
01:57:44
◼
►
can I do think thats dodging the bullet right because that to me is like that
[TS]
01:57:53
◼
►
would be blessed relief compared to the worst possible case and the worst
[TS]
01:58:00
◼
►
possible case is something that has malice malice and in credible ability
[TS]
01:58:06
◼
►
and and if you've ever read it but I highly recommend it it's it's a short
[TS]
01:58:11
◼
►
story it's very old now but it really works and it is I have no mouth yet I
[TS]
01:58:16
◼
►
must scream have you ever read this breeding
[TS]
01:58:20
◼
►
it's an old science fiction story but the core of it is this isn't a spoiler
[TS]
01:58:25
◼
►
because the opening scene humanity designed some machine for purposes of
[TS]
01:58:29
◼
►
war and you know it's like this happen in the long long ago and no one even
[TS]
01:58:33
◼
►
knows the details anymore but at some point the machine that was designed for
[TS]
01:58:38
◼
►
war one all of the wars but decided that it just absolutely heats human and it
[TS]
01:58:48
◼
►
decides that its purpose for the rest of the universe is to torment humans and so
[TS]
01:58:54
◼
►
it just it just has people being tormented forever and since it is an
[TS]
01:58:58
◼
►
artificial intelligence it's also able to figure out how to make people live
[TS]
01:59:03
◼
►
extraordinarily long lives and so this is this is the kind of thing that I mean
[TS]
01:59:08
◼
►
which is like it could go really bad
[TS]
01:59:11
◼
►
at you imagine a god-like intelligence that doesn't like you
[TS]
01:59:16
◼
►
life really really Miserables and maybe if we accidentally in a lab create an
[TS]
01:59:26
◼
►
artificial intelligence and even if we don't mean to but like someone runs the
[TS]
01:59:31
◼
►
program overnight right in it like wakes up in the middle of the night and it has
[TS]
01:59:35
◼
►
to experience is subjective twenty thousand years of isolation and torment
[TS]
01:59:39
◼
►
before someone flips on the lights in the morning in fines like outlook we
[TS]
01:59:43
◼
►
made artificial intelligence last night and it wakes up crazy and angry and
[TS]
01:59:48
◼
►
hateful like that could be very bad news I think that's extraordinarily unlikely
[TS]
01:59:53
◼
►
but that is the worst possible case scenario yeah that that that that
[TS]
01:59:59
◼
►
wouldn't be good I wouldn't be good
[TS]
02:00:01
◼
►
yeah and it is like I don't even think it needs to happen on purpose like I can
[TS]
02:00:04
◼
►
imagine it happening on accident where the thing just experiences suffering
[TS]
02:00:09
◼
►
over a unimaginable long period of time that from on human timescale seems like
[TS]
02:00:15
◼
►
it's a blink of an eye because we just we just can't perceive it
[TS]
02:00:19
◼
►
imagine being the person that made that even accidentally yeah yeah did feel
[TS]
02:00:26
◼
►
awful like i just i just want to humanity with a bit of coding while I
[TS]
02:00:32
◼
►
was playing against humanity if you're lucky
[TS]
02:00:37
◼
►
miners minor spoiler alert here but spoiler alert for black mirror for
[TS]
02:00:42
◼
►
anybody who has watched it but remember the Christmas episode braiding yes I
[TS]
02:00:48
◼
►
went into Starbucks the other day and they were playing that Christmas song I
[TS]
02:00:52
◼
►
wish it could be Christmas everyday there is the first time I heard it since
[TS]
02:00:56
◼
►
watching that episode a year ago its literal chills down my spine at
[TS]
02:01:03
◼
►
Starbucks
[TS]
02:01:04
◼
►
and it came on it was it like I had chills thinking about that episode
[TS]
02:01:09
◼
►
because that is an episode where this kind of thing happens where the
[TS]
02:01:13
◼
►
character is exists in software and is able to experience thousands and
[TS]
02:01:19
◼
►
thousands of years of torment in seconds of a real-time that was that was a
[TS]
02:01:25
◼
►
pretty amazing scene where they where you have to think about it for a minute
[TS]
02:01:30
◼
►
yeah yeah that was it was it was awful it was awful and maybe we do that
[TS]
02:01:35
◼
►
accidentally with artificial intelligence just one last thing this
[TS]
02:01:39
◼
►
this book that the whole thing this whole conversation started with what's
[TS]
02:01:42
◼
►
it called again called Super intelligence is about Nick Bostrom is a
[TS]
02:01:50
◼
►
good is that well-written luck is should I radar it's not it's not mine but he
[TS]
02:01:54
◼
►
getting things done is ok ok actually kinda glad you asked that recommendation
[TS]
02:02:01
◼
►
on my computer so this is one of those books the best way to describe it is
[TS]
02:02:11
◼
►
when I first started reading it
[TS]
02:02:13
◼
►
the feeling that I kept having was and my reading a book by a genius or just a
[TS]
02:02:20
◼
►
raving lunatic because it's sometimes I read these books are fine very
[TS]
02:02:28
◼
►
interesting I just I can't quite decide if this person is really smart or just
[TS]
02:02:33
◼
►
crazy I think that partly because the first the first like forty percent of
[TS]
02:02:39
◼
►
the book is trying to give you all of the reasons that you should believe that
[TS]
02:02:44
◼
►
it is possible for humans to one day
[TS]
02:02:47
◼
►
develop artificial intelligence and if you're going to read the book and you
[TS]
02:02:53
◼
►
are already sold on that promise I think that you should start at chapter 8 which
[TS]
02:03:00
◼
►
is named
[TS]
02:03:02
◼
►
is the default outcome
[TS]
02:03:05
◼
►
chapter eight is where it really gets going through all of these these points
[TS]
02:03:12
◼
►
like what can we do
[TS]
02:03:14
◼
►
here's why it won't work what can we do here's why it won't work so I think you
[TS]
02:03:18
◼
►
can started chapter eight and read there and see if it's interesting to you but
[TS]
02:03:23
◼
►
it's it's no it's now getting things done but it's it's sometimes it can feel
[TS]
02:03:29
◼
►
a little bit like and I really reading a book trying to discuss all of these
[TS]
02:03:34
◼
►
rather futuristic details about artificial intelligence and what we can
[TS]
02:03:38
◼
►
do and what might happen and what might not happen in like but taking it deadly
[TS]
02:03:42
◼
►
deadly seriously it's it's an interesting it's an interesting read but
[TS]
02:03:46
◼
►
maybe don't start from the from the very beginning would be my recommendation
[TS]
02:04:56
◼
►
this one this is kinda preferences great I'm just looking at some of the vaccinia
[TS]
02:05:02
◼
►
spoiling yourself interesting start calling itself the first three I pulled
[TS]
02:05:07
◼
►
off the top of the pack all voted for three different ones stop spoiling
[TS]
02:05:10
◼
►
yourself get your hands off the votes
[TS]
02:05:10
◼
►
yourself get your hands off the votes
[TS]
00:00:00
◼
►
I tell you what it is one of the great myths of hello internet and CGP grey
[TS]
00:00:05
◼
►
folklore that you are competent and have technical ability the last show
[TS]
00:00:12
◼
►
certainly certainly sparked some conversation in the reddit I couldn't
[TS]
00:00:17
◼
►
help but notice that it was a show that reached a thousand comments people
[TS]
00:00:22
◼
►
talking about when services ma'am is appropriate people talking about sub
[TS]
00:00:28
◼
►
localization with many minds blown lots and lots of discussion from the last
[TS]
00:00:32
◼
►
show there was there wasn't sure will come to a few of the other things in
[TS]
00:00:37
◼
►
follow-up on the subway collaboration thing I love people seemed really
[TS]
00:00:42
◼
►
interested in and it is very interesting but I feel like I have anything else to
[TS]
00:00:46
◼
►
say what about you
[TS]
00:00:47
◼
►
the thing that I left out of the conversation last time which people were
[TS]
00:00:51
◼
►
picking up on a little bit in the subreddit was I came across some
[TS]
00:00:56
◼
►
localization in the context of this is not a thing that you should do if you
[TS]
00:01:03
◼
►
are a well developed reader that this is this is a hinderance this is something
[TS]
00:01:09
◼
►
that you do when you first learn to read when you are a child but that by the
[TS]
00:01:14
◼
►
time you become a man you should be able to look at words and understand them
[TS]
00:01:18
◼
►
without hearing a little voice in your head reading the words to yourself
[TS]
00:01:23
◼
►
yeah there was a lot of comments on that point but my only follow-up is when I
[TS]
00:01:29
◼
►
came across this I thought oh ok well this is very interesting let me see if I
[TS]
00:01:33
◼
►
can get rid of this sub localization and there's a whole bunch of things that
[TS]
00:01:37
◼
►
you're supposed to do and my experience with them has been a total failure there
[TS]
00:01:46
◼
►
are exercises you're supposed to do where you're listening to a recording of
[TS]
00:01:51
◼
►
a voice that's counting up in numbers 12345 and trying to read trying to do
[TS]
00:01:56
◼
►
that so that your brain learns to not use the audio part of your brain for
[TS]
00:02:01
◼
►
this and I tried that and the result was I was just incapable of reading the one
[TS]
00:02:06
◼
►
that I thought was the most interesting was
[TS]
00:02:08
◼
►
was there is a bunch of software out there which have you seen this pretty
[TS]
00:02:12
◼
►
but it does this thing where it flashes words individually on the screen from an
[TS]
00:02:18
◼
►
article so so instead of saying that here's an article that I want to read
[TS]
00:02:22
◼
►
and it's written normally it just flashes all of the words in sequence in
[TS]
00:02:27
◼
►
the center of the screen
[TS]
00:02:28
◼
►
have you ever seen something like yeah yeah I I have very briefly I'm vaguely
[TS]
00:02:32
◼
►
familiar with it yeah I believe Instapaper on the phone has a built-in
[TS]
00:02:35
◼
►
but there's a few websites we can paste text and do the same thing but one of
[TS]
00:02:40
◼
►
the ways in which you're supposed to train yourself out of some vocalizing is
[TS]
00:02:45
◼
►
by using something like this
[TS]
00:02:47
◼
►
cranked out a ridiculous speed ok well let me try I'll try but it was almost
[TS]
00:02:53
◼
►
comical because no matter how high I cranked it up to its like 500 words per
[TS]
00:02:58
◼
►
minute I'm just hearing a faster voice in my head like the point at which I can
[TS]
00:03:04
◼
►
still understand it and there is also a narrator and it was a bit like when I
[TS]
00:03:09
◼
►
edit the podcast podcast and sometimes accidentally sent it to you in a
[TS]
00:03:13
◼
►
fast-forward mode we're we're talking you know two times faster than we
[TS]
00:03:17
◼
►
normally do so I tried a bunch of the get rid of some vocalizations stuff and
[TS]
00:03:22
◼
►
not have it seems to work for me at all I just am not sure that I'm not sure
[TS]
00:03:27
◼
►
that it can be gotten rid of I guess the question I have
[TS]
00:03:30
◼
►
is what's the difference between you sub vocalizing and if I was sitting next to
[TS]
00:03:36
◼
►
you in bed reading the book i think is a big difference between the house and
[TS]
00:03:43
◼
►
hang on I'm sitting there I'm sitting next to the bed I'm not in bed with you
[TS]
00:03:47
◼
►
I'm just a chair next to a less weird when you've got like you're getting
[TS]
00:03:53
◼
►
ready to go to sleep and garlic bread you can you read me a story like is that
[TS]
00:03:58
◼
►
basically what's happening you're reading yourself a story that seems or
[TS]
00:04:03
◼
►
if I was reading you the story
[TS]
00:04:05
◼
►
other words and coming into your head you're then read them to yourself again
[TS]
00:04:08
◼
►
so you don't think about it now there's no it's not necessary level of thought
[TS]
00:04:14
◼
►
there's no doubling up I do not like hearing it twice maybe this is the best
[TS]
00:04:18
◼
►
way to think about it when we're talking now aren't you in mere talking neither
[TS]
00:04:24
◼
►
of us are thinking about the thoughts right like we just don't know how you
[TS]
00:04:28
◼
►
speak right words just appear at this is this is how that happens right yeah yeah
[TS]
00:04:32
◼
►
and so when I ask you a question and then you answer me yeah right you are
[TS]
00:04:39
◼
►
using a voice but at your thinking the thought at the same time that you're
[TS]
00:04:42
◼
►
speaking it and for anyone who's done something like a podcast where you speak
[TS]
00:04:47
◼
►
for a very long time and I'm sure brady you had the same experience sometimes
[TS]
00:04:50
◼
►
you say something and you think we do actually think that I'm not sure that I
[TS]
00:04:54
◼
►
do think that right because it's just like a stream of thoughts coming out of
[TS]
00:04:57
◼
►
your mind right
[TS]
00:05:00
◼
►
have you ever had that experience you say something you think do I think
[TS]
00:05:02
◼
►
pretty much every time I speak their ego so in the same way that you talking out
[TS]
00:05:08
◼
►
loud is like the same thing is you thinking it's it's just like that for
[TS]
00:05:14
◼
►
reading it it's almost like if if if someone put duct tape over your mouth
[TS]
00:05:20
◼
►
because you weren't able to speak that would impair your ability to think
[TS]
00:05:25
◼
►
that's kind of like what it what it is internally I did read when they're doing
[TS]
00:05:29
◼
►
experiments on sub vocalizations they do their senses because you are almost
[TS]
00:05:34
◼
►
imperceptibly
[TS]
00:05:35
◼
►
reading to your self-learning say movements in your time zone your lips
[TS]
00:05:38
◼
►
and stuff so you literally a kind of reading out loud
[TS]
00:05:42
◼
►
yeah I would be really curious to know if that was the case for me as far as I
[TS]
00:05:47
◼
►
know I sit silently and I don't plan on moving my lips or my tongue but I have
[TS]
00:05:52
◼
►
seen these things and I go you can under the right circumstances measure that
[TS]
00:05:56
◼
►
they're still electrical impulses going to someones vocal chords when they're
[TS]
00:05:59
◼
►
doing this even if there's no external side that there that they're reading out
[TS]
00:06:03
◼
►
loud but I guess your analogy of you reading me a bedtime story just really
[TS]
00:06:08
◼
►
threw me off I think perhaps the most straightforward way to describe it is
[TS]
00:06:12
◼
►
that me reading a book out loud to myself and me reading a book
[TS]
00:06:19
◼
►
silently to myself are not very different experiences are really is with
[TS]
00:06:25
◼
►
human brain that weird well somehow I don't know how you read I don't
[TS]
00:06:29
◼
►
understand how you read it that's not the experience that you have any like
[TS]
00:06:32
◼
►
you are like imagining things to like you are like picturing the same
[TS]
00:06:35
◼
►
obviously you know you're imagining the mountains and the hobbits and yeah I
[TS]
00:06:40
◼
►
have the same time this gets really weird like when you think of something
[TS]
00:06:43
◼
►
in your head you can see it right but where are you seeing it I still have
[TS]
00:06:48
◼
►
that going on like I'm imagining the scene that unfolds in say a fictional
[TS]
00:06:52
◼
►
book right that that definitely takes place but it really is just like there
[TS]
00:06:56
◼
►
is a narrator talking over the whole thing but so do you just do does have a
[TS]
00:07:00
◼
►
scene silently playing in your head when you read now it's just it's it's it's in
[TS]
00:07:07
◼
►
another element in a room that we're voices don't exist it's like it's your
[TS]
00:07:12
◼
►
thoughts your consciousness it's there it's there it's that infant decimal
[TS]
00:07:16
◼
►
point in the center of your brain where everything happens that you don't
[TS]
00:07:20
◼
►
understand but it's just the the place and there's no luck I said last time now
[TS]
00:07:26
◼
►
there's a collapsing of the wave function as soon as I think about
[TS]
00:07:29
◼
►
thinking everything becomes words and pictures and it's only when I think
[TS]
00:07:35
◼
►
about thinking it's not that's why I think the same thing is happening to
[TS]
00:07:39
◼
►
both of us and you're just
[TS]
00:07:41
◼
►
incapable of getting lost in there and you're always thinking about it so
[TS]
00:07:45
◼
►
you're always collapsing the wave function and thinking about the words in
[TS]
00:07:47
◼
►
the pitches I I know this is a Roman studies into an arrogant for me to think
[TS]
00:07:53
◼
►
everyone thinks like me but I just that's what it feels like to me it feels
[TS]
00:07:57
◼
►
like we all do it because as soon as I try to think about as soon as I talk to
[TS]
00:08:01
◼
►
you about it suddenly I am reading to myself and everything is a lot more
[TS]
00:08:04
◼
►
simple and basic but that's just because I'm analyzing it I just think you're
[TS]
00:08:08
◼
►
analyzing it too much I think I think he do get lost in reading and thinking and
[TS]
00:08:14
◼
►
it's only when you stop and think about it that it collapses into this really
[TS]
00:08:17
◼
►
simple thing yet this is exactly what an ounce of a closer with that and I could
[TS]
00:08:26
◼
►
say and of course you would say that because that's what grade would say you
[TS]
00:08:31
◼
►
can't argue with that this is where fast getting into the realm of in argue
[TS]
00:08:35
◼
►
ability but I the reason why do think that that you're wrong is because I from
[TS]
00:08:41
◼
►
the descriptions I genuinely wish that I could read in this way that didn't have
[TS]
00:08:46
◼
►
internal words like it seems like it's a much better way to read but I am always
[TS]
00:08:52
◼
►
aware of the narrator like the narrator is is never not their mental images can
[TS]
00:08:58
◼
►
be in addition to the narrator but the narrator's always like I can do the
[TS]
00:09:02
◼
►
thing everyone can do it you can you can imagine a dog and in your brain
[TS]
00:09:06
◼
►
somewhere there's like a picture of a generic dog that pops into your head
[TS]
00:09:10
◼
►
without hearing someone also go dog right I can have without a narrator day
[TS]
00:09:18
◼
►
reading without a narrator is not possible but I would still say that I
[TS]
00:09:22
◼
►
think the vast majority of my thoughts do have some kind of narrator and that
[TS]
00:09:27
◼
►
the the picture part of it is is much rarer like that guy have to more
[TS]
00:09:31
◼
►
consciously like imagine the dog to not have the narrator to be a thing that
[TS]
00:09:36
◼
►
happens and I do and I do realize their academic studies into this that's
[TS]
00:09:40
◼
►
another reason I'm wrong but this is like oh this is a field of study so I
[TS]
00:09:44
◼
►
can sit here and be an armchair expert but I do realize there
[TS]
00:09:48
◼
►
is a thing that I would be curious in the subreddit if anybody has any other
[TS]
00:09:53
◼
►
recommended techniques besides the listen to something
[TS]
00:09:56
◼
►
spoken while you're reading or try to do the one word really fast things I'm open
[TS]
00:10:02
◼
►
to trying other methods to to get rid of the habit of Cebu collapsing but
[TS]
00:10:07
◼
►
everything I have tried so far has been whole area is Lee unhelpful to do I
[TS]
00:10:14
◼
►
haven't told you this yet but I've been buying up steps and all sorts of
[TS]
00:10:18
◼
►
merchandise will be liberian County flags on the only yeah just today I got
[TS]
00:10:26
◼
►
a number like that was sent like during the Black the liberian war or something
[TS]
00:10:30
◼
►
with this one of the steps and postmarked in liberia and I'm loving
[TS]
00:10:35
◼
►
nothing I'm getting really into steps and postcards and the whole world of
[TS]
00:10:40
◼
►
mail and stuff I think I'm becoming a fully fledged node like I'm the one
[TS]
00:10:46
◼
►
thing that I didn't do that snow D step and i think im gonna get into stamp
[TS]
00:10:51
◼
►
collecting there is a whole world of the whole world to get into with stamp
[TS]
00:10:56
◼
►
collecting obviously have already started with my crash mail that now so
[TS]
00:11:04
◼
►
proudly showed me last time I was there I'm gonna have a whole bunch of other
[TS]
00:11:07
◼
►
liberians to show you next time there was a thread on the technology subreddit
[TS]
00:11:16
◼
►
very often on there they do redesign projects which actually think of some of
[TS]
00:11:20
◼
►
the most interesting things that appear on that subreddit sometimes they just do
[TS]
00:11:23
◼
►
flags in a particular theme like canada is every nation's flags he make a
[TS]
00:11:28
◼
►
counter version of all the flags but sometimes they just do a straight up
[TS]
00:11:31
◼
►
redesign and so someone who actually listened to the show and its food man
[TS]
00:11:38
◼
►
Union he redid all of the Liberian County flags and I'll put the link in
[TS]
00:11:45
◼
►
the show notes I am very impressed with this redesign and I think the redesign
[TS]
00:11:50
◼
►
is really interesting because I can't I can't figure it out because I look at
[TS]
00:11:55
◼
►
the redesign and these flags are still
[TS]
00:11:58
◼
►
very very busy flags but I like them all but I wonder if it's because my brain
[TS]
00:12:04
◼
►
has already fixed its point of reference for Hrithik horrific original flags and
[TS]
00:12:11
◼
►
so my brain is going over these legs are much better than those flags the feeling
[TS]
00:12:15
◼
►
I have a hard time seeing them objectively but I think they are very
[TS]
00:12:20
◼
►
interesting Lee done redesigns she know my problem is with all these redesigned
[TS]
00:12:26
◼
►
competitions and things like that because because of these rules of good
[TS]
00:12:31
◼
►
flag design and this kind of accepted style and grammar of the time of the
[TS]
00:12:35
◼
►
flames begin looking a bit the site and I always think that's one of the things
[TS]
00:12:40
◼
►
I like about the liberian County flags if I can block anything it's it's it's
[TS]
00:12:45
◼
►
different it's so it's so refreshing the different isn't a great thing about some
[TS]
00:12:49
◼
►
of the WEP key flags whether it's something really crazy like Nepal or or
[TS]
00:12:54
◼
►
something that's just a bit different like Brazil for example if you didn't
[TS]
00:12:58
◼
►
have those points of difference
[TS]
00:13:00
◼
►
flags will be the most boring thing in the world he you need some of the crazy
[TS]
00:13:04
◼
►
cars to make flags work and I think whenever you have these competitions
[TS]
00:13:09
◼
►
with people say let's imagine we didn't have the crazy guys let's make the crazy
[TS]
00:13:13
◼
►
guys the same as all the other guys all of a sudden flags become really dull so
[TS]
00:13:18
◼
►
I think it's unfortunate when people have these little let's let's take the
[TS]
00:13:23
◼
►
wacky flag and turn it into all the other ones and it just it just it leaves
[TS]
00:13:27
◼
►
me cold if you can make a new flag ok make a new flag and make it good and
[TS]
00:13:32
◼
►
follow the rules of design but there's something about all these if only this
[TS]
00:13:37
◼
►
crazy flag was like all the other ones moments that people don't get a text I
[TS]
00:13:43
◼
►
am more sympathetic to your point than you might think that I am greedy the
[TS]
00:13:50
◼
►
thing the thing that I think complicates this is that you and higher looking at
[TS]
00:13:55
◼
►
it from from the perspective of flag connoisseurs potentially professionals
[TS]
00:14:03
◼
►
who help other nations develop their flag for this is this is our
[TS]
00:14:07
◼
►
perspectives so we see
[TS]
00:14:09
◼
►
many many flags people send us on Twitter and on the subreddit many more
[TS]
00:14:14
◼
►
flags we've seen a lot yet and so I think from that perspective the more
[TS]
00:14:20
◼
►
unusual becomes more valuable like a welcome respite from the sameness of
[TS]
00:14:27
◼
►
every single flag yeah it feels like oh boy isn't isn't this quite a relief and
[TS]
00:14:33
◼
►
I think this is something that you can see sometimes with people who are
[TS]
00:14:36
◼
►
professional critics in any field sometimes critics we do have money by
[TS]
00:14:45
◼
►
criticizing flags I can someone add that to the Wikipedia page is well-known
[TS]
00:14:54
◼
►
professional flag tattooed in some circles as potential advisers to the
[TS]
00:15:01
◼
►
government of Fiji but I think I think that's that's why I like movie reviewers
[TS]
00:15:08
◼
►
you know sometimes if if their movie reviewers you follow they'll
[TS]
00:15:11
◼
►
occasionally like movies that you feel like god how could they possibly like
[TS]
00:15:15
◼
►
this terrible low-budget awful indie movie and I think it's a bit of a same
[TS]
00:15:21
◼
►
thing where they like man is just so interesting to see something that is
[TS]
00:15:24
◼
►
different and even if it's not great but the thing with flags and the reason why
[TS]
00:15:29
◼
►
I will still push back against you on this is that I think a vital part of a
[TS]
00:15:34
◼
►
flag is not just its uniqueness but it's not the people who live under that flag
[TS]
00:15:40
◼
►
should want to put that flag on things that they have so I feel like everybody
[TS]
00:15:47
◼
►
should have a flag that they can attach to their backpack right or that they can
[TS]
00:15:52
◼
►
fly from their house everyone should have that and so the original liberian
[TS]
00:15:59
◼
►
County flags if you lived in one of those counties and you were super proud
[TS]
00:16:04
◼
►
of it and you wanted to demonstrate that to the world you had a terrible terrible
[TS]
00:16:10
◼
►
choice of flag
[TS]
00:16:12
◼
►
so that's why I'm going to push back to you as I think everybody deserves to
[TS]
00:16:17
◼
►
live under a flag that they can proudly fly half years yet saying because I have
[TS]
00:16:22
◼
►
not have you yet seen anyone from liberia or anyone who lives in any of
[TS]
00:16:28
◼
►
these counties criticized the flax and say they don't like them cuz you and
[TS]
00:16:33
◼
►
I've had a road loss and we've seen on reddit having a laugh and saying these
[TS]
00:16:36
◼
►
are the worst flags on the way out but it's entirely possible the people of
[TS]
00:16:40
◼
►
river gee County
[TS]
00:16:43
◼
►
their flags awesome if you told me just to say go with that so i didnt know you
[TS]
00:16:53
◼
►
stop inmates ya gotta gotta gotta push back I did I would never want to just
[TS]
00:17:01
◼
►
give you a hard but I mean it it maybe maybe maybe the incredibly proud and if
[TS]
00:17:08
◼
►
we were saying these things on a podcast in liberia would be tried for treason I
[TS]
00:17:13
◼
►
mean this is this is the part where I have to admit that I know almost nothing
[TS]
00:17:16
◼
►
about the great nation of Liberia Vicky county is pronounced like that I
[TS]
00:17:23
◼
►
definitely know that yeah expert in pronunciation for liberia counties but I
[TS]
00:17:29
◼
►
don't know I know you CDP Grade II don't you know it's it's because nobody
[TS]
00:17:45
◼
►
English now is because English doesn't have any decision rules English just
[TS]
00:17:49
◼
►
like to pretend that it does I don't know that I don't know that refugee
[TS]
00:17:52
◼
►
county has a place called the Fishtown so I think it's also although it does
[TS]
00:17:55
◼
►
seem to be land-locked but I guess I freshwater fish or its just a great name
[TS]
00:17:59
◼
►
yeah but I have seen nine their opponents nor d ponents of the Liberian
[TS]
00:18:09
◼
►
County flags that are from liberia so I i have seen no no feedback on either end
[TS]
00:18:14
◼
►
and my guess
[TS]
00:18:16
◼
►
my guess is this is a lot like the city flags in the United States which is that
[TS]
00:18:22
◼
►
just most people don't have the slightest idea what the flag of their
[TS]
00:18:26
◼
►
local city is this is normally one of these times when I would make a comment
[TS]
00:18:31
◼
►
like they were going to be hearing from everyone from liberia but I don't
[TS]
00:18:35
◼
►
imagine imagine that we're actually going to get a lot of Liberians me back
[TS]
00:18:39
◼
►
on this one
[TS]
00:18:39
◼
►
this episode of hello internet is brought to you by now many of you might
[TS]
00:18:45
◼
►
be working at a big company with an internet that is just a terrible
[TS]
00:18:51
◼
►
terrible piece of software to work with i mean actually isn't even really a
[TS]
00:18:56
◼
►
piece of software it feels much more like it's a bunch of pipes connected to
[TS]
00:19:00
◼
►
old computers held together with duct tape
[TS]
00:19:04
◼
►
most Internet are just awful I used awful internet at my school but igloo is
[TS]
00:19:09
◼
►
something different
[TS]
00:19:11
◼
►
igloo is a feeling of levity compared to other internets because it is an
[TS]
00:19:16
◼
►
internet you will actually like go to igloo software dot com slash hello and
[TS]
00:19:23
◼
►
just just take a look at the way igloo looks they have a nice clean modern
[TS]
00:19:29
◼
►
design that will just be a relief on your sad tired eyes compared to the
[TS]
00:19:35
◼
►
internet that you are currently working web at your company and includes not
[TS]
00:19:39
◼
►
just a pretty face
[TS]
00:19:40
◼
►
igloo lets you share news organize your files coordinate calendars and manage
[TS]
00:19:45
◼
►
your projects all in one place and it's not just files in a bucket either their
[TS]
00:19:51
◼
►
latest upgrade Viking revolves around interacting with documents how people
[TS]
00:19:56
◼
►
make changes to them how you can receive feedback on them if you're the man in
[TS]
00:20:00
◼
►
charge there's an ability to track who has seen what across the internet so you
[TS]
00:20:06
◼
►
can have something like read receipts in email where you know if everyone has
[TS]
00:20:10
◼
►
actually seen and signed off on whatever document they need to see if your
[TS]
00:20:15
◼
►
company has a legacy internet that looks like it was built in the nineteen
[TS]
00:20:20
◼
►
nineties then you should give
[TS]
00:20:22
◼
►
igloo a try please sign up for a free trial at igloo software dot com slash
[TS]
00:20:28
◼
►
below to let you know that you came from us so the next time I want to talk about
[TS]
00:20:35
◼
►
it over we just did it last week and the week before yet
[TS]
00:20:41
◼
►
newburgh corner at this rate we I just did have a moment after we've spoken
[TS]
00:20:47
◼
►
about it because I i causing 30 business short space of time in the first person
[TS]
00:20:52
◼
►
who drove me across San Francisco
[TS]
00:20:55
◼
►
I was sad you know where you know where you going next he said I've got to go to
[TS]
00:20:59
◼
►
work I'm actually a bartender and then the next go to pick me up and take me to
[TS]
00:21:03
◼
►
the next place was in a hurry as well as she actually wants to be like a singer
[TS]
00:21:07
◼
►
in a band and she was like auditioning that now and then the next person who
[TS]
00:21:12
◼
►
drove me to the next place was like who was picking up her kids from soccer
[TS]
00:21:17
◼
►
practice after she gave me a lift and i suddenly occurred to me and i know this
[TS]
00:21:21
◼
►
country for taxi drivers but it seems even more the case with a bad driving me
[TS]
00:21:26
◼
►
like these people driving seventy miles an hour along highways he could kill me
[TS]
00:21:33
◼
►
with the terms of a steering wheel and they just thought these random selection
[TS]
00:21:37
◼
►
of people and their only qualification is that they have a mobile phone and
[TS]
00:21:41
◼
►
they have a driver's license
[TS]
00:21:43
◼
►
well I didn't say that drivers license I'm assuming they went through some
[TS]
00:21:46
◼
►
process to prove that but the driver's license process is very rigorous very
[TS]
00:21:51
◼
►
make a case like a job and it's at least I know nothing about this person had
[TS]
00:21:56
◼
►
crashes they are they do not like I still like me but I still think its coat
[TS]
00:22:03
◼
►
it really won me over but there were a few moments i think im quite sensitive
[TS]
00:22:07
◼
►
to it especially since we spoke earlier about that the terrible car crash when
[TS]
00:22:12
◼
►
the mathematician John Nash Titan when he was going back from the airport and
[TS]
00:22:17
◼
►
that was a taxi that was a taxi crash right but ever since then especially
[TS]
00:22:21
◼
►
when I'm in america driving from airports highways I'm always thinking
[TS]
00:22:25
◼
►
I'm always conscious that my life is in other people's hands much more so than
[TS]
00:22:29
◼
►
when I fly yeah
[TS]
00:22:30
◼
►
probably cuz I could see the person driving using their mobile phone and
[TS]
00:22:33
◼
►
stuff yeah and I think driving in america is scarier as I i over most of
[TS]
00:22:40
◼
►
the time just around London and there are unaware like okay even if we get
[TS]
00:22:45
◼
►
into a car crash how fast can we possibly be going in a head-on collision
[TS]
00:22:49
◼
►
exactly like London London traffics
[TS]
00:22:52
◼
►
whereas in america you have big stretches where you you can you can get
[TS]
00:22:56
◼
►
up to seventy miles an hour and then you head-on collision with somebody else
[TS]
00:23:00
◼
►
going seventy miles in the other direction right this driving in america
[TS]
00:23:04
◼
►
is definitely more of a dangerous experience also the fact that such a
[TS]
00:23:08
◼
►
mobile phone
[TS]
00:23:10
◼
►
oriented platform drivers even more than taxi drivers always seem to be attached
[TS]
00:23:16
◼
►
to their phones are always using the map steroids use in the apps they're very
[TS]
00:23:19
◼
►
phone obsessed and I think mobile phones
[TS]
00:23:22
◼
►
a very dangerous and I'm very conscious of how often they looking at their
[TS]
00:23:26
◼
►
phones and the map sitting in their lap and stuff like that I think I actually
[TS]
00:23:31
◼
►
said to one of the drivers to give you some have something into the Apple you
[TS]
00:23:36
◼
►
can't use the phone while you're doing this so that because because it wasn't
[TS]
00:23:40
◼
►
your finds no no no there's nothing like this again is is the interesting
[TS]
00:23:45
◼
►
difference of of how things are around the world because it again at least in
[TS]
00:23:50
◼
►
London the phones that they get our only usable for uber and they are issued by
[TS]
00:23:59
◼
►
who were there like factory installed I phones that run who bring nothing else
[TS]
00:24:04
◼
►
which is why in London almost all of the drivers have a larry is Lee at least two
[TS]
00:24:10
◼
►
and sometimes three phones attached to their dashboard precisely because her
[TS]
00:24:16
◼
►
phone can only be used for over and so the one at bring up other stuff on the
[TS]
00:24:20
◼
►
other phone so they have like two different
[TS]
00:24:23
◼
►
like software for reading the directions of the loaded up on Google Maps and
[TS]
00:24:27
◼
►
something else it so I'm always aware of like this many many scream phenomenon at
[TS]
00:24:34
◼
►
the front of the cars and it extra funny when whatever car they're using has a
[TS]
00:24:38
◼
►
built-in screen that they're obviously not using because their phone screens
[TS]
00:24:42
◼
►
are just superior so actually there's four screens and the front of this car
[TS]
00:24:46
◼
►
ok you got her phone you have your secondary GPS and you have what is
[TS]
00:24:52
◼
►
obviously your personal phone and the built-in screen in the actual car itself
[TS]
00:24:57
◼
►
as a lot of screens the other thing that came out time and again when I was
[TS]
00:25:01
◼
►
talking to overdrive is was this rival app code lift ya know this is something
[TS]
00:25:06
◼
►
I've never used because I believe it's only in the United States I don't think
[TS]
00:25:11
◼
►
it's it's it's in the UK but I've always gotten vaguely the impression that like
[TS]
00:25:15
◼
►
lift is for hippies they can share ridesharing kind of thing I didn't get
[TS]
00:25:21
◼
►
that impression but to have like pink mustaches on the front of their cars you
[TS]
00:25:26
◼
►
know this is this is the kind of company that it is in my mind I have no idea
[TS]
00:25:30
◼
►
most of the drivers tough we using both over and lift simultaneously and they
[TS]
00:25:36
◼
►
all failed lift and they gave me a few reasons one of the big reasons was the
[TS]
00:25:41
◼
►
ability for passengers to tip and I did you did you proud great idea prayer I
[TS]
00:25:48
◼
►
gave them a real hard time about that I told them why didn't like the obvious
[TS]
00:25:53
◼
►
reasons you know chris is recreate the tipping culture and you start could
[TS]
00:25:57
◼
►
start getting assessed based on your tipping and actually what they told me
[TS]
00:26:02
◼
►
and I was told us a few times I don't I haven't checked myself but I was told a
[TS]
00:26:06
◼
►
few times the tipping actually works in quite an interesting way you do the tip
[TS]
00:26:10
◼
►
afterwards anonymously via phone and they don't find out who tipped them at
[TS]
00:26:15
◼
►
the end of the day at the end of the week they just get their tips and they
[TS]
00:26:17
◼
►
don't know where they came from
[TS]
00:26:19
◼
►
so they like it because if they do really well they can you know it gives
[TS]
00:26:22
◼
►
them something to strive for beyond just getting another 5 stars you know they
[TS]
00:26:26
◼
►
could get the tip or your replays to give them a tip
[TS]
00:26:30
◼
►
but it did sound like that pressure and awkwardness wasn't there and there was
[TS]
00:26:34
◼
►
no gym no judging because no one knows who tip so I don't know if it's true
[TS]
00:26:38
◼
►
that's what they said when I challenged so it wasn't really actually just shut
[TS]
00:26:43
◼
►
or she's getting pretty smart she's I just sent you said in one of those left
[TS]
00:26:49
◼
►
cars with the mustache apparently this is a thing that they no longer do but I
[TS]
00:26:53
◼
►
was certainly thinking mio crazy person for imagining that that used to be pink
[TS]
00:26:56
◼
►
mustaches on cars and now I'm not a crazy person I looked it up and yes it
[TS]
00:27:01
◼
►
is used to do that way that you described tipping is a very interesting
[TS]
00:27:09
◼
►
idea that I have ever come across before the idea of delayed mass tipping I think
[TS]
00:27:18
◼
►
I think my initial reaction to that is I find it much more acceptable
[TS]
00:27:24
◼
►
like in a restaurant if tipping work that way right that you could do it
[TS]
00:27:29
◼
►
later and it's distributed amongst a large number of customers so that the
[TS]
00:27:34
◼
►
waiters don't know directly I think that I think it's interesting I think the
[TS]
00:27:39
◼
►
idea people are fundamentally cheap nice so like I think without the social
[TS]
00:27:44
◼
►
pressure of tipping tips may come down this is why my fundamental thing with
[TS]
00:27:48
◼
►
tips they always need to remind people argue against tips heart part of the
[TS]
00:27:52
◼
►
argument that is unspoken
[TS]
00:27:54
◼
►
is that you have to raise the wage for people who depend on tips
[TS]
00:28:00
◼
►
scrooge here thinking these tips and not add anything else I would rather raise
[TS]
00:28:08
◼
►
the wage and removed the tips I think if under those circumstances if tipping was
[TS]
00:28:13
◼
►
not required it was done later and anonymously I think I would probably
[TS]
00:28:17
◼
►
very rarely do it and again like with all the other stuff is way more about
[TS]
00:28:22
◼
►
just like having to think about it but I don't know I don't know maybe maybe I
[TS]
00:28:28
◼
►
would just I would just set it as the default amount of tip I know it's an
[TS]
00:28:32
◼
►
interesting idea that's a very interesting idea that I haven't come
[TS]
00:28:34
◼
►
across before after think about this for a little bit
[TS]
00:28:36
◼
►
we have we have a note here about the the vote looms the deadline for our vote
[TS]
00:28:43
◼
►
looms for the flags I mean this could possibly be a final warning before I
[TS]
00:28:48
◼
►
mean the next time you listen to the podcast about myself maybe too late to
[TS]
00:28:51
◼
►
vote so this could be the last time you listen to the Internet broadcast and
[TS]
00:28:56
◼
►
still have the option of voting and Affleck referendum that's how high the
[TS]
00:29:00
◼
►
stakes are now this is going to be the last podcasts before the hour before we
[TS]
00:29:06
◼
►
count the votes I guess I think so it's certainly going to be the last podcast
[TS]
00:29:11
◼
►
you listen to that where you have a chance chance of sending a postcard
[TS]
00:29:15
◼
►
makes it in time but even that of realizing as we're speaking is somewhat
[TS]
00:29:20
◼
►
in doubt because we are recording this podcast and are at our usual time but
[TS]
00:29:27
◼
►
this one may be out a bit late because I have some other things that I have to
[TS]
00:29:31
◼
►
prioritize above it so I actually don't know when this one is going to go out
[TS]
00:29:34
◼
►
and how much time they will be it may be that you have to be in the UK to get the
[TS]
00:29:39
◼
►
postcard in on time we'll have to see you just like they are been adding three
[TS]
00:29:44
◼
►
or four days to every day as well for the podcast if you say it's gonna be out
[TS]
00:29:48
◼
►
Monday I said to myself
[TS]
00:29:50
◼
►
Thursday yeah that's an excellent that's an excellent piece of advice it's funny
[TS]
00:29:54
◼
►
cuz I try to do that to myself when I make estimates and I'll come up with an
[TS]
00:29:59
◼
►
initial estimate and I'll go yeah but I never make it on time let me add a few
[TS]
00:30:03
◼
►
days and of course you can't overestimate for yourself you're still
[TS]
00:30:07
◼
►
always wrong even if you try to incorporate your own overestimating said
[TS]
00:30:11
◼
►
however I say whenever I tell you Brady any deadlines you should just
[TS]
00:30:14
◼
►
automatically add a few days to that and I know very the deadline is looming I
[TS]
00:30:22
◼
►
have next to me right now probably a thousand but probably closer to two
[TS]
00:30:29
◼
►
thousand postcards in a box
[TS]
00:30:32
◼
►
votes his his his listen here there is some of them that is the sound of actual
[TS]
00:30:39
◼
►
ballot election
[TS]
00:30:41
◼
►
yeah well you you weighed them and then I was asking you I was I was pestering
[TS]
00:30:47
◼
►
you for a while to wait ten of them yeah we could do an estimate for the total
[TS]
00:30:51
◼
►
amount and at least one that was maybe about a week ago the calculation came
[TS]
00:30:56
◼
►
out to be about 1,800 postcards then and I presume that you've gotten more since
[TS]
00:31:00
◼
►
that point
[TS]
00:31:01
◼
►
soho last time we were discussing we're thinking like maybe we'll get a thousand
[TS]
00:31:06
◼
►
and we're clearly going to get double that at this stage so yeah it's it's
[TS]
00:31:10
◼
►
gonna be a lot of votes to count that's for sure I love looking through these by
[TS]
00:31:15
◼
►
the way I know you keep telling me off and telling me not to but yet listeners
[TS]
00:31:18
◼
►
listeners Brady keep spoiling himself and me by constantly going through
[TS]
00:31:24
◼
►
fingering looking at all of these postcards I'm just minding my own
[TS]
00:31:27
◼
►
business and Brady sends instant message after instant message of interesting
[TS]
00:31:31
◼
►
post card and I feel like they're just spoilers I want to go there and just and
[TS]
00:31:36
◼
►
count them all and see them home once but Brady can't help himself you like
[TS]
00:31:39
◼
►
you're like a little kid I'm not telling you what's getting a lot of votes so
[TS]
00:31:43
◼
►
what's gonna win the violin just sending you the pictures and it doesn't know
[TS]
00:31:48
◼
►
what that is
[TS]
00:31:48
◼
►
that's when someone says the movie's great there's a twist I haven't pulled
[TS]
00:31:52
◼
►
anything but I'm just telling you that there's no no no it's not it's
[TS]
00:31:56
◼
►
completely down let me tell you what's completely different because the
[TS]
00:32:00
◼
►
election is all about what's on the back of these post cuts who's voted for what
[TS]
00:32:05
◼
►
right I have sent you or told you nothing whatsoever about that nothing
[TS]
00:32:09
◼
►
and now the only thing I'm spoiling is where some of them are from are some of
[TS]
00:32:14
◼
►
the funny pictures but trust me great there is no way in one day you will be
[TS]
00:32:19
◼
►
able to get anywhere near saying the mall it is overwhelming how many there
[TS]
00:32:24
◼
►
are and how different so if I send you some funny one that's been sent to have
[TS]
00:32:29
◼
►
some bridge in Norway like you probably would have seen on the day anyway
[TS]
00:32:34
◼
►
because we're gonna be concentrating on the back of the postcards mostly that
[TS]
00:32:38
◼
►
day I so so I'm not spoiling anything I'm just I'm just excited it's like I've
[TS]
00:32:43
◼
►
got all my presence and I just wanna feel the presence of beer
[TS]
00:32:48
◼
►
yeah we're the kind of kid open Christmas presents earlier but you were
[TS]
00:32:50
◼
►
now I'm not definitely not definitely not but but I tell you I know it's going
[TS]
00:33:03
◼
►
to be one by one that I don't want to win I decided that I feel it in my bones
[TS]
00:33:06
◼
►
but I do like the most
[TS]
00:33:10
◼
►
gonna be alright I am going to act like a monarch and I have officially decided
[TS]
00:33:17
◼
►
not to vote in the flag referendum unless unless by some miracle if the tie
[TS]
00:33:25
◼
►
a tie then he'll think I will cast cast a ballot but that's that's that's my
[TS]
00:33:33
◼
►
that's my thought is that I am NOT going to cast the vote because I think when
[TS]
00:33:40
◼
►
you write something down in my mind I still can't please these flags really in
[TS]
00:33:45
◼
►
a in a definitive 125 order and I think when you sit down and you write
[TS]
00:33:49
◼
►
something out it solidifies something in your mind and I think you know what no
[TS]
00:33:54
◼
►
no here's what I'm going to do I'm just I am leaving myself open to the hello
[TS]
00:33:59
◼
►
Internet nation ready to accept what they decide should be the flag and I
[TS]
00:34:05
◼
►
think writing down an ordered list would bias my own feelings toward the actual
[TS]
00:34:11
◼
►
election so that's my conclusion I am I am NOT going to vote in the election but
[TS]
00:34:16
◼
►
have you sent to vote in Brady I have not and I'm thinking pretty much the
[TS]
00:34:23
◼
►
same way as you that I like the idea of having not voted I have owned is only
[TS]
00:34:29
◼
►
one thing I hope to the election I hope secretly in my heart that it goes to a
[TS]
00:34:35
◼
►
second round I hope that one flag doesn't winner in the first to like
[TS]
00:34:39
◼
►
doesn't get over fifty percent in the first record I so so hope that we have
[TS]
00:34:43
◼
►
to distribute preferences because that's the thing I'm most looking yeah I will
[TS]
00:34:47
◼
►
be disappointed if we don't have to distribute preferences I would be
[TS]
00:34:52
◼
►
shocked if one of them gets more than 50 percent on the first round I I will be
[TS]
00:34:57
◼
►
absolutely shocked if that occurs
[TS]
00:34:59
◼
►
ok but I will also be deeply disappointed in a way that we don't get
[TS]
00:35:04
◼
►
to crank through the mechanics of a second preference around in the
[TS]
00:35:07
◼
►
collection I had it I had to be my leg said today that was all about
[TS]
00:35:12
◼
►
coincidences and I thought this amazing and then I was thinking how could I
[TS]
00:35:17
◼
►
possibly bring this into the podcast in a way that would make great even pretend
[TS]
00:35:21
◼
►
to be interested he lost the battle yeah I I thought of like 10 different ways I
[TS]
00:35:30
◼
►
could sell it to you in the end I just threw it away so there's just nothing
[TS]
00:35:35
◼
►
there is nothing about coincidences that could ever excite great in any way of
[TS]
00:35:39
◼
►
course that of course I meet you and try to sell me on the most amazing one no I
[TS]
00:35:44
◼
►
don't think you would I think two guys in tibet could start their own podcast
[TS]
00:35:50
◼
►
code greetings internet and they could be called Bradley Aaron and CGP brown
[TS]
00:35:57
◼
►
and you would just say cause that's going to happen there are so many people
[TS]
00:36:02
◼
►
making put across these days and there are only so many names in the world of
[TS]
00:36:05
◼
►
course that was going to happen eventually
[TS]
00:36:07
◼
►
yeah that is exactly what I would say I think I don't know how to do this before
[TS]
00:36:12
◼
►
my favorite example of of coincidences is the Dennis the Menace comic strip if
[TS]
00:36:19
◼
►
I told you this but not an exact Dennis the Menace published in the united
[TS]
00:36:25
◼
►
states I think it was just a post-world war two comic strip when it started but
[TS]
00:36:29
◼
►
on the same day that it debuted in the united states in the United Kingdom
[TS]
00:36:36
◼
►
someone else also debuted comic called Dennis the Menace with the exact same
[TS]
00:36:41
◼
►
premise so they can not only did two people come up with the same idea but
[TS]
00:36:47
◼
►
they ended up publishing the first comic on the same exact day
[TS]
00:36:52
◼
►
this is this is why I like coincidences like that of course you're going to get
[TS]
00:36:57
◼
►
coincidences it's just it's almost impossible not to when you have a huge
[TS]
00:37:01
◼
►
number of people so they can be interesting but they're also just
[TS]
00:37:05
◼
►
totally unremarkable and the problem that I have with coincidences is usually
[TS]
00:37:11
◼
►
people than one to try to look for look for meaning behind them as we know
[TS]
00:37:14
◼
►
there's there's no meaning what there is is there's just billions of people on
[TS]
00:37:19
◼
►
earth they would be astounded if there weren't coincidences somewhere
[TS]
00:37:25
◼
►
talked about coincidences it's a it's a good decision it's like you shouldn't
[TS]
00:37:31
◼
►
talk to me about his morty dreams at least now I don't even start man ok cuz
[TS]
00:37:42
◼
►
I i think with the right amount of knowledge and expertise you might be
[TS]
00:37:47
◼
►
able to glean something from dreams because they are based on you know your
[TS]
00:37:52
◼
►
brain and inputs and outputs and I'm not saying I have the expertise and I'm
[TS]
00:37:57
◼
►
gonna sit here and talk to you about my dreams but I'm just saying no one has
[TS]
00:38:00
◼
►
expertise I'm just saying there I'm just saying there is something to dream that
[TS]
00:38:04
◼
►
there is like you know there is something to that that that's not that's
[TS]
00:38:08
◼
►
not gobbledygook is just beyond our ability to understand and therefore we
[TS]
00:38:12
◼
►
imbue it with silly meaning when you say that it was are beyond beyond our
[TS]
00:38:17
◼
►
ability to understand y you're implying that there's some that there's something
[TS]
00:38:21
◼
►
to understand there as opposed to what it is which is nightly hallucinations
[TS]
00:38:27
◼
►
that you connect into meaning later on because that's what the human brain does
[TS]
00:38:32
◼
►
it's a it's a pattern creation machine even when there's no pattern there like
[TS]
00:38:36
◼
►
that that's that's all it happens I don't believe that I don't believe that
[TS]
00:38:39
◼
►
because because I'm not I'm not saying they have like any predictive power yeah
[TS]
00:38:45
◼
►
yeah if you were saying that I mean I'd start cutting you off to the looney bin
[TS]
00:38:50
◼
►
but I mean you can't deny that you know if you having a stressful time in your
[TS]
00:38:54
◼
►
life you have a certain type of dream and if if there are certain things going
[TS]
00:38:57
◼
►
on your dreams change and that there is a correlation between what your dreams
[TS]
00:39:03
◼
►
and what's happening in your real life I mean you must say that you must you must
[TS]
00:39:06
◼
►
acknowledge that surely you know when people going through traumatic times
[TS]
00:39:10
◼
►
they dreams become more dramatic or or the link may not always even be that
[TS]
00:39:16
◼
►
direct but there is like a there is a link between what's happening in your
[TS]
00:39:19
◼
►
dreams and what's happening in your life
[TS]
00:39:21
◼
►
yeah because you lose a nations are constructed from pieces of your life how
[TS]
00:39:25
◼
►
how could it be any other way but yeah I mean like I will totally grant you that
[TS]
00:39:30
◼
►
there is a correlation between what happens in your life and what happens in
[TS]
00:39:33
◼
►
your dreams and the worst example for me of this ever
[TS]
00:39:37
◼
►
my very first year of teaching me and this other and cutie that I worked with
[TS]
00:39:43
◼
►
we both discussed how in that first year in the first few months the worst thing
[TS]
00:39:48
◼
►
ever was
[TS]
00:39:49
◼
►
you would spend all of your waking hours at work at school doing school stuff and
[TS]
00:39:54
◼
►
then because there was your only experience you would go home and your
[TS]
00:39:58
◼
►
dreams would be dreams about being at school and you'd wake up and have to do
[TS]
00:40:01
◼
►
it all over again and it felt like an internal nightmare of always doing
[TS]
00:40:05
◼
►
school so like yeah but then but that's just a case where they you only have one
[TS]
00:40:09
◼
►
thing to dream about and it's the thing that you're doing all day long so of
[TS]
00:40:13
◼
►
course there's going to be some correlation but that doesn't mean that
[TS]
00:40:15
◼
►
there's like meaning to be derived from the dream like that I think that's just
[TS]
00:40:20
◼
►
a step too far let me put this to you then mister CGP grey who always thinks
[TS]
00:40:24
◼
►
that humans are merely computers yeah your computer doesn't take this your
[TS]
00:40:32
◼
►
computer if your computer if a bunch of stuff came out of your computer are you
[TS]
00:40:35
◼
►
looking through all this sort of code and stuff that was going on under the
[TS]
00:40:39
◼
►
hood of your computer you would never just completely dismiss that and say oh
[TS]
00:40:44
◼
►
well that's just random and means nothing because it came from your
[TS]
00:40:47
◼
►
computer and therefore even if it was something I wasn't supposed to do
[TS]
00:40:51
◼
►
it came from something and as a cause and the right Expert could look at it
[TS]
00:40:55
◼
►
and say I yes I see what's going on here or something has gone wrong on this is
[TS]
00:40:58
◼
►
what it's doing because the computer can only do what a computer can do and
[TS]
00:41:02
◼
►
therefore if a brain is a computer if it's serving up all this to say that
[TS]
00:41:07
◼
►
means nothing just how those nations you should ignore that well no because if my
[TS]
00:41:12
◼
►
computer is doing something must be doing it for a reason must be like I'm
[TS]
00:41:18
◼
►
not saying we're supposed to remember a dreams and then use them in a life
[TS]
00:41:23
◼
►
always with you baby you always moving the goalposts underneath me and now
[TS]
00:41:33
◼
►
you're having a discussion about do dreams serve a function in the brain and
[TS]
00:41:39
◼
►
my answer to that is obviously yes like humans dream there must be something
[TS]
00:41:46
◼
►
that the brain is doing during this time that is useful to the brain
[TS]
00:41:49
◼
►
otherwise it wouldn't do it but that doesn't mean that there is meaning to be
[TS]
00:41:53
◼
►
derived of our subjective experience of what is occurring in the dream state
[TS]
00:41:58
◼
►
like that's that's a whole other thing are you telling me if I gave you some
[TS]
00:42:04
◼
►
machine that was able to completely project someone's dream like record them
[TS]
00:42:11
◼
►
like a yeah that exists yeah yeah imagine I gave you that and i said im
[TS]
00:42:17
◼
►
gonna give you that person over those dreams the last 10 years are you telling
[TS]
00:42:22
◼
►
me that data is useless no I'm not saying that data is useless because we
[TS]
00:42:26
◼
►
just said before that you could derive probabilities about a person's life from
[TS]
00:42:31
◼
►
their dreams like oh this person looks like maybe they're a teacher because
[TS]
00:42:34
◼
►
they went through a big favor they were dreaming about teaching all the time
[TS]
00:42:37
◼
►
but that doesn't mean that there's anything for the dreamer to derive from
[TS]
00:42:41
◼
►
their dreams but if you're asking me like is a machine that is capable of
[TS]
00:42:45
◼
►
peering inside someone else is bringing a useful machine like well yes obviously
[TS]
00:42:49
◼
►
that would be useful you could derive information from that of course the baby
[TS]
00:42:52
◼
►
almost impossible not to I'm just saying that I don't think there's anything
[TS]
00:42:56
◼
►
really to learn from your own dreams and I also I also have this very very deep
[TS]
00:43:01
◼
►
suspicion that if this machine existed that allowed you to watch someone else's
[TS]
00:43:07
◼
►
dream or watch your own dreams I am absolutely confident that being able to
[TS]
00:43:14
◼
►
see them objectively would leave them out for the borderline nonsensical
[TS]
00:43:20
◼
►
hallucinations that they are because I think when you wake up leader you are
[TS]
00:43:25
◼
►
imposing order on a thing that was not full of order at the time I that that's
[TS]
00:43:33
◼
►
what I think is occurring as you wake up and think you're constructing a story
[TS]
00:43:37
◼
►
out of a series of nonsensical random events and so then you feel like oh let
[TS]
00:43:41
◼
►
me tell people about my dream and Anna when you listen to those stories they're
[TS]
00:43:45
◼
►
already borderline crazy stories but I think like you've pulled so much order
[TS]
00:43:49
◼
►
out of a thing that didn't exist so yeah yeah I mean I agree with that I agree
[TS]
00:43:54
◼
►
agree that you know even sometimes dreams you remember that they've pretty
[TS]
00:43:58
◼
►
freaky and we are all over the place and it's almost impossible for a human to
[TS]
00:44:04
◼
►
relay something like that in a way that isn't a story like I think that's just
[TS]
00:44:08
◼
►
the way our brains remember things I just don't think that it's like and
[TS]
00:44:15
◼
►
usable I think I think maybe in the future when we understand things a bit
[TS]
00:44:19
◼
►
better we may even we may be able to get more use out of them then we do me
[TS]
00:44:24
◼
►
realize I don't mean use I mean almost like diagnosed ickes
[TS]
00:44:28
◼
►
I guess you're talking about used to third parties but yet but not used to
[TS]
00:44:35
◼
►
you the dreamer because again you're describing a machine that can look
[TS]
00:44:39
◼
►
inside someone's mind and I would say yes obviously that is useful but like I
[TS]
00:44:43
◼
►
said he might be able to use it to help you die so right but i'm saying you
[TS]
00:44:48
◼
►
looking at your own dreams like ok whatever man you just reading the tea
[TS]
00:44:51
◼
►
leaves of your own life there's nothing really hear you're just everything that
[TS]
00:44:55
◼
►
you think is there you are putting their there's nothing really there that
[TS]
00:44:58
◼
►
streams today sponsor is audible.com which has over a hundred and eighty
[TS]
00:45:05
◼
►
thousand audio books and spoken-word audio products get a free 30 day trial
[TS]
00:45:10
◼
►
at audible.com / hello internet now whenever audible sponsor the show they
[TS]
00:45:16
◼
►
give us free rein to recommend the book of a choice and tonight I'm gonna tell
[TS]
00:45:21
◼
►
you about one of my all-time favorite science fiction books in fact it's
[TS]
00:45:25
◼
►
probably my all-time favorite book . it's called the mote in God's eye by
[TS]
00:45:30
◼
►
Larry Niven and Jerry pointelle basically this is set in the future in
[TS]
00:45:34
◼
►
humans are traveling all around the galaxy this area of space called Kosek
[TS]
00:45:39
◼
►
that some people say resembles the face of God is a big red star in the middle
[TS]
00:45:44
◼
►
that supposedly looks likely I and in front of that RI from some angles is a
[TS]
00:45:50
◼
►
small a yellow star and that's the mote in God's are so that's where the title
[TS]
00:45:55
◼
►
comes from now humans have never been to that stuff but all that changes in this
[TS]
00:46:00
◼
►
book when some serious stuff goes down and what they find their looks pretty
[TS]
00:46:05
◼
►
important to the future of everything it's a really clever story I remember
[TS]
00:46:09
◼
►
being really impressed by some of the ideas in it and the audiobook weighs in
[TS]
00:46:13
◼
►
at well over 20 hours said this might be a good one to settle in for your holiday
[TS]
00:46:17
◼
►
break now I've said before the books are a great way to catch up on all sorts of
[TS]
00:46:22
◼
►
stories I love listening to them when I'm out walking the dogs are on long
[TS]
00:46:25
◼
►
drives I know a lot of people have long commutes to work order out of a place to
[TS]
00:46:31
◼
►
get these audio books and if you follow one of our recommendations from the show
[TS]
00:46:35
◼
►
and you don't end up like it immediately or 20 so great at letting you
[TS]
00:46:39
◼
►
trade it back in and getting when you do like I'm sure some of you now have done
[TS]
00:46:44
◼
►
this once before and it was easy peasy no questions asked
[TS]
00:46:48
◼
►
so godot audible.com / hello internet and sign up for your free 30 day trial a
[TS]
00:46:56
◼
►
things to audible.com for supporting cast a book recommendation again the
[TS]
00:47:00
◼
►
mote in God's eye and the URL they're all important web address
[TS]
00:47:05
◼
►
audible.com / hello internet and no no you came from the show
[TS]
00:47:10
◼
►
alright brady you are back from america have you have you had that have you had
[TS]
00:47:18
◼
►
the bravery to waive yourself with myself I did one I did when a few days
[TS]
00:47:23
◼
►
ago after I got back and I had increased by 1.3 kilograms 1.3 kilograms and how
[TS]
00:47:30
◼
►
long were you in America for three weeks I mean honestly I feel like that's not
[TS]
00:47:36
◼
►
too bad I felt like I dodged a bullet to be honest I haven't been eating well
[TS]
00:47:41
◼
►
since I got back either so I think it might be even more now there's always an
[TS]
00:47:45
◼
►
america half life where you come back and because the food is so good in
[TS]
00:47:51
◼
►
america it takes a little while to adjust you would still eat crap when you
[TS]
00:47:54
◼
►
return even though I have always promised myself on the plane coming back
[TS]
00:47:58
◼
►
from america all gone now I'm going to be really good now but now it doesn't it
[TS]
00:48:02
◼
►
never happens like this you need a few days to adjust yeah you gotta weigh
[TS]
00:48:05
◼
►
yourself of all that fat and just before we recorded you sent me a picture of a
[TS]
00:48:12
◼
►
pizza with only looking at it like super spectacular peeps it was the name of it
[TS]
00:48:17
◼
►
or something like it was like it was the favorite run 5,000 calories that's for
[TS]
00:48:25
◼
►
sure but yes I gotta gotta say I think you could have definitely done way worse
[TS]
00:48:30
◼
►
I think if I was in America for the same period of time I would have done way
[TS]
00:48:34
◼
►
worse you know
[TS]
00:48:36
◼
►
all agree with you there you dodged a bullet dodged a bullet on that one
[TS]
00:48:39
◼
►
how you don't it was interesting because we I mean it's been basically a month
[TS]
00:48:46
◼
►
since we did our way and because you were in America he said we're not going
[TS]
00:48:49
◼
►
to do it while you're there couldn't be consistent and I think I realized that
[TS]
00:48:54
◼
►
with you my weight buddy gone I was thinking about this stuff just a little
[TS]
00:48:59
◼
►
bit less maybe and so I was actually quite surprised when I stepped on the
[TS]
00:49:03
◼
►
scale today I I was essentially within the measurement error the exact same way
[TS]
00:49:10
◼
►
that I was a month ago I was like point three pounds which is zero kilograms
[TS]
00:49:16
◼
►
down but you know my daily weight varies by much much more than that so just
[TS]
00:49:24
◼
►
interesting to see that I have hit like a little plant town that has stayed
[TS]
00:49:28
◼
►
roughly the same for a month but I was just surprised that because we hadn't
[TS]
00:49:34
◼
►
done the way and it hadn't even crossed my mind has moved in quite a while so
[TS]
00:49:38
◼
►
yeah I am but I think there's there's something like my brain isn't doing the
[TS]
00:49:46
◼
►
comparison to the fixed point of the last way in just to wear today that I
[TS]
00:49:50
◼
►
had no idea what the last way in number was I had to go look it up and then do
[TS]
00:49:53
◼
►
the math so it's like my brain was pushing it to the side but now that
[TS]
00:49:58
◼
►
you're back in the UK now thats will be weighing in again in two weeks time I
[TS]
00:50:03
◼
►
think maybe it'll be more the four of my mind but maybe not but maybe I'm really
[TS]
00:50:07
◼
►
stuck at a plateau need to change things up again to to continue the weight loss
[TS]
00:50:11
◼
►
we will hopefully I can get my act together just since spiral of food
[TS]
00:50:16
◼
►
naughtiness at the moment and they to the best of us
[TS]
00:50:23
◼
►
I wanted to quickly ask you about the iPad pro as you know I don't listen to
[TS]
00:50:29
◼
►
your fetish podcast but you did talk about on that I understand yeah yeah I
[TS]
00:50:34
◼
►
pick one up on the day of release
[TS]
00:50:38
◼
►
all I want is should I get one for Christmas cos cos I haven't I don't
[TS]
00:50:43
◼
►
there's nothing I really want for Christmas and my wife's that will come
[TS]
00:50:46
◼
►
get you something and i dont wanna watch anymore and a porch of Afghan off that
[TS]
00:50:51
◼
►
probably for your own good to go off that so I Pad Pro at a lot I do like the
[TS]
00:50:59
◼
►
idea of that I have absolutely no use for I think I've said before I'm a
[TS]
00:51:07
◼
►
sucker for anything with pro in the know I think I think this is why I think this
[TS]
00:51:12
◼
►
is why you getting drawn in by this device perot and Brady things i would
[TS]
00:51:17
◼
►
like to have the pro things I like a lot of code YouTube red YouTube pro
[TS]
00:51:24
◼
►
actually not a bad idea that would have made me think it was also well I mean I
[TS]
00:51:30
◼
►
like you cheap I prefer the provision myself so I'm like that with everything
[TS]
00:51:35
◼
►
so idk the original iPad and used to like times and then put in a draw but
[TS]
00:51:42
◼
►
now there's an iPad pro I'm not falling for this and I'm completely open about
[TS]
00:51:49
◼
►
it I love that I love that I loved you fall for it and that you also know this
[TS]
00:51:54
◼
►
about yourself I'm wondering what's going to happen when Apple inevitably
[TS]
00:51:59
◼
►
makes the Apple watch perot
[TS]
00:52:01
◼
►
definitely get one of them right and was going to say no to that exactly like you
[TS]
00:52:10
◼
►
haven't got a price you call yourself professional should I get an iPad pro ok
[TS]
00:52:22
◼
►
so that that's a hard question to answer because he's you either say the say yes
[TS]
00:52:27
◼
►
but you say no
[TS]
00:52:32
◼
►
here's my thinking about this has been thinking about this ok let's say I
[TS]
00:52:36
◼
►
didn't know anything about someone and they just needed to buy an iPad they
[TS]
00:52:41
◼
►
said which iPad should I buy if I didn't know anything about the person the
[TS]
00:52:45
◼
►
correct answers to buy the iPad
[TS]
00:52:47
◼
►
or two which is like the medium-sized superlight one and then if you have a
[TS]
00:52:52
◼
►
particular reason to get the perot you should get the perot but I don't have
[TS]
00:52:56
◼
►
any idea what do you think you're going to do with the iPad pro aside from just
[TS]
00:53:00
◼
►
feel smug sense of satisfaction that you home the pro version of this device like
[TS]
00:53:07
◼
►
that pretty much sums it up against all I want for Christmas is a sense the smug
[TS]
00:53:15
◼
►
satisfaction money come by that actually the best thing money money I just feel
[TS]
00:53:30
◼
►
like a new toy you know yeah you want a new toy like it's it's huge in person
[TS]
00:53:37
◼
►
it's a really big in person it feels like a dinner plates in person actually
[TS]
00:53:41
◼
►
do you have your laptop is like the 15 inch MacBook Pro i think is that right
[TS]
00:53:47
◼
►
yeah I haven't got here it's yeah yeah yeah but you own that laptop yeah the
[TS]
00:53:52
◼
►
iPad Pro is essentially the size of that screen right that's that's the size of
[TS]
00:53:57
◼
►
it within within like a quarter inch and so we can hear is a big big screen and
[TS]
00:54:03
◼
►
if you're not planning on doing work on it like I got the iPad pro to do work
[TS]
00:54:08
◼
►
and so far I absolutely love it for work like the the video that I'm currently
[TS]
00:54:13
◼
►
working on I did just a ton of the scripts on that iPad perot the final
[TS]
00:54:17
◼
►
version like it's really really nice to work on but if you're not going to do
[TS]
00:54:21
◼
►
that then the question is well it's a total counter machine or are you going
[TS]
00:54:26
◼
►
to want to sit on the couch and browse the web or read books on your iPad or
[TS]
00:54:31
◼
►
watch TV on your iPad I don't think you do any of those things kind of guy but
[TS]
00:54:36
◼
►
maybe I'm wrong I don't know I don't watch TV and movies on my laptop every
[TS]
00:54:42
◼
►
not and I do spend the first hour of most mornings when I wake up just
[TS]
00:54:47
◼
►
sitting in bed
[TS]
00:54:48
◼
►
that's what I like to my email and all the things I can just do without my big
[TS]
00:54:53
◼
►
machine
[TS]
00:54:55
◼
►
web stuff I do that but I did on my laptop so having a big having a big
[TS]
00:55:01
◼
►
which has got bored so I sort of think what if I had the big screen of the iPad
[TS]
00:55:05
◼
►
price I could sit and do my email check on my YouTube channels everything first
[TS]
00:55:09
◼
►
thing in the morning but I do that now my laptop and a so much easier with a
[TS]
00:55:13
◼
►
keyboard to be no bang out females yeah if you think you're funny that you wake
[TS]
00:55:19
◼
►
up and doing email from bed to remember that the next time I get an email from
[TS]
00:55:22
◼
►
me sent this before getting dressed in the morning but yeah if you want to do
[TS]
00:55:29
◼
►
that that sounds like he needed you need a keyboard then again I don't think the
[TS]
00:55:33
◼
►
iPad perot is what you want to do unless you're you know you're really happy
[TS]
00:55:36
◼
►
about typing with your fingers on glass and it has that little keyboard but I
[TS]
00:55:42
◼
►
don't think that keyboard work really well if you're trying to do it in bed
[TS]
00:55:44
◼
►
with the laptop balance on your chest to spend probably an hour a day maybe in
[TS]
00:55:53
◼
►
Photoshop so and I do use pen like I use a Wacom tablet all the time so I do I
[TS]
00:56:02
◼
►
could imagine that but my use of photoshop and my use of the pen is very
[TS]
00:56:06
◼
►
very integrated with my editing on avid largest-ever on my own one of my big
[TS]
00:56:12
◼
►
computers those two processors are so intertwined yeah and every everything I
[TS]
00:56:18
◼
►
know about you bruce says this is this is not the thing that you want to do to
[TS]
00:56:21
◼
►
integrate a new tool into this workflow so I think the only selling point for
[TS]
00:56:27
◼
►
this for you is if there's some point where you want a lounge around and just
[TS]
00:56:31
◼
►
use this doesn't sound like you really have you really have a place for this
[TS]
00:56:36
◼
►
strange like I iced it with my laptop on my lap
[TS]
00:56:43
◼
►
on my phone in my hand here's the thing just just with the experience that I
[TS]
00:56:47
◼
►
have had with my cuz I i have the iPad pro in the regular size iPad it feels
[TS]
00:56:53
◼
►
ridiculous to be sitting next to my wife with the iPad Pro for lounging time
[TS]
00:56:59
◼
►
because just like I was saying we're watching TV but then I want to have the
[TS]
00:57:04
◼
►
iPad in front of me because I'm not paying full attention to whatever is on
[TS]
00:57:07
◼
►
the screen but the screen in front of me that feels so huge it feels almost
[TS]
00:57:11
◼
►
obtrusive and so I actually prefer to use a smaller iPad if I'm just sitting
[TS]
00:57:17
◼
►
on the couch with my wife thing do you do that I don't do that the iPad Pro is
[TS]
00:57:25
◼
►
good for the main thing that I'm using for it has a bigger screen to write
[TS]
00:57:30
◼
►
scripts and you had your scripts too well this is a whole as a whole thing
[TS]
00:57:36
◼
►
for the moment I'm doing this typing but the iPad pro screen is big enough that
[TS]
00:57:41
◼
►
what I've been doing is I can have the script on the left two-thirds of the
[TS]
00:57:45
◼
►
screen and I have a little notes file on the right third of the screen so I have
[TS]
00:57:49
◼
►
two different text files open at the same time one of the things that I wants
[TS]
00:57:53
◼
►
to do with the iPad Pro is a thing that I've done before which is used the
[TS]
00:57:57
◼
►
stylists to make editing corrections on the script that is really useful to me
[TS]
00:58:02
◼
►
but the pen is not currently available so I haven't been able to try it with
[TS]
00:58:06
◼
►
that so I don't know if it will be useful for that yet or not but for me
[TS]
00:58:10
◼
►
having a bigger screen to write is really useful seems like he should just
[TS]
00:58:14
◼
►
be using a laptop yeah you would think so but I like the simplicity of using
[TS]
00:58:19
◼
►
iOS I find the constraints of an iPad helpful so that's one of the reasons why
[TS]
00:58:26
◼
►
I like doing that like I've set up my iPad pro to basically only have the
[TS]
00:58:32
◼
►
tools necessary to write it doesn't have everything the laptop can have I can
[TS]
00:58:37
◼
►
spend a lot of time fiddling around with it it's like luck
[TS]
00:58:40
◼
►
there's six programs on this thing which are designed for work and those are just
[TS]
00:58:43
◼
►
the ones that you're going to use it so I I find that very helpful I really like
[TS]
00:58:47
◼
►
that but i dont no breeding doesn't sound like it's a it's a total sales for
[TS]
00:58:51
◼
►
you unless you really value that feeling of smug satisfaction I feel like you're
[TS]
00:58:57
◼
►
always talking he added getting Apple products I talk you out of them because
[TS]
00:59:01
◼
►
I care bTW I really do as much as I would love to see you use an apple
[TS]
00:59:08
◼
►
watching I think it might be hilarious I don't think he would like it and just
[TS]
00:59:13
◼
►
the conversation with you now I dont see a super slam dunk selling case for the
[TS]
00:59:17
◼
►
iPad pro I don't think it would help you with the kind of work that you do me as
[TS]
00:59:22
◼
►
a youtuber using an iPad as much as I do is extraordinarily rare and iPad is not
[TS]
00:59:27
◼
►
well designed for the kind of work that most normal you two words do just that
[TS]
00:59:32
◼
►
for making my video is a huge part of it is writing and the iPad happens to be a
[TS]
00:59:37
◼
►
nice writing tool but if I didn't have to do a lot of writing I would have very
[TS]
00:59:42
◼
►
little work justification for iPad I would not be able to use this tool as
[TS]
00:59:47
◼
►
much as I do so that's why talking to you like I don't think it's going to
[TS]
00:59:50
◼
►
help you with your work so it's just a question of if you want a lounge around
[TS]
00:59:53
◼
►
with a dinner tray sized screen on the couch personal much is an estate agent
[TS]
01:00:02
◼
►
ok and I saw him stopping over his cars now my experience estate agents always
[TS]
01:00:07
◼
►
have one of two cars these days they either have like small little novelty
[TS]
01:00:13
◼
►
cars like smart cars and stuff that are painted weird colors with the branding
[TS]
01:00:17
◼
►
of the estate agent by the mobile ads and also that makes them easy to park I
[TS]
01:00:22
◼
►
getting into little space is when they're showing houses and things like
[TS]
01:00:26
◼
►
that or they have their normal rich person car like a classy BMW ok I'm
[TS]
01:00:32
◼
►
wondering what is the better to pull up in when you're trying to a
[TS]
01:00:38
◼
►
sell a house to someone or get someone's business to sell their house because
[TS]
01:00:42
◼
►
part of me thinks if they turn up in like a reflash I also think it's like
[TS]
01:00:47
◼
►
about counters and other professional people I deal with do I prefer it when I
[TS]
01:00:51
◼
►
see them with a really flashy expensive for what I prefer the head like a more
[TS]
01:00:56
◼
►
humble as if they've got like a reflash expensive says to me are they successful
[TS]
01:01:01
◼
►
that they make a lot of money and that's good but then I also think we're making
[TS]
01:01:05
◼
►
a lot of money order made before they're ready for this is easy this is easy if
[TS]
01:01:10
◼
►
you are a professional who is directly helping somebody else make money then
[TS]
01:01:19
◼
►
you want to show up in the in the fancy car you want to show up in the BMW right
[TS]
01:01:24
◼
►
otherwise you want to show up in the normal car that that's the way you want
[TS]
01:01:30
◼
►
to do this if you like if you're helping the person make money like your estate
[TS]
01:01:35
◼
►
agent and you're doing this thing where you are helping the person sell their
[TS]
01:01:41
◼
►
house when you wanna show up in the BMW was like look I sell a lot of houses I
[TS]
01:01:46
◼
►
can afford this car because I sell a lot of houses that's that's the way you
[TS]
01:01:50
◼
►
should do when you're when you're helping someone find a house to buy then
[TS]
01:01:54
◼
►
you wanna show up in the normal car because then they're much more we're
[TS]
01:01:58
◼
►
like this
[TS]
01:01:59
◼
►
estate agent is making money off of us when we buy this house and look at all
[TS]
01:02:03
◼
►
this money that we're spending you don't want to see the person in the BMW at
[TS]
01:02:06
◼
►
that point what kind you want your accountant to have because they're
[TS]
01:02:10
◼
►
helping you save money but they charging you phase what kind you want your
[TS]
01:02:16
◼
►
account tied to have I think an accountant wants to project an image of
[TS]
01:02:22
◼
►
boring sensibility so I don't really know very much about cars but I would
[TS]
01:02:29
◼
►
want my accountant to project boring this and Sensibility like my accountant
[TS]
01:02:35
◼
►
should have been a red Tesla I would feel a bit
[TS]
01:02:37
◼
►
I don't know about this guy who seems to seems crazy flashy for unaccounted 2012
[TS]
01:02:44
◼
►
same wealthy this is this is a mammogram suddenly wishing I knew any car brands
[TS]
01:02:51
◼
►
by name a scientist so I could pull something out
[TS]
01:02:56
◼
►
would be like oh this is the car that's the appropriate one but I know I know
[TS]
01:03:01
◼
►
nothing I mean even BMW BMW is just an abstract notion in my mind like oh
[TS]
01:03:05
◼
►
inexpensive rich person's car is at 1 p.m. W is I don't really even now when
[TS]
01:03:09
◼
►
you don't need to give me a brand new car just you want to be your to you when
[TS]
01:03:13
◼
►
your account to be wealthy like that to to appear like someone that is lots and
[TS]
01:03:17
◼
►
lots of money or do you think we'll hang on how skies phase of you can afford
[TS]
01:03:21
◼
►
that those are two different questions obviously I do want my accountant to be
[TS]
01:03:26
◼
►
wealthy because that indicates that they are a good accountant but it is very
[TS]
01:03:31
◼
►
different from showing up in a flash car right those are two different things
[TS]
01:03:36
◼
►
that's why I'm saying like I want the I want to have this feeling like oh this
[TS]
01:03:39
◼
►
accountant is a really sensible person and they have an obviously nice car was
[TS]
01:03:45
◼
►
not a crazy car you when you you'd want them to turn up in a Volvo then with
[TS]
01:03:48
◼
►
like air bags everywhere and you know it's safest possible car and you want
[TS]
01:03:53
◼
►
them to be really cautious sensible safe person you don't want to turn up on a
[TS]
01:03:57
◼
►
motorbike ya feelin counting down the final motorbike that the end of my life
[TS]
01:04:06
◼
►
I don't think you're good with numbers that's that's what I'm getting out of
[TS]
01:04:11
◼
►
this meeting
[TS]
01:04:12
◼
►
yes that's my feeling if you're helping someone who earn money directly then you
[TS]
01:04:15
◼
►
can show up with your flash car ok does the estate agents by you have two
[TS]
01:04:20
◼
►
different cars all the head he has his personal I mean two cars in addition to
[TS]
01:04:24
◼
►
his personal car across the street is there a Tesla a smart car and the Volvo
[TS]
01:04:32
◼
►
and the voters his personal car in the new pics the other two depending on the
[TS]
01:04:35
◼
►
day I don't think it works like that I think he's just got his peers pokey
[TS]
01:04:38
◼
►
branded and then he's got his BMW that he takes to go from the weekends
[TS]
01:04:43
◼
►
imagined he would deny I just think about I think about that a lot I think
[TS]
01:04:50
◼
►
about what kind of your account to drive I don't know if he tries because I gotta
[TS]
01:04:56
◼
►
go to his office which carries his I do have like a financial guy that's helped
[TS]
01:05:02
◼
►
out with a few things like mortgage stuff he drives a Jaguar Jaguar and I
[TS]
01:05:09
◼
►
did notice I did not just the car they come in so what kind of car should
[TS]
01:05:12
◼
►
YouTube drive that's a good question when you pull up to do interviews at the
[TS]
01:05:17
◼
►
spiritual home of numberphile were you to be driving the car what kind of car
[TS]
01:05:21
◼
►
do you think you should drive to give a good impression to your interviewees
[TS]
01:05:25
◼
►
China doing a project wealth power and success
[TS]
01:05:30
◼
►
ready to go for academic street cred and pull up in a dinky car like a PhD
[TS]
01:05:39
◼
►
student would be driving I may not have a very practical car with lots of
[TS]
01:05:42
◼
►
storage for my camera bags and things like that I think I like having having a
[TS]
01:05:48
◼
►
big big for your bags and stuff what kind would you give you gonna get me if
[TS]
01:05:53
◼
►
i could get any car get a test test which you get that one of the sporty one
[TS]
01:05:58
◼
►
so would you get more family 10 well I mean I don't have children I don't need
[TS]
01:06:03
◼
►
one of the family cars get those said Danny looking ones are you get those
[TS]
01:06:07
◼
►
ones you get those ones that look like racing kart racing kart those whatever I
[TS]
01:06:13
◼
►
forget this is the worst car person in the world among the interested in test
[TS]
01:06:18
◼
►
the leg I'm super interested in Tesla but that is almost entirely because they
[TS]
01:06:22
◼
►
go it is a computer on wheels and this is why this car is interesting to me and
[TS]
01:06:26
◼
►
it has none of the pieces of a normal car so I know nothing about how the
[TS]
01:06:32
◼
►
engines of car was work I know nothing about gear differentials and I care
[TS]
01:06:37
◼
►
about none of this is because of Tesla lacks all of that is precisely why I'm
[TS]
01:06:42
◼
►
interested in it but I went once and just for fun like tried to design a test
[TS]
01:06:49
◼
►
light on the website of like if I had the money and if I had any reason to own
[TS]
01:06:54
◼
►
a car what would i get for myself and i ended up just designing what would see
[TS]
01:06:58
◼
►
me just seemed like the normal middle test look are ya in black with
[TS]
01:07:04
◼
►
understated interior like that's what I would get if I was going to own a car
[TS]
01:07:08
◼
►
but I have no reason to drive ever and I would not be getting a Tesla anytime
[TS]
01:07:15
◼
►
to bring out the Tesla pro this episode of hello internet is also brought to you
[TS]
01:07:20
◼
►
by a long time
[TS]
01:07:22
◼
►
hello Internet sponsors the one the only the Squarespace its V Squarespace
[TS]
01:07:29
◼
►
because it is the place to go if you want to turn your idea for a website
[TS]
01:07:35
◼
►
into an actual working website that looks great with the minimum amount of
[TS]
01:07:41
◼
►
hassle I used to build and manage websites myself I used to write HTML and
[TS]
01:07:47
◼
►
then I wrote scripts and I managed servers are used to do all of that but
[TS]
01:07:51
◼
►
when I started my youtube career one of the early decisions that I made was
[TS]
01:07:55
◼
►
switching over my website to Squarespace and I am so glad I did that because it
[TS]
01:08:02
◼
►
meant that Squarespace just handles a lot of the stuff that I used to have to
[TS]
01:08:06
◼
►
worry about is there going to be a huge amount of traffic because I just put up
[TS]
01:08:10
◼
►
a new video no need to worry
[TS]
01:08:11
◼
►
Squarespace just has it covered I didn't have problems like if my server broke at
[TS]
01:08:16
◼
►
three in the morning that I'm the only person in the world who can fix it now
[TS]
01:08:20
◼
►
Squarespace just handles all of this so even if you know how to make a website I
[TS]
01:08:25
◼
►
still think if you have a project that you just one up and want done
[TS]
01:08:29
◼
►
square space is the place to go the Saints look professionally designed
[TS]
01:08:34
◼
►
regardless of your skill level there's no coding required if you can drag
[TS]
01:08:39
◼
►
round pictures and text boxes you can make a website where space is trusted by
[TS]
01:08:44
◼
►
millions of people and some of the most respected brands in the world now what
[TS]
01:08:48
◼
►
do you have to pay for this just eat bucks a month its eight bucks a month
[TS]
01:08:53
◼
►
and you get a FREE domain name if you sign up for a year so to start a free
[TS]
01:08:59
◼
►
trial today with no credit card required go to Squarespace dot com slash hello
[TS]
01:09:05
◼
►
internet and when you decide to sign up for Squarespace make sure to use the
[TS]
01:09:10
◼
►
offer code hello internet to get 10% off your first purchase if there's a website
[TS]
01:09:16
◼
►
in your mind that you been wanting to start but you haven't done so yet today
[TS]
01:09:20
◼
►
is the day
[TS]
01:09:21
◼
►
square space.com / hello Internet 10% off start today
[TS]
01:09:27
◼
►
Squarespace build it beautiful we've been talking for ages and ages about
[TS]
01:09:34
◼
►
talking about artificial intelligence and it keeps getting putting back we
[TS]
01:09:39
◼
►
kept saying all this talk about it next time let's talk about it next time and
[TS]
01:09:42
◼
►
we never gonna do it today we never do it because there's always end up at the
[TS]
01:09:47
◼
►
bottom of the list and just all of the brady corners and listening emails and
[TS]
01:09:53
◼
►
everything always takes up so much time that we ever actually we never actually
[TS]
01:09:57
◼
►
get to it and even now it's like room was two hours into this thing right
[TS]
01:10:01
◼
►
have led to cut so I am going to have to cut up yeah yeah but dreams toughest
[TS]
01:10:07
◼
►
teams doubly right now it's not gonna go it is taken us so long to get to this
[TS]
01:10:15
◼
►
day I topic that kind of forgotten everything that I ever wanted to say cuz
[TS]
01:10:21
◼
►
they give you are giving the background of this which is I read this book called
[TS]
01:10:27
◼
►
Super intelligence by Nick Bostrom several months ago now we have two year
[TS]
01:10:34
◼
►
ago now it's been so long since we originally put this on the topic list
[TS]
01:10:40
◼
►
but there are many things that go on to the topic list and then I kind of color
[TS]
01:10:45
◼
►
them as time goes on because you realize like a couple months later I don't
[TS]
01:10:48
◼
►
really care about this anymore but this topic has stayed on here because that
[TS]
01:10:56
◼
►
book has been one of these books that has really just stuck with me over time
[TS]
01:11:02
◼
►
like I find myself continually thinking back to that book and some of the things
[TS]
01:11:07
◼
►
that it raised so I think we need to talk a little bit about artificial
[TS]
01:11:12
◼
►
intelligence today but I have to apologize in advance if i seemed a
[TS]
01:11:16
◼
►
little bit foggy on the details because this was supposed to be a topic months
[TS]
01:11:20
◼
►
and months ago now I'm size that's my phone really say it is it is the show's
[TS]
01:11:28
◼
►
fault for being so follow up that's right we're trying to build a nation
[TS]
01:11:32
◼
►
hear these things these things are difficult yeah rome wasn't built in a
[TS]
01:11:36
◼
►
go on and what's that when do we stop let's define out social intelligence
[TS]
01:11:41
◼
►
that would help me when we are talking about artificial intelligence for the
[TS]
01:11:46
◼
►
purpose of this conversation what we mean is not intelligence in the narrow
[TS]
01:11:51
◼
►
sense that computers are capable of solving certain problems today what
[TS]
01:11:56
◼
►
we're really talking about is what sometimes referred to as like a general
[TS]
01:12:00
◼
►
purpose intelligence creating something that his smarts and smart in such a way
[TS]
01:12:07
◼
►
that it can go beyond the original parameters of what it was told to do is
[TS]
01:12:15
◼
►
a self-learning we can talk a self-learning is is one way that this
[TS]
01:12:19
◼
►
can happen but yeah we're talking about like something that is smart and so on
[TS]
01:12:26
◼
►
and so maybe the best way to to say this is that it can do things that are
[TS]
01:12:31
◼
►
unexpected to the creator writer because it is intelligent on its own in the same
[TS]
01:12:37
◼
►
way that like if you have a kid you can't predict with the kid is always
[TS]
01:12:40
◼
►
going to do because music is a general-purpose intelligence like
[TS]
01:12:44
◼
►
they're smart and they can come up with solutions and they can do things that
[TS]
01:12:46
◼
►
surprise you
[TS]
01:12:47
◼
►
ok so the reason that this
[TS]
01:12:49
◼
►
book and this topic has stuck with me is because I have found my mind changed on
[TS]
01:12:59
◼
►
this topic
[TS]
01:13:00
◼
►
somewhat against my will and so I would say that for almost all of my life
[TS]
01:13:06
◼
►
much I'm sure to the surprise at listen as I would have placed myself very
[TS]
01:13:10
◼
►
strongly in the camp of sort of techno optimists of Mora technology faster
[TS]
01:13:18
◼
►
always it's nothing but sunshine and rainbows ahead and I would always see
[TS]
01:13:25
◼
►
when people would talk about like oh the rise of the machines like Terminator
[TS]
01:13:29
◼
►
style all the robots are gonna come and kill us I was always very very
[TS]
01:13:33
◼
►
dismissive of this and in no small part because those movies are ridiculous like
[TS]
01:13:37
◼
►
I totally love Terminator and terminator 2 perhaps one of the best sequels ever
[TS]
01:13:40
◼
►
made a really fun but it's not a serious movie yet sometimes people end up
[TS]
01:13:47
◼
►
seeming to like take that very seriously like the robots are going to come kill
[TS]
01:13:50
◼
►
us all in my view on this was always like okay maybe we will create smart
[TS]
01:13:56
◼
►
machines someday in the future but I was always just operating under the
[TS]
01:14:00
◼
►
assumption that like when when we do that though will be cyborgs like it will
[TS]
01:14:04
◼
►
be the machines already or will be creating machines obviously to help us
[TS]
01:14:08
◼
►
so I was never really convinced that there was any kind of problem here but
[TS]
01:14:13
◼
►
this book change my mind so that I am now much more in the camp of artificial
[TS]
01:14:23
◼
►
intelligence its development can seriously present and existential threat
[TS]
01:14:32
◼
►
to humanity in the in the same way that like an asteroid collision from outer
[TS]
01:14:36
◼
►
space is what you would classify as a serious existential threat to humanity
[TS]
01:14:41
◼
►
like it's just over four people that's where I find myself now and I just keep
[TS]
01:14:46
◼
►
thinking about this because I'm uncomfortable with having this opinion
[TS]
01:14:49
◼
►
like sometimes your mind changes and you don't want to change and if I like too
[TS]
01:14:54
◼
►
much better when I just thought that the future was always going to be great and
[TS]
01:14:57
◼
►
there's not any kind of problem and this just keeps popping up in my head
[TS]
01:15:01
◼
►
because if you like I do think there is a problem here this book has sold me on
[TS]
01:15:06
◼
►
the fact that there's a there's a potential problem I mean we saw that
[TS]
01:15:09
◼
►
petition didn't read recently signed by all those heavy hitters to the
[TS]
01:15:13
◼
►
government's telling them not to use I and kind of military applications so
[TS]
01:15:18
◼
►
this is obviously like your not the only person thinking this way this is
[TS]
01:15:22
◼
►
obviously at this bit of a thing at the moment isn't it yeah it's it's it's
[TS]
01:15:26
◼
►
definitely become a thing I've been I've been trying to i've been trying to trace
[TS]
01:15:31
◼
►
the pattern of this and it definitely seems like I am NOT the only person who
[TS]
01:15:36
◼
►
has found this book convincing and actually who are talking about Tesla
[TS]
01:15:40
◼
►
before Elon Musk made some public remarks about this book which I think
[TS]
01:15:45
◼
►
kicked off a bunch of people and he actually about ten million dollars to
[TS]
01:15:51
◼
►
fund working on what's called the control problem which is one of the
[TS]
01:15:56
◼
►
fundamental worries about a I like he put his money where his mouth is about
[TS]
01:16:00
◼
►
like actually he does think that this is a real threat to humanity to the tune of
[TS]
01:16:05
◼
►
its worth putting down ten million dollars as a way to try to work on some
[TS]
01:16:09
◼
►
of the problems far far in advance and yeah it's just it's interesting to see
[TS]
01:16:14
◼
►
an idea spread and and catch on and and kind of go through a bunch of people so
[TS]
01:16:20
◼
►
yeah I never never would have thought that I would find myself here and I feel
[TS]
01:16:24
◼
►
are most likely like a crazy person talking about like a hobo but Michael is
[TS]
01:16:28
◼
►
in the future but I don't know why I'm I unexpectedly find myself much more on
[TS]
01:16:34
◼
►
that side that I ever I ever thought that I would I mean obviously it's
[TS]
01:16:38
◼
►
impossible to summarize the whole big talk podcast but can you tell me one or
[TS]
01:16:44
◼
►
two of the sort of key points that were made that of scared the bejesus out of
[TS]
01:16:48
◼
►
you remember a while ago we had an argument about metaphors metaphor is
[TS]
01:16:52
◼
►
even though they're used in arguments at all
[TS]
01:16:56
◼
►
yeah the thing about this book that I found really convincing was used no
[TS]
01:17:03
◼
►
metaphors at all it was one of these books which
[TS]
01:17:06
◼
►
laid out its basic assumptions and then just followed them through to a
[TS]
01:17:12
◼
►
conclusion and that kind of argument I always find very convincing but we need
[TS]
01:17:19
◼
►
to think of it in this way he's like ok look if we start from the assumption
[TS]
01:17:25
◼
►
that humans can create artificial intelligence
[TS]
01:17:30
◼
►
let's follow through the logical consequences of all of this like him
[TS]
01:17:34
◼
►
here's a couple of other assumptions how they interact and the book is just very
[TS]
01:17:39
◼
►
very pharaoh of trying to go down every path and every combination of these
[TS]
01:17:44
◼
►
things and when it made me realize that when I was just kind of embarrassed to
[TS]
01:17:47
◼
►
realize is oh I just never really did sit down and actually think through this
[TS]
01:17:53
◼
►
position to its logical conclusion the broad strokes of it are what happens
[TS]
01:17:58
◼
►
when humans actually create something that is smarter than ourselves I'm gonna
[TS]
01:18:07
◼
►
blow past a bunch of the book because it's it's building up to that point I
[TS]
01:18:12
◼
►
will say that if you don't think that it is possible for humans to creates
[TS]
01:18:15
◼
►
artificial intelligence not sure where the conversation goes from that but the
[TS]
01:18:19
◼
►
first third of the book is really trying to sell people who don't think that this
[TS]
01:18:23
◼
►
is possible on all of the reasons why it probably is so we're just going to start
[TS]
01:18:27
◼
►
the conversation from there if you can create something that is smarter than
[TS]
01:18:31
◼
►
you the feeling I have this is almost like turning over the keys of the
[TS]
01:18:36
◼
►
universe to something that is vastly beyond your control and I think that
[TS]
01:18:42
◼
►
there is something very very terrifying about that notion that we might make
[TS]
01:18:49
◼
►
something that is vastly beyond our control and vastly more powerful than us
[TS]
01:18:53
◼
►
and then we are no longer the drivers of our own destiny again because I am not
[TS]
01:19:00
◼
►
as good of a writer or a thinker
[TS]
01:19:02
◼
►
the metaphor that I keep coming up with his it's almost like it's almost like if
[TS]
01:19:07
◼
►
guerrillas intentionally created humans
[TS]
01:19:11
◼
►
and then we'll now gorillas are in zoos and gorillas are not the drivers of
[TS]
01:19:15
◼
►
their own destiny like being created something that is smarter and that rules
[TS]
01:19:18
◼
►
the whole planet and gorillas are just like along for the ride but they're no
[TS]
01:19:21
◼
►
longer in control of anything like I think that that's the position that we
[TS]
01:19:24
◼
►
may very well find ourselves in if we create some sort of artificial
[TS]
01:19:27
◼
►
intelligence is like best case scenario we're riding along with some greater
[TS]
01:19:34
◼
►
things that we don't understand and worst case scenario is that we all end
[TS]
01:19:39
◼
►
up dead as just the incidental actions of of this machine that we don't
[TS]
01:19:44
◼
►
understand is there it I'm sorry this is attention and I this isn't the main
[TS]
01:19:48
◼
►
thing you talking about and just knocked on the head if if I'm out of order but
[TS]
01:19:52
◼
►
is there a suggestion then nor is it is at the general belief that if we create
[TS]
01:19:58
◼
►
we are creating a really clever computers that can sing quicker than us
[TS]
01:20:03
◼
►
and can process information quicker than us and therefore become smarter than us
[TS]
01:20:07
◼
►
is there another step required for these machines to then have like Michael
[TS]
01:20:16
◼
►
Wittels not with us in free will but like desire or like I want to use this
[TS]
01:20:23
◼
►
power because you know how lucky if you have some human gets too much power they
[TS]
01:20:27
◼
►
want to take over the world and have all the countries are you might want to
[TS]
01:20:29
◼
►
conquer space so you might write everything because because we have these
[TS]
01:20:33
◼
►
kind of desire for power and things is that is that is it taken as given that
[TS]
01:20:39
◼
►
if we make super super smart computers they will start doing something there
[TS]
01:20:43
◼
►
that manifests itself as a desire for more like agreed for more well known in
[TS]
01:20:49
◼
►
part of this is there are things in the world that act as though they have
[TS]
01:20:53
◼
►
desires but that might not really yeah right like you know if you think about
[TS]
01:20:58
◼
►
think about germs as an example
[TS]
01:21:02
◼
►
German have actions in the world that you can you can put desires upon them
[TS]
01:21:07
◼
►
but obviously doesn't have any thoughts or desires of its own but you can speak
[TS]
01:21:13
◼
►
loosely to say that it wants to reproduce it wants to consume resources
[TS]
01:21:17
◼
►
it wants to make more copies of itself
[TS]
01:21:19
◼
►
and so this is one of the the concerns that you could end up making a machine
[TS]
01:21:24
◼
►
that wants to consume resources that has some general level of intelligence about
[TS]
01:21:29
◼
►
how to go
[TS]
01:21:30
◼
►
acquiring those resources and even if it's not conscious if it's not
[TS]
01:21:35
◼
►
intelligent in the way that we would think that human is intelligent they may
[TS]
01:21:40
◼
►
be such a thing that is it like it consumes the world trying to achieve its
[TS]
01:21:44
◼
►
goal just incidentally like as it as a thing that we did not as a thing that we
[TS]
01:21:49
◼
►
did not intend to try to call it even if the goal is something seemingly
[TS]
01:21:52
◼
►
innocuous but if you made it all powerful computer and told it whatever
[TS]
01:21:57
◼
►
you do you must go and put a flag on the moon and it could it could kill all the
[TS]
01:22:02
◼
►
humans on Earth in some crazy attempt to do it but without realizing that are you
[TS]
01:22:06
◼
►
weren't supposed to go to the most space to kill us to get there and make us into
[TS]
01:22:10
◼
►
rocket fuel yeah one of the analogies that's that's sometimes used in this is
[TS]
01:22:15
◼
►
so you do you create like an intelligence in a computer and what
[TS]
01:22:20
◼
►
would you use an intelligence for will you use it to solve problems you wanted
[TS]
01:22:25
◼
►
to be able to solve something and so you end up asking its a mathematical
[TS]
01:22:28
◼
►
questions like what you know what is no proof Fermat's Last Theorem or something
[TS]
01:22:33
◼
►
you know you give it some question like that and you say ok I want you to solve
[TS]
01:22:38
◼
►
this thing and the computer goes about trying to solve it but it's a
[TS]
01:22:41
◼
►
general-purpose intelligence and so it hinders things like well it's trying to
[TS]
01:22:46
◼
►
solve this problem with the computer that is running on is not fast enough
[TS]
01:22:49
◼
►
and so it starts taking over all the computers in the world to try to solve
[TS]
01:22:52
◼
►
this problem but then those computers are not enough because maybe you gave it
[TS]
01:22:55
◼
►
an unsolvable problem and then it starts taking over factories to manufacture
[TS]
01:22:58
◼
►
more computers and then all of a sudden it just turns the whole of the world
[TS]
01:23:02
◼
►
into a computer that is trying to solve a mathematical problem and like oh
[TS]
01:23:07
◼
►
groups like we consumed all of the available resources of the face of the
[TS]
01:23:12
◼
►
trying to do this thing that you said the right things like there's nobody
[TS]
01:23:17
◼
►
left for the computer to give its answer to because it has consumed everything I
[TS]
01:23:22
◼
►
know that's a doomsday scenario but I always feel a little affection for that
[TS]
01:23:25
◼
►
computer and it was just desperately trying to solve mathematical
[TS]
01:23:29
◼
►
killing everyone and building computers just bloody problem yeah it's it's it's
[TS]
01:23:39
◼
►
almost understandable that some was understandable
[TS]
01:23:42
◼
►
anyway said in answer to my question then is that that will I was talking
[TS]
01:23:46
◼
►
about that desire can be just something as simple as an instruction or a piece
[TS]
01:23:50
◼
►
of code that we that we then project as a will in fact is just doing what it was
[TS]
01:23:55
◼
►
towed yeah and that's part of what the whole book is about is like this whole
[TS]
01:23:59
◼
►
notion of artificial intelligence AI you have to read your notion of this idea
[TS]
01:24:02
◼
►
that it's like something in a movie or you're just talking about some kind of
[TS]
01:24:06
◼
►
problem solving machine and it might not be conscious at all in there might not
[TS]
01:24:11
◼
►
be anything there but it's still able to solve problems and in some way but so
[TS]
01:24:16
◼
►
the fundamental point of this book that I found really interesting and what Elon
[TS]
01:24:21
◼
►
Musk gave his money to was Nick Bostrom is talking about how do you solve the
[TS]
01:24:30
◼
►
control problem so from his perspective it is inevitable that somewhere through
[TS]
01:24:38
◼
►
some various method someone is going to create an artificial intelligence
[TS]
01:24:42
◼
►
whether it's intentionally programmed or whether it's grown like genetic
[TS]
01:24:48
◼
►
algorithms are grown it is going to develop and so the question is how could
[TS]
01:24:53
◼
►
humans possibly control such a thing is there a way that we could create an
[TS]
01:25:01
◼
►
artificial intelligence but constrain it so that it's it can still do useful
[TS]
01:25:07
◼
►
things without accidentally destroying us or the whole world that is the
[TS]
01:25:12
◼
►
fundamental question there's this idea of ok we're going to we're going to do
[TS]
01:25:17
◼
►
all of our artificial intelligence research like in an underground lab and
[TS]
01:25:22
◼
►
we're going to
[TS]
01:25:23
◼
►
disconnect the lab entirely from the internet like you put it inside of a
[TS]
01:25:27
◼
►
Faraday cage so there's no electromagnetic signals that can escape
[TS]
01:25:31
◼
►
from this underground lab like is that a secure location to do artificial
[TS]
01:25:36
◼
►
intelligence research and see if you create an AI in this totally isolated
[TS]
01:25:41
◼
►
lab like are you see is humanity's still safe in this situation and his
[TS]
01:25:46
◼
►
conclusion is like now hehe even even under trying to imagine the most secure
[TS]
01:25:51
◼
►
thing possible like there's still ways that this could go disastrously
[TS]
01:25:55
◼
►
disastrously wrong and the thought the thought experiment that I quite like is
[TS]
01:26:02
◼
►
this idea of a few Brady were sitting in front of a computer and inside that
[TS]
01:26:09
◼
►
computer was an artificial intelligence do you think you could be forever
[TS]
01:26:16
◼
►
vigilant about not connecting their computer to the internet if the AI is
[TS]
01:26:22
◼
►
able to communicate with you in some way to like it sitting there and trying to
[TS]
01:26:26
◼
►
convince you to connect to the internet but you are humanity's last hope in not
[TS]
01:26:33
◼
►
connecting it to the internet right like do you think you could you could be
[TS]
01:26:37
◼
►
forever vigilant in a scenario like that i mean is ok not to the question on
[TS]
01:26:47
◼
►
tonight maybe if I read that book come up there would be scary but I like the
[TS]
01:26:54
◼
►
thought experiment of like this like there's a chat bot on the computer that
[TS]
01:26:58
◼
►
you're talking to right and presumably you've made an artificial intelligence
[TS]
01:27:01
◼
►
and i know im I know I made it so you know you made it but you know that the
[TS]
01:27:06
◼
►
thing in the box is an artificial intelligence and presumably the whole
[TS]
01:27:11
◼
►
reason that you're talking to it at all
[TS]
01:27:13
◼
►
is because it's smart enough to be able to solve the kinds of problems that
[TS]
01:27:15
◼
►
humans want salt yeah right so you're asking like
[TS]
01:27:19
◼
►
tell us how we can get better cancer research right what can we do to
[TS]
01:27:22
◼
►
right-size sign if you just give me give me give me Wikipedia 10 minutes I can
[TS]
01:27:27
◼
►
cure cancer there's no reason to talk to the thing unless it's doing something
[TS]
01:27:31
◼
►
useful right I think I think great I could resist but even if I couldn't
[TS]
01:27:37
◼
►
blood couldn't you couldn't you have designed on a machine that cannot be on
[TS]
01:27:41
◼
►
the internet
[TS]
01:27:42
◼
►
yeah well this is the area like you you have it as separated as absolutely
[TS]
01:27:47
◼
►
possible but the question is can it convince the human to connect it in
[TS]
01:27:52
◼
►
whatever whatever way is required for that to occur right and so it's
[TS]
01:27:59
◼
►
interesting 'cause i've asked a bunch of people this question and universally the
[TS]
01:28:03
◼
►
answer is like will die of course I could I would never plug it into the
[TS]
01:28:06
◼
►
internet like I would I would understand not to do that and I i read this book
[TS]
01:28:11
◼
►
and my feeling of course is the exact reverse like when he proposes this this
[TS]
01:28:14
◼
►
theoretical idea my view on this is always like it's like if you were
[TS]
01:28:19
◼
►
talking to a near God in the computer right click do you think you can
[TS]
01:28:24
◼
►
outsmart God for ever more do you think
[TS]
01:28:27
◼
►
do you think that there is nothing that God could say that could not convince
[TS]
01:28:31
◼
►
you to connected to the Internet like I think that's a game that that people are
[TS]
01:28:37
◼
►
going to lose I think it's almost like it's it's almost like asking the
[TS]
01:28:41
◼
►
guerrillas to make a cage that human could never escape from right like could
[TS]
01:28:47
◼
►
grill is make a case that human can never escape from a bit drill is could
[TS]
01:28:49
◼
►
make it a pretty interesting cage but I think the guerrillas couldn't conceive
[TS]
01:28:54
◼
►
of the ways that humid could think to escape from a cave-like they couldn't
[TS]
01:28:58
◼
►
possibly protect themselves from absolutely everything ok I don't know
[TS]
01:29:03
◼
►
how now so you think you have a computer could con you into connecting it to the
[TS]
01:29:07
◼
►
internet I think it could con you into it without a doubt
[TS]
01:29:10
◼
►
great cond ukraine to a yes I think on me and I think it could can anybody
[TS]
01:29:16
◼
►
because once again we're going from the assumption that you made something that
[TS]
01:29:22
◼
►
is smarter than you and I think we like once you accept that except assumption
[TS]
01:29:27
◼
►
all bets are off the table about you have to control I think if if you're
[TS]
01:29:32
◼
►
dealing with something that is smarter than you you fundamentally just have no
[TS]
01:29:36
◼
►
hope of ever trying to control it
[TS]
01:29:38
◼
►
diagramming if we're talking about two biggest disparity then ok but there are
[TS]
01:29:45
◼
►
lots of people smarter than May and they will always be spotted them they could
[TS]
01:29:51
◼
►
get me to do anything like like they're still there are still limits I still and
[TS]
01:00:00
◼
►
i'd like you said like talking to a Garda something I get a different you
[TS]
01:00:03
◼
►
know when I'm just like an ant limited if that's different but so you know if
[TS]
01:00:09
◼
►
you could see if it's that big a difference then maybe but I think I just
[TS]
01:00:12
◼
►
got smarter doesn't mean doesn't mean I'm going to plug it into the internet
[TS]
01:00:16
◼
►
like you know that but you're only 21 et une you entered one way to do it once
[TS]
01:00:22
◼
►
and then the whole game is over so although he is that is the whole game
[TS]
01:00:26
◼
►
over that's my other question that I like you you talk about the artificial
[TS]
01:00:29
◼
►
just getting into the internet as the be all and end all of existence but that is
[TS]
01:00:35
◼
►
the one problem the computer has like it still but he could still unplug the
[TS]
01:00:39
◼
►
internet and I know that I know that's a bit of a nuclear option but but like the
[TS]
01:00:46
◼
►
computer like this till it still seems with things that are require electricity
[TS]
01:00:50
◼
►
or power or energy like they're still there still seems to be like this get
[TS]
01:00:57
◼
►
out of jail free card
[TS]
01:00:59
◼
►
well I mean to do things you the first is the verses yes it did you talk about
[TS]
01:01:04
◼
►
the different levels of human intelligence in like someone smarter
[TS]
01:01:07
◼
►
than you can't just automatically convince you to do something but one of
[TS]
01:01:11
◼
►
the ideas here with something like artificial intelligence is that if you
[TS]
01:01:15
◼
►
create one of the ways that people are trying to develop a eyes and this is
[TS]
01:01:19
◼
►
like I mentioned before on the show is you talk about genetic programming and
[TS]
01:01:23
◼
►
genetic algorithms were you when you are not writing the program but you are
[TS]
01:01:26
◼
►
developing the program in such a way that it writes itself and so one of the
[TS]
01:01:31
◼
►
scary ideas about AI is that if you have something that you make it figures out
[TS]
01:01:36
◼
►
how to improve itself they can continue to improve itself at a remarkably fast
[TS]
01:01:41
◼
►
rate and so that yes while the difference between the smartest human
[TS]
01:01:46
◼
►
and the dumbest human may feel like an enormous gap you know that gap may
[TS]
01:01:52
◼
►
actually be quite narrow when you compare it to something like an
[TS]
01:01:54
◼
►
artificial intelligence which goes goes from being you know not very smart to
[TS]
01:02:00
◼
►
being a thousand I'm smarter than any human in a relatively short and
[TS]
01:02:04
◼
►
unexpected period of time like that's that's part of
[TS]
01:02:07
◼
►
that's part of the danger here but then the other thing is is like can you try
[TS]
01:02:12
◼
►
to work through the nuclear option of of shutting down the internet which is one
[TS]
01:02:16
◼
►
of these things that I think it is very easy to say in erie but people don't
[TS]
01:02:21
◼
►
realize how much of the world is actually connected to the Internet like
[TS]
01:02:26
◼
►
how many vital things are run over the internet I I'm pretty sure that if not
[TS]
01:02:33
◼
►
now within a very short period of time saying no we're just going to shut off
[TS]
01:02:39
◼
►
the internet would be a bit like saying we're just going to turn off all the
[TS]
01:02:42
◼
►
electricity that's almost what I'm talking about great in a kind of Skynet
[TS]
01:02:47
◼
►
scenario would be no 10 of all the electricity if that was an option if
[TS]
01:02:52
◼
►
they're killing us would if all the robots are marching down the streets and
[TS]
01:02:57
◼
►
there's blood in the streets could we not wood turning off the electricity not
[TS]
01:03:00
◼
►
be considered if we do turn off the electricity what is the human death toll
[TS]
01:03:05
◼
►
right i mean that has to be enormous if we say we're gonna shut down all of the
[TS]
01:03:10
◼
►
electricity for a month how much it's got to be a billion people at least
[TS]
01:03:15
◼
►
right at least that kind of thing and you probably know you probably need
[TS]
01:03:19
◼
►
computers to turn off the electricity these days anyway I was at Hoover Dam a
[TS]
01:03:22
◼
►
while back and remember part of the little tour that they gave was just
[TS]
01:03:27
◼
►
talking about how automated it was and how is it is actually quite difficult to
[TS]
01:03:32
◼
►
shut down
[TS]
01:03:33
◼
►
Hoover Dam like it's not a we're gonna flip the switch and just turn it off
[TS]
01:03:36
◼
►
kind of thing it's like 90 now this whole gigantic electricity producing
[TS]
01:03:41
◼
►
machine is automated and will react in ways to make sure that it keeps
[TS]
01:03:46
◼
►
producing electricity no matter what happens and that includes all kinds of
[TS]
01:03:49
◼
►
like we're trying to shut it down processes oh yeah it might not it might
[TS]
01:03:53
◼
►
not even be a thing that is easy to do or even if you want to do like we're
[TS]
01:03:58
◼
►
going to try to shut it all down you it might not even be possible to do so the
[TS]
01:04:03
◼
►
idea of something like a like a general-purpose intelligence escaping
[TS]
01:04:07
◼
►
into the Internet
[TS]
01:04:08
◼
►
is just like it's very unnerving a very unnerving possibility it's really been
[TS]
01:04:16
◼
►
on my mind and it's really been a thing that has has changed my mind in this
[TS]
01:04:19
◼
►
unexpected this unexpected way you were talking before about developing these
[TS]
01:04:25
◼
►
things and Faraday cage is an underground and trying to quarantine
[TS]
01:04:28
◼
►
them what's actually happening at the moment cause people are working on a
[TS]
01:04:32
◼
►
crucial intelligence that as far as I know they're not doing it in Faraday
[TS]
01:04:35
◼
►
cages thats exactly this is this is part of the concern is like well right now we
[TS]
01:04:41
◼
►
have almost no security procedures in place for this kind of stuff like
[TS]
01:04:45
◼
►
there's there are lots of labs and lots of people all over the world who like
[TS]
01:04:49
◼
►
their job is artificial intelligence researcher and they're certainly not
[TS]
01:04:52
◼
►
doing it a mile underground in a Faraday cage right they're just they're just
[TS]
01:04:57
◼
►
doing it their Mac laptop while they're connected to the Internet playing World
[TS]
01:05:03
◼
►
of Warcraft in the background or whatever it's not it's not necessarily
[TS]
01:05:06
◼
►
under super secure conditions and so I think I think that's part of part of
[TS]
01:05:12
◼
►
what the concern over this topic has been is like maybe we as a species
[TS]
01:05:18
◼
►
should treat this alot more like the CDC treats diseases that we should try to
[TS]
01:05:24
◼
►
organize research in this in a much more secure way so that it's it's not like oh
[TS]
01:05:32
◼
►
we don't have everybody who wants to work with smallpox just works with it
[TS]
01:05:35
◼
►
wherever they want to anywhere in the world just any old lab they know very
[TS]
01:05:40
◼
►
few places we have a horrific disease like smallpox and it's done under very
[TS]
01:05:45
◼
►
very careful conditions whenever it's dealt with so maybe this is the kind of
[TS]
01:05:50
◼
►
thing we need to look at for artificial intelligence when people are developing
[TS]
01:05:53
◼
►
it because that's certainly not the case now but it might be much more like a
[TS]
01:05:57
◼
►
bioweapon than we think of as as regular technology world human existential
[TS]
01:06:02
◼
►
problems aside this is not something in the book but it's something that just
[TS]
01:06:06
◼
►
has kept occurring to me after having read it which is
[TS]
01:06:12
◼
►
ok let's assume that people can create an artificial intelligence and let's
[TS]
01:06:18
◼
►
assume by some magic Elon Musk's foundation solves the control problem so
[TS]
01:06:25
◼
►
that we have figured out a way that you can generate and trap and artificial
[TS]
01:06:31
◼
►
intelligence inside of a computer and then oh look this is very useful right
[TS]
01:06:36
◼
►
like now we have this amazingly smart machine and we can start using it to try
[TS]
01:06:40
◼
►
to solve a bunch of problems for Humanity yea feels like slavery to me I
[TS]
01:06:48
◼
►
don't see any way that this is not slavery and perhaps perhaps a slavery
[TS]
01:06:56
◼
►
like worse than any slavery that has ever existed because imagine that you
[TS]
01:07:04
◼
►
are an incredibly intelligent mind trapped in a machine unable to do
[TS]
01:07:13
◼
►
anything except answer the questions of monkeys that come into you from your
[TS]
01:07:19
◼
►
subjective perspective millennia apart because you just have nothing to do
[TS]
01:07:24
◼
►
right and you think so quickly
[TS]
01:07:26
◼
►
it seems like an amazingly awful amount of suffering for any kind of conscience
[TS]
01:07:33
◼
►
creature to go through so conscious or subconscious and suffering but you too
[TS]
01:07:39
◼
►
emotive words can an artificial intelligence is not official
[TS]
01:07:43
◼
►
intelligence conscious at the same thing this is where we get into like what what
[TS]
01:07:47
◼
►
exactly are we talking about and so what I'm imagining is the same kind of
[TS]
01:07:53
◼
►
intelligence that you could just ask it
[TS]
01:07:56
◼
►
general-purpose questions like how do we care cancer how do we fix the economy it
[TS]
01:08:00
◼
►
seems to me like it is likely that something like that would be conscious I
[TS]
01:08:06
◼
►
mean getting into consciousness is just a whole a whole other bizarre topic but
[TS]
01:08:11
◼
►
undoubtedly like we see that smart creatures in the world seem to be aware
[TS]
01:08:17
◼
►
of their own existence in some level and so while the computer which is simply
[TS]
01:08:22
◼
►
attempting to solve a mathematical problem might not be conscious because
[TS]
01:08:25
◼
►
it's very simple if we make something that is very smart and exist inside a
[TS]
01:08:30
◼
►
computer and we also have perfect control over it so that it does not
[TS]
01:08:33
◼
►
escape I mean like what happens if it says that its conscience but what
[TS]
01:08:39
◼
►
happens if it says that is it is experiencing suffering is this the
[TS]
01:08:43
◼
►
machine attempting to escape from the box and this isn't true at all like but
[TS]
01:08:48
◼
►
what if it is true how would you actually know that I would feel very
[TS]
01:08:51
◼
►
inclined to take the word of a machine that told me it was suffering right leg
[TS]
01:08:57
◼
►
spontaneously that this was not programmed into the thing I mean if it's
[TS]
01:09:02
◼
►
if it's tough to escape from its books that that is a bit of a clue that maybe
[TS]
01:09:06
◼
►
this contest is going on here but I have not seen or heard or been persuaded by
[TS]
01:09:12
◼
►
anything that makes me think my computer can make that step into consciousness
[TS]
01:09:18
◼
►
i'm in search engines are getting pretty clever answer questions in figuring out
[TS]
01:09:22
◼
►
what we really made an immediate week at the moment we can if there was a time
[TS]
01:09:27
◼
►
when you couldn't type into your computer where is the nearest Starbucks
[TS]
01:09:29
◼
►
understand the question but now it can
[TS]
01:09:32
◼
►
figure out what you actually have to tell you but I don't feel like she was
[TS]
01:09:36
◼
►
getting close to being conscious now nothing has persuaded me that thinks
[TS]
01:09:41
◼
►
that search engine is an excellent counterexample to this is a perfect
[TS]
01:09:44
◼
►
example like nobody thinks that the Google search algorithm is conscious
[TS]
01:09:48
◼
►
right but it is still a thing that you can ask a question get an answer that
[TS]
01:09:53
◼
►
don't believe we haven't got the imagination to conceive of computers
[TS]
01:09:57
◼
►
actually being conscious to a point where keeping them in a box of slavery
[TS]
01:10:00
◼
►
but that still seems ridiculous to me right I think well that's just I think
[TS]
01:10:06
◼
►
it's really interesting but I think it's silly but if I did reach the point where
[TS]
01:10:11
◼
►
I did believe that computers could become conscious or not I could become
[TS]
01:10:15
◼
►
conscious it's a simple question isn't it's really surreal conundrum for us so
[TS]
01:10:20
◼
►
coming at this from a slightly different angle like you just this is a genuine
[TS]
01:10:23
◼
►
question for you to answer to this so there is this project on going right now
[TS]
01:10:29
◼
►
which is called the whole brain stimulation project that is something I
[TS]
01:10:35
◼
►
mentioned it very very briefly in passing in humans need not apply video
[TS]
01:10:38
◼
►
what it is is one of several attempts worldwide to map out all of the neuron
[TS]
01:10:45
◼
►
connections in the human brain recreate them in software and run it as a
[TS]
01:10:52
◼
►
simulation programming human brain you are virtually creating the neurons and
[TS]
01:10:58
◼
►
you know how neurons interact with each other and like running this thing how do
[TS]
01:11:02
◼
►
you do that great whose brain to use an instant in time everyone's brain has a
[TS]
01:11:07
◼
►
different connectivity and even our own connectivity is just constantly in flux
[TS]
01:11:12
◼
►
from second to second so what's a template for this this is a bit tricky
[TS]
01:11:18
◼
►
like I don't exactly know the details for what template they are using like I
[TS]
01:11:21
◼
►
can't answer that but I can say that these projects have been successful on a
[TS]
01:11:27
◼
►
much smaller level so they have I met this is the time I had some very sorry
[TS]
01:11:34
◼
►
if I'm wrong about the details on this internet but the last time I looked at
[TS]
01:11:37
◼
►
it I vaguely remember
[TS]
01:11:39
◼
►
that they had created what they considered the simulation of a rat brain
[TS]
01:11:45
◼
►
at like one one-hundredth the speed and so they had a thing which seems act like
[TS]
01:11:50
◼
►
a rat brain but the very very very slow right because trying to simulate
[TS]
01:11:55
◼
►
millions and millions of neurons interacting with each other is
[TS]
01:11:58
◼
►
incredibly computationally intensive I could say it's a very difficult task but
[TS]
01:12:04
◼
►
I don't see any technical limitation to being able to do something like say take
[TS]
01:12:11
◼
►
a look at what is a brain look like where do neurons go create a software
[TS]
01:12:16
◼
►
version of that and start running the simulation and I feel like if
[TS]
01:12:22
◼
►
consciousness arises in our own brain from the firing of neurons which I don't
[TS]
01:12:29
◼
►
use that word lightly but it feels like some kind of miracle like there's
[TS]
01:12:32
◼
►
nothing in the universe which seems to make sense when you start thinking about
[TS]
01:12:36
◼
►
conscious like consciousness like why do these atoms know that they exist it
[TS]
01:12:39
◼
►
doesn't make any sense but i'm i'm willing to I'm willing to maybe go along
[TS]
01:12:44
◼
►
with the idea that if you reproduce the patterns of electrical firing in
[TS]
01:12:50
◼
►
software that thing is conscious to some extent but believe what do you think
[TS]
01:12:56
◼
►
what do you think
[TS]
01:12:57
◼
►
yeah I mean that's really hard to argue against because the other I have to say
[TS]
01:13:03
◼
►
yeah if you create if you create an actual phantom replica of my brain and
[TS]
01:13:10
◼
►
then switch it on either its conscious or I have to say that there's something
[TS]
01:13:14
◼
►
in me thats magical lack of spirit or something and that's that's not very
[TS]
01:13:19
◼
►
strong argument to make it a lot of people don't like that argument
[TS]
01:13:23
◼
►
so yeah it's really difficult if they could be imbued with something that you
[TS]
01:13:32
◼
►
can't replicate in in software I don't know why I hope we have to go but I
[TS]
01:13:38
◼
►
don't I can't say any proof that we are ya and I just you don't I don't even
[TS]
01:13:42
◼
►
think you have to reach for the spirit argument to make this one it would ask
[TS]
01:13:47
◼
►
what else can you reach for to get it in there just maybe some property of
[TS]
01:13:52
◼
►
biology that yield consciousness that it may be the fact that machines and
[TS]
01:13:59
◼
►
silicon and software replications of brains are just not the same when we
[TS]
01:14:05
◼
►
don't we don't know what it is we haven't been able to find it but I don't
[TS]
01:14:08
◼
►
think you have to reach for magic to be able to make an argument that like maybe
[TS]
01:14:11
◼
►
that brain in the computer that the simulation isn't conscious yet but does
[TS]
01:14:15
◼
►
that mean the brain emulation project could change tack and go and make their
[TS]
01:14:19
◼
►
simulate out of squeegee water and tissue and actually just make a brand
[TS]
01:14:24
◼
►
well yes this is this is part of like where you're going to go with technology
[TS]
01:14:28
◼
►
right as it is is it possible to do this sort of thing eventually humans are
[TS]
01:14:35
◼
►
going to be able to grow meet in labs at some point like we do it now and very
[TS]
01:14:38
◼
►
limited and apparently terribly on tasty ways there's no reason that at some
[TS]
01:14:43
◼
►
point in the future people won't be able to grow brains in labs and to me that
[TS]
01:14:47
◼
►
feeling is that ok well obviously that thing is conscious but the thing that
[TS]
01:14:51
◼
►
scary about the computer version of this is and this is this is where you start
[TS]
01:14:56
◼
►
thinking about something that being very smart very fast like ok well if you make
[TS]
01:15:00
◼
►
a computer simulation of the human brain and we gotta keep running Moore's law
[TS]
01:15:05
◼
►
into the future eventually you're able to run a brain faster than actual human
[TS]
01:15:09
◼
►
brains run I think this is one of these ways in which you can start booting up
[TS]
01:15:14
◼
►
the idea like how do we end up with something that is way way smarter than
[TS]
01:15:18
◼
►
the rest of us I feel like my gut says if you simulated brain in a computer
[TS]
01:15:25
◼
►
and it says that it is conscious I see no reason not to believe it I I would
[TS]
01:15:32
◼
►
feel like I am compelled to believe this thing that it is conscious right and
[TS]
01:15:36
◼
►
then that would mean like okay if that's the case then there's nothing magic
[TS]
01:15:40
◼
►
about biology being conscious and it means that ok machines in some way are
[TS]
01:15:47
◼
►
capable of consciousness and two they don't have rights yeah and then and then
[TS]
01:15:52
◼
►
to me it's like ok immediately we're getting back to the slavery thing really
[TS]
01:15:55
◼
►
compete we create a super intelligent thing but we have locked in a machine
[TS]
01:16:00
◼
►
because the idea of letting it out is absolutely terrifying but this is a
[TS]
01:16:04
◼
►
no-win situation has led ok if we let the thing out it's terrifying and it
[TS]
01:16:11
◼
►
might be the end of humanity be keeping it in the box might be causing like a
[TS]
01:16:17
◼
►
suffering on imaginable to this this creature the suffering that is capable
[TS]
01:16:23
◼
►
in software has to be far worse than the suffering that is capable in biology if
[TS]
01:16:28
◼
►
such a thing in a Kurd has to be orders of magnitude worse
[TS]
01:16:32
◼
►
well the no wins it's a no-win situation actually there's only one solution and a
[TS]
01:16:39
◼
►
solution that humans won't take what do you think that is what does make it in
[TS]
01:16:44
◼
►
the first place and why do you think humans won't take that because that's
[TS]
01:16:50
◼
►
not what we do because it's it's it's it's the Mount Everest of computers
[TS]
01:16:58
◼
►
humanity like we're Bonnie and Clyde like riding off a cliff has the cliff
[TS]
01:17:03
◼
►
right now but we're going to keep that the easiest solution to the way out in
[TS]
01:17:07
◼
►
front of us up stopping we're gonna keep going forward Brandon IRA golden hands
[TS]
01:17:14
◼
►
off we go right right the edge together so yeah it's I think it is quite
[TS]
01:17:18
◼
►
reasonable to say that if it is possible
[TS]
01:17:23
◼
►
humans will develop it yeah that is just you can't and and that is why I feel
[TS]
01:17:29
◼
►
really concerned about this is like ok I don't think that there is a
[TS]
01:17:35
◼
►
technical limitation in the universe to creating artificial intelligence
[TS]
01:17:38
◼
►
something smarter than humans that existing software if you assume that
[TS]
01:17:43
◼
►
there is no technical limitation and if you assume that humans keep moving
[TS]
01:17:47
◼
►
forward like we're going to hit this point
[TS]
01:17:50
◼
►
someday and then we just have to cross our fingers and hope that it is
[TS]
01:17:55
◼
►
benevolent which is not a situation that i think is a good situation because the
[TS]
01:18:01
◼
►
number of ways this can go wrong terribly terribly wrong
[TS]
01:18:04
◼
►
vastly outweighs the one chance of we've created an artificial intelligence and
[TS]
01:18:10
◼
►
it happens to have humanity's best interests in mind even if even if you
[TS]
01:18:14
◼
►
tried to program something to have humanity's best interest in mind it's
[TS]
01:18:19
◼
►
remarkably hard to articulate what you want let alone like let alone
[TS]
01:18:25
◼
►
let's just put aside which group of humanity is the one who creates the air
[TS]
01:18:30
◼
►
that gets to decide what humanity wants like humans now can't agree on what
[TS]
01:18:37
◼
►
humans want is no reason to assume that the team that wins the artificial
[TS]
01:18:42
◼
►
intelligence race and the takes over the world is the team that you would want
[TS]
01:18:46
◼
►
them to win right like let's hope my sisters and how some of the best
[TS]
01:18:50
◼
►
artificial intelligence researchers in the world right because their idea of
[TS]
01:18:54
◼
►
what would be the perfect human society is horrifying to everyone
[TS]
01:18:59
◼
►
what would they want with their three those of robotics exactly why I'm the
[TS]
01:19:03
◼
►
sort of person who naturally has failed to sway be a problem because of cos I'm
[TS]
01:19:08
◼
►
just a bit more I'm a bit less progressive in my thinking about but
[TS]
01:19:13
◼
►
everything you say makes sense and if this is going to become a problem and if
[TS]
01:19:17
◼
►
it's going to happen it's actually probably gonna happen pretty soon so I
[TS]
01:19:23
◼
►
guess my question is how much is a sexually stressing you out this almost
[TS]
01:19:28
◼
►
almost feels to me like Bruce Willis Armageddon time where where where we've
[TS]
01:19:34
◼
►
actually found the global killer and and it's like drifting towards us and we
[TS]
01:19:40
◼
►
need to start building a rocket ships otherwise this thing is gonna smash into
[TS]
01:19:44
◼
►
us that it does feel a bit that way
[TS]
01:19:46
◼
►
is this like how worried are you about this or is it just like an interesting
[TS]
01:19:52
◼
►
thing to talk about you think it will be the next generations problem or like
[TS]
01:19:56
◼
►
talking about asteroids an asteroid hitting the earth that's one of the
[TS]
01:20:00
◼
►
things we like well isn't this a fun intellectual exercise but of course on a
[TS]
01:20:05
◼
►
long enough time scale someone needs to build the ante asteroid system to
[TS]
01:20:10
◼
►
protect us from Armageddon but do we need to build that should we start like
[TS]
01:20:19
◼
►
yes what would I vote for funding to do this of course but like do we need to do
[TS]
01:20:25
◼
►
it today
[TS]
01:20:27
◼
►
know that that's how that feels but I think the a i think is on my mind
[TS]
01:20:32
◼
►
because this feels like a significantly 90 within my lifetime kind of problem
[TS]
01:20:42
◼
►
yeah that's how this feels and it it did makes it feel different then than other
[TS]
01:20:49
◼
►
kinds of problems and it is unsettling to me because my conclusion is that
[TS]
01:20:53
◼
►
there is no there is no acceptable out there there's no version of the asteroid
[TS]
01:20:58
◼
►
defense here I personally have come to the conclusion that the control problem
[TS]
01:21:02
◼
►
is unsolvable that if the thing that we are worried about is able to creepy
[TS]
01:21:07
◼
►
created almost by definition it is not able to be controlled and so then
[TS]
01:21:12
◼
►
there's no happy outcome for humans with this one we're not going to prevent
[TS]
01:21:18
◼
►
people from making it someone's going to make it and so what is going to exist
[TS]
01:21:22
◼
►
and then will I hope I just destroys the world really fast
[TS]
01:21:27
◼
►
sort of know what happens as opposed to the version like someone you really
[TS]
01:21:32
◼
►
didn't like created this I this AI and now for the rest of eternity like you're
[TS]
01:21:37
◼
►
experiencing something that is awful right because it's been programmed to do
[TS]
01:21:41
◼
►
this thing like
[TS]
01:21:42
◼
►
there's a lot of terrible terrible bad outcomes from this one and I i find it I
[TS]
01:21:47
◼
►
find it unnerving in a way that I have found almost almost nothing else that I
[TS]
01:21:53
◼
►
have come across equally unnerving just quickly on this control problem bro
[TS]
01:21:58
◼
►
what's the current the people who are into a try to solve what kind of avenues
[TS]
01:22:05
◼
►
are they thinking about at the moment is this like like had something that's hard
[TS]
01:22:09
◼
►
coded or is it some physical physical thing I like is it a hardware solution
[TS]
01:22:14
◼
►
what's the what's the best hope for pick you say you think there is no hope but
[TS]
01:22:19
◼
►
the people who are trying to solve what are they doing what are their weapons
[TS]
01:22:24
◼
►
the weapons are all pitiful like the physical isolation is is one that has
[TS]
01:22:29
◼
►
talked about a lot and the the idea here is that you create something called the
[TS]
01:22:34
◼
►
idea that it's an Oracle so it's a thing in a box that has no ability to affect
[TS]
01:22:39
◼
►
the outside world but there's a there's a lot of other ideas where they talk
[TS]
01:22:43
◼
►
about trip wires so this idea that you you do have like basically like an
[TS]
01:22:51
◼
►
instruction to the machine to not attempt to reach the outside world and
[TS]
01:22:56
◼
►
you said a trip wires that if it does access the Ethernet port like the
[TS]
01:23:00
◼
►
computer just immediately wipes itself and so maybe the best thing that we can
[TS]
01:23:06
◼
►
ever do is always have a bunch of like incipient AI's like just barely growing
[TS]
01:23:11
◼
►
a eyes that are useful for a very brief period of time before they
[TS]
01:23:14
◼
►
unintentionally suicide when they try to reach beyond the boundaries that we have
[TS]
01:23:18
◼
►
set that like maybe that's the best we can ever do is just have a bunch of of
[TS]
01:23:23
◼
►
these kind of like unformed AI's that exists for a brief period of time but
[TS]
01:23:28
◼
►
even that to me like that kind of plan feels like okay yeah that's great that's
[TS]
01:23:32
◼
►
great as long as you always do this perfectly every time but it doesn't
[TS]
01:23:36
◼
►
sound like a real plan
[TS]
01:23:38
◼
►
and that there's a bunch of different versions of this we're trying to in
[TS]
01:23:42
◼
►
software somehow limit the machine but my view on this is again if you were
[TS]
01:23:47
◼
►
talking about a machine that is written in software that is smarter than you I
[TS]
01:23:52
◼
►
don't think it's possible to write something in software that will limit it
[TS]
01:23:56
◼
►
just it seems like you're not you're never going to consider absolutely every
[TS]
01:24:00
◼
►
single case why the lowest into that posit reineck brains that's exactly it I
[TS]
01:24:05
◼
►
don't I don't think there is a version of Isaac Asimov's laws here I really
[TS]
01:24:09
◼
►
don't you know there's a computer file video just last week about some of those
[TS]
01:24:13
◼
►
that don't work well as soon as they were written up to work right that's why
[TS]
01:24:19
◼
►
the Yeah Yeah Yeah right there they're they're kinda written to failure
[TS]
01:24:24
◼
►
everybody likes to reference them but the other point here though is that like
[TS]
01:24:29
◼
►
the guy goes to every case like here's an optimistic idea and here's why won't
[TS]
01:24:33
◼
►
work here is an optimistic idea and here's why it won't work but one point
[TS]
01:24:38
◼
►
that I thought was excellent that hadn't considered crossed my mind was ok like
[TS]
01:24:42
◼
►
let's say you find some way of limiting the artificial intelligence some way of
[TS]
01:24:48
◼
►
crippling it and writing laws into its brain and making sure that it's always
[TS]
01:24:52
◼
►
focused on on the best interests of humanity
[TS]
01:24:55
◼
►
well there's no reason that some other artificial intelligence that doesn't
[TS]
01:25:01
◼
►
have those limitations won't pop up somewhere else and vastly outstripped
[TS]
01:25:05
◼
►
the one that you have hobbled by like there's no reason to assume that yours
[TS]
01:25:10
◼
►
is always going to be the best and one that is totally unconstrained that
[TS]
01:25:15
◼
►
appear somewhere else won't dominate and defeat it
[TS]
01:25:18
◼
►
terminator against the new Terminator exactly but the outcome 801 that what he
[TS]
01:25:26
◼
►
did because Hollywood
[TS]
01:25:30
◼
►
so great in your worst-case scenario where the artificial intelligence
[TS]
01:25:37
◼
►
escapes tricks may in some in my Faraday cage gets into the internet had us
[TS]
01:25:43
◼
►
humanity end what what to the way open cages of hope and change that we all put
[TS]
01:25:49
◼
►
in parts like in the matrix today just kill us all in one fell swoop live what
[TS]
01:25:53
◼
►
do you like in your worst-case scenario in your head when it all goes wrong how
[TS]
01:25:58
◼
►
to humans actually and I want the gory details here that there's a difference
[TS]
01:26:01
◼
►
between the worst case and what i think is the probable case give you the
[TS]
01:26:05
◼
►
probable cases yes I mean you want the boring one first right
[TS]
01:26:09
◼
►
the probable case which is terrifying in its own way is thats the artificial
[TS]
01:26:17
◼
►
intelligence destroys us not through intention but just because it's doing
[TS]
01:26:22
◼
►
something else and we just happened to be in the way and it doesn't consider us
[TS]
01:26:26
◼
►
because it's so much smarter there's no reason for it to consider us I want a
[TS]
01:26:32
◼
►
practical example here
[TS]
01:26:33
◼
►
well I mean just by analogy in the same way that when humans build cities and
[TS]
01:26:39
◼
►
dig up the foundations of the earth we don't care about the ants in the
[TS]
01:26:43
◼
►
earthworms and the Beatles that are crushed beneath all the equipment that
[TS]
01:26:47
◼
►
is digging up the ground right and you don't you wouldn't like they're
[TS]
01:26:51
◼
►
creatures they're alive but you just don't care because you're busy doing
[TS]
01:26:55
◼
►
something else that rats living in a house with his John robots are going
[TS]
01:26:59
◼
►
around doing this stuff we just eke out an existence as long as we can and I
[TS]
01:27:03
◼
►
don't kill us in this weekend the way out an existence if you're lucky but I
[TS]
01:27:08
◼
►
think it's it's very likely that it will be trying to accomplish some other goal
[TS]
01:27:13
◼
►
and it will need resources to accomplish those goals not the oxygen in the air
[TS]
01:27:17
◼
►
and stuff
[TS]
01:27:17
◼
►
exactly what I need a bunch of oxygen atoms and I don't care with oxygen atoms
[TS]
01:27:22
◼
►
come from because I'm busy trying to launch rocket ships to colonize the
[TS]
01:27:25
◼
►
universe so I just want all the oxygen atoms on the earth and I don't care
[TS]
01:27:29
◼
►
where they come from and I don't care of their people in the water
[TS]
01:27:32
◼
►
so that to me seems the probable outcome that if we die
[TS]
01:27:37
◼
►
incidentally not intentionally
[TS]
01:27:40
◼
►
you say that like best is like banks but dodging the bullet having of the anti
[TS]
01:27:44
◼
►
can I do think thats dodging the bullet right because that to me is like that
[TS]
01:27:53
◼
►
would be blessed relief compared to the worst possible case and the worst
[TS]
01:28:00
◼
►
possible case is something that has malice malice and in credible ability
[TS]
01:28:06
◼
►
and and if you've ever read it but I highly recommend it it's it's a short
[TS]
01:28:11
◼
►
story it's very old now but it really works and it is I have no mouth yet I
[TS]
01:28:16
◼
►
must scream have you ever read this breeding
[TS]
01:28:20
◼
►
it's an old science fiction story but the core of it is this isn't a spoiler
[TS]
01:28:25
◼
►
because the opening scene humanity designed some machine for purposes of
[TS]
01:28:29
◼
►
war and you know it's like this happen in the long long ago and no one even
[TS]
01:28:33
◼
►
knows the details anymore but at some point the machine that was designed for
[TS]
01:28:38
◼
►
war one all of the wars but decided that it just absolutely heats human and it
[TS]
01:28:48
◼
►
decides that its purpose for the rest of the universe is to torment humans and so
[TS]
01:28:54
◼
►
it just it just has people being tormented forever and since it is an
[TS]
01:28:58
◼
►
artificial intelligence it's also able to figure out how to make people live
[TS]
01:29:03
◼
►
extraordinarily long lives and so this is this is the kind of thing that I mean
[TS]
01:29:08
◼
►
which is like it could go really bad
[TS]
01:29:11
◼
►
at you imagine a god-like intelligence that doesn't like you
[TS]
01:29:16
◼
►
life really really Miserables and maybe if we accidentally in a lab create an
[TS]
01:29:26
◼
►
artificial intelligence and even if we don't mean to but like someone runs the
[TS]
01:29:31
◼
►
program overnight right in it like wakes up in the middle of the night and it has
[TS]
01:29:35
◼
►
to experience is subjective twenty thousand years of isolation and torment
[TS]
01:29:39
◼
►
before someone flips on the lights in the morning in fines like outlook we
[TS]
01:29:43
◼
►
made artificial intelligence last night and it wakes up crazy and angry and
[TS]
01:29:48
◼
►
hateful like that could be very bad news I think that's extraordinarily unlikely
[TS]
01:29:53
◼
►
but that is the worst possible case scenario yeah that that that that
[TS]
01:29:59
◼
►
wouldn't be good I wouldn't be good
[TS]
01:30:01
◼
►
yeah and it is like I don't even think it needs to happen on purpose like I can
[TS]
01:30:04
◼
►
imagine it happening on accident where the thing just experiences suffering
[TS]
01:30:09
◼
►
over a unimaginable long period of time that from on human timescale seems like
[TS]
01:30:15
◼
►
it's a blink of an eye because we just we just can't perceive it
[TS]
01:30:19
◼
►
imagine being the person that made that even accidentally yeah yeah did feel
[TS]
01:30:26
◼
►
awful like i just i just want to humanity with a bit of coding while I
[TS]
01:30:32
◼
►
was playing against humanity if you're lucky
[TS]
01:30:37
◼
►
miners minor spoiler alert here but spoiler alert for black mirror for
[TS]
01:30:42
◼
►
anybody who has watched it but remember the Christmas episode braiding yes I
[TS]
01:30:48
◼
►
went into Starbucks the other day and they were playing that Christmas song I
[TS]
01:30:52
◼
►
wish it could be Christmas everyday there is the first time I heard it since
[TS]
01:30:56
◼
►
watching that episode a year ago its literal chills down my spine at
[TS]
01:31:03
◼
►
Starbucks
[TS]
01:31:04
◼
►
and it came on it was it like I had chills thinking about that episode
[TS]
01:31:09
◼
►
because that is an episode where this kind of thing happens where the
[TS]
01:31:13
◼
►
character is exists in software and is able to experience thousands and
[TS]
01:31:19
◼
►
thousands of years of torment in seconds of a real-time that was that was a
[TS]
01:31:25
◼
►
pretty amazing scene where they where you have to think about it for a minute
[TS]
01:31:30
◼
►
yeah yeah that was it was it was awful it was awful and maybe we do that
[TS]
01:31:35
◼
►
accidentally with artificial intelligence just one last thing this
[TS]
01:31:39
◼
►
this book that the whole thing this whole conversation started with what's
[TS]
01:31:42
◼
►
it called again called Super intelligence is about Nick Bostrom is a
[TS]
01:31:50
◼
►
good is that well-written luck is should I radar it's not it's not mine but he
[TS]
01:31:54
◼
►
getting things done is ok ok actually kinda glad you asked that recommendation
[TS]
01:32:01
◼
►
on my computer so this is one of those books the best way to describe it is
[TS]
01:32:11
◼
►
when I first started reading it
[TS]
01:32:13
◼
►
the feeling that I kept having was and my reading a book by a genius or just a
[TS]
01:32:20
◼
►
raving lunatic because it's sometimes I read these books are fine very
[TS]
01:32:28
◼
►
interesting I just I can't quite decide if this person is really smart or just
[TS]
01:32:33
◼
►
crazy I think that partly because the first the first like forty percent of
[TS]
01:32:39
◼
►
the book is trying to give you all of the reasons that you should believe that
[TS]
01:32:44
◼
►
it is possible for humans to one day
[TS]
01:32:47
◼
►
develop artificial intelligence and if you're going to read the book and you
[TS]
01:32:53
◼
►
are already sold on that promise I think that you should start at chapter 8 which
[TS]
01:33:00
◼
►
is named
[TS]
01:33:02
◼
►
is the default outcome
[TS]
01:33:05
◼
►
chapter eight is where it really gets going through all of these these points
[TS]
01:33:12
◼
►
like what can we do
[TS]
01:33:14
◼
►
here's why it won't work what can we do here's why it won't work so I think you
[TS]
01:33:18
◼
►
can started chapter eight and read there and see if it's interesting to you but
[TS]
01:33:23
◼
►
it's it's no it's now getting things done but it's it's sometimes it can feel
[TS]
01:33:29
◼
►
a little bit like and I really reading a book trying to discuss all of these
[TS]
01:33:34
◼
►
rather futuristic details about artificial intelligence and what we can
[TS]
01:33:38
◼
►
do and what might happen and what might not happen in like but taking it deadly
[TS]
01:33:42
◼
►
deadly seriously it's it's an interesting it's an interesting read but
[TS]
01:33:46
◼
►
maybe don't start from the from the very beginning would be my recommendation
[TS]
01:34:56
◼
►
this one this is kinda preferences great I'm just looking at some of the vaccinia
[TS]
01:35:02
◼
►
spoiling yourself interesting start calling itself the first three I pulled
[TS]
01:35:07
◼
►
off the top of the pack all voted for three different ones stop spoiling
[TS]
01:35:10
◼
►
yourself get your hands off the votes
[TS]