134: AI Art Will Make Marionettes Of Us All Before It Destroys The World
00:00:00
◼
►
Well, we have a lot more terrifying things to discuss about AI.
00:00:05
◼
►
A lot has happened since the last time we spoke, Myke.
00:00:08
◼
►
I don't presume that I'm the center of the universe.
00:00:12
◼
►
But there is an almost Truman Show-like aspect sometimes to life,
00:00:17
◼
►
and this is one of those things where I feel like,
00:00:20
◼
►
you know, I was seeing enough about A.I.R. around the place,
00:00:23
◼
►
like, you know, there was a reason that we spoke about it,
00:00:26
◼
►
because it was coming into our worldview more and more.
00:00:29
◼
►
But in the time between these last two episodes,
00:00:31
◼
►
I feel like it has taken over the world.
00:00:34
◼
►
I can't move for people talking about it.
00:00:37
◼
►
It's coming up on podcasts that have got nothing to do
00:00:40
◼
►
with this stuff, like not even tech related shows
00:00:43
◼
►
they're talking about.
00:00:43
◼
►
Like Idina sent me a link to one of her favorite podcasts,
00:00:46
◼
►
it's a podcast called Comic Lab,
00:00:48
◼
►
which is about cartooning and making comics.
00:00:52
◼
►
They did like a whole episode talking about it.
00:00:54
◼
►
And it makes sense for them a little bit
00:00:56
◼
►
because they are comic artists, right?
00:00:57
◼
►
They are artists.
00:00:58
◼
►
- But it's like, why now?
00:01:00
◼
►
- Yeah, I think this is definitely
00:01:03
◼
►
the biggest Truman Show-like effect
00:01:05
◼
►
for anything that we have ever spoken about.
00:01:08
◼
►
And I think it's just that we happened to hit it
00:01:10
◼
►
at the exact right moment before a bunch of things exploded.
00:01:15
◼
►
- Yeah, it's like realistically, I know why this happened,
00:01:18
◼
►
but it's still such a strange feeling.
00:01:20
◼
►
This happened because we spoke about it
00:01:22
◼
►
because it was starting to become a thing
00:01:23
◼
►
that we couldn't avoid, right?
00:01:25
◼
►
So that's why then it's just like a snowball effect.
00:01:28
◼
►
Oh, I know, I know. And like listeners may remember that I had a conference that I said
00:01:33
◼
►
I was going to go to in September. I was like, "Oh, guess what?" One of the major topics
00:01:38
◼
►
like everyone was discussing at the conference. It was a bunch of AI stuff. Like my whole
00:01:42
◼
►
life just exploded into this in the past month. And I think also with the last episode of
00:01:48
◼
►
the show, there was like a little bit of confusion over our positions because there's just
00:01:53
◼
►
like there's so much that this touches on, right? Like this, we just kind of talked about
00:01:57
◼
►
it off the cuff. And, you know, I really think this this absolutely touches on almost everything
00:02:04
◼
►
in the world. So I think to try to be clear, it's like, with AI arts, and the language
00:02:10
◼
►
models in particular, in the short term, I think there's a lot of interesting discussions
00:02:14
◼
►
to be had about how it affects technology and how it affects the economy and how it
00:02:19
◼
►
affects ethics. But in the long term, we're seeing the beginning of the end of the human
00:02:29
◼
►
Is this the clarifying you were looking to do?
00:02:32
◼
►
Yeah, just so people can understand where I'm coming from.
00:02:35
◼
►
There's lots of interesting discussions to be had where you can take many different
00:02:39
◼
►
positions, but ultimately I think this all points in one direction, which is not very
00:02:45
◼
►
But yes, in the short term, since the last time we spoke, so much has happened, and I
00:02:52
◼
►
I think one of the first things that you sent to me was that Dolly is now open for the public
00:02:59
◼
►
And the thing that I think really helped kick it off was, because last episode we were talking
00:03:04
◼
►
about how you, oh, you need these like massive computing clusters to do this kind of stuff.
00:03:08
◼
►
I don't know.
00:03:09
◼
►
It was like 10 days after that episode went up, the very first desktop versions of these
00:03:15
◼
►
things came out.
00:03:17
◼
►
And the one I've been playing with is called Diffusion B, which allows you just on your
00:03:23
◼
►
Mac as long as it's one of the new M processors so that it's fast enough, it allows you to
00:03:28
◼
►
start typing in whatever and generating AI art.
00:03:32
◼
►
And so it's like, oh, okay, this went from supercomputing clusters only down to, oh,
00:03:40
◼
►
it can happen on your desktop, which is I think the first time that lots of people actually
00:03:45
◼
►
started to play with it firsthand and I think that's kind of what kicked off for lots
00:03:52
◼
►
"Oh this is really real, like I can play with it and I can see what it does."
00:03:58
◼
►
The first publicly used by us piece of AI art went up which was running our Cortex logo
00:04:05
◼
►
through some AI art stuff.
00:04:08
◼
►
- How did you do that?
00:04:09
◼
►
- That was done with, oh god what was it called, it was done with Dream I think is one of the
00:04:13
◼
►
projects but you can upload a piece of art and have it like iterate on that piece of art.
00:04:20
◼
►
- Oh, okay. That makes sense then. Because I wonder, it's like, how did you get it to do the Cortex logo?
00:04:24
◼
►
- This is also one of these things like, even Diffusion Bee, which is the thing that you can
00:04:28
◼
►
play with on your laptop, it started out as, oh you can only just put in words and then it will
00:04:33
◼
►
generate images, but even that one now, you can give it an image and then ask for modifications
00:04:38
◼
►
on that image, but the dream program just lets you put in a piece of art and then it
00:04:43
◼
►
just iterates on it in some way.
00:04:45
◼
►
Part of the intrinsically horrifying thing about all of this art is that a bunch of examples
00:04:50
◼
►
had to be rejected because the AI knew that it was a brain and so ended up making it look
00:04:57
◼
►
very like, "Oh, it's a horrifying mass of meat that's also a brain."
00:05:04
◼
►
My favorite.
00:05:05
◼
►
Which brings us to AI Art.
00:05:07
◼
►
Why you so terrifying?
00:05:09
◼
►
So last time we spoke, another one of these things was, "Oh, obviously this is coming
00:05:15
◼
►
It's just a matter of time.
00:05:16
◼
►
It's just a question of how long does it take computationally to do these sorts of things?"
00:05:21
◼
►
And it's like, "Oh, is that going to happen next year?"
00:05:24
◼
►
No, it happened two weeks after the show went up.
00:05:28
◼
►
And I have something for you to take a look at, which was the first version of this, which
00:05:34
◼
►
is this was made a little bit manually.
00:05:37
◼
►
It was made with stable diffusion.
00:05:39
◼
►
But the prompt is a journey through a boy's life.
00:05:43
◼
►
Again, this is not true video, but it's like an,
00:05:48
◼
►
I don't know how to describe it.
00:05:50
◼
►
It's a-- - It's like,
00:05:50
◼
►
I'm watching it now, it's like a flip book.
00:05:52
◼
►
- Yeah, it's a bit like a flip book.
00:05:53
◼
►
So just take a look at it
00:05:54
◼
►
and tell me what you think about this.
00:05:56
◼
►
- I mean, it's weird.
00:05:58
◼
►
Like, it's not much of a video to me, really.
00:06:01
◼
►
Like, I can see it's just like a bunch,
00:06:02
◼
►
you know, you're just cycling through a bunch of images.
00:06:05
◼
►
It's got weird,
00:06:05
◼
►
I just got weirdly religious there for a minute.
00:06:07
◼
►
What is happening here?
00:06:09
◼
►
Like what is going on here?
00:06:10
◼
►
Did an AI generate these and put them together?
00:06:13
◼
►
Or like, are they generating them from each other?
00:06:16
◼
►
Like what is going on?
00:06:17
◼
►
How is this being made?
00:06:18
◼
►
- Yeah, so I read a little bit of what the author said.
00:06:21
◼
►
And the impression is, he started with a general image
00:06:26
◼
►
that's called a journey through a boy's life.
00:06:28
◼
►
And then each of the next frames
00:06:31
◼
►
is based on the frame before it,
00:06:34
◼
►
given that same piece of information.
00:06:36
◼
►
It's going in some really weird places.
00:06:37
◼
►
We just went to war for a while,
00:06:40
◼
►
and now there's some kind of eldritch horror occurring.
00:06:43
◼
►
Okay, I mean, it's really weird.
00:06:45
◼
►
I watch something like that
00:06:46
◼
►
and knowing where it's coming from,
00:06:48
◼
►
and it doesn't make me feel great in a way,
00:06:53
◼
►
because of the weird places the AI takes itself.
00:06:59
◼
►
And there was an uncomfortability in that.
00:07:02
◼
►
I do not know why we spent a significant portion of that
00:07:07
◼
►
in various wartime.
00:07:10
◼
►
And no one can really answer that question.
00:07:14
◼
►
I assume, there's no way to know why the AI went that route
00:07:17
◼
►
and stuck in it for quite some time.
00:07:19
◼
►
- So as a description for the listeners,
00:07:22
◼
►
I mean, I think it's sort of horrifying
00:07:26
◼
►
and it's very dirty looking for some reason,
00:07:29
◼
►
But it is like a sequence of images that start with a child and the child slowly grows up
00:07:35
◼
►
and you just see a bunch of things like a kid at the desk doing his homework and it
00:07:40
◼
►
sort of transitions.
00:07:42
◼
►
The kid just gets increasingly older and older and then yes, it's young men at war for a
00:07:46
◼
►
while and I presume this is one of these side effects of what's in the training database
00:07:51
◼
►
and it has some concept of, "Oh, based on all of the images I've seen, what has a boy
00:07:57
◼
►
been doing in his 20s, and so it generates like a ton of war imagery.
00:08:03
◼
►
I also suspect that's why at that moment it also gets weirdly religious, like a lot of
00:08:07
◼
►
crosses appear, because I presume that that's coming from graveyard stuff.
00:08:12
◼
►
And it goes all the way through to ending with like a, you know, a dead body laying
00:08:18
◼
►
on a table with sort of muddy blood coming down from it.
00:08:22
◼
►
This was the first thing that I saw that I thought, oh, this is a video.
00:08:27
◼
►
And I would also legitimately say, oh, this is a piece of art.
00:08:31
◼
►
Like you could, you could display this in an art museum and it wouldn't be out of place.
00:08:38
◼
►
And what it made me think of was, I don't know if you've ever seen the animations from
00:08:42
◼
►
Pink Floyd's The Wall.
00:08:44
◼
►
That's what this made me think of.
00:08:45
◼
►
Oh, it has that kind of feeling to it.
00:08:49
◼
►
And it has a really horrifying animation style.
00:08:53
◼
►
And I went back and wanted to rewatch some of the animation sections of that movie.
00:08:59
◼
►
And boy, it was an interesting experience, because in comparison, suddenly the walls
00:09:06
◼
►
seemed remarkably undetailed.
00:09:10
◼
►
And I just felt like I could only see, oh, it's so simple.
00:09:14
◼
►
And like that is perhaps one of the most complicated examples of like hand drawn animation is the
00:09:21
◼
►
Oh God compared to this.
00:09:22
◼
►
Oh, yeah, they just can't possibly put the detail in every single frame that exists in
00:09:28
◼
►
this one thing.
00:09:30
◼
►
And so I just I just had a real feeling of, wow, what a jump.
00:09:34
◼
►
It's sort of addressing the same idea, but in 100,000 times more detail.
00:09:41
◼
►
So anyway, that'll be in the show notes for people to take a look at and then I don't
00:09:45
◼
►
know another seven days after that meta and a few other companies announced true text
00:09:54
◼
►
to video projects and I have some links in the show notes that you can click on and so
00:09:59
◼
►
this is called make a video and if you if you take a look at some of these links, I
00:10:04
◼
►
think this stuff looks more like what stable diffusion does.
00:10:10
◼
►
- Okay, this looks like, take all of the generators
00:10:15
◼
►
and make them do animation, right?
00:10:17
◼
►
Like it looks, it has a similar look to,
00:10:21
◼
►
like the quality of some of the imagery is like,
00:10:25
◼
►
has the telltale signs that this was made
00:10:28
◼
►
by an AI art generator.
00:10:32
◼
►
- Yeah, I think the key characteristic
00:10:34
◼
►
of a lot of this stuff is it's still quite dreamlike,
00:10:38
◼
►
because a lot of the details aren't there. There's a lot of areas where it's kind of fuzzy.
00:10:44
◼
►
But I think that this again is the thing where a lot of the detractors of this kind of thing said,
00:10:50
◼
►
"Oh, you'll never be able to do video." And you go, "Well, yeah, let's just wait. Let's just
00:10:57
◼
►
wait and see how long this takes." And we go, "Okay, here we go. Here's the first versions
00:11:03
◼
►
of video where you can just type the words "a dog wearing a superhero outfit with a red cape flying
00:11:09
◼
►
through the sky" and it makes that. Like it makes a little video of that thing. And there's another
00:11:15
◼
►
video project which I'll try to find for the show notes for viewers later which was this, but it's
00:11:21
◼
►
like a multi-scene description. So you can say things like "a woman's swimming in the ocean,
00:11:27
◼
►
she dives under the water and picks up a starfish on the ground and then returns to the shore and
00:11:32
◼
►
And it's able to keep that concept straight the entire time and construct the whole scene
00:11:39
◼
►
instead of just an image that's moving.
00:11:42
◼
►
So the speed on this in the space of what, five weeks has been absolutely breathtaking.
00:11:52
◼
►
This all seems inevitable.
00:11:55
◼
►
I will say I am surprised at the speed like you, right?
00:11:58
◼
►
that we've, from the last episode to now, how much has happened?
00:12:03
◼
►
ALICE The thing that I just want to draw people's
00:12:05
◼
►
attention to that gets overlooked in the AI art discussion is, people are talking about
00:12:12
◼
►
the art because it's visual.
00:12:15
◼
►
You can see it, I think it's the most understandable thing that's occurring.
00:12:21
◼
►
But to me, the much bigger deal is the less visible language model stuff.
00:12:28
◼
►
And so it is the same idea, but for words where AI is responding to some text prompt
00:12:38
◼
►
with an answer of some kind.
00:12:42
◼
►
And we talked in more texts about AI explaining jokes like how an AI can explain a joke listeners
00:12:49
◼
►
can go check that out.
00:12:50
◼
►
But I have an example that I didn't mention that a listener pointed out from that paper
00:12:54
◼
►
that I had missed, which I think is much more impressive.
00:12:57
◼
►
So again, you can do this same thing.
00:13:00
◼
►
Ai doesn't understand words.
00:13:03
◼
►
And so I'm going to give an input a thing that the user told the computer and then the
00:13:09
◼
►
output how the computer answered.
00:13:12
◼
►
So here is the input and this is this is very strange.
00:13:15
◼
►
It's almost like a puzzle just so you can understand what I'm what I'm saying here.
00:13:20
◼
►
Input Michael is at that really famous museum in France looking at its most famous painting.
00:13:26
◼
►
However, the artist who made this painting just makes Michael think of his favorite cartoon
00:13:32
◼
►
character from childhood.
00:13:35
◼
►
What was the country of origin of the thing that the cartoon character usually holds in
00:13:43
◼
►
So do you think you have an answer to that question?
00:13:48
◼
►
Okay, here is what the AI said.
00:13:52
◼
►
The most famous painting in the Louvre is the Mona Lisa.
00:13:56
◼
►
The artist who made the Mona Lisa is Leonardo da Vinci.
00:14:00
◼
►
Leonardo da Vinci is also the name of the main character in the cartoon Teenage Mutant
00:14:04
◼
►
Ninja Turtles.
00:14:06
◼
►
Leonardo da Vinci is from Italy.
00:14:08
◼
►
The thing that Leonardo usually holds in his hand is a katana.
00:14:12
◼
►
The country of origin of the katana is Japan.
00:14:16
◼
►
The answer is Japan.
00:14:19
◼
►
I mean, part of my issue there is I forgot it was Leonardo da Vinci who made the Mona
00:14:22
◼
►
Lisa but I would not have remembered that Leonardo the turtle from Ninja Turtles
00:14:27
◼
►
held a katana because I thought that he had like a kendo stick or something that
00:14:31
◼
►
is horrifying right so because this is because this is actually a good example
00:14:37
◼
►
right because my human brain forgets things mm-hmm the AI brain doesn't it
00:14:46
◼
►
just has all the information or can get it I assume I don't know where from but
00:14:51
◼
►
But like the difference between pitching me against the machine there is I just couldn't
00:14:57
◼
►
remember a couple of key pieces of information, which now you tell me I did know them.
00:15:03
◼
►
Yeah, or even when I first read this input, it's just phrased in such a strange and vague
00:15:10
◼
►
Yeah, I do wonder why they did it that way.
00:15:12
◼
►
That is odd to me.
00:15:13
◼
►
I think the whole purpose of why it's framed in this really strange way is to give the
00:15:19
◼
►
minimal amount of information that you can regarding what the actual answer is. They
00:15:25
◼
►
don't even say like the most famous museum in France, it's a really famous museum in
00:15:29
◼
►
France. Right? And also they're like, they're trying to get an answer, but they're asking
00:15:34
◼
►
a bunch of questions that require recalling the previous answer, right? Right. It's a
00:15:40
◼
►
multi-stage thing to think through. I think the particular one that's really killer here
00:15:44
◼
►
is the artist who made this painting just makes Michael think of his favorite cartoon
00:15:49
◼
►
character from his childhood. For the AI to make the connection, "Oh, it's Leonardo
00:15:55
◼
►
da Vinci. Leonardo is one of the Teenage Mutant Ninja Turtles."
00:15:58
◼
►
- Because there's no concept of how old the person is, right?
00:16:01
◼
►
- Yeah, there's no concept of how old the person is. Simply being able to draw that
00:16:06
◼
►
out of like, what painting in France would make someone think of a childhood cartoon?
00:16:13
◼
►
a crazy abstract thing to think of, but the machine got it. So I think this like input
00:16:19
◼
►
output is a an example of again the AI art stuff is interesting. The text stuff to me
00:16:27
◼
►
is where a lot of the much more concerning stuff comes from and there's a little bit
00:16:32
◼
►
of a less visceral demo, but I actually found this the most striking thing I have found
00:16:38
◼
►
in language models, but it's a little bit hard to describe. So there's an article
00:16:42
◼
►
called "Using GPT-3 to Pathfind in Random Graphs". GPT-3 is one of these language models.
00:16:50
◼
►
It's the most advanced one that people have access to. Although at the conference I was
00:16:56
◼
►
at I got to see some of the not publicly released stuff, which was much more terrifying, but
00:17:00
◼
►
we'll just leave that alone for now. GPT-3 is the much older one. And this is the most
00:17:07
◼
►
impressive thing that I've seen. So this is a little bit hard to describe on a podcast,
00:17:13
◼
►
but there's this problem in mathematics, which is called the traveling salesman problem,
00:17:16
◼
►
which is say, Oh, you're a salesman and you have to get to 10 cities in the United States,
00:17:23
◼
►
what's the shortest possible path to travel between those 10 cities in any order. And
00:17:28
◼
►
it's one of those things like it seems like it should be easy, but it turns out this is
00:17:32
◼
►
just incredibly difficult to actually solve in a reasonable way. Just a hugely computationally
00:17:38
◼
►
intense problem. So someone basically got the idea of, hey, why don't I see if GPT three
00:17:44
◼
►
can solve the traveling salesman problem. So I think you need to go read the paper to
00:17:49
◼
►
see the details. But let me just describe it in a general way, where the person gave
00:17:55
◼
►
an input to GPT three that says something like this. There's 17 locations, we need to
00:18:03
◼
►
find the shortest path between these locations. Location one is connected to location seven
00:18:09
◼
►
and 13. Location 13 is connected to locations nine and two. So they just wrote out like
00:18:16
◼
►
a bunch of here's all of the connections. You're currently at location three, find the
00:18:21
◼
►
optimal path to location eight. And GPT three just did it. Basically, half of the time it
00:18:30
◼
►
was able to find an optimal or near optimal path, just given a bunch of random locations
00:18:36
◼
►
and the connections between them. And what's just what's like, that may not sound like
00:18:42
◼
►
much but what needs to be understood here, what is absolutely mind blowing is that GPT
00:18:47
◼
►
has just been given a ton of like text documents from which to derive the world.
00:18:54
◼
►
And inside of it, somehow it has the concept of locations and connections
00:19:01
◼
►
and what does it mean to try to find the path to a different location.
00:19:06
◼
►
Somehow it's thinking of what is the optimal solution.
00:19:10
◼
►
And again, like it's not programmed for this.
00:19:12
◼
►
This is an incredibly difficult math problem, but it has an idea of what
00:19:17
◼
►
of what's going on. And so I've just seen more and more people are really trying to
00:19:22
◼
►
push the edges of these language models to say, what do you understand? And the answer
00:19:30
◼
►
keeps coming out to be much more than you might have imagined that there's a better
00:19:34
◼
►
understanding of something in the world that has just been derived from dumping tons and
00:19:42
◼
►
tons of text files into this big database and building a neural network on top of it.
00:19:47
◼
►
It is genuinely, genuinely terrifying. And I'll just say like, I don't think I can talk
00:19:54
◼
►
in much detail about what I have seen. But I saw a couple demos of the next generation of this stuff
00:20:01
◼
►
in person. And it was extremely alarming what you could ask it to do. And it could give reasonable
00:20:08
◼
►
answers to. And I think just like the AI art stuff, it has clearly crossed into the realm
00:20:17
◼
►
of creative in a way that I think people just wouldn't have expected.
00:20:22
◼
►
Like writing fiction?
00:20:23
◼
►
Exactly, yes. That's exactly the kind of thing. Some public similar projects that are
00:20:28
◼
►
easier to talk about are, there's a few cases of auto-generated text adventures. So
00:20:35
◼
►
Exactly the thing that we do, right, for the bonus episodes at Cortex where we work through
00:20:39
◼
►
a text adventure that someone has created, there are now projects that do that, where
00:20:44
◼
►
you can just play an infinite text adventure where it keeps spitting out like, "Oh, you're
00:20:49
◼
►
in this room and here are your options of what you can do." And if you select an option,
00:20:53
◼
►
like it'll just go forever in a coherent way. It's like, "Oh my!" So yeah, I think
00:21:01
◼
►
the AI art stuff is flashy. The language model stuff is what people are going to be quite
00:21:08
◼
►
surprised at how much it might take over soon. And that includes a lot of things that people
00:21:14
◼
►
would not expect like computer code. There are already some public examples of this of
00:21:20
◼
►
you express in human language, the computer code that you want created, and it is able to create
00:21:27
◼
►
that code for you. So we're rapidly encroaching on computers programming themselves territory.
00:21:34
◼
►
Mykey - Definitely want the computer to think for itself. That's what I'm looking for.
00:21:39
◼
►
Alex - I mean look, look, we can get to the Doom stuff later. Like there's so much other
00:21:43
◼
►
stuff to talk about.
00:21:44
◼
►
Mykey - Wait a second, what do you mean later? Am we not already in it? I feel like we're
00:21:47
◼
►
in it. What are you talking about?
00:21:48
◼
►
Alex - No, no Myke. I don't think this is the Doom stuff at all. This is just timeline
00:21:53
◼
►
Mykey - Oh okay. May I follow up at this point?
00:21:54
◼
►
"Hey, what's happened since we last spoke?"
00:21:57
◼
►
And a lot has happened since we last spoke.
00:22:00
◼
►
And this is why at the beginning I kind of divided,
00:22:03
◼
►
"Oh, there's stuff in the near term,
00:22:05
◼
►
and there's stuff in the long term."
00:22:07
◼
►
And we're still just in the near term conversation, Myke.
00:22:09
◼
►
Like we're not even remotely close
00:22:11
◼
►
to the long term conversation.
00:22:15
◼
►
Hey, this isn't an ad,
00:22:17
◼
►
but we're putting it in between the ad sounds
00:22:20
◼
►
because Myke and I have been talking
00:22:23
◼
►
for forever about AI.
00:22:26
◼
►
And we forgot that we need to tell you about the subtlety
00:22:31
◼
►
and the subtle sweater going on sale
00:22:33
◼
►
sometime before the 90 minute mark of the show.
00:22:36
◼
►
So we're just breaking into our own conversation now
00:22:40
◼
►
from the future.
00:22:41
◼
►
- This is us from the future coming back to tell you
00:22:44
◼
►
that we talk about AI for an hour and a half
00:22:46
◼
►
accidentally in this episode.
00:22:48
◼
►
And we realize as purveyors of fine merchandise,
00:22:52
◼
►
That is a terrible way to structure the show.
00:22:55
◼
►
If you want to let people know
00:22:57
◼
►
that the incredibly popular and beloved
00:22:59
◼
►
Subtlety and Subtle Sweater is back
00:23:01
◼
►
for its one time a year sale,
00:23:04
◼
►
to leave that 90 minutes into the episode,
00:23:07
◼
►
we are doing a bad job selling our products.
00:23:10
◼
►
So if you go to cortexmerch.com right now,
00:23:15
◼
►
you will find until November 8th, 2022,
00:23:20
◼
►
a limited time sale of the subtlety and subtle sweater.
00:23:24
◼
►
We are bringing back all of our beloved colors,
00:23:27
◼
►
the original blue, black, green, and red.
00:23:31
◼
►
We are adding a new color this year, gray.
00:23:35
◼
►
- The best color.
00:23:36
◼
►
- Now I am really into this.
00:23:39
◼
►
So we did some general merch at Relay
00:23:42
◼
►
and we added a bunch of colors for some stuff we were doing,
00:23:45
◼
►
including just a light gray sweater option
00:23:48
◼
►
for one of the shirts that we did.
00:23:49
◼
►
And as soon as I got it, I was like,
00:23:51
◼
►
oh God, I need this in a subtlety.
00:23:53
◼
►
Because that light gray color,
00:23:56
◼
►
it's just like the traditional sweater color, right?
00:23:59
◼
►
So we now have that available
00:24:01
◼
►
and look so good with the light blue stitching.
00:24:04
◼
►
So that's available in tees and in sweaters
00:24:08
◼
►
with all of the other colors,
00:24:10
◼
►
red, green, blue, black, and gray.
00:24:13
◼
►
You can get any of them.
00:24:14
◼
►
This year gray,
00:24:15
◼
►
I am going to be replacing my original blue sweatshirts,
00:24:19
◼
►
I've decided.
00:24:20
◼
►
- Ooh, okay.
00:24:21
◼
►
- 'Cause I mean, I've had those for like four years now
00:24:24
◼
►
or something, so I'm gonna get some new ones.
00:24:26
◼
►
- You're probably right.
00:24:27
◼
►
From where I'm recording right now,
00:24:28
◼
►
I can look into my closet and,
00:24:30
◼
►
this is not an exaggeration,
00:24:31
◼
►
one third of my closet is subtle sweaters, subtle jeans.
00:24:38
◼
►
- It's the same for me.
00:24:39
◼
►
I'm wearing a green tee, the green sleeper hit.
00:24:43
◼
►
The green is so good.
00:24:44
◼
►
I'm buying a bunch more green.
00:24:46
◼
►
- Green looks great.
00:24:46
◼
►
- Like this happens to me every year.
00:24:48
◼
►
We get a new color, I buy one of each,
00:24:49
◼
►
and I'm like, "God damn it, I wish I put more of them."
00:24:52
◼
►
And then I also have to wait a year to get more of them.
00:24:56
◼
►
But that green, so good.
00:24:58
◼
►
- For me actually, surprisingly, it was the red.
00:25:01
◼
►
Like I bought some of the red just like,
00:25:03
◼
►
"Oh, it's good to have a complete set for me."
00:25:06
◼
►
I wear it a surprising amount,
00:25:07
◼
►
and I would never have picked that as a color for myself.
00:25:09
◼
►
But yeah, the subtle sweaters and the subtle tees,
00:25:13
◼
►
They're seriously so comfortable.
00:25:16
◼
►
It's we get just a ton of positive feedback from people who really like them, which is
00:25:21
◼
►
part of why we realized we've got to break into the show and remind you, "Hey, they're
00:25:28
◼
►
If you want them, you need to get them now.
00:25:31
◼
►
People love them.
00:25:32
◼
►
It's just this one time sale.
00:25:34
◼
►
So go and order them right now."
00:25:38
◼
►
I cannot impress upon enough that you do this.
00:25:40
◼
►
We will not have another episode come out to remind you to do this before the sale is
00:25:46
◼
►
It's a three week sale from when the episode goes out.
00:25:49
◼
►
November 8th is when it's done and it's gone for a year.
00:25:53
◼
►
So we only do this once a year.
00:25:55
◼
►
So if you want them, you need to go and get them at cortexmerch.com.
00:26:00
◼
►
Cannot impress upon you enough as well how good these things look.
00:26:02
◼
►
Like a couple of days ago I was wearing the red one and I was walking towards a glass
00:26:06
◼
►
door and I was like, "Damn that looks good."
00:26:10
◼
►
It's just a good, our brand on a shirt like that
00:26:14
◼
►
or on a sweatshirt like that, it just looks so professional.
00:26:18
◼
►
I'm so happy we did this as a thing.
00:26:21
◼
►
They are so great.
00:26:22
◼
►
- I also feel like I'm trying to do the impossible thing,
00:26:25
◼
►
which is it happens every year that when the sale is over,
00:26:28
◼
►
we get contacted from people who are going like,
00:26:30
◼
►
"Oh, I want the shirts, how can I buy them?"
00:26:33
◼
►
And I'm like, "I'm trying to talk to you right now,
00:26:36
◼
►
"the person who's going to be sad in five weeks
00:26:39
◼
►
when they can't get them. You need to do this now. We're still going to be talking about AI later,
00:26:46
◼
►
but know if you ever want these shirts, which you definitely do, if you want to wear the most
00:26:51
◼
►
comfortable sweater you have ever worn, go to cortexmerch.com right now and get yourself some
00:26:59
◼
►
fantastic clothing to wear. You will not regret it. Everybody loves these things and they're
00:27:04
◼
►
sad when they're not on sale. So, CortexMerch.com, pause the podcast right now.
00:27:10
◼
►
Or let me give a secondary thing. If you're like, "But Gray, Myke, I'm driving!"
00:27:16
◼
►
Here's what I'll tell you you can do. Imagine in your mind now something you see at the
00:27:21
◼
►
end of your commute. It might be a billboard, it might be like a sign in your parking space,
00:27:26
◼
►
it might be your garage door, whatever. Imagine that thing right now, tie in your brain the
00:27:32
◼
►
image of that thing with cortexmerch.com. Think about it right now. Say it in your mind
00:27:37
◼
►
a few times. So when you arrive at your destination, you have set yourself a reminder.
00:27:42
◼
►
Look, Myke's trying to pull some fancy memory palace stuff over here. You know, my method
00:27:48
◼
►
is much more direct. I just say, "Hey Siri, remind me in three hours to go to cortexmerch.com."
00:27:55
◼
►
Hey Google, remind me in three hours.
00:28:00
◼
►
Hey Google, remind me in three hours
00:28:02
◼
►
to go to cortexmerch.com.
00:28:06
◼
►
We're breaking into your life now.
00:28:08
◼
►
What are you gonna do?
00:28:11
◼
►
Cortexmerch.com.
00:28:14
◼
►
- Okay, and now back to us in the past talking about AI.
00:28:17
◼
►
- Are we gonna have that long-term conversation?
00:28:21
◼
►
- Yeah, yeah, yeah, we will, we will.
00:28:23
◼
►
But like, I think there's also just leftover from last time,
00:28:28
◼
►
there's still just like a bunch of stuff in the near term
00:28:31
◼
►
that might be worth revisiting.
00:28:34
◼
►
One of the other things that came up last time is,
00:28:38
◼
►
how do these models work?
00:28:40
◼
►
Like how do they even begin to start creating anything,
00:28:44
◼
►
whether it's poetry or videos or computer codes,
00:28:48
◼
►
like how does this work?
00:28:50
◼
►
And fundamentally, these AI systems are made by just hoovering up a ton of information
00:28:57
◼
►
in the relative domain and feeding it into the system for the system to be trained upon.
00:29:03
◼
►
And I do think one of the most concerning short-term questions about that is like,
00:29:11
◼
►
what does it mean to use the public work that people have done, whatever that is?
00:29:19
◼
►
you've written a book and you've published it. You've gone on to Stack Overflow and you've helped
00:29:24
◼
►
answer hundreds of people's questions about computer code. You've been on DeviantArt for
00:29:30
◼
►
years and you've made images and like that stuff has been sucked into a computer somewhere so that
00:29:36
◼
►
it can then produce imitations or produce new work based on what you have done. I think that's like a
00:29:44
◼
►
That is just a really difficult question.
00:29:47
◼
►
- Human inspiration.
00:29:49
◼
►
So, one of the things that a lot of people brought up
00:29:52
◼
►
is how is this any different to being inspired
00:29:56
◼
►
by someone's work and creating your own work?
00:29:59
◼
►
I feel like it is quite different, but what do you think?
00:30:03
◼
►
- I think it's different, but it is hard to articulate
00:30:13
◼
►
why in a coherent way?
00:30:16
◼
►
- I feel like I have something that I think is pretty core
00:30:19
◼
►
to me, but I also, I know a lot of people don't agree
00:30:24
◼
►
- What is that?
00:30:25
◼
►
- The difference to me is the skill required in acting
00:30:29
◼
►
on the imitation is the thing that I actually think
00:30:33
◼
►
is valuable.
00:30:34
◼
►
- What do you mean by that sentence?
00:30:36
◼
►
- Let's imagine we'll go with painting, right?
00:30:40
◼
►
It's just a simple thing that we can all understand, right?
00:30:42
◼
►
how somebody paints a picture.
00:30:45
◼
►
If you look at a painting,
00:30:49
◼
►
we'll talk about the previously famous Parisian art piece,
00:30:53
◼
►
the Mona Lisa, right?
00:30:55
◼
►
- And you wanna make your own Mona Lisa.
00:30:57
◼
►
To be able to take the inspiration from that piece
00:31:01
◼
►
and do it yourself, it is an imitation of previous work.
00:31:04
◼
►
But you had to do it, you had to practice
00:31:09
◼
►
and get the skill and build up your own level of skill
00:31:13
◼
►
to perform that work.
00:31:16
◼
►
Now, yes, all you have done is imitate it,
00:31:18
◼
►
but if you get even 50% close,
00:31:21
◼
►
you've made something that's interesting.
00:31:22
◼
►
You've now learned the skills that you can go out maybe
00:31:26
◼
►
and produce your own work,
00:31:27
◼
►
but you've built the actual skill, the practice.
00:31:31
◼
►
You've built the skill.
00:31:33
◼
►
That is what I think is the thing that concerns me most
00:31:37
◼
►
about this type of work.
00:31:39
◼
►
is that I worry that the skills will get lost.
00:31:43
◼
►
And I accept that some people do not value that
00:31:48
◼
►
the same as me, but that's where I come from with this,
00:31:52
◼
►
where I think that there is an inherent humanity
00:31:56
◼
►
in these mostly inconsequential skills
00:32:01
◼
►
that we value important as humans.
00:32:04
◼
►
Practices, traditions, all of this kind of stuff
00:32:09
◼
►
I hold those kinds of things dear and my concern with this a lot of this stuff is
00:32:16
◼
►
we may lose this part in larger numbers like we may lose this part of our
00:32:24
◼
►
humanity to more people if the creation of art is so simple. Like for example I
00:32:30
◼
►
saw a comment on our YouTube video this morning when we put up and this
00:32:36
◼
►
commenter had said that one of the things that they love about the idea of AI art is
00:32:41
◼
►
that there are movies that people want to exist that don't currently exist and they
00:32:49
◼
►
can type, they would maybe in the future be able to type a prompt into an AI art generator
00:32:55
◼
►
and it would create that movie for them to watch. Honestly, I can't think of anything
00:33:02
◼
►
worse than that. Why? Because there's no art in there, there's no passion in there,
00:33:08
◼
►
there's no drive from a creator, from everyone involved in the creation of a
00:33:14
◼
►
movie to come together and work towards something good. Now you may sit and think
00:33:20
◼
►
to yourself, "Myke I don't agree with what you're saying right now" and that is
00:33:23
◼
►
perfectly fine but I just want people to understand from my perspective the
00:33:28
◼
►
The creation of art is as important, if not more important, than the piece that is at
00:33:35
◼
►
And I think if all we end up with is just a bunch of pieces at the end, we will lose
00:33:39
◼
►
so much of the humanity in this work.
00:33:43
◼
►
The ideas that somebody might have that sparks off something to create a different shot in
00:33:49
◼
►
Like that is what I actually hold to be so important to who we are as the human race,
00:33:56
◼
►
rather than just here is media to consume.
00:34:00
◼
►
So I don't know.
00:34:01
◼
►
I don't know if I'm expressing myself clearly,
00:34:05
◼
►
but I just want people to understand
00:34:07
◼
►
that the thing that I care about is the creation of the art.
00:34:12
◼
►
And it's not even just about jobs.
00:34:15
◼
►
I just worry that we will lose this part of who we are.
00:34:20
◼
►
And one of the things that makes us different
00:34:25
◼
►
to every other species on this planet
00:34:28
◼
►
is this kind of thing that we do,
00:34:32
◼
►
sometimes for pleasure, mostly for pleasure.
00:34:34
◼
►
There is business in it, but people like to make things
00:34:38
◼
►
because they just like to make them.
00:34:40
◼
►
That's something I find to be so beautiful
00:34:42
◼
►
and I don't get the same sense of pleasure
00:34:46
◼
►
out of typing six complex sentences into a text field
00:34:50
◼
►
to then look at an image.
00:34:52
◼
►
Maybe I'm old-fashioned.
00:34:55
◼
►
If I'm trying to summarize your position, because I often have a hard time when
00:35:01
◼
►
people use language like it's a fundamental part of our humanity.
00:35:06
◼
►
Like I'm not, I'm never quite sure what that means, but I guess I'm trying to
00:35:11
◼
►
summarize your position as like, you think it is just a fundamental good that humans
00:35:16
◼
►
are producing art and that part of that process is the skills that are required to be learned
00:35:26
◼
►
in order to make that art.
00:35:28
◼
►
- I think that there is an importance in it.
00:35:30
◼
►
I can't tell you why, but it just feels, I don't know, there's a lot of emotion in it.
00:35:37
◼
►
And I just want people to understand that I'm not sitting here like, "Oh no, my job's
00:35:42
◼
►
gonna go away."
00:35:44
◼
►
Because realistically, it's not going to in my lifetime.
00:35:47
◼
►
I feel pretty confident about that.
00:35:49
◼
►
We'll get to why in a little bit.
00:35:51
◼
►
I have an example.
00:35:53
◼
►
I think we'll kind of like clear up why I'm not concerned about my own job.
00:35:57
◼
►
I'm concerned more about creativity as an idea, something that I care greatly about,
00:36:05
◼
►
about people being creative, even if it's just for fun.
00:36:10
◼
►
And I'm just not sure that I like this idea of creativity will just ultimately become
00:36:18
◼
►
the same thing, which is how good can you be at writing a prompt?
00:36:22
◼
►
Like that doesn't feel creative to me.
00:36:25
◼
►
The act of the act, the process doesn't feel like it exists to me in the same way anymore.
00:36:32
◼
►
I don't know.
00:36:33
◼
►
I agree. I find it strange this argument that the new artistry will be in creating the prompts.
00:36:40
◼
►
There's something I don't know there's something very odd to me about that argument.
00:36:43
◼
►
It kind of reminds me of a while ago when AI systems started becoming the best
00:36:50
◼
►
chess players in the world. There was this what to me always seemed like an absolutely bizarre idea
00:36:56
◼
►
that the chess that there was I forget what they called it, but it was it was this concept of like,
00:37:03
◼
►
"Oh, the best chess player will be a hybrid chess player, that it will be a person who is being
00:37:09
◼
►
advised by the computer." And there was a period in time where that was true, that a tag team of a
00:37:15
◼
►
human and a computer could beat the best computer and they could beat the best human. That always
00:37:21
◼
►
seemed to me like a strange artifact, like this won't exist forever. This just happens to be the
00:37:25
◼
►
situation right now, but I see no reason why the computers won't just ultimately outclass the human
00:37:32
◼
►
And the human will just be like a monkey, right, adding absolutely nothing to this incredibly
00:37:37
◼
►
complicated game that's taking place.
00:37:39
◼
►
And it's not the best comparison, but I feel that there is something in this concept that
00:37:46
◼
►
people have of artistry will be the prompts that's the same.
00:37:51
◼
►
If the computers are getting so good at text, at the same time that computers are getting
00:37:56
◼
►
so good of interpreting text to create art, why will they not just meet?
00:38:00
◼
►
Yeah, maybe that's what it is, is it's like, why do you think you will be the best
00:38:07
◼
►
at coming up with the sequence of words that generates the most interesting art?
00:38:13
◼
►
I just don't—
00:38:14
◼
►
Because similarly, it's the same argument of like, "Oh, the computer will make better
00:38:18
◼
►
art than any human ever can," right?
00:38:21
◼
►
Like that's kind of the thinking of like, "Oh, or just as good or good enough."
00:38:24
◼
►
Then why do you think computers won't be just as good as you are at creating prompts?
00:38:30
◼
►
maybe that's, you've kind of sharpened it up there. I find that argument strange. And even if,
00:38:37
◼
►
let's say that for whatever reason that was never – it turned out that just that wasn't true.
00:38:41
◼
►
It's built into the laws of the universe that humans are just great at writing prompts in ways
00:38:47
◼
►
that machines will never be, which I think is strange. But let's just say it was true.
00:38:50
◼
►
At least in my experience of playing around with Diffusion Bee, I just, I agree with you. I don't
00:38:57
◼
►
I don't think there was really anything creative
00:38:59
◼
►
about what I was doing.
00:39:00
◼
►
- It's problem solving, it's creative problem solving,
00:39:03
◼
►
but that's like a completely different thing
00:39:05
◼
►
to what I care about, which is the process of practicing
00:39:10
◼
►
and getting good at a thing.
00:39:12
◼
►
Like I think that that's really important.
00:39:14
◼
►
I think I'm gonna do it again.
00:39:16
◼
►
I think it's part of the human experience, I do.
00:39:17
◼
►
And it doesn't need to be that everybody becomes a painter,
00:39:20
◼
►
but we all have these things in our lives
00:39:22
◼
►
that we practice and get better at, right?
00:39:25
◼
►
It's like, why don't I just give my game controller
00:39:28
◼
►
to a robot and then just watch what it does?
00:39:32
◼
►
- That's where we're going.
00:39:33
◼
►
It's like "WALL-E", right?
00:39:36
◼
►
This is how we get to "WALL-E".
00:39:37
◼
►
This is how "WALL-E" happens,
00:39:40
◼
►
that we're all just sitting in these chairs.
00:39:42
◼
►
Like that's, right now is the beginning
00:39:45
◼
►
of the path to "WALL-E".
00:39:47
◼
►
- Well, "WALL-E" if we're lucky, but yes.
00:39:50
◼
►
- You know, it's just like,
00:39:52
◼
►
I feel like we have to do things.
00:39:54
◼
►
whatever it is, some kind of thing
00:39:57
◼
►
that we enjoy the process of.
00:39:59
◼
►
And I feel like if all we're doing is saying
00:40:02
◼
►
that this stuff is just gonna replace filmmaking,
00:40:05
◼
►
it's just like, I don't wanna watch those movies.
00:40:08
◼
►
I really don't, but I like to be sold in something.
00:40:11
◼
►
I like to believe that a human was involved
00:40:13
◼
►
in the endeavors that I'm consuming.
00:40:15
◼
►
Like, but again, maybe I'm old fashioned
00:40:18
◼
►
and I'm fine to accept that,
00:40:20
◼
►
but I'm just trying to get across
00:40:22
◼
►
like why these things are important to me,
00:40:24
◼
►
why I am so passionate about it.
00:40:29
◼
►
Yeah, I sort of have a minor point and a major point
00:40:33
◼
►
on that topic.
00:40:34
◼
►
I think last time, I made some offhanded remark of,
00:40:37
◼
►
like, very soon in major productions
00:40:40
◼
►
where we're going to see stuff that's AI generated
00:40:43
◼
►
and not know it.
00:40:45
◼
►
And I didn't realize at the time,
00:40:47
◼
►
but like, oh, that was already true.
00:40:50
◼
►
Didn't know about this.
00:40:51
◼
►
Yeah, I had watched, uh, the Obi-Wan Kenobi show on Disney and I didn't really think about it,
00:40:58
◼
►
but when I had watched Rogue One, I was very aware of listening to James Earl Jones do Darth Vader's
00:41:05
◼
►
lines of like, "He's still Darth Vader, but he's getting too old for this." Like, you can just hear
00:41:12
◼
►
it in someone's voice that they're just older. That's just what happens, uh, as your vocal
00:41:16
◼
►
chords physically change. And when I watched Obi-Wan Kenobi, I never thought about it.
00:41:21
◼
►
And I also didn't think about the fact that I didn't think about it. And I realized, since
00:41:26
◼
►
we recorded that episode, oh, I didn't think about it, because all of his lines were done
00:41:31
◼
►
with AI. There's a program called Respeacher that will take a voice actor's lines and redo
00:41:39
◼
►
them in the voice of someone else. And what's really quite remarkable about it is it is
00:41:46
◼
►
isn't just what you sort of think like voice modulators on the phone like, oh, I you know,
00:41:51
◼
►
I can increase or decrease my pitch, but it still sounds like me. Everybody has vocal
00:41:56
◼
►
tics and things like no, no, no re speech or will put in the vocal tics of the other
00:42:01
◼
►
person like it's not just making your words sound in their voice. It's making them sound
00:42:08
◼
►
like them. Just as an interesting thing to note, though, I did I did go back on YouTube
00:42:13
◼
►
and I watched some of the original line deliveries of James Earl Jones in the original Star Wars.
00:42:17
◼
►
I was like, "Ooh, he is better here!"
00:42:20
◼
►
Like he has like funny little things that he does with a bunch of words that make those
00:42:23
◼
►
line deliveries really great.
00:42:25
◼
►
And there isn't as much of that in the Obi-Wan Kenobi show.
00:42:28
◼
►
I will say when I watched Obi-Wan, there was something about Darth Vader's performance
00:42:33
◼
►
that I didn't like.
00:42:35
◼
►
And I assumed it was because James Earl Jones was getting older.
00:42:40
◼
►
But it turned out it wasn't the case.
00:42:42
◼
►
But there was something that felt missing to me,
00:42:45
◼
►
and I just put it down to,
00:42:46
◼
►
and I remember saying to Adina at the time,
00:42:49
◼
►
"I can't believe they're still getting this guy to do this.
00:42:51
◼
►
Why don't they just get like a voice actor?"
00:42:55
◼
►
- Right, leave James Earl Jones alone.
00:42:57
◼
►
- Yeah, well, it was just kind of just like,
00:43:00
◼
►
this isn't gonna last forever with him,
00:43:03
◼
►
so like they need a path.
00:43:05
◼
►
And they created one, and it did a fine job,
00:43:07
◼
►
but it still felt like it was missing something to me.
00:43:10
◼
►
I can't put my finger on what it was.
00:43:11
◼
►
It just didn't feel right.
00:43:13
◼
►
I don't know if this was part, whatever.
00:43:14
◼
►
But what I will say, this particular implementation of AI,
00:43:19
◼
►
I'm fine with, because there are practical reasons for it.
00:43:22
◼
►
Of like, if we want Darth Vader to sound the same,
00:43:26
◼
►
which I think ideally we do, right?
00:43:28
◼
►
Like you could just get someone else to do it,
00:43:29
◼
►
but I would prefer it if he still sounded
00:43:31
◼
►
like James Earl Jones,
00:43:32
◼
►
'cause it's like an existing character.
00:43:34
◼
►
And James Earl Jones signed this away
00:43:37
◼
►
as an individual while still alive.
00:43:39
◼
►
- Right, which is key.
00:43:40
◼
►
you can do this, you can take my voice and do whatever you want
00:43:44
◼
►
plus working voice actors continue to get jobs
00:43:48
◼
►
being Darth Vader then they change the voice. For me there are enough
00:43:52
◼
►
pieces of this puzzle where I'm like I am fine with this
00:43:56
◼
►
because people are involved in it. It's if Disney just were like
00:44:03
◼
►
we've just decided we don't want to hire him anymore
00:44:06
◼
►
and we've created a thing and we're just gonna type
00:44:10
◼
►
some text in and the AI's gonna spit out
00:44:13
◼
►
the Darth Vader lines, I will be like,
00:44:15
◼
►
"Mm, I don't like that, that doesn't feel good to me."
00:44:18
◼
►
But the way they have done this whole thing going around,
00:44:21
◼
►
I'm kind of fine with.
00:44:23
◼
►
Like in the same way that I do find it kind of funny really,
00:44:26
◼
►
that they continue to have Mark Hamill on set
00:44:30
◼
►
during the Mandalorian stuff to be Luke Skywalker.
00:44:35
◼
►
But then they completely digitally replace him with the younger version of him.
00:44:39
◼
►
Well they do multiple things. They have him do a thing, then they have a look-alike actor
00:44:43
◼
►
do a thing, and then they put a digital recreation of his face on the young actor's face.
00:44:48
◼
►
"Really there is no point in him being here, right? We just like to have him around? I
00:44:53
◼
►
don't know, right?"
00:44:54
◼
►
But like, there is something funny to me about that, but at the same time, I'm like, at
00:44:58
◼
►
least it is a respect of the person they are digitally recreating. Everyone is in on this
00:45:04
◼
►
that's fine. You know?
00:45:07
◼
►
I suspect that they're also using re-speacher to make Mark Hamill's voice sound like young
00:45:12
◼
►
Mark Hamill, because that's again, like, he does not sound like he did when he was in
00:45:16
◼
►
But he also doesn't sound like him now either, right? So they're doing something.
00:45:20
◼
►
Yeah, something is happening there. But this is also one of these key things of, like,
00:45:24
◼
►
okay, it's interesting to realize I had already watched an entire show where a major part
00:45:28
◼
►
of it was AI created and I didn't even notice, or it didn't even cross my mind. But it also
00:45:34
◼
►
touches on like what you're saying here is some of the key differences between what's
00:45:40
◼
►
happening in different parts of the world.
00:45:43
◼
►
And a lot of the AI art developments on the internet are just like a like a crazy Wild
00:45:49
◼
►
West where people are just grabbing whatever they can.
00:45:52
◼
►
And it's like, that makes me extremely uncomfortable.
00:45:55
◼
►
Like, okay, so again, I will just say in general ways, I saw a demo of a Siri like voice assistant
00:46:03
◼
►
that was significantly better than things that currently exist.
00:46:07
◼
►
Oh, that's— it was very interesting, but it was also using the voice of a famous actress
00:46:14
◼
►
just in their demo mode.
00:46:16
◼
►
And I got very uncomfortable about that, 'cause it's like…
00:46:21
◼
►
She's a— she's a person who doesn't know that you've made a machine that can make her
00:46:26
◼
►
say anything.
00:46:29
◼
►
That's what we want!
00:46:32
◼
►
Yeah, and it's like, "No, no, no!" And so a lot of the AI art stuff just feels like that,
00:46:37
◼
►
like it's just grabbed everything that exists. And the thing that I keep thinking of is,
00:46:44
◼
►
like it makes a marionette of everyone who's ever put out any kind of art in the public.
00:46:52
◼
►
And that's like a horror, right? That's just a completely horrifying concept that,
00:47:00
◼
►
if you have a social media timeline where you've posted stuff, guess what?
00:47:05
◼
►
There's enough information where someone can make a marionette of you, and it can do anything.
00:47:10
◼
►
And right now, like, this can happen without someone's permission.
00:47:13
◼
►
And yeah, it's totally why, like, the Darth Vader stuff I have no problem with,
00:47:17
◼
►
because presumably James Earl Jones was, like, fantastic.
00:47:20
◼
►
- I saw an article with that. That was, like, I read about that.
00:47:22
◼
►
It was the case that he approved this, signed the rights, like, he's good with it.
00:47:26
◼
►
There's a big distinction between giving permission for something like this to be done
00:47:32
◼
►
and just the power of AI to make a marionette of anyone.
00:47:39
◼
►
And there's something really horrifying about that.
00:47:42
◼
►
Now I think it becomes, like other people brought this up as a kind of question, there's
00:47:48
◼
►
a boundary that's crossed between doing it to a living person, which is super bad.
00:47:54
◼
►
It's like, yeah, I just think that's really awful and immoral. It's less bad when the person's dead,
00:48:02
◼
►
but it can still be bad if it's recent. I don't quite know what recent means in this context, but
00:48:09
◼
►
clearly when we get to a point where we're talking about modifying Leonardo da Vinci's art,
00:48:18
◼
►
I don't think there's anything bad there about like making a marionette of Leonardo da Vinci,
00:48:23
◼
►
Making a machine that can spit out a ton of Leonardo da Vinci style paintings
00:48:27
◼
►
I don't think it's possible to ascribe some kind of rule onto this
00:48:31
◼
►
You just feel it when you feel it like I can't tell you what the right time period is
00:48:37
◼
►
and it's also different in every case and
00:48:39
◼
►
You know goes on and on forever like
00:48:44
◼
►
The fast and furious movie right where Paul Walker had just died, right and it was like months later when the movie came out
00:48:53
◼
►
out but they used some digital recreation of his face to be able to
00:48:57
◼
►
give him like a tribute and send-off. And I think that there is a
00:49:02
◼
►
little like it's awkward I think that movie came out too soon really but the
00:49:08
◼
►
whole family kind of seemed to agree and there's something about like that was
00:49:12
◼
►
the thing. Same as like Princess Leia right? Yeah that's what I was thinking.
00:49:16
◼
►
Princess Leia's send-off was the next one after Carrie Fisher had died.
00:49:20
◼
►
Whereas still I feel like we're on a fine line here because it's weird.
00:49:24
◼
►
But it was recreated from stuff that she did participate in shooting and at least there's
00:49:29
◼
►
like a tribute.
00:49:31
◼
►
But then if like Princess Leia just continued being a character after that, be like no you've
00:49:35
◼
►
pushed it too far now.
00:49:37
◼
►
Because now that's the making of the marionette right?
00:49:40
◼
►
She's dead now but you're continuing to use her.
00:49:43
◼
►
That would be weird.
00:49:46
◼
►
I don't think it's possible to put rules on the death of a person. I just know that I
00:49:52
◼
►
don't find it particularly comfortable for someone to profit off, either financially
00:49:59
◼
►
or publicity-wise, somebody who is dead. Like taking their actual work, and again, divorcing
00:50:07
◼
►
the work required in the copying, right? And just straight up, like, "I took this thing
00:50:14
◼
►
and I made this thing out of it.
00:50:16
◼
►
And I didn't really have to do anything
00:50:19
◼
►
other than put this image into the AI.
00:50:23
◼
►
There's uncomfortableness for me in that.
00:50:25
◼
►
- Yeah, I just wanted to highlight that there's a...
00:50:28
◼
►
Like with many discussions, you can run into areas
00:50:33
◼
►
where there's some kind of spectrum.
00:50:34
◼
►
- Yeah, like someone taking the works of Shakespeare
00:50:37
◼
►
and putting that into an AI and making more Shakespeare,
00:50:39
◼
►
I don't care about that.
00:50:40
◼
►
But I can't tell you why I don't care about it either.
00:50:43
◼
►
I just don't.
00:50:44
◼
►
Yeah, I mean, I do want to also add that this, this dance is around a concept
00:50:49
◼
►
that I've been, I've been thinking about for years and I have, I have a hard time
00:50:52
◼
►
articulating, but for now I will just call it as the importance of saying no in art.
00:50:58
◼
►
And I think a lot of artistic projects or the life of creative people are defined as
00:51:08
◼
►
much by what they did as what they didn't do.
00:51:10
◼
►
A thousand no's for every yes.
00:51:12
◼
►
Yeah, there is a way to put it, but I think that having spoken to creative people, a lot
00:51:19
◼
►
of them have some internal set of things that they don't do that's not obvious in the
00:51:27
◼
►
work that they create, but it lends a kind of character to their work.
00:51:31
◼
►
I've also just recently went through this where I killed a what was going to probably
00:51:37
◼
►
be like a 20 or 30 minute video that was all storyboarded and written out.
00:51:42
◼
►
And part of the reason I killed it is because I realized this was a project to say no to,
00:51:48
◼
►
that like, yes, I could put it out and I think people would like it, but it violated a couple
00:51:54
◼
►
of my own internal, like, I don't want to do this sort of thing in this way, even if
00:52:01
◼
►
no one in the audience would have noticed.
00:52:05
◼
►
As like, man, that was a hard decision.
00:52:06
◼
►
But I think that that's what matters.
00:52:09
◼
►
And so even while I think that there is there's nothing morally wrong about making new Shakespeare
00:52:16
◼
►
plays or new Leonardo da Vinci paintings with AI, I do think that there's something about
00:52:23
◼
►
this concept of making an artist do anything that you want demeans the limits that that
00:52:35
◼
►
person put on their own things.
00:52:37
◼
►
- And I'm also like not sure why you even want it.
00:52:41
◼
►
- Well, I think this is also just a side effect
00:52:43
◼
►
of how easy it can be and will be to create this stuff.
00:52:46
◼
►
Where just like out of curiosity, you type some prompts in.
00:52:50
◼
►
I guess-- - Curiosity is different
00:52:52
◼
►
though, right? - But I mean, it's like
00:52:53
◼
►
when this gets good enough that it can just like
00:52:55
◼
►
make a movie where some things happen.
00:52:57
◼
►
I don't know, this idea of the importance of saying now
00:53:00
◼
►
is also connected to a thing that I see with a lot of,
00:53:02
◼
►
particularly TV shows and long running creative projects,
00:53:07
◼
►
where I think of this as you can see the audience influence the creators that never works.
00:53:14
◼
►
And there's lots of shows where it's like, "Oh, this is the moment. I can see now that the
00:53:19
◼
►
creators are aware of the audience." And with creative projects, the audience will ask for
00:53:27
◼
►
things that they want that they don't realize in the long run actually destroys or makes worse the
00:53:34
◼
►
thing that they want. And like that's sort of the I think that can sound super snobbish,
00:53:39
◼
►
but I think it's true. And I've seen it in enough projects was like the audience wants
00:53:42
◼
►
something and the creator then goes, Oh, I'll do that thing and the audience responds and
00:53:48
◼
►
you get into this little positive feedback loop of doing the things that the audience
00:53:51
◼
►
wants and none of those decisions mattered individually, but cumulatively they can make
00:53:57
◼
►
a thing much worse in a way that's hard to pin down. And I just think there's something
00:54:02
◼
►
in AI art where even when there's no problem, I think people can like destroy the things
00:54:10
◼
►
that they love because there's no one to say no. There's no creator who says, "These
00:54:16
◼
►
are the limits of my thing." And again, I keep thinking of the Miyazaki movies as a
00:54:22
◼
►
particular example of this. Like if you can make a Miyazaki movie about anything you want,
00:54:28
◼
►
It kind of destroys what those movies are unless moved by the argument about it's important
00:54:34
◼
►
for humans to do these sorts of things.
00:54:37
◼
►
But I can easily imagine a situation where even if the AI is able to make amazing art,
00:54:44
◼
►
it's actually kind of worse for everyone involved, even though it's a thing that the audience
00:54:50
◼
►
has asked for or that people go like, Oh man, I wish I could continue the series of movies
00:54:55
◼
►
forever and I can do it by typing into the machine and it will make the movie for me.
00:54:58
◼
►
It's like you'll ruin the thing you love by doing that, by getting what you want all the
00:55:03
◼
►
time without a creative mind to say like, "No, this is the limits of this project" or "No,
00:55:10
◼
►
this is when the story stopped."
00:55:11
◼
►
Yeah, I mostly agree with that statement. Like, we've spoken about this a lot, right?
00:55:17
◼
►
Like that idea of the point where the artist becomes aware of the audience. We speak about
00:55:24
◼
►
this a lot in regards to like just personally TV show recommendations and like you know
00:55:29
◼
►
like I know you are very sensitive to this like it's something that I'm familiar with
00:55:33
◼
►
like you point this out and I'm like oh okay this is an interesting idea for why the show
00:55:37
◼
►
may have gotten bad in that season like it's not really something I would have thought
00:55:40
◼
►
of and I will say that I mostly agree with your thinking here but not completely I think
00:55:48
◼
►
my kind of interpretation of this idea that you have is like, just to add the word like,
00:55:55
◼
►
mostly to it.
00:55:58
◼
►
Like that I don't, and I'm not saying that you say this, but the way that it sounds makes
00:56:01
◼
►
it sound like a definitive, like that listening to the audience or seeing what the audience
00:56:07
◼
►
want can never make something better, but I don't think it's that way, and I'm not sure
00:56:12
◼
►
you think it's that way, but I just want to state that, right?
00:56:17
◼
►
there is definitely feedback which helps make something better and I do this all
00:56:23
◼
►
the time but there are sometimes people will ask for me to talk about a thing
00:56:28
◼
►
and I just know it won't be interesting because I'm not that person to do that
00:56:35
◼
►
right and that that tends to be a lot of it for me of like you're telling me you
00:56:39
◼
►
want a thing I know you won't want it because I know it's not gonna be
00:56:43
◼
►
interesting because either I don't find it interesting or I don't have the
00:56:49
◼
►
knowledge or whatever it might be you know like that there is that part of the
00:56:53
◼
►
artist of oh god I just called myself an artist there is that part of the
00:56:58
◼
►
creative person right where they are aware of what they're good at and what
00:57:03
◼
►
they're not and what they think will be ultimately enjoyable for this type of
00:57:08
◼
►
stuff that they create yeah and then they go out and make it obviously like
00:57:12
◼
►
this is a different kind of thing.
00:57:14
◼
►
I also think it's the bounds of the project.
00:57:18
◼
►
And I found, I just recently stumbled upon to me what is just the most perfect
00:57:24
◼
►
example of the artist saying no, which is also a little bit heartbreaking for
00:57:30
◼
►
reasons that will be obvious when I, when I mentioned who it is, J.R.R.
00:57:34
◼
►
I just discovered, I can't believe I never knew this.
00:57:37
◼
►
He started a book that was set after the Lord of the Rings.
00:57:42
◼
►
And so he wrote the first couple of chapters of this story.
00:57:48
◼
►
And it's quite interesting, but like he,
00:57:51
◼
►
basically he then wrote about,
00:57:56
◼
►
he wrote in one of his letters,
00:57:57
◼
►
you can get these books of all of his private letters
00:57:59
◼
►
where he discusses a bunch of his thoughts.
00:58:01
◼
►
He wrote about why he didn't finish this.
00:58:04
◼
►
And the reason he didn't finish it
00:58:06
◼
►
is because, oh, in his timeline after The Lord of the Rings,
00:58:10
◼
►
there's not really much magic left in the world.
00:58:14
◼
►
He had an idea of who the story would be about,
00:58:16
◼
►
but basically everybody magic is gone,
00:58:18
◼
►
and it's just the world of men.
00:58:20
◼
►
And he said, "Oh, I could have written
00:58:22
◼
►
"a totally fine adventure book,
00:58:26
◼
►
"but that's not what the world
00:58:28
◼
►
"I was trying to create was about."
00:58:31
◼
►
He was trying to create this mythology.
00:58:34
◼
►
And writing another story that's what happens
00:58:38
◼
►
after the mythology, he was like,
00:58:39
◼
►
"Wait, why am I doing this project?
00:58:41
◼
►
This isn't really what I want to do.
00:58:43
◼
►
Sure, we can continue what happens
00:58:44
◼
►
with a bunch of these characters, but I'm gonna say no."
00:58:47
◼
►
And so he just like stopped writing it and said,
00:58:49
◼
►
"No, I'm not gonna write anymore
00:58:51
◼
►
the stories of what happens after.
00:58:52
◼
►
I'll just fill in some of the details
00:58:54
◼
►
of what happened before."
00:58:55
◼
►
And I think that's an amazing example
00:58:57
◼
►
in reading his letter about it.
00:58:58
◼
►
He's very self-aware.
00:59:01
◼
►
Sure, the audience would love this.
00:59:03
◼
►
Everybody's asking for like a sequel to Lord of the Rings because it was so great,
00:59:07
◼
►
but it's also so great because he knew where to stop.
00:59:10
◼
►
I just think that's an interesting example and it would just make me very sad if people
00:59:15
◼
►
were typing into AI generator machines like, "Give me Tolkien's next book!"
00:59:20
◼
►
Like, "Oh no, please don't.
00:59:22
◼
►
Please don't do that.
00:59:24
◼
►
This episode of Cortex is brought to you by Squarespace, the all-in-one platform for building
00:59:29
◼
►
your brand and growing your business online.
00:59:31
◼
►
You can stand out with a beautiful website, engage with your audience and sell your products,
00:59:36
◼
►
services, the content that you create, whatever it is, Squarespace has got you covered.
00:59:41
◼
►
No matter what type of website you're looking to make, whether it's something for a personal
00:59:45
◼
►
event, maybe you have a business site that you want to create, it doesn't matter, Squarespace
00:59:49
◼
►
can help you.
00:59:50
◼
►
They've got you covered.
00:59:51
◼
►
And it starts with one of their beautiful templates.
00:59:54
◼
►
They're best in class, they're award winning.
00:59:57
◼
►
It is as easy as browsing the type of site or category of business you want to make a
01:00:01
◼
►
website for and that will give you the perfect starting place for you to then customise just
01:00:05
◼
►
a few clicks and really make it feel like your own.
01:00:09
◼
►
And then once your website expands and maybe you want to add that online store, you can
01:00:13
◼
►
sell whatever you want, physical, digital goods, it doesn't matter.
01:00:17
◼
►
They have all of the tools that you need to start selling online.
01:00:20
◼
►
And then as your website grows and your visitors start pouring in, you can use insights to
01:00:25
◼
►
grow your business.
01:00:26
◼
►
If you want to know where your site visitors are coming from, where your sales are generating
01:00:30
◼
►
from, which channels are most effective for you, you can analyse all of this in Squarespace.
01:00:35
◼
►
Then when you have the data, you can improve your website and build a marketing strategy
01:00:39
◼
►
based on your top keywords or the most popular products and content you have available.
01:00:43
◼
►
I love Squarespace. I have been using Squarespace for nearly 15 years now for various projects
01:00:50
◼
►
and I wouldn't go anywhere else. When I want to put something online, they're the first
01:00:54
◼
►
place I go. I know how to use it, it's so easy, everything looks great on every type
01:00:58
◼
►
the device, they have awesome apps for you to manage everything. It's the full package,
01:01:03
◼
►
I love it and I'm confident you're going to love it too. But you don't have to just take
01:01:07
◼
►
my word for it. Go to squarespace.com/cortex and you can sign up for a free trial, no credit
01:01:12
◼
►
card required. You can go in and build your whole website. Then when you want to launch
01:01:16
◼
►
it to the world, use the offer code "CORTEX" and you will get 10% off your first purchase
01:01:20
◼
►
of a website or domain. That's squarespace.com/cortex and then when you sign up, use the offer code
01:01:26
◼
►
Cortex to get that 10% off your first purchase and show you support for the show.
01:01:30
◼
►
Our thanks to Squarespace for their continued support of Cortex and Relay FM.
01:01:34
◼
►
Hey, do you want to talk about this podcast that you found here, Myke?
01:01:38
◼
►
So I referenced why I'm not worried about my job specifically. This has been going around
01:01:44
◼
►
a lot. It is an AI Steve Jobs on an AI Joe Rogan episode. And it is a full podcast created
01:01:54
◼
►
by an AI. I've not really bothered to look into this because I actually think it might
01:01:59
◼
►
be part of a publicity stunt. But nevertheless, it is a thing that exists, it's 20 minutes
01:02:05
◼
►
long and you can listen to it and I've like skipped around in it and you know what? It
01:02:09
◼
►
really sounds like Steve Jobs. Like there are points where you can hear that it's not
01:02:13
◼
►
real right? But like and ultimately my takeaway from this is who wants this? Like genuinely
01:02:19
◼
►
like who wants this? Who wants to hear what Steve Jobs might have said to a fake Joe Rogan?
01:02:25
◼
►
For me, I don't feel concerned about my job because you can take a version of me and a version of you
01:02:33
◼
►
and we can have them make podcasts forever, but I feel like if that's what you want, I don't know
01:02:43
◼
►
how much you could enjoy the content because it's not real. It's not real conversations.
01:02:48
◼
►
That's what I do for a living, is real conversations between real people about
01:02:52
◼
►
things that they care about and my assumption is the majority of people that listen to my shows
01:02:58
◼
►
want to hear that rather than let's imagine what to ai might be talking about instead for me that's
01:03:06
◼
►
just so broken from what i imagine people want the content for and look if you are that person you
01:03:14
◼
►
don't need to tell me of course i know you exist right like i'm sure that there are people that
01:03:18
◼
►
would like to just have us on in the background so they could go to sleep and it's just like we're
01:03:22
◼
►
We're just going to have an infinite amount of episodes forever.
01:03:25
◼
►
But realistically, I don't imagine that that's going to be a thing that people
01:03:32
◼
►
would genuinely care about enough in the way that people might care
01:03:35
◼
►
about the content that I make.
01:03:36
◼
►
So just coming across this one episode, it honestly made me, even though it
01:03:42
◼
►
showed me it can be done, it made me feel more secure in my own profession.
01:03:45
◼
►
But honestly, in the last episode, I wasn't worrying about my own profession.
01:03:50
◼
►
It was more about the idea of creativity in people.
01:03:53
◼
►
That's what I get more about.
01:03:54
◼
►
- You said this to me and I took a listen.
01:03:56
◼
►
I did the same thing.
01:03:57
◼
►
I didn't listen to the whole thing all the way through.
01:03:59
◼
►
I sort of skipped around.
01:04:01
◼
►
I have a slightly different take on this sort of stuff,
01:04:03
◼
►
which is it doesn't matter how good Steve Jobs
01:04:07
◼
►
on Joe Rogan is to listen to.
01:04:10
◼
►
It's more of just a demonstration of proof of concept.
01:04:14
◼
►
This thing is possible.
01:04:15
◼
►
And once you have a demonstration of proof of concept,
01:04:18
◼
►
get better, they don't get worse. So this was just like the first real proof of concept
01:04:24
◼
►
demonstration of two people having a podcast conversation. I do have to say, listening
01:04:29
◼
►
to it, I think the Steve Jobs was less good, but like holy moly did it nail Joe Rogan's
01:04:36
◼
►
way of talking.
01:04:37
◼
►
That makes sense though, right? The amount of source material for Joe Rogan is almost
01:04:41
◼
►
infinite really.
01:04:42
◼
►
Yeah, I just thought like it's actually interesting to listen to because you can almost
01:04:46
◼
►
hear the fact that there's what, 100 million hours of Joe Rogan talking and, you know,
01:04:52
◼
►
the database for Steve Jobs is so much smaller. And you can hear that in the two voices. It's
01:04:57
◼
►
like, especially because I think Rogan has a funny circular way of talking sometimes
01:05:02
◼
►
that it's like, I would never not know that that wasn't him. If you told me it was Rogan,
01:05:07
◼
►
but it was only just an AI Jobs, I would have believed it. But this to me is, again, a good
01:05:13
◼
►
example of this thing where making a marionette of someone is just bad. It's like, "Hey
01:05:19
◼
►
guys, whatever you think of Rogan, he's still a person and it's real bad to make
01:05:23
◼
►
like a fake show where he's talking. And it's less bad for Steve Jobs because he
01:05:31
◼
►
is dead, but it's still close enough that it makes me very uncomfortable." I just
01:05:36
◼
►
think we're going to see an increasingly large number of these sorts of things where
01:05:40
◼
►
people can make whatever they want and it's only going to become increasingly easy over
01:05:45
◼
►
time and I don't know how this project was made in particular but yeah it's the first
01:05:50
◼
►
example of oh it's an AI podcast with two people where we have enough information to
01:05:57
◼
►
recreate them in some sense.
01:05:59
◼
►
Yeah it's a publicist stunt for the company. The company that made this has an AI text
01:06:05
◼
►
to voice generation system that they are trying to sell.
01:06:08
◼
►
I mean that's not surprising.
01:06:10
◼
►
But it's also interesting just how fast so much of this is being commercialized.
01:06:15
◼
►
I stumbled across a – I thought, "Oh God, how brutal is this?"
01:06:21
◼
►
But it's a company that makes AI people who will read quite convincingly scripts where
01:06:27
◼
►
it looks like it's a talking head segment.
01:06:29
◼
►
And the whole idea is, "Oh, you can have all of your corporate training material delivered
01:06:33
◼
►
in this way where there's like an AI person who will talk through whatever it is you need
01:06:37
◼
►
to onboard your new employees. There's something about that to me which is like, it's very
01:06:41
◼
►
convincing but it also feels like what a horrible dystopian nightmare. It's like you as an
01:06:46
◼
►
employee sign up with the company is like, "Oh hey, we didn't even take the time to
01:06:52
◼
►
film a person going through our own training materials. We just gave it to an AI and they
01:06:57
◼
►
made a fake person who you get to listen to explain your job to you."
01:07:01
◼
►
Enjoy your training.
01:07:02
◼
►
I know, it's just so awful.
01:07:04
◼
►
We value you.
01:07:06
◼
►
I just want to read from the podcast AI, their kind of description of the show.
01:07:10
◼
►
"Whether you're a machine learning enthusiast, just want to hear your favorite topics covered
01:07:16
◼
►
in a new way, or even just want to listen to voices from the past brought back to life,
01:07:20
◼
►
this is the podcast for you."
01:07:22
◼
►
- Right. - Alright.
01:07:25
◼
►
- Do you want to hear our ghoulish marionettes say what we made them say?
01:07:30
◼
►
- It's just like, if you are a fan of Steve Jobs, which I think is who they are pitching this to,
01:07:35
◼
►
right? Why do you want to hear him talk about things he never spoke about? Like what do you
01:07:39
◼
►
value from that other than just like hearing the voice? Like if you just want to hear his voice
01:07:44
◼
►
just go to youtube and watch like commencement speeches or interviews or whatever. Like I don't
01:07:49
◼
►
understand why you want to hear him talk about things he never spoke about. It's not his opinion.
01:07:56
◼
►
There is no opinion of him in this. They've built all of this from things he said but you're just
01:08:03
◼
►
taking words he used and just putting them together in a new way. It's not actually
01:08:08
◼
►
his opinion. And I just I find it so strange.
01:08:15
◼
►
Since the last episode, I have much more strongly onboarded the concept of trying to only read
01:08:23
◼
►
and consume and listen to and watch media that you know has been produced by a human.
01:08:29
◼
►
And how did that do for you with Obu?
01:08:31
◼
►
You know what I mean?
01:08:33
◼
►
- Yeah, it sneaks in there, but at least it's like,
01:08:36
◼
►
oh, I know humans made the show.
01:08:39
◼
►
- Yeah, at least you got that part, right?
01:08:42
◼
►
- Maybe not all humans, all of it all the time,
01:08:43
◼
►
but like enough.
01:08:44
◼
►
- Yeah, and so again, this idea that went from like,
01:08:48
◼
►
that's crazy to, oh, I guess this is wise advice to live by
01:08:51
◼
►
has accelerated quite quickly in my life.
01:08:54
◼
►
And yeah, it's kind of a weird,
01:08:57
◼
►
oh, make sure you know it's a person
01:08:59
◼
►
who wrote or made this thing.
01:09:01
◼
►
So yeah, I am trying to onboard that as a concept
01:09:05
◼
►
because I think not onboarding that as a concept
01:09:07
◼
►
is part of what is going to lead to AI doom for us all.
01:09:12
◼
►
- Oh God, we still want to talk about AI.
01:09:14
◼
►
All right, yep, okay.
01:09:16
◼
►
This episode of Cortex is brought to you by Fitbod.
01:09:19
◼
►
Between balancing your work life, your family life,
01:09:22
◼
►
and everything else going on in your life,
01:09:24
◼
►
it can be hard to make fitness a priority.
01:09:26
◼
►
Personal fitness shouldn't be about competing with people.
01:09:30
◼
►
What you need is a program that is tailored for you, working with you and not against
01:09:34
◼
►
you and that's why you need FitBud.
01:09:37
◼
►
You don't want to have to look to other people and do what they're doing.
01:09:40
◼
►
You don't want to have to be told exactly what to do.
01:09:42
◼
►
You don't want to have to do these boring things every single day.
01:09:45
◼
►
You don't want that rigid structure.
01:09:47
◼
►
You want something that is flexible, something that adjusts, that is dynamic, that is your
01:09:51
◼
►
own fitness plan that you can have access to from anywhere. In a beautiful app of wonderful
01:09:58
◼
►
HD video tutorials shot from multiple angles so every exercise you want to learn is super
01:10:04
◼
►
simple. This is all why you need Fitbod. They use data to create and adjust that plan for
01:10:10
◼
►
you. It's crafted to be unique for you exactly. Their app is super easy to learn. It integrates
01:10:18
◼
►
with your smartwatch, your Apple watch, your Wear OS watch, it integrates with apps like
01:10:22
◼
►
Strava, Fitbit and Apple Health. Everything's super awesome. I love Fitbud. It really set
01:10:27
◼
►
me on a different path to my fitness journey when I started using it. What I really liked
01:10:32
◼
►
was that there was a variety there. Like that variety was super important to me. Previously
01:10:37
◼
►
I'd used apps and videos and stuff. It's just the same thing every day on a phone that's
01:10:40
◼
►
really boring. But I was learning new things, while at the same time it's also mixing up
01:10:46
◼
►
with those variety of exercises so you're not overworking or underworking anything.
01:10:50
◼
►
It's really clever. Personalized training of this quality can be expensive. Fitbod is just
01:10:55
◼
►
$12.99 a month or $79.99 a year but you can get 25% off her membership by going and signing up today
01:11:02
◼
►
at fitbod.me/cortex. So go now and get your customized fitness plan at fitbod.me/cortex.
01:11:09
◼
►
That is 25% off that you will get by signing up at FITBOD.ME/CORTEX.
01:11:16
◼
►
Our thanks to Fitbod for their support of this show and Relay FM.
01:11:20
◼
►
Yes, I do want to talk about Doom for us all, but partly because it's clearing up a little
01:11:26
◼
►
bit of confusion from last time as well.
01:11:29
◼
►
So one of the things we discussed last time was stable diffusion making images like,
01:11:36
◼
►
here is the filming of NASA landing on the moon
01:11:40
◼
►
and creating fake stuff.
01:11:42
◼
►
And I think a totally fair criticism of that
01:11:44
◼
►
that I saw from people is,
01:11:46
◼
►
hey, Photoshop exists, you know,
01:11:50
◼
►
like we can make those images now,
01:11:52
◼
►
we don't need an AI art system to generate them for us.
01:11:56
◼
►
If you would be convinced by a good Photoshop,
01:11:59
◼
►
what's the difference from being convinced
01:12:01
◼
►
by an AI art system about something
01:12:03
◼
►
that isn't real in the world?
01:12:06
◼
►
So what I want to portray here is I think Doom comes in like three phases.
01:12:11
◼
►
Phase one, I think will be just pollution of the public information space.
01:12:18
◼
►
I think it's sort of a general confusion.
01:12:20
◼
►
And what makes the difference between AI art and something like Photoshop is the scale
01:12:29
◼
►
and the cost. So right now, if someone wants to try and put out a bunch of misinformation
01:12:37
◼
►
or create evidence to back a conspiracy theory, there still needs to be effort that goes into
01:12:43
◼
►
creating that thing. Like you were saying, Myke, you need to learn these skills as part of your art
01:12:48
◼
►
for how to create a piece of misinformation to put out in the world. And I think what I kind of
01:12:56
◼
►
of expect if I'm projecting a bunch of this stuff forward is the scale of this potential
01:13:03
◼
►
misleading images and misleading text that were AI generated is just vastly beyond what
01:13:10
◼
►
we can imagine now. And I think in the modern world, like a lot of people have been driven
01:13:16
◼
►
kind of crazy just from the selection of what information they're presented. Like, you know,
01:13:26
◼
►
people on social media can kind of like drive themselves crazy by just going down rabbit
01:13:30
◼
►
holes and being continually presented with information that agrees with them. And that's
01:13:34
◼
►
just by like, selecting the things that they're seeing that actual human beings have created.
01:13:40
◼
►
But what I'm kind of thinking might happen here is that when you're able to generate
01:13:45
◼
►
a huge amount of content. Just as right now, companies intentionally AB test what they
01:13:53
◼
►
show you for engagement. Not even on purpose, but AI generated content will effectively
01:14:01
◼
►
be unintentionally AB tested for convincing this. How convincing is this sequence of words
01:14:09
◼
►
in whatever idea it's trying to spread. And I really do view a lot of this stuff as the
01:14:17
◼
►
kind of concept of memes, of ideas. They evolve, and they spread, and they mutate, and their
01:14:24
◼
►
spreading in the world doesn't have anything to do with how true they are, it has to do
01:14:29
◼
►
with how convincing they are.
01:14:31
◼
►
Imagine hooking up Twitter's algorithm to a text AI.
01:14:36
◼
►
Yeah, I think that's going to happen.
01:14:37
◼
►
every time you pull to refresh, it just gives you a bunch of other nonsense.
01:14:43
◼
►
That is truly horrifying to me as a thought.
01:14:47
◼
►
I think we're going to see that.
01:14:48
◼
►
I think we're going to see that or we're going to see something very much like it.
01:14:51
◼
►
I mean, look, the AI video stuff that we saw before.
01:14:55
◼
►
Do you think TikTok won't start doing that the moment that it becomes engaging to people?
01:15:01
◼
►
That's a really good point.
01:15:02
◼
►
That's actually the most likely of all of them, I think, to occur.
01:15:06
◼
►
to just automatically generate 20 second, 30 second videos every time a person refreshes
01:15:12
◼
►
and just keep doing it for whatever keeps that person engaged.
01:15:16
◼
►
I think we're going to see that.
01:15:18
◼
►
Even if it's not the companies directly doing it themselves, you'll have entities on Twitter
01:15:23
◼
►
where it's like, "Oh, it's a bot, but it's, you know, it's acting like it's a person and
01:15:27
◼
►
it's just existing in the world and it's doing this kind of unintentional A/B testing for
01:15:32
◼
►
convincing this."
01:15:33
◼
►
and the ones that are more convincing for whatever reason will just spread better.
01:15:37
◼
►
So I really think that is a kind of like, I think a lot of the criticism is people want
01:15:44
◼
►
to know, like, gray, what like, you love technology, why aren't you behind this stuff? I used to
01:15:50
◼
►
be a real technological optimist. But I've changed my mind on a bunch of that stuff.
01:15:55
◼
►
And this is one area in particular, where, and I want to be clear here, I don't think
01:16:01
◼
►
It's malicious.
01:16:02
◼
►
I don't think it's necessarily that someone's out there trying to do bad, but I think the
01:16:07
◼
►
ability to create hundreds of millions of memes in the way of like just a concept that
01:16:15
◼
►
can spread at the drop of a hat is just bad for the public information space.
01:16:22
◼
►
And the world has barely survived social media in some ways, like that has made stuff so
01:16:29
◼
►
so bad with how people think about the world. And this is that just taken to the next level
01:16:34
◼
►
by a huge order of magnitude. So I suspect that'll be one of the first ways that this
01:16:41
◼
►
becomes obviously bad over time. Like once it progresses out of the stage of just being
01:16:49
◼
►
an interesting toy. My hope is that for whatever reason, we're in the part of the technological
01:16:55
◼
►
development where it looks like it's an exponential graph, but it's rapidly going
01:17:00
◼
►
to level off at an s-curve, and we discover "oh, there's parts of this that were way
01:17:05
◼
►
harder than we thought." That's where I hope this is going, but…
01:17:09
◼
►
MATT: You show me the signs of that one on the next episode, then.
01:17:12
◼
►
BRIAN well, I'm in this position where I feel
01:17:15
◼
►
a little bit like "oh, I'm like a crazy person talking into a microphone about this
01:17:19
◼
►
stuff. But I have spoken to some of the top people in the world in this area and they're
01:17:28
◼
►
very concerned is the way that I would put it. And talking to people, I was trying to
01:17:34
◼
►
tease out this concept of where do we think we are on this curve? Is this the start of
01:17:39
◼
►
the exponential or are there obvious problems ahead? And the answer was pretty universally,
01:17:45
◼
►
Oh, the as far as we can tell, the exponential has barely begun. Because the thing part of
01:17:51
◼
►
the reason it's making such fast progress is because the the work being done right now
01:17:57
◼
►
is still in the realm of Oh, hey, you get into work. And like, what's the first thing
01:18:01
◼
►
you think of that could make this better? And you try it, and it makes it better. That's
01:18:05
◼
►
just like an indication that we're at the start of an exponential curve. So yeah, I
01:18:09
◼
►
think we're gonna have like a bunch of confusion about this stuff. I think that transitions
01:18:14
◼
►
into genuine economic problems as particularly language models get better and better and better
01:18:23
◼
►
at doing all the kinds of work that humans do, which is largely knowledge work.
01:18:30
◼
►
You know, and it's funny, like I made "Humans need not apply" eight years ago now, I think?
01:18:37
◼
►
And it's been on the back of my mind about, you know, revisiting that at some point.
01:18:41
◼
►
And I was kind of thinking like, oh, I don't know how relevant this video still is. But having
01:18:48
◼
►
rewatched it, it's like, Oh, no, I put it back, like to be featured on my channel under one of
01:18:52
◼
►
like the most watched videos, because it's like, no, no, all this AI stuff and all of the language
01:18:56
◼
►
stuff like makes this way more concerning. I think the lesson learned there has simply been that
01:19:02
◼
►
physical automation is slower to progress for a bunch of reasons. But all of the knowledge
01:19:08
◼
►
worker stuff is coming along very very fast. I just know that there are companies that
01:19:13
◼
►
are very explicitly targeting low-level knowledge work and then will be progressing further
01:19:19
◼
►
and further up the chain as fast as they can with better and better language models to
01:19:23
◼
►
do all sorts of things that people can do. I understand that lots of people just fundamentally
01:19:28
◼
►
disagree with me on this point of economics that jobs can't be replaced because humans
01:19:34
◼
►
have infinite needs and always want more things. I understand that argument, I just don't
01:19:41
◼
►
agree with it, and I think AI just breaks some of the fundamental assumptions that are
01:19:45
◼
►
built into that model of as we get better machines of all kinds, we just increase the
01:19:51
◼
►
quality of life and increase our desires. Like, I just don't think that that's universally
01:19:55
◼
►
true. So yeah, I think we're going to end up with some really major problems in the
01:20:01
◼
►
the economy, particularly in the knowledge worker part of that economy.
01:20:06
◼
►
And look, we don't need to talk about it today because we've talked about it enough,
01:20:10
◼
►
but I think once you start encroaching on AI systems that are good enough to replace
01:20:15
◼
►
most human work, you really start encroaching on the kinds of things that can lead to the
01:20:21
◼
►
extinction of the human race.
01:20:24
◼
►
Maybe that's a little too heavy for today, but that's kind of my having thought it
01:20:28
◼
►
through of like, what are the three phases of where does this go?
01:20:31
◼
►
– confusion of the public information space, destruction of the economy, extinction of
01:20:36
◼
►
the species. It's one, two, three.
01:20:40
◼
►
How you feeling, Myke?
01:20:45
◼
►
No, I didn't want to do another hour and a half on this, to be honest. But I guess
01:20:50
◼
►
this is who we are now.
01:20:52
◼
►
Yeah. Can I just ask you though, like, how crazy does that sound to you?
01:20:55
◼
►
No, it doesn't. And that's why I don't like it. I don't have optimism about this
01:20:59
◼
►
this area of technology. I do not think that this is a thing that will produce much good.
01:21:05
◼
►
We can leave it there for now then. On with the show. It's only been an hour and a half.
01:21:17
◼
►
Just some quick follow up from the last thing we talked about on the previous episode. Okay,
01:21:21
◼
►
on the lighter side of things, what is a podcast, Myke?
01:21:25
◼
►
I don't know why. Why are we doing this one again, too? What is this?
01:21:30
◼
►
We just relive the horrors of the previous episode?
01:21:32
◼
►
I feel so badly for you, Myke. Last episode, you stumbled into a terrible mistake,
01:21:39
◼
►
which was attempting to describe... but what is a podcast?
01:21:43
◼
►
Well, the real issue was I put two topics together that seemed related, but they were unrelated.
01:21:50
◼
►
Right, yeah. This is always the dangers of speaking
01:21:53
◼
►
extemporaneously and you're just sort of like in the middle of a conversation and you say some
01:21:57
◼
►
things and so anyway we ended up talking for a while about like but what is a podcast what must
01:22:03
◼
►
it be in order for it to be the platonic ideal of a podcast and this this caught people's attention
01:22:09
◼
►
and you can click the link in the show notes a cortex sent in what i absolutely adored is this
01:22:16
◼
►
oh i saw this podcast alignment chart and i read this and was like i think i agree with absolutely
01:22:22
◼
►
everything on this chart. This nails what I wanted, I think, to kind of get across,
01:22:27
◼
►
but maybe didn't do a great job of doing, I don't know.
01:22:29
◼
►
So in the great tradition of memes, there's a meme where people make the alignment chart for
01:22:36
◼
►
various things, and I think this started with what is still the fantastic example of the sandwich
01:22:42
◼
►
alignment chart. So you have two axes in the chart for the sandwich, which is ingredients
01:22:49
◼
►
and structure. So it ranges from ingredient purist to ingredient rebel, and then you have
01:22:56
◼
►
structure purist. A sandwich must have the classic sandwich shape, two pieces of bread
01:23:01
◼
►
with toppings in between. And then like structure rebel, any kind of food enveloped in any way is a
01:23:07
◼
►
sandwich. Oh, like a hot dog? Yeah, so this is like, is a hot dog a sandwich? And hot dog falls
01:23:13
◼
►
on the sandwich alignment chart of ingredient neutral and structure neutral. Wait, this is a
01:23:18
◼
►
- There's a very intriguing way that you've introduced this.
01:23:21
◼
►
Isn't this just like the chaotic, evil, chaotic, good thing?
01:23:25
◼
►
- Yes, that's where this comes from.
01:23:26
◼
►
That's the origin.
01:23:27
◼
►
- Oh, right. - It is like.
01:23:28
◼
►
- I misunderstood you and thought you were saying
01:23:29
◼
►
that the sandwich one was the origin.
01:23:31
◼
►
I was like, I don't think you're right there.
01:23:33
◼
►
- No, no, no, no, no, no, no, no, no, no, no, no.
01:23:37
◼
►
I always forget 'cause I never played "Dea Deas."
01:23:39
◼
►
- Chaotic, neutral, chaotic, good, that's the thing.
01:23:41
◼
►
- No, but the two axes, it's evil and good
01:23:43
◼
►
and then lawful and chaotic.
01:23:45
◼
►
Is that, that's, that's,
01:23:46
◼
►
those the two axes on the traditional one?
01:23:48
◼
►
The sandwich alignment chart is the meme, right?
01:23:51
◼
►
Which is like, oh, here is from which other things birth.
01:23:54
◼
►
And so what I love on the internet is people will get into an argument about like, what
01:24:00
◼
►
Is a hot dog a sandwich?
01:24:01
◼
►
Is a pop tart a sandwich?
01:24:03
◼
►
Is a chicken wrap a sandwich?
01:24:04
◼
►
And someone will come up with like, here's the alignment chart to try to describe where
01:24:09
◼
►
all of these things fit in.
01:24:10
◼
►
And so someone did this for podcasts, the podcast alignment charts.
01:24:15
◼
►
The two axes are distribution method and media type.
01:24:20
◼
►
So do you wanna run through some of these, Myke?
01:24:22
◼
►
Where do you fit on the podcast alignment chart?
01:24:25
◼
►
- Oh man, I think I was...
01:24:28
◼
►
- I feel like you were making an argument
01:24:30
◼
►
for traditional distribution method,
01:24:32
◼
►
traditional media type,
01:24:33
◼
►
which is an audio RSS feed is a podcast.
01:24:37
◼
►
- No, because I don't, I think I am a modernist.
01:24:41
◼
►
- Distribution modernist, media type traditionalist,
01:24:44
◼
►
- Any audio that's subscribable is a podcast?
01:24:47
◼
►
- No, I think I'm modern modern.
01:24:49
◼
►
- Any audio or video that's subscribable is a podcast?
01:24:52
◼
►
Oh, so my YouTube channel is a podcast,
01:24:54
◼
►
that's what you're saying?
01:24:55
◼
►
- No, you see that's, yeah, no,
01:24:56
◼
►
that I'm modern traditionalist.
01:24:59
◼
►
'Cause that's the thing of like,
01:25:00
◼
►
but it gets into an issue for me is that I watch videos
01:25:04
◼
►
that I consider podcasts.
01:25:06
◼
►
It's the any, I would see, I would phrase it as any audio
01:25:10
◼
►
or video that's subscribable can be a podcast,
01:25:14
◼
►
That would be my personal definition, I think.
01:25:17
◼
►
Would be closer to how I feel.
01:25:20
◼
►
But I think I'm more traditionalist modernist,
01:25:22
◼
►
which is any audio that's subscribable is a podcast.
01:25:25
◼
►
No, but then it's got audio books in here.
01:25:27
◼
►
This is very complicated.
01:25:30
◼
►
I would say any audio or video
01:25:32
◼
►
that somebody wants to call a podcast
01:25:34
◼
►
that you can subscribe to can be a podcast.
01:25:36
◼
►
I think that's how I actually personally sit now.
01:25:39
◼
►
I think that works for me.
01:25:42
◼
►
I really love these charts go kind of crazy, right?
01:25:45
◼
►
I think the problem with the podcast one is a two-dimensional surface is not enough to
01:25:50
◼
►
express the entirety of what it wants to be.
01:25:53
◼
►
You need a third axis, which I think is something like consumption intention.
01:26:00
◼
►
So for me, like, I would be like a traditionalist here, right?
01:26:03
◼
►
That the consumption intention is audio only.
01:26:08
◼
►
That to me is a really key characteristic of what is a podcast.
01:26:13
◼
►
It has to be intended as an audio first experience, which isn't quite captured on this chart.
01:26:20
◼
►
But then it can be a video.
01:26:22
◼
►
It can be a Spotify exclusive.
01:26:24
◼
►
Yeah, see, I don't know.
01:26:28
◼
►
I am very sympathetic to the position you were expressing last time, which is there's,
01:26:33
◼
►
or at least I thought it was the position you're expressing last time, that there's
01:26:37
◼
►
Something about it needs to be generally accessible. I think for me, the example in my mind that
01:26:45
◼
►
really sticks out is Audible does these things that they call podcasts, which are like little
01:26:50
◼
►
shows that you can subscribe to in the Audible app.
01:26:52
◼
►
I don't know why they call them podcasts. I don't get that one at all.
01:26:56
◼
►
I've listened to some of them, like they're fine, I haven't found one that's amazing
01:26:59
◼
►
or anything, but they're also, I don't know, there's some part of my brain which
01:27:03
◼
►
is just like, this is not a podcast at all.
01:27:05
◼
►
It's just original audio content.
01:27:06
◼
►
Yeah, it's an audio-
01:27:07
◼
►
call them audible originals yeah and they call them podcasts and I just wonder if maybe
01:27:12
◼
►
they should just stick to one of those descriptors but yeah but anyway like those are the ones
01:27:18
◼
►
that really clang in my brain of like but what's the key thing there and I think the
01:27:22
◼
►
fact that they're only available on audio really makes them not a podcast to me okay
01:27:27
◼
►
well then Joe Rogan yeah is that a podcast?
01:27:31
◼
►
- Ah, damn you, Myke.
01:27:37
◼
►
'Cause I would say yes, it is,
01:27:38
◼
►
even though it's a Spotify exclusive.
01:27:40
◼
►
- Curse you.
01:27:44
◼
►
That is the perfect counter example to my position.
01:27:49
◼
►
- So that's why I feel like for me,
01:27:51
◼
►
there is the third axis, which is like intent.
01:27:54
◼
►
And that's when if we go on this third axis of intent,
01:27:58
◼
►
I would sit in the modernist
01:27:59
◼
►
where it could be like any audio or video that's
01:28:01
◼
►
subscribable can be a podcast,
01:28:04
◼
►
but it's on creators intent.
01:28:06
◼
►
- Yeah, but see, look, just to make things hard,
01:28:08
◼
►
it has to be phrased in order to fit the meme as,
01:28:11
◼
►
is a podcast.
01:28:13
◼
►
You can't put in these weasel words of can, right?
01:28:15
◼
►
The purpose of the alignment chart
01:28:17
◼
►
is to definitively answer what is a podcast.
01:28:20
◼
►
- Well, I'll tell you now, it doesn't.
01:28:22
◼
►
- I really love looking at this chart,
01:28:24
◼
►
that traditionalist traditionalist is
01:28:26
◼
►
an audio RSS feed is a podcast.
01:28:29
◼
►
And then you start moving in these reasonable directions of any audio that's subscribable
01:28:33
◼
►
is a podcast.
01:28:35
◼
►
And then as you get into the radicalist ones, it just gets very funny, right?
01:28:39
◼
►
Any audio available on the internet is a podcast.
01:28:43
◼
►
Anything that you can subscribe to is a podcast.
01:28:46
◼
►
And my personal favorite, the radicalist radicalist is that anything on the internet is a podcast,
01:28:53
◼
►
which made me laugh so hard when I first saw it.
01:28:57
◼
►
Because that got me.
01:28:59
◼
►
Because you know what?
01:29:02
◼
►
That fits with some people that I've come across in my life, you know what I mean?
01:29:06
◼
►
I also enjoyed the cortexes giggling over this one, where people were like, "This
01:29:11
◼
►
comment on this Reddit thread is a podcast, right?
01:29:13
◼
►
Because I'm a radicalist?
01:29:15
◼
►
Radicalist!"
01:29:16
◼
►
I thought this was fantastic, and thanks to XD1936 for posting it, it got a really good
01:29:22
◼
►
laugh out of it and I think yeah it was just a fun way to try to encapsulate the conversation
01:29:27
◼
►
from last time.
01:29:30
◼
►
This episode of Cortex is brought to you by Wealthfront.
01:29:32
◼
►
Is your bank keeping money that could be yours?
01:29:35
◼
►
If you're earning less than Wealthfront's 2.55% APY, they might be.
01:29:41
◼
►
Federal interest rates have been going up this year which means banks have had the opportunity
01:29:44
◼
►
to earn more on your savings.
01:29:46
◼
►
Where's all the extra money going?
01:29:48
◼
►
Well according to the FDIC, the average US bank has only raised their rates to 0.17%
01:29:55
◼
►
this year, while Wealthrun is now offering their clients a rate that's about 15 times
01:29:59
◼
►
higher with the Wealthrun Cash account.
01:30:02
◼
►
Of course you have a choice.
01:30:03
◼
►
You could let your bank keep the extra money they're making, or you could move your savings
01:30:08
◼
►
to a high yield account like Wealthrun and earn 2.55% APY.
01:30:13
◼
►
I think that having your savings in order is really just a great thing, it's one less
01:30:17
◼
►
thing to have to think about, so check it out. It's super easy to sign up, plus you
01:30:21
◼
►
get unlimited transfers that are completely fee free and up to $1 million in FDIC insurance
01:30:27
◼
►
through partner banks. There are no account fees, no minimum balance, and if you sign
01:30:31
◼
►
up at wealthfront.com/cortex you get a free $50 bonus of an initial deposit of $500. Don't
01:30:37
◼
►
let your bank keep the interest you could be earning. Join nearly half a million people
01:30:42
◼
►
who already use Wealthfront to earn 15 times more than the average US bank. That's wealthfront.com/cortex
01:30:49
◼
►
to get started and earn your free $50 bonus. That's wealthfront.com/cortex. This has been
01:30:55
◼
►
a paid endorsement for Wealthfront. Our thanks to Wealthfront for the support of this show
01:30:59
◼
►
and Relay FM.
01:31:01
◼
►
We are hurtling towards the end of the year.
01:31:02
◼
►
Uh, yes, alarmingly fast.
01:31:05
◼
►
And this is the final regular episode of Cortex this year.
01:31:09
◼
►
Don't remind me, Myke.
01:31:10
◼
►
So if you have more to say about AI, oh god, great, where's AI art gonna be in January?
01:31:17
◼
►
By January we're gonna be at step two of my three steps to the apocalypse.
01:31:21
◼
►
Excellent, can't wait for it.
01:31:23
◼
►
So maybe that's why like we had to spend the best part of two hours doing follow-up, because
01:31:28
◼
►
we're not gonna do it now.
01:31:30
◼
►
Yeah I guess so, let's say that, that was the plan, we didn't just end up talking about
01:31:34
◼
►
it for forever.
01:31:35
◼
►
There is a possibility we might do something in Mortax, getmortax.com, I can't say that
01:31:40
◼
►
for sure because I don't know what we're going to do over the next two months except
01:31:43
◼
►
for what I know we're going to do which is state of the apps November yearly themes December.
01:31:49
◼
►
So on that note for yearly themes I wanted to take this as a time to suggest a method
01:31:56
◼
►
to people for preparing for their yearly theme. If you don't know what a yearly theme is I
01:32:02
◼
►
will put a video in the show notes where Grey explains what yearly themes are but we're
01:32:06
◼
►
going to get to it again obviously in December so I don't want to talk about it now. But
01:32:11
◼
►
I am personally at the point of my year now where I'm preparing my yearly theme. The
01:32:17
◼
►
way I do this is I have it in my mind to start noting down things that I'm happy about
01:32:25
◼
►
in my work life and personal life and things that I find frustrating in those as well and
01:32:31
◼
►
things I would like to change. So I keep this as an Apple note, it's in my yearly themes
01:32:36
◼
►
Apple note, but I have it just playing in my mind that as I'm bumping into things that
01:32:42
◼
►
are good or bad that I would like to continue or stop, I write them all down. I just write
01:32:47
◼
►
them down. I don't need to draw any comparisons to them yet, I just start writing them down.
01:32:52
◼
►
So I end up with a list of things, maybe 20 things or so, over the space of a few weeks
01:32:58
◼
►
or a month that are in my head. Some of them are not even things I bump into, but because
01:33:02
◼
►
I'm in this kind of reviewing mode, things just pop up in my head randomly. I'm like
01:33:07
◼
►
"Oh, that's a thing, I'll write that down." Then as we get towards November or so, I start
01:33:13
◼
►
to review this list of things and note the similarities where I can tie some similarities
01:33:18
◼
►
between things. Be like "Oh, that's actually related to this" or "That is similar to this."
01:33:24
◼
►
I then use these links as the basis to build my theme.
01:33:27
◼
►
So what do I think I could do next year to address some of these common issues that I
01:33:34
◼
►
have or these common things that I would like to do more of?
01:33:38
◼
►
So if you struggle to come up with creating yearly theme, this is my top tip.
01:33:43
◼
►
It's how I do mine every year.
01:33:44
◼
►
I don't know if this is asking for spoilers for the theme episode, which I don't want,
01:33:49
◼
►
but I just I feel like I would like some concrete examples of the kinds of things that you're
01:33:52
◼
►
down that you end up thinking about and they don't they don't have to be
01:33:56
◼
►
examples for this year. I will give you some examples for this year but I'm not
01:34:00
◼
►
gonna give you so many that I think was born my theme. Good. I want to spend more
01:34:04
◼
►
time on product creation and design. That's one. I want to listen to more
01:34:09
◼
►
music. I'm happy with how I've looked after my health and want to do more of
01:34:13
◼
►
that. I want to be smarter about my scheduling and the days that I'm in the
01:34:17
◼
►
office. These are the kinds of things I write down.
01:34:20
◼
►
Okay, so presumably then the way you have this happen is there's some frustration
01:34:27
◼
►
about the scheduling and then I guess you notice that and add it to this file?
01:34:30
◼
►
Is that the system? Mm-hmm. Okay. Like all these are at the moment
01:34:35
◼
►
it's just an outline like bullet points of things, right? And I write a
01:34:39
◼
►
bunch of notes down, I write some context down, and so then I start reading through
01:34:43
◼
►
these and I'm like, "Oh hang on a minute, this can relate to this and this.
01:34:47
◼
►
Or like, "Hey, I've written a bunch of times here that I'm unhappy about this kind of thing
01:34:52
◼
►
in like a bunch of different ways. So can I address that thing?"
01:34:58
◼
►
- Hmm, okay, actually, interesting. I think I should frame this thought a bit differently,
01:35:03
◼
►
because I have a notes file where when it occurs to me things that I may want to discuss
01:35:08
◼
►
on the theme episode, I make a note of them. I guess this is just a different kind of
01:35:13
◼
►
the same idea of like things about my own personal theme there.
01:35:17
◼
►
But no, I like that as a have a place to collect what you notice about your life,
01:35:25
◼
►
which also then just encourages you to notice more.
01:35:28
◼
►
Might be a way to try to pitch it to people who are,
01:35:32
◼
►
as the end of the year comes hurtling towards us,
01:35:35
◼
►
and if they've never tried a theme before,
01:35:37
◼
►
might want to think about that over the next two and a half months.
01:35:41
◼
►
They're going to start a theme at the beginning of the year.
01:35:43
◼
►
It's like just have a note somewhere to put down things that you notice about your life as a way to get started
01:35:49
◼
►
And then you have something to look over and try to synthesize later. It's not about any particular moment
01:35:56
◼
►
Okay. Yeah, that's interesting
01:35:57
◼
►
I think that this is make something that's more likely to stick on because it's it's actually related to things that you've experienced
01:36:04
◼
►
Mmm, and pushes it further away from the New Year's resolution idea because New Year's resolutions are typically created from whole cloth
01:36:12
◼
►
like this just like I have this aspirational idea of myself that I would like to be this different kind of person and I'll
01:36:17
◼
►
Just go and live that life
01:36:18
◼
►
Where what I'm suggesting that you do is how do you live right now?
01:36:22
◼
►
What frustrates you about things that are happening right now? What would you like to be better in your actual life right now?
01:36:29
◼
►
Mmm and try and think about what that might be
01:36:33
◼
►
It's related to something that's actually happening rather than something that you look forward to just imagine yourself as a different person
01:36:39
◼
►
Yeah, that's a good distinction because, again, also the New Year's resolutions have
01:36:44
◼
►
the feeling to me of homework assignments that you forgot about until the morning of
01:36:48
◼
►
and it's like, "Oh no, I need to write an essay about what I'm going to do this
01:36:50
◼
►
year." Whereas this makes me think of—it's actually kind of popping into my head the
01:36:55
◼
►
way some researchers have tried to study how happy people are by messaging them at random
01:37:01
◼
►
points throughout the weeks or the years of just like, "Hey, right now, how do you feel?"
01:37:06
◼
►
And this is closer to that.
01:37:07
◼
►
It's like a continual process of, hey, notice in your life
01:37:10
◼
►
how you're feeling and just make a little note of it.
01:37:12
◼
►
And then later, you can have some stuff to review.
01:37:14
◼
►
So I like that.
01:37:15
◼
►
I think that's good.
01:37:16
◼
►
I think that's a good suggestion for people looking
01:37:18
◼
►
for a place to onboard where to start instead of just trying
01:37:23
◼
►
to create a theme out of whole cloth
01:37:25
◼
►
after they've listened to the theme episode.
01:37:29
◼
►
There's something I've got to tell you about before we wrap
01:37:33
◼
►
I didn't know about this.
01:37:35
◼
►
somehow seemed to have slipped by everyone. In iOS 16 there is now a Dvorak software keyboard
01:37:44
◼
►
on the phone. Oh yeah? Let me grab my phone here. It's in the settings app right? You
01:37:52
◼
►
have to go to settings and then I think it's general keyboard and then whatever your setting
01:38:00
◼
►
as like English UK or English US or whatever. If you tap that you get the option of QWERTY,
01:38:06
◼
►
AZERTY, QWERTS and DIVORAC.
01:38:07
◼
►
Oh I see so it's not a new keyboard it's under English I would never have found that.
01:38:12
◼
►
You could probably add it as a new keyboard?
01:38:14
◼
►
No it's not that's what I was just trying to do.
01:38:16
◼
►
Oh god I hate it.
01:38:18
◼
►
Oh interesting.
01:38:19
◼
►
Wow that's horrifying.
01:38:20
◼
►
Do you still use DIVORAC?
01:38:24
◼
►
Of course I use DIVORAC Myke.
01:38:26
◼
►
So your keyboard on your desk is set in the Dvorak layout?
01:38:30
◼
►
I feel like we haven't spoken about that
01:38:31
◼
►
in a really long time.
01:38:32
◼
►
I've kind of forgotten.
01:38:34
◼
►
- Yes, well, my keyboard, if you look at it,
01:38:37
◼
►
the current one I'm using is, it has a QWERTY layout,
01:38:40
◼
►
but the keys are mapped to the Dvorak settings.
01:38:43
◼
►
- You are a monster.
01:38:45
◼
►
- No, no, no, I actually, I've decided that's best.
01:38:48
◼
►
That's the better way to go.
01:38:49
◼
►
I've stopped getting keyboards.
01:38:51
◼
►
Look, I think there's only two ways to go.
01:38:53
◼
►
QWERTY visual layout or blank, right?
01:38:58
◼
►
Like nothing on any of the keys.
01:39:01
◼
►
- Why can't you use the Dvorak visual layout?
01:39:03
◼
►
I think I could build you a keyboard one day
01:39:08
◼
►
that would be laid out this way.
01:39:10
◼
►
- Yeah, okay, I guess here's,
01:39:12
◼
►
I was just trying to articulate why.
01:39:15
◼
►
After years and years of doing all sorts of different things,
01:39:17
◼
►
why have I settled on this?
01:39:19
◼
►
The answer is that keyboard shortcuts are funny
01:39:24
◼
►
in a lot of applications.
01:39:27
◼
►
So there's something in the system level
01:39:31
◼
►
where a keyboard shortcut will either trigger based
01:39:34
◼
►
on the letter that the key represents
01:39:37
◼
►
or it will trigger based on the location that the key is.
01:39:41
◼
►
You can see this explicitly in some things like Final Cut
01:39:45
◼
►
where you can reprogram the key based on letter
01:39:49
◼
►
or location. So I guess the way I use it is that it is I'm just not having I'm struggling
01:39:55
◼
►
trying to think of a specific example that's not final cut, but I know it comes up where
01:39:59
◼
►
it's useful to be able to see what does the keyboard look like for everyone in the whole
01:40:06
◼
►
wide world, sometimes for keyboard shortcuts like that does come up where I am glad that
01:40:11
◼
►
the layout is a QWERTY layout. And since I touch type, I don't look at the keyboard,
01:40:16
◼
►
It doesn't matter when I'm actually writing what the keys look like at all, which is why
01:40:20
◼
►
I have gone with a completely blank keyboard sometimes, which I do think is cool, but can
01:40:25
◼
►
be annoying for keyboard shortcuts.
01:40:27
◼
►
I have a question for you.
01:40:29
◼
►
Well, because one, I'll come back to the keyboard shortcuts thing because I know how to fix
01:40:33
◼
►
Depends on what you use.
01:40:34
◼
►
Anyway, when you are using your Dvorak layout, what is the copy command?
01:40:40
◼
►
Okay, so for me, it's command I is what it would look like on a query keyboard. That's
01:40:47
◼
►
copy. Oh, okay. Yes, I know what you're saying. Now you are actually hitting still command
01:40:54
◼
►
C though is what I'm asking. That's more work, but it's in a different place. Okay, it's
01:40:58
◼
►
in a different place. So one of the things that exists now, which I don't know if it's
01:41:03
◼
►
actually a good idea for any other lunatics who might want to switch their keyboard layouts.
01:41:09
◼
►
One of the things that computers have gotten better at over years is there are now explicit
01:41:13
◼
►
layouts which are on the Mac, it's called something like Dvorak preserve shortcuts.
01:41:20
◼
►
So it's like you type in Dvorak, but it totally ignores the keyboard layout for all keyboard
01:41:26
◼
►
It would be like command J looking at the Dvorak layout right in that scenario where
01:41:33
◼
►
it's preserving the location of the key.
01:41:36
◼
►
Yeah, it's funny, it's like I can't – look, I is C. What's the complication here?
01:41:42
◼
►
I'm looking at a Dvorak layout on Wikipedia and like, you know, J is in the location that
01:41:48
◼
►
C is in a QWERTY keyboard.
01:41:51
◼
►
And so you would hit, I guess, command J, because what it's saying is, the way I've
01:41:56
◼
►
understood the way you've explained that, is like, it's keeping the physical location
01:42:00
◼
►
of the key no matter what the key actually says it is.
01:42:03
◼
►
Yeah, okay. So I just looked it up like what is the actual thing called in the system setting?
01:42:08
◼
►
So on the Mac, you can set it as something called Dvorak QWERTY command. So I presume
01:42:14
◼
►
that what it's doing there is whenever you hit command, it ignores the Dvorak layout
01:42:17
◼
►
and reverts to the QWERTY layout.
01:42:19
◼
►
That seems horribly complicated. Yeah, all of this to say, one day, because I want to
01:42:24
◼
►
build your keyboard one day, just for fun. I would choose a keyboard where I can change
01:42:30
◼
►
the programming of the keyboard. The boards that I build these days tend to all support
01:42:36
◼
►
a piece of software called VIA which overrides the layout on the keyboard. So it's not relying
01:42:44
◼
►
on the Mac. So if you hit any shortcut in any app it's going to register as the key
01:42:53
◼
►
that is on the Dvorak layout. It doesn't matter that you would try and do some kind of key
01:42:59
◼
►
binding and software. Does that make sense? Yes. Right? So like one of the
01:43:04
◼
►
things you're saying is like for example if you hit some kind of keyboard
01:43:08
◼
►
shortcut in Final Cut it's going to assume that's QWERTY even if you've set
01:43:12
◼
►
to Dvorak because there's like a software in between the two of them
01:43:15
◼
►
that's trying to communicate it. But I'm saying the hardware of the keyboard
01:43:20
◼
►
would communicate to the computer so I am convinced it would get it correctly
01:43:27
◼
►
it would always hit correctly.
01:43:29
◼
►
- Okay, I feel like I'm not 100% understanding,
01:43:32
◼
►
but there's also so many layers here.
01:43:35
◼
►
One of the other problems is,
01:43:36
◼
►
I don't know if any of the cortexes
01:43:37
◼
►
are able to help me out with this,
01:43:39
◼
►
if anyone's already done this.
01:43:40
◼
►
Here's the maximum level of crazy, right?
01:43:43
◼
►
I use the Dvorak layout,
01:43:45
◼
►
but I've learned all the keyboard commands,
01:43:48
◼
►
all the basic system commands,
01:43:50
◼
►
with the Dvorak layout,
01:43:51
◼
►
so I'm not pressing the regular buttons
01:43:53
◼
►
that everybody knows.
01:43:55
◼
►
However, in Final Cut, the way that the programmers laid out all the keyboard shortcuts that do
01:44:03
◼
►
things, it makes sense physically where they are on the keyboard.
01:44:08
◼
►
Whereas like Command Copy, Command Paste, like it doesn't really matter if those two
01:44:11
◼
►
are next to each other.
01:44:13
◼
►
But there's tons of stuff in Final Cut, which is like trim from the start of the clips,
01:44:17
◼
►
trim from the end of the clip.
01:44:19
◼
►
And you want to have those keys like next to each other on opposite sides.
01:44:22
◼
►
There's a lot of like physicality of this.
01:44:24
◼
►
So in Final Cut, I've been trying to slowly build up my own custom mapping, which is to
01:44:31
◼
►
make it so that when I am using the Dvorak layout, it's still acting as though it's a
01:44:37
◼
►
QWERTY layout just for the Final Cut shortcuts.
01:44:41
◼
►
So every time I try to learn a new Final Cut shortcut, I try to go in and like, change
01:44:47
◼
►
But I've wondered like, has someone just done this?
01:44:49
◼
►
is there a QWERTY for Dvorak remapping of all the Final Cut shortcuts that someone has just done?
01:44:55
◼
►
Because I could tell like I'm getting into a situation of, "Oh, this is a little bit
01:44:59
◼
►
inconsistent." So anyway, I've just wondered if like a person has done that work for me.
01:45:02
◼
►
And there's also some weird system stuff that I've done with a few of the shortcuts.
01:45:07
◼
►
Anyway, there's a bunch of these weird little problems. And I also have this
01:45:10
◼
►
minor annoyance. Listen, before I say this annoyance, I understand. I understand why
01:45:15
◼
►
Apple did it. You don't have to leave me comments for why they did it. I
01:45:18
◼
►
understand. It's still annoying. But it used to be that Apple, when you had
01:45:23
◼
►
different keyboard layouts, it had a little country flag to represent each of
01:45:28
◼
►
the layouts. And so the US keyboard layout had a little US flag. And the
01:45:33
◼
►
Dvorak one was just DV. And they made the change so that they don't use country
01:45:41
◼
►
flags for languages. Now again, I get it, but it actually causes a huge annoyance because in my menu
01:45:48
◼
►
bar I want to see which layout the keyboard is using. Is it using Dvorak or is it using US?
01:45:54
◼
►
Because sometimes I do switch between those two layouts and now I can't visually see instantly
01:46:00
◼
►
that the keyboard is on US layout versus Dvorak layout and it's maddening because it's just a
01:46:06
◼
►
a little box and the box either says DV or it says US. And like, no, it was so much better when it was
01:46:13
◼
►
black DV and that's Dvorak. And then there was a little American flag when it's the US layout.
01:46:18
◼
►
So it's super frustrating. But anyway, all of that aside, I'm looking at this Dvorak layout on my
01:46:23
◼
►
phone, and I hate everything about it. I don't think Dvorak was made for phones. I think all
01:46:29
◼
►
All of the advantages for typing Dvorak with two hands, I don't feel like they translate
01:46:36
◼
►
for two thumbs on a phone.
01:46:38
◼
►
I think maybe the QWERTY layout is actually superior for the phone.
01:46:41
◼
►
Yeah, all of the vowels are next to each other on the Dvorak one.
01:46:45
◼
►
I'm just trying to type some words here.
01:46:47
◼
►
By the way, I have two applications for you that can restore those flags.
01:46:53
◼
►
Yeah, one is called Keyboard Switcheroo.
01:46:56
◼
►
The other is called Colorful Input Menu Flags.
01:46:59
◼
►
'cause they're both in the Mac App Store.
01:47:03
◼
►
- Two enterprising developers who created applications
01:47:06
◼
►
to bring that back.
01:47:08
◼
►
I'd like to thank Glenn Fleischman at Macworld
01:47:11
◼
►
for writing the article too.
01:47:12
◼
►
'Cause I know I'd seen these,
01:47:14
◼
►
and so I just did a quick Google while you were upset.
01:47:17
◼
►
And actually on keyboard switcheroo,
01:47:20
◼
►
they have in their app screenshots,
01:47:23
◼
►
US, French, German, Dvorak.
01:47:25
◼
►
So you get a little US flag
01:47:28
◼
►
a little DV for Dvorak.
01:47:29
◼
►
Oh good, I'm glad to know I'm not the only person who was annoyed by this.
01:47:32
◼
►
So I'll definitely check those out.
01:47:35
◼
►
You can always rely on Mac developers to solve the little annoyances.
01:47:38
◼
►
Yeah you can.
01:47:39
◼
►
What I was just thinking here is with iOS, I think Qwerty on the phone, I think it does
01:47:46
◼
►
I feel like the weirdness of the way Qwerty is laid out to sort of slow you down, although
01:47:52
◼
►
I think that that's overplayed as a story of why it exists.
01:47:55
◼
►
I think that actually works as an advantage for typing with your thumbs on a tiny phone
01:47:59
◼
►
screen, but I am glad that this exists for the iPad.
01:48:03
◼
►
I might actually switch it on the iPad, because that's the place where it's been annoying.
01:48:06
◼
►
Big keyboard.
01:48:07
◼
►
And that's more like a larger keyboard there is what you're more used to for Dvorak, right?
01:48:12
◼
►
Bigger keyboards like a computer keyboard.
01:48:15
◼
►
Can you get MacBooks in Dvorak?
01:48:17
◼
►
I don't think so.
01:48:18
◼
►
I don't think they sell that.
01:48:21
◼
►
I'm pretty sure you have to just pick between American layout and the English layout when
01:48:26
◼
►
I'm buying the keyboards, which is the thing I'm always very careful about.
01:48:30
◼
►
Like, please give me the US layout one.
01:48:32
◼
►
I do not want the UK layout one.
01:48:34
◼
►
There's no little "huh" there.
01:48:35
◼
►
Like the American enter is clearly superior.
01:48:38
◼
►
I build all of my keyboards in NC.
01:48:40
◼
►
My laptop is in the British layout, but I think in the future I might try and remember
01:48:46
◼
►
to just order them in US layout because that's what I'm much more used to now.
01:48:49
◼
►
Wait a second.
01:48:50
◼
►
- You build your keyboards in the US layout, NC layout.
01:48:55
◼
►
- But you get the MacBook keyboards with the English layout?
01:48:58
◼
►
- I don't think about it when I'm buying them.
01:49:01
◼
►
I think I will now in the future, right?
01:49:03
◼
►
'Cause I'm getting more annoyed because I'm switching
01:49:06
◼
►
between something I'm not used to.
01:49:08
◼
►
And the keyboards that I build,
01:49:10
◼
►
most of them are just more available
01:49:13
◼
►
and easier to build in NC layout.
01:49:14
◼
►
So I'm just used to it now, so the US English layout.
01:49:18
◼
►
But I never think about it.
01:49:19
◼
►
and obviously the default on the British Mac store
01:49:23
◼
►
is the British layout, so I don't think about it
01:49:25
◼
►
and I just buy what's the default,
01:49:28
◼
►
but I think I wanna start changing it.
01:49:30
◼
►
It doesn't look like they do Dvorak,
01:49:31
◼
►
it's not in their options.
01:49:32
◼
►
- I don't think I'll be using this on my phone.
01:49:35
◼
►
I think it's worse on the phone,
01:49:36
◼
►
but I will definitely try it out on the iPad
01:49:38
◼
►
when the iPad comes out,
01:49:39
◼
►
'cause that's where it's more of a problem,
01:49:41
◼
►
but I mostly don't really think about it very much
01:49:44
◼
►
'cause it's just been so long,
01:49:46
◼
►
like this has been my entire life.
01:49:48
◼
►
The last time it caused me problems was when I worked at a school and had to switch between
01:49:53
◼
►
the QWERTY and Dvorak typing keyboards a bunch for using different computers, but even then
01:49:57
◼
►
it was not the worst.
01:49:58
◼
►
Like, I just learned to touch type while not looking and then sort of type while looking
01:50:03
◼
►
at the keyboard as two different modes that my brain could switch between.
01:50:06
◼
►
But I wonder why on earth they added Dvorak to the—
01:50:11
◼
►
—to iOS 16 now.
01:50:12
◼
►
It's a very funny, like—
01:50:13
◼
►
It's strange.
01:50:14
◼
►
Why did it take 16 attempts?
01:50:15
◼
►
You know what I mean?
01:50:17
◼
►
why 16 versions in? Is Dvorak having some resurgence that we're not aware of? Like why?
01:50:23
◼
►
It is odd, right? Yeah, I'm not even sure if I was suggesting to someone now who had RSI problems if
01:50:31
◼
►
Dvorak is the way to go. I always forget, I think it's Colmak or something? There's another one
01:50:37
◼
►
which is definitely worth investigating if you're learning now versus Dvorak. And it's also a
01:50:42
◼
►
a feature that I don't know why but it took all the way until now for people to notice
01:50:47
◼
►
it it seems like. There's been a bunch of articles written because it was kind of discovered
01:50:51
◼
►
but that means it went through the whole beta process without people seeming to know about
01:50:56
◼
►
it. I don't know.
01:50:57
◼
►
The thing that's also confusing about it is if they did add... why didn't they add something
01:51:03
◼
►
like Colmac? I mean look I don't know a lot about programming iOS to add additional keyboard
01:51:08
◼
►
layouts but if you're going to add something like Dvorak how hard is it to also add the
01:51:14
◼
►
other popular one of these if you're gonna do it?
01:51:17
◼
►
Well isn't Dvorak like the popular one though?
01:51:20
◼
►
I don't know if that's true or not.
01:51:22
◼
►
I feel like it must be or at least in the requests that they get or some high up executive
01:51:27
◼
►
like Tim Cook is a Dvorak guy and like he'd had enough you know?
01:51:31
◼
►
Yeah I mean with Dvorak I just always feel like there's dozens of us.
01:51:38
◼
►
- You've seen the Godfather, right?
01:51:40
◼
►
- No. - What?
01:51:42
◼
►
- I've not seen the Godfather.
01:51:44
◼
►
I've just never really been that interested.
01:51:46
◼
►
- Okay, well, in the Godfather,
01:51:49
◼
►
this is pointless for you now,
01:51:50
◼
►
but in the Godfather, there is a moment,
01:51:53
◼
►
I think Godfather 2, I think it's Godfather 2,
01:51:56
◼
►
which I assume you also have not seen,
01:51:58
◼
►
because why would you have, right?
01:51:59
◼
►
- Yeah, that's a correct, yeah, I've seen the third one,
01:52:01
◼
►
but not the first or the second one.
01:52:03
◼
►
- You're joking, right?
01:52:04
◼
►
- I am joking, yes. - Good, good, good, good.
01:52:06
◼
►
There is a moment where a bunch of things happen
01:52:08
◼
►
and the line is, "We're settling all family business."
01:52:13
◼
►
There's a line about settling all family business.
01:52:15
◼
►
And I feel like that is today's episode.
01:52:17
◼
►
We have just settled a bunch of business, right?
01:52:20
◼
►
A lot of follow-up has been dealt with, right?
01:52:23
◼
►
We've spoken about Dvorak, right?
01:52:25
◼
►
Like we're just settling the business
01:52:27
◼
►
before the year ends for us.
01:52:28
◼
►
Like very good episodes, now we go to the specials.
01:52:31
◼
►
- Yeah, that's true.
01:52:32
◼
►
- There are two more pieces of family business
01:52:34
◼
►
that I would like to settle before we finish today.
01:52:37
◼
►
One is to thank every Cortexan who donated
01:52:40
◼
►
to our Saint Jude campaign.
01:52:41
◼
►
Overall, throughout the month of September,
01:52:44
◼
►
we raised $706,397.10.
01:52:49
◼
►
All of us absolutely astounded.
01:52:53
◼
►
This is the most money we've ever raised.
01:52:55
◼
►
- It's unbelievable.
01:52:57
◼
►
It's an incomprehensible amount of money.
01:53:00
◼
►
So we have now not only passed $2 million raised
01:53:05
◼
►
in the last four years,
01:53:06
◼
►
we have now hit $2.2 million raised in the last four years.
01:53:11
◼
►
- It's incredible.
01:53:15
◼
►
It's just incredible to think that the relay listenership
01:53:18
◼
►
has raised that much money for St. Jude.
01:53:22
◼
►
So yeah, thank you to everyone who donated to the campaign.
01:53:27
◼
►
That's completely mind blowing.
01:53:29
◼
►
You know, it seemed for a while that it was like,
01:53:31
◼
►
okay, we're gonna raise a lot of money,
01:53:33
◼
►
we're gonna meet our goal,
01:53:34
◼
►
but we're probably not going to exceed.
01:53:36
◼
►
Because there is definite economic challenges
01:53:39
◼
►
right now around the world,
01:53:41
◼
►
and we were thinking, okay, that's gonna be that.
01:53:43
◼
►
And then it just, the last couple of days even,
01:53:47
◼
►
like the last couple of weeks, it just exploded
01:53:49
◼
►
'cause people were just getting their final totals in.
01:53:51
◼
►
We had to, we were gonna end it on September 30th,
01:53:54
◼
►
but we extended it to October 3rd,
01:53:56
◼
►
because the money was just piling in in the last day.
01:54:00
◼
►
It was like, all right,
01:54:01
◼
►
we're gonna leave this open a couple of days.
01:54:02
◼
►
I think like in the last day or two,
01:54:04
◼
►
we raised like an extra $110,000 or something.
01:54:07
◼
►
So it was like, probably should keep this open.
01:54:10
◼
►
- Yeah, that's worth leaving open an extra day.
01:54:13
◼
►
- But overall, this year's campaign was fantastic.
01:54:16
◼
►
We learned a lot and for me,
01:54:18
◼
►
it was just so incredibly rewarding and fun
01:54:22
◼
►
to be able to be back in Memphis for the Podcastathon,
01:54:24
◼
►
which was a great success.
01:54:26
◼
►
The whole video is on YouTube.
01:54:27
◼
►
I'll put a link in the show notes
01:54:28
◼
►
if people don't want to watch it.
01:54:30
◼
►
A very kind comment to timestamp the whole thing.
01:54:33
◼
►
So if you want to jump around to different segments
01:54:35
◼
►
or whatever, that's all in there
01:54:37
◼
►
in one of the comments in the YouTube video.
01:54:39
◼
►
But yeah, it was truly fantastic
01:54:42
◼
►
and we achieved something that I just did not think
01:54:45
◼
►
was going to be possible.
01:54:46
◼
►
And once again, the Relay FM community
01:54:48
◼
►
has gone and surprised us.
01:54:51
◼
►
Cortexmerch.com.
01:54:52
◼
►
This is the second thing, final thing.
01:54:54
◼
►
And what's the final piece?
01:54:56
◼
►
- Is subtletys and subtle sweaters.
01:54:58
◼
►
We're reminding you, maybe you have now finished your commute
01:55:00
◼
►
and you're walking into the office or whatever,
01:55:02
◼
►
and you didn't do what I asked you to do earlier,
01:55:04
◼
►
so now you can get your phone out
01:55:06
◼
►
and you can go to cortexmerch.com
01:55:08
◼
►
and buy yourself a subtlety or subtle sweater or both
01:55:10
◼
►
or whatever you want.
01:55:12
◼
►
- Unlike the St. Jude fundraiser, this will not be extended.
01:55:16
◼
►
So for realsies, cortexmerch.com.