515: The Elder Programmer Look
00:00:00
◼
►
Hey, have we ever recorded this show before noon?
00:00:04
◼
►
I would be surprised if Jon has ever recorded a podcast before noon.
00:00:08
◼
►
No, I've done a bunch of morning ones for usually like people in the UK and stuff.
00:00:12
◼
►
Have you even shaved yet?
00:00:13
◼
►
What are you talking about? I don't think you understand my shaving schedule.
00:00:18
◼
►
Yeah, maybe. You've completed the morning shave, but not the evening shave.
00:00:22
◼
►
Right, right.
00:00:23
◼
►
No, that's not... I shave like once a week-ish.
00:00:27
◼
►
I thought I figured you know your thick Italian heritage is just like just constant.
00:00:32
◼
►
It is, oh it is.
00:00:34
◼
►
It's just I don't feel the need to shave it off now that I'm not going into an office
00:00:38
◼
►
all the time.
00:00:39
◼
►
Even when I was going to an office honestly it was pretty hit or miss in the later years.
00:00:44
◼
►
You slowly become like the elder programmer look.
00:00:49
◼
►
Yeah, because one or two days it's like stubble and then people start asking you whether you're
00:00:54
◼
►
throwing a beard, and then you shave,
00:00:55
◼
►
and then just repeats.
00:00:57
◼
►
- By the way, if anybody asks,
00:01:00
◼
►
today is not December 26th, it's December 24th.
00:01:03
◼
►
And I figure if Roman emperors were able to dictate
00:01:08
◼
►
whatever they felt like the calendar doing,
00:01:10
◼
►
I feel like I can do it too.
00:01:12
◼
►
So today is really actually December 24th in our family,
00:01:15
◼
►
and we have our own custom calendar, only for this year.
00:01:18
◼
►
Because this has been,
00:01:22
◼
►
oh, this has been an experience.
00:01:25
◼
►
So we drove halfway upstate on the day before Christmas Eve,
00:01:32
◼
►
on Friday, this past Friday.
00:01:34
◼
►
The idea was to drive the rest of the way upstate
00:01:36
◼
►
to our family's place on Saturday, Christmas Eve day.
00:01:41
◼
►
And we woke up on Saturday morning,
00:01:43
◼
►
and our kid had a massive fever.
00:01:46
◼
►
And we were like, oh, no.
00:01:48
◼
►
So here we are scheduling to go up to visit
00:01:53
◼
►
the whole family, including his grandparents,
00:01:56
◼
►
who were, you know, we wanna be careful
00:01:58
◼
►
with what we bring to them.
00:01:59
◼
►
And so we were like, oh no, we gotta,
00:02:02
◼
►
we can't go up like this.
00:02:03
◼
►
We can't bring this very fevery kid.
00:02:07
◼
►
First of all, we can't make him sit in a car
00:02:09
◼
►
for three hours.
00:02:11
◼
►
That's not super great when you're feeling sick.
00:02:13
◼
►
Second of all, we don't wanna get them sick.
00:02:15
◼
►
So we're like, oh God, what do we do?
00:02:16
◼
►
It's Christmas Eve.
00:02:18
◼
►
And so we worked out this plan where we were basically
00:02:21
◼
►
gonna wait until the fever broke and talk to a doctor
00:02:24
◼
►
and see when it's safe and everything.
00:02:26
◼
►
And then so as we're working this out,
00:02:27
◼
►
we get reports from all of his,
00:02:30
◼
►
from some of his school classmates' parents.
00:02:33
◼
►
His whole class basically has flu A.
00:02:36
◼
►
It's the whole class.
00:02:37
◼
►
- Cool. - It's the real flu.
00:02:39
◼
►
Many of them got tested.
00:02:40
◼
►
It's influenza A, real flu,
00:02:43
◼
►
and now he's showing flu symptoms and we're like,
00:02:46
◼
►
And it's Christmas Eve, may I remind you.
00:02:48
◼
►
It's not like anything's easily accessible or open.
00:02:51
◼
►
So we do like a telehealth thing,
00:02:53
◼
►
which thank God that even exists.
00:02:55
◼
►
I mean, that's an amazing thing.
00:02:57
◼
►
I'm so happy that that's,
00:02:59
◼
►
like I think one of the,
00:03:01
◼
►
if there is such a thing as a silver lining
00:03:03
◼
►
for the COVID era,
00:03:04
◼
►
it's that I think we have way more access
00:03:07
◼
►
to virtual services than we used to.
00:03:08
◼
►
And even though telehealth existed before COVID,
00:03:11
◼
►
it was a much smaller and less commonly available thing
00:03:13
◼
►
for most insurance plans and stuff.
00:03:15
◼
►
So now it's super easy.
00:03:17
◼
►
So we did telehealth,
00:03:18
◼
►
and then we had to get a prescription.
00:03:21
◼
►
And by this time,
00:03:22
◼
►
the time that we had done the telehealth and done all this,
00:03:25
◼
►
it was 6 p.m. on Christmas Eve.
00:03:28
◼
►
- So try getting a prescription filled
00:03:30
◼
►
at 6 p.m. on Christmas Eve.
00:03:33
◼
►
So we eventually found like a 24-hour pharmacy
00:03:36
◼
►
that was open Christmas Eve, had it sent there.
00:03:39
◼
►
It was a whole thing, but we got it.
00:03:42
◼
►
Anyway, so we've had it,
00:03:44
◼
►
And we're like, all right, the good thing about Christmas
00:03:47
◼
►
is that unlike, for example, Halloween,
00:03:50
◼
►
if you are sick on Halloween, you just miss everything
00:03:53
◼
►
because Halloween is a holiday that depends
00:03:55
◼
►
on the rest of the surrounding community
00:03:57
◼
►
celebrating it at the same time you are celebrating it.
00:04:00
◼
►
Otherwise, you can't really go trick or treating.
00:04:02
◼
►
If you just show up on strangers' doors
00:04:03
◼
►
and knock on them on November 3rd,
00:04:05
◼
►
you're gonna have a bad time, I think.
00:04:08
◼
►
But Christmas, it really just depends
00:04:11
◼
►
on when your family wants to celebrate it.
00:04:14
◼
►
It's basically within your own household for the most part.
00:04:16
◼
►
And so you can kind of celebrate it whenever you want.
00:04:19
◼
►
So we just delayed it.
00:04:20
◼
►
So yesterday we sat inside and played video games all day.
00:04:23
◼
►
Big fan of Project High Rise right now.
00:04:25
◼
►
It's like a modern Sim Tower.
00:04:27
◼
►
It's fantastic.
00:04:28
◼
►
- Oh, I loved Sim Tower.
00:04:29
◼
►
- Yeah, yeah.
00:04:30
◼
►
But yeah, and so we just played video games all day.
00:04:33
◼
►
They tried to teach me how to play Don't Starve together
00:04:35
◼
►
and I starved.
00:04:36
◼
►
Well, no, I died almost instantly.
00:04:40
◼
►
I think I lasted maybe 90 seconds in the game
00:04:43
◼
►
before dying, so that might not be a strength of mine, but certainly SimTower-like games
00:04:49
◼
►
are, I'm enjoying. So we just did all that yesterday, and this afternoon we are driving
00:04:53
◼
►
upstate and today is fake Christmas Eve and tomorrow is fake Christmas and we're gonna
00:04:58
◼
►
celebrate it, dammit. Anyway, so I'm very thankful for Modern Medicine and for our flexible
00:05:04
◼
►
family in their scheduling. And yeah, it's been a time. Oh, and 24 Hour Pharmacy is also
00:05:11
◼
►
Thankful for that.
00:05:12
◼
►
- Did you give Adam COVID tests?
00:05:15
◼
►
- Yes, and no.
00:05:17
◼
►
- And had he gotten his flu shot yet?
00:05:21
◼
►
- No, this is the thing.
00:05:22
◼
►
So he had COVID on Halloween.
00:05:25
◼
►
So that's why I'm such an expert
00:05:27
◼
►
in what it's like to miss Halloween.
00:05:29
◼
►
That was awful.
00:05:30
◼
►
And because we thought we might be exposed to it,
00:05:33
◼
►
we didn't get any boosters or anything yet,
00:05:35
◼
►
'cause we're like, all right,
00:05:36
◼
►
now let's wait two months, like they say,
00:05:37
◼
►
and then we'll get all of our boosters.
00:05:39
◼
►
and that puts us about now.
00:05:43
◼
►
- Yeah, I always remember when they would start advertising
00:05:46
◼
►
the flu shots and they'd be advertising them
00:05:48
◼
►
and before Halloween I'd be like,
00:05:50
◼
►
"That seems a little early for flu shots,"
00:05:52
◼
►
but I eventually gave in and said,
00:05:53
◼
►
"All right, well I'll just get it as soon as it's available
00:05:55
◼
►
so I'll let this be a lesson to the listener."
00:05:58
◼
►
Even though sometimes it seems like they're giving out
00:05:59
◼
►
the flu shots way too early
00:06:00
◼
►
and it's not even close to flu season yet,
00:06:02
◼
►
it's a good idea to get them when you can
00:06:03
◼
►
'cause it's easy to forget.
00:06:05
◼
►
- Yeah, yeah, last year we got them all regularly on schedule
00:06:08
◼
►
and we had a totally fine, smooth winter.
00:06:10
◼
►
I mean, you know, and I know it's on 100% or whatever,
00:06:13
◼
►
but at this point, I'll take any improvement in chances,
00:06:17
◼
►
even if it's on 100%, I'll take it.
00:06:20
◼
►
- All right, let's do some follow-up,
00:06:23
◼
►
and it starts with Brad Crittenden, who writes,
00:06:24
◼
►
"I got a butterfly keyboard class action lawsuit
00:06:26
◼
►
"settlement notice.
00:06:28
◼
►
"It claims that they will pay 300 to $395.
00:06:31
◼
►
"I have no recollection of this flying by.
00:06:33
◼
►
"I probably filled it out and just don't remember,
00:06:35
◼
►
"but I have zero recollection of this."
00:06:37
◼
►
So here's the thing, I'd never filled it out
00:06:39
◼
►
and I got one, or rather I got the notice
00:06:41
◼
►
saying I'm entitled to one or whatever,
00:06:42
◼
►
but I'm not gonna submit it because,
00:06:45
◼
►
see I never, as far as I can recall,
00:06:48
◼
►
I think I only got it repaired once,
00:06:51
◼
►
or even, did I ever get it repaired or did I just not?
00:06:54
◼
►
See I can't even remember
00:06:55
◼
►
if I actually ever got one repaired.
00:06:57
◼
►
So that's why I'm like, you know,
00:06:58
◼
►
it's 'cause the class action is basically like
00:07:01
◼
►
if you got a butterfly keyboard laptop repaired by Apple,
00:07:05
◼
►
depending on how many times you got it repaired,
00:07:07
◼
►
you have a certain payout of 100 bucks
00:07:11
◼
►
or 200 or 300 bucks apparently,
00:07:12
◼
►
but it's not a huge amount of money.
00:07:15
◼
►
'Cause I mean, look, the real problem
00:07:17
◼
►
with the butterfly keyboards,
00:07:18
◼
►
well, there were two real problems.
00:07:19
◼
►
Number one, people getting them repaired out of warranty
00:07:21
◼
►
had to pay quite a lot of money, and more than that.
00:07:24
◼
►
Remember, the price was, I believe,
00:07:25
◼
►
like over $500 to get one repair out of warranty.
00:07:29
◼
►
So there's that issue, which this, I guess,
00:07:32
◼
►
partially addresses but doesn't fully address.
00:07:35
◼
►
The other issue is any replacement you got
00:07:38
◼
►
would have the same problem eventually.
00:07:40
◼
►
And so really the whole laptop
00:07:43
◼
►
was incredibly devalued over time.
00:07:46
◼
►
That was the big problem.
00:07:48
◼
►
And so, you know, this is, I'm glad Apple, you know,
00:07:52
◼
►
got sued over the butterfly keyboards
00:07:53
◼
►
'cause frankly they should have.
00:07:55
◼
►
One issuance of that keyboard was bad enough
00:08:00
◼
►
and then to keep issuing it year after year
00:08:02
◼
►
after year after year claiming they'd fixed it
00:08:04
◼
►
was beyond negligent and borderline fraudulent.
00:08:08
◼
►
So they should have been held liable for that
00:08:11
◼
►
and it seems like they've been held
00:08:12
◼
►
a little bit liable for that.
00:08:14
◼
►
But ultimately, if you were an owner
00:08:16
◼
►
of a butterfly keyboard laptop,
00:08:18
◼
►
this kind of helps a little bit,
00:08:20
◼
►
but it doesn't really address the actual value
00:08:23
◼
►
that you lost by having that laptop
00:08:25
◼
►
and having to eventually sell or trade it in
00:08:27
◼
►
or get it repaired out of warranty.
00:08:29
◼
►
I was excited to get my money for this
00:08:31
◼
►
and then I realized, oh wait,
00:08:32
◼
►
my butterfly keyboard that broke wasn't mine,
00:08:34
◼
►
it was on my work's laptop.
00:08:35
◼
►
It was annoying though.
00:08:37
◼
►
- You gotta get your work laptop taken away
00:08:38
◼
►
and you gotta have a replacement
00:08:40
◼
►
and all your stuff is on it.
00:08:40
◼
►
It's even more annoying with all the work
00:08:42
◼
►
'cause they own the machine
00:08:43
◼
►
so it's not like you can just copy stuff off of it
00:08:45
◼
►
'cause they don't let you attach external drives
00:08:47
◼
►
and blah, blah, blah.
00:08:47
◼
►
So anyway, I hope my old workplace gets the money
00:08:51
◼
►
they have coming to them
00:08:52
◼
►
for all the butterfly keyboards that broke.
00:08:55
◼
►
- Yeah, I didn't end up getting any of these repaired.
00:08:57
◼
►
In fact, I'm trying to remember.
00:08:59
◼
►
So we still have, you know, Aaron, what was my adorable now, Aaron's adorable in the house,
00:09:03
◼
►
and every great moment, like once every few months, she'll be like, "Oh, my such and such
00:09:07
◼
►
key is sticking."
00:09:08
◼
►
Yep, sure is.
00:09:09
◼
►
I bet you're right.
00:09:11
◼
►
But anyways, but that never got a repair.
00:09:14
◼
►
And then I think the one or two other MacBook Pros that I had during the butterfly era were
00:09:19
◼
►
either works or have already been sold off.
00:09:22
◼
►
So yeah, I'd never had a repair, and that probably explains why I never got this notice.
00:09:26
◼
►
But I'm glad to hear that this is a thing,
00:09:28
◼
►
and I agree with you Marco that,
00:09:30
◼
►
you know, it's one thing to release a keyboard
00:09:32
◼
►
that's flawed and maybe you didn't know it,
00:09:34
◼
►
maybe you thought, oh it'll work itself out,
00:09:36
◼
►
and okay fine.
00:09:38
◼
►
The thing that bothered me about this whole fiasco,
00:09:40
◼
►
well one of the things that bothered me
00:09:41
◼
►
about this whole fiasco was the fact that,
00:09:43
◼
►
like you said, they just kept saying,
00:09:44
◼
►
no, no, no, it's fine, no, it's fine,
00:09:47
◼
►
no it's fine, no, it's fine.
00:09:49
◼
►
- This time we fixed it.
00:09:50
◼
►
- Oh yeah, oh we added a gasket or whatever it was,
00:09:52
◼
►
I don't remember the details,
00:09:53
◼
►
we added a gasket, that'll fix it.
00:09:55
◼
►
Because you're right, they were also saying,
00:09:57
◼
►
"Nothing's wrong, keep changing it.
00:09:58
◼
►
"Nothing's wrong, oh wait, we changed it again.
00:10:00
◼
►
"Nothing's wrong, what are you talking about?
00:10:02
◼
►
"There's no flaw, we fixed the flaw.
00:10:03
◼
►
"What are you talking about?
00:10:04
◼
►
"There's no flaw, oh, we fixed it again."
00:10:05
◼
►
Like, yeah, it was a rough time.
00:10:08
◼
►
- Huge Apple energy.
00:10:09
◼
►
- Not to mention the fact that it sucked
00:10:10
◼
►
even when it was working, but we'll move beyond that for now.
00:10:14
◼
►
- I mean, what are you gonna do?
00:10:14
◼
►
But no, I'm glad to see that this is finding,
00:10:19
◼
►
as late as it is, I'm glad to see it's finding
00:10:20
◼
►
some sort of resolution for regular people.
00:10:23
◼
►
- Two related things to this.
00:10:24
◼
►
I don't know we don't couldn't find this feedback for the show
00:10:26
◼
►
But we did get some feedback from somebody through some channel live streams that said
00:10:30
◼
►
That they had a Mac with a butterfly keyboard
00:10:33
◼
►
And it went bad and they brought it to the Apple store
00:10:36
◼
►
And they were informed that oh the Apple repair program for the butterfly keyboards that said hey if you have a problem
00:10:42
◼
►
You know whatever it's called the extended repair program if you have a problem or replace it that program has ended
00:10:46
◼
►
So this person had a butterfly keyboard
00:10:48
◼
►
Brought it in and and they're no longer
00:10:51
◼
►
eligible to get it repaired for free or for cheap because the Apple, we talked about this
00:10:54
◼
►
when Apple rolled out the program, it's like, okay, at a certain point, Apple says, yeah,
00:10:58
◼
►
we have this repair program because we realized we put out a defective keyboard, but at a
00:11:01
◼
►
certain point, we just want to stop paying for that, so we're going to.
00:11:03
◼
►
And I think we're past that point now, which is pretty crappy, so if you do want it repaired,
00:11:07
◼
►
you're on the hook for whatever the, you know, $500 repair bill, really just probably want
00:11:11
◼
►
to throw that laptop into the sea at that point.
00:11:13
◼
►
And the second thing is that, just from some brief Googling, I believe I still have a butterfly
00:11:19
◼
►
keyboard Mac that's in semi-active use. The original Retina MacBook Air had it, right?
00:11:25
◼
►
The Intel one? Yeah, the first one, the 2018 model. But the
00:11:29
◼
►
2019 did not. Yeah, maybe I got the 2019 one, I'll have
00:11:33
◼
►
to look it up. There was an Intel MacBook Air Retina that
00:11:37
◼
►
didn't have the butterfly keyboard. Yeah, I'll double check, but I was thinking,
00:11:42
◼
►
"Oh, maybe if that one ever goes bad, I'll get paid," but apparently I won't, again,
00:11:45
◼
►
according to that feedback that I can't find. So that's pretty crappy. I understand the
00:11:49
◼
►
the repair programs can't last forever, but it's kind of like, what's the point of the
00:11:53
◼
►
repair program if you're saying, "We're sorry for putting out a defective laptop, but we
00:11:56
◼
►
have a timer on the lifetime of your computer."
00:11:59
◼
►
So, I mean, it seems like the keyboard is even more likely to fail after the three or
00:12:04
◼
►
four or five or however many years after that time period is up, but they're just saying,
00:12:07
◼
►
"Yeah, we can't pay for that forever," because we basically know that there's probably some
00:12:11
◼
►
graph of over time, what are the odds that a keyboard will fail, and at a certain point,
00:12:15
◼
►
it becomes like ruinous, not ruinous,
00:12:16
◼
►
but very expensive for Apple to repair
00:12:19
◼
►
because the percentage of them that are gonna go bad
00:12:21
◼
►
after 10 years is approaching like 25% or something.
00:12:25
◼
►
- Well, but it does make sense to have a time limit on it.
00:12:28
◼
►
I just think this time limit was probably too short.
00:12:31
◼
►
'Cause at some point they have to stop manufacturing
00:12:33
◼
►
and keeping the parts and the service equipment
00:12:36
◼
►
and stuff like that.
00:12:36
◼
►
So it does make sense to have an expiration date,
00:12:39
◼
►
but that expiration date should be however many years
00:12:43
◼
►
their products are expected to be in use
00:12:45
◼
►
that number of years after they stop being for sale.
00:12:49
◼
►
And so maybe it's like seven years, you know,
00:12:51
◼
►
whatever it is, like as long as they can expect
00:12:53
◼
►
a laptop to reasonably still be in use by most people,
00:12:57
◼
►
it should have covered that duration, and it didn't.
00:13:00
◼
►
- They don't have to keep making
00:13:01
◼
►
the butterfly keyboard parts,
00:13:02
◼
►
like that's not the only solution available to them.
00:13:05
◼
►
They can do what they have done in the past,
00:13:06
◼
►
which is that you bring something in to get repaired,
00:13:08
◼
►
and they're like, oh, we don't make parts for that anymore,
00:13:11
◼
►
and we don't make that product anymore,
00:13:13
◼
►
but we recognize that you've got a broken thing
00:13:15
◼
►
and it's our fault, so here is the new version.
00:13:17
◼
►
We've heard lots and lots of stories of Apple doing that.
00:13:19
◼
►
Like they basically give you a new laptop
00:13:20
◼
►
or a new iPod back in the day.
00:13:22
◼
►
That's not the one that you had
00:13:23
◼
►
that is better than the one you had,
00:13:24
◼
►
but it's the only option available to them
00:13:26
◼
►
because they don't make the old one anymore.
00:13:27
◼
►
They could always do that,
00:13:28
◼
►
but of course that would cost money.
00:13:30
◼
►
- And that would cost way more money.
00:13:32
◼
►
'Cause like-- - Yeah.
00:13:33
◼
►
Well, you stopped shipping a defective keyboard for years.
00:13:35
◼
►
- I know, but like, but you know,
00:13:36
◼
►
this was not a small number of affected products,
00:13:39
◼
►
even though they kept saying,
00:13:40
◼
►
"It's only a tiny percentage,
00:13:42
◼
►
but we all know it's a much bigger percentage.
00:13:44
◼
►
- A tiny percentage after six months,
00:13:46
◼
►
is it a tiny percentage after six years?
00:13:47
◼
►
- Right, yeah, that's the thing.
00:13:49
◼
►
This would have been a massive number
00:13:52
◼
►
if they actually did that.
00:13:52
◼
►
And I've heard of that happening,
00:13:55
◼
►
and one time it even happened for me
00:13:57
◼
►
back in the white plastic MacBook days.
00:14:00
◼
►
But it's not super common.
00:14:03
◼
►
I feel like you have to have a fairly exceptional case,
00:14:05
◼
►
and I think it has to go through some levels of approval
00:14:08
◼
►
before they just give you a new thing
00:14:11
◼
►
replace your old broken thing. So it's not that common of a thing.
00:14:16
◼
►
We are brought to you this week by Nebula, a streaming service created and owned by some
00:14:21
◼
►
of the Internet's most thoughtful creators, offering full length and ad-free videos earlier
00:14:26
◼
►
than anywhere else as well as Nebula exclusives. Personally, I love Nebula because here's what
00:14:32
◼
►
happened. I'm a YouTube video addict like many of us and I started noticing that many
00:14:39
◼
►
of my favorite creators who would have all this cool, like, usually it's like educational
00:14:43
◼
►
explainer kind of content. People like Wendover Productions and Real Engineering and Practical
00:14:48
◼
►
Engineering and Georgia Dow. I was seeing all these people who, videos I was watching,
00:14:54
◼
►
and they would keep saying, "Hey, by the way, we started the streaming service. We're over
00:14:57
◼
►
here on Nebula and all of our stuff there is ad-free and, you know, you get some bonus
00:15:01
◼
►
content or some exclusive features." And I was like, "Okay, I gotta check this out."
00:15:06
◼
►
And it is so great.
00:15:08
◼
►
They have so many creators.
00:15:09
◼
►
The ones I already knew about, plus so many more that I've discovered that I also like
00:15:13
◼
►
because it's kind of in a similar vein or similar quality levels of what I was hoping
00:15:19
◼
►
And everything there is ad-free and it's full length.
00:15:22
◼
►
It's like getting the director's cut or getting bonus features.
00:15:26
◼
►
Plus they have these exclusives that are even bigger and higher production value or more
00:15:32
◼
►
in depth than you could often go in a place like YouTube.
00:15:34
◼
►
And so I am super happy with Nebula.
00:15:37
◼
►
They have fully produced originals,
00:15:39
◼
►
things like Anita Sarkeesian's That Time When or Impact,
00:15:42
◼
►
Renee Ritchie's retrospective on the launch of the iPhone.
00:15:45
◼
►
They have Nebula classes featuring all kinds
00:15:46
◼
►
of online courses led by creators,
00:15:48
◼
►
including introduction to programming called What is Code
00:15:51
◼
►
from creator and NYU professor Daniel Shiffman,
00:15:53
◼
►
and just so much more bonus and exclusive content.
00:15:56
◼
►
So see for yourself at nebula.tv/atp.
00:16:00
◼
►
Now normally this is five bucks a month or 50 bucks a year,
00:16:03
◼
►
but they're having a sale for us, 40 bucks for a year.
00:16:05
◼
►
So 40 bucks for a year by going to nebula.tv/atp.
00:16:10
◼
►
Thank you so much to Nebula for sponsoring our show.
00:16:14
◼
►
- Let's move on and talk about stable diffusion, baby.
00:16:20
◼
►
And apparently artists can now opt out
00:16:23
◼
►
of the next version of stable diffusion.
00:16:26
◼
►
This is an article on the Technology Review.
00:16:28
◼
►
Artists will have the chance to opt out
00:16:29
◼
►
of the next version of stable diffusion.
00:16:30
◼
►
The company behind it has announced stability.ai
00:16:32
◼
►
will work with Spawning, an organization founded
00:16:35
◼
►
by artists couple Matt Dreiherst and Holly Herndon,
00:16:37
◼
►
who have built a website called Have I Been Trained.
00:16:40
◼
►
That's in the spirit of Have I Been Pwned, isn't it?
00:16:42
◼
►
That's so good.
00:16:42
◼
►
- Oh, all right. - That allows artists
00:16:43
◼
►
to search for their works in the dataset
00:16:47
◼
►
that was used to train stable diffusion.
00:16:49
◼
►
Artists will be able to select which works
00:16:51
◼
►
they want to exclude from the training data.
00:16:53
◼
►
- I mean, you don't even need to read the summary
00:16:54
◼
►
or the article, just the headline,
00:16:56
◼
►
which I've copied and pasted directly into the notes,
00:16:57
◼
►
just the headline is enough for you to,
00:16:59
◼
►
it's kind of like the Ian Beteridge law,
00:17:00
◼
►
if you're a user, just say no.
00:17:02
◼
►
Artists can now opt, no, you're not getting it.
00:17:05
◼
►
The idea is not, oh, we'll steal all your stuff
00:17:07
◼
►
and build a commercial product off of it and make money,
00:17:10
◼
►
but you can always track us down,
00:17:12
◼
►
find out we're doing that and tell us to stop.
00:17:14
◼
►
Everything's fine now, right?
00:17:16
◼
►
No, it's not.
00:17:17
◼
►
We'll siphon money from your bank account slowly over time,
00:17:20
◼
►
but if you catch us doing that and tell us to stop,
00:17:24
◼
►
Everything's fine now, right?
00:17:25
◼
►
No problem anymore?
00:17:26
◼
►
No, this is not the solution.
00:17:28
◼
►
You can't take people's stuff, use it,
00:17:30
◼
►
and say, "But we'll stop if you ask us to."
00:17:33
◼
►
It should not be an opt-out.
00:17:34
◼
►
Would you like to opt out
00:17:35
◼
►
of having your intellectual property stolen?
00:17:37
◼
►
Please offer that plan to Disney.
00:17:39
◼
►
Hey, Disney, we'll be using your intellectual property,
00:17:41
◼
►
but if you ask us to stop, we will.
00:17:45
◼
►
- It's hilarious that someone thought this was a good idea.
00:17:48
◼
►
- Well, I can, I mean, look,
00:17:51
◼
►
I know this is probably unpopular,
00:17:52
◼
►
so maybe I shouldn't even bother,
00:17:54
◼
►
but I think the issue of copyright and rights
00:17:59
◼
►
in terms of whether AI can be trained
00:18:02
◼
►
on your copyrighted material is a weird blurry situation.
00:18:06
◼
►
I don't think it's clear cut.
00:18:08
◼
►
'Cause if you think about what training is
00:18:11
◼
►
and how it might relate to real world analogs
00:18:13
◼
►
to which we have precedent,
00:18:14
◼
►
I am able to view a copyrighted work
00:18:19
◼
►
if it's being shown to me in a legal way.
00:18:21
◼
►
I'm able to view it and remember it and learn from it.
00:18:25
◼
►
Artists everywhere are able to look at other artists' work
00:18:29
◼
►
be inspired by them and generate new things in their spirit.
00:18:33
◼
►
And so it is kind of a weird blurry line
00:18:36
◼
►
in the sense that the AI models
00:18:38
◼
►
that are being trained in this data,
00:18:39
◼
►
they're not making illegal copies
00:18:41
◼
►
and distributing illegal copies of the original works.
00:18:45
◼
►
They are, in a way, viewing them.
00:18:47
◼
►
- But they're not people, so they're not.
00:18:49
◼
►
- Right, and so that's,
00:18:50
◼
►
and I know there was recently a court thing somewhere
00:18:52
◼
►
that said that they basically failed,
00:18:54
◼
►
that AI stuff, the output of AI models
00:18:57
◼
►
does not qualify for copyright
00:18:58
◼
►
'cause it was not created by a human.
00:19:00
◼
►
And I think that's also gonna be worked out over time
00:19:02
◼
►
in the core systems, I think that's also probably--
00:19:05
◼
►
- I think that's actually, that ruling's actually
00:19:07
◼
►
more questionable than, in my opinion,
00:19:09
◼
►
than the whether they're allowed to take your stuff.
00:19:11
◼
►
Because the not created by a human thing is fuzzy in the,
00:19:14
◼
►
okay, so the program is in human,
00:19:17
◼
►
but a human runs the program and tweaks the output
00:19:19
◼
►
and so on and so forth, so you have to say,
00:19:22
◼
►
this isn't copyrightable because it wasn't created
00:19:23
◼
►
by a human, it's like, well, the program didn't create it
00:19:25
◼
►
on its own, it's just sitting there waiting for a human
00:19:27
◼
►
It's like saying, "This is not copyrightable because it was created by Photoshop and Photoshop's
00:19:31
◼
►
not a human."
00:19:32
◼
►
Well, someone ran Photoshop and used it and clicked around and did stuff, right?
00:19:36
◼
►
How many tweaks to the output do you have to do for it to suddenly become a copyrightable,
00:19:40
◼
►
I think that is actually more fuzzy.
00:19:41
◼
►
What is not fuzzy, in my opinion, and again, who knows what the law will be, but in my
00:19:45
◼
►
personal opinion, is not fuzzy at all that you cannot take—that you can't take works
00:19:51
◼
►
that you don't have the rights to and use it to train an AI model.
00:19:54
◼
►
and I talked about this on upgrade a while back,
00:19:56
◼
►
like it just, in my opinion, there's no,
00:20:00
◼
►
like I see where people think it might be fuzzy,
00:20:02
◼
►
but the difference is that it's not a person doing,
00:20:04
◼
►
you're feeding it into a program,
00:20:05
◼
►
and the value of that program is nothing
00:20:08
◼
►
without being fed input.
00:20:10
◼
►
You have to feed input into the program.
00:20:12
◼
►
If I just give you that program and say,
00:20:14
◼
►
well, it's a program that I wrote,
00:20:15
◼
►
so anything that comes out of it is mine,
00:20:16
◼
►
sure, okay, go ahead, but it's like,
00:20:18
◼
►
okay, well, my program doesn't work unless I feed it data.
00:20:20
◼
►
Okay, well, now what data,
00:20:21
◼
►
where are you getting that data from?
00:20:23
◼
►
I'll just feed it any data I feel like.
00:20:25
◼
►
The point of the acquisition of data,
00:20:27
◼
►
you are taking that data and using it.
00:20:29
◼
►
It's like taking a font and using it to make a sign.
00:20:32
◼
►
Like fonts, this is more complicated
00:20:34
◼
►
'cause you can't copyright a font.
00:20:35
◼
►
But anyway, if you wanna use the official version
00:20:37
◼
►
of a font or whatever, you have to pay for it.
00:20:39
◼
►
You can use a clone of the font or whatever,
00:20:41
◼
►
which gets things complicated.
00:20:42
◼
►
But anyway, I don't think you can take people's
00:20:45
◼
►
copyrighted material and use it to train an AI model.
00:20:47
◼
►
I just don't, without some kind of compensation for them,
00:20:50
◼
►
because you are directly deriving value from their work.
00:20:55
◼
►
And I don't think the analogy to people makes any sense
00:20:57
◼
►
because the program's not a person.
00:20:58
◼
►
Yes, a person can look at art and make derivative art,
00:21:00
◼
►
right, but it's not a person.
00:21:02
◼
►
It's a person using a program, right?
00:21:04
◼
►
It's like, well, if I use a program to copy this,
00:21:07
◼
►
I didn't copy your art and resell it.
00:21:09
◼
►
I used the CP program, and the CP program copied your art,
00:21:12
◼
►
but then once I had it from the CP program,
00:21:14
◼
►
then I sold it, but it's fine
00:21:16
◼
►
because the CP program did just what a person does,
00:21:18
◼
►
looks at the art, right?
00:21:19
◼
►
No, it's just a more complicated version of the CP.
00:21:22
◼
►
Now, will lawyers agree?
00:21:25
◼
►
I think it really depends on, especially in our country,
00:21:27
◼
►
how rich the corporation is on each side of the case
00:21:30
◼
►
when it comes to, like if Disney arrives and says,
00:21:32
◼
►
"Hey, we see you've trained your AI model
00:21:34
◼
►
"on all of our intellectual property,"
00:21:35
◼
►
probably Disney's gonna win, based on the precedent.
00:21:39
◼
►
And in this case, I think Disney would be right
00:21:41
◼
►
that you can't train your model on stuff made by Disney.
00:21:45
◼
►
But as I said, an upgrade, Disney can train an AI model
00:21:48
◼
►
on stuff made by Disney.
00:21:49
◼
►
Like I'm not saying AI models are useless.
00:21:51
◼
►
Disney can train all the models they want
00:21:53
◼
►
on all their own intellectual property
00:21:55
◼
►
and use it to generate stuff.
00:21:57
◼
►
And I think the stuff that Disney generates
00:21:59
◼
►
from Disney's own intellectual property
00:22:01
◼
►
should be copyrightable because humans are running that
00:22:04
◼
►
and doing the prompts and tweaking the output and whatever.
00:22:06
◼
►
And I think that should be copyrightable.
00:22:08
◼
►
So, so far, the one legal case that we've heard of
00:22:12
◼
►
has gone the opposite way as I expected.
00:22:14
◼
►
But this is a, you know, I think you're right, Marco,
00:22:16
◼
►
that lots of people have different opinions on this.
00:22:18
◼
►
I just feel like mine is very strongly against using other people's intellectual property.
00:22:23
◼
►
And I think the problem with this stable diffusion thing is it's kind of admitting that, hey,
00:22:29
◼
►
we probably shouldn't be using your stuff because they will immediately stop using it
00:22:32
◼
►
if you ask them.
00:22:33
◼
►
So based on that silly policy, they're saying, oh, we understand that you have the right
00:22:38
◼
►
to make us stop.
00:22:39
◼
►
Or maybe they're just trying to say we're being nice and we'll stop if you want.
00:22:42
◼
►
But given my opinion that it is totally wrong for them to be taking that stuff in the first
00:22:46
◼
►
I think this is a dumb policy.
00:22:48
◼
►
- I think this is one of those areas where
00:22:51
◼
►
we're gonna look back at whatever our opinions are now,
00:22:54
◼
►
we're gonna look back in five or 10 years
00:22:55
◼
►
and be like, man, we were way off.
00:22:57
◼
►
And I don't know which direction we're gonna go in,
00:22:59
◼
►
but I think this is gonna be a rapidly evolving policy
00:23:04
◼
►
and what's acceptable and what's legal
00:23:08
◼
►
are both also going to rapidly evolve.
00:23:10
◼
►
I agree with you on the output being copyrightable.
00:23:15
◼
►
again, it's like, if you use Photoshop to flood fill an area with a pattern, does that
00:23:19
◼
►
not qualify for copyright because the program made the pattern and you didn't? So that
00:23:24
◼
►
obviously, I think using it, I think the output is copyrightable, but I think the input is
00:23:29
◼
►
really questionable and it's not clear cut. I think one area where it could kind of fall
00:23:37
◼
►
legally into some kind of settlement is that when you watch a movie and you get the little
00:23:43
◼
►
FBI warrant at the beginning, there are certain rights that you have and certain rights that
00:23:47
◼
►
you don't. You are not, for instance, allowed to buy a copy of the movie on Blu-ray or whatever
00:23:53
◼
►
and then show it to an audience who has paid to be there. You aren't allowed to use it
00:23:57
◼
►
commercially. You aren't allowed to do public performances. And the same thing applies to
00:24:03
◼
►
most music and things like that. And so you could argue, if you are a copyright holder,
00:24:08
◼
►
You could maybe argue, well, the use of your model viewing or listening to my content is
00:24:15
◼
►
not licensed because it's not personal, private home use or whatever.
00:24:20
◼
►
So they could maybe get something in there, but I think overall the concept, I still currently
00:24:25
◼
►
at least, I still fall on the side of the AI is just watching and learning from things
00:24:31
◼
►
in a similar, well, maybe not too similar, but it's in an analogous way to how humans
00:24:37
◼
►
might see something and be inspired by it over time.
00:24:40
◼
►
But what do you have to say about the idea that the program doesn't have any rights?
00:24:43
◼
►
Because in our law system, computer programs don't have rights right now, right? And I
00:24:48
◼
►
think this specific computer program definitely shouldn't have rights, because it's not a
00:24:51
◼
►
sentient being. So yes, it is doing something similar to what humans do, broad strokes,
00:24:56
◼
►
but it doesn't have any rights, because it is not a sentient being, and all of our laws
00:25:00
◼
►
apply to sentient beings, right? No, for that, I mean, I guess I agree with
00:25:04
◼
►
but I don't see this as whether the AI has rights itself.
00:25:09
◼
►
I see this as like--
00:25:11
◼
►
- You're trying to give it creative control.
00:25:12
◼
►
It's saying it created this thing,
00:25:14
◼
►
therefore now the program has rights to it.
00:25:15
◼
►
I mean, if you wanted to say that, I would say great.
00:25:17
◼
►
Now the program can do whatever it wants with it.
00:25:19
◼
►
You're like, well, the program doesn't wanna do anything
00:25:22
◼
►
with it, the human that ran the program, though,
00:25:24
◼
►
wants to sell it for an in and out purchase.
00:25:26
◼
►
So it's like, okay, if you actually want to assign rights,
00:25:29
◼
►
assign them to the program, and then you'd be like,
00:25:30
◼
►
well, that's dumb, the program can't do anything.
00:25:32
◼
►
- Exactly, that's why I don't have any rights.
00:25:35
◼
►
- Well, but I think it's more complicated than that.
00:25:38
◼
►
Like when you dive into the technical part of it,
00:25:40
◼
►
like okay, well what about when Google indexes
00:25:43
◼
►
fact data off of pages, and then you do a fact query
00:25:46
◼
►
and it shows one of those Google knowledge things.
00:25:48
◼
►
That's a questionable area of copyright as well, right?
00:25:51
◼
►
But I think that's very, very similar to this.
00:25:53
◼
►
- It is, but see, here's the thing.
00:25:55
◼
►
You're saying how this turns out,
00:25:56
◼
►
it can go a couple different ways.
00:25:58
◼
►
It really depends on, in the case of the Google thing,
00:26:02
◼
►
When that case comes--
00:26:03
◼
►
I don't remember what the resolution of the case--
00:26:05
◼
►
I think Google won that one.
00:26:06
◼
►
But when the cases come out, there are two strong forces
00:26:09
◼
►
One is rich corporations want something.
00:26:12
◼
►
In this case, Google is the rich corporation.
00:26:14
◼
►
It wanted to be able to take summaries of web pages
00:26:17
◼
►
and show them as its own output, which is arguably,
00:26:20
◼
►
you're right, very similar to what these models do.
00:26:22
◼
►
But I'm pretty sure--
00:26:23
◼
►
and we'll have all of this for a moment--
00:26:25
◼
►
I'm pretty sure Google won that case.
00:26:27
◼
►
And did they win it because they were right,
00:26:28
◼
►
or did they win it because they're Google,
00:26:30
◼
►
and they're rich, and they have a lot of lawyers?
00:26:32
◼
►
see also Disney and copyright.
00:26:34
◼
►
- Right, exactly, right.
00:26:35
◼
►
So I think, but in using the analogy to what Google does,
00:26:38
◼
►
I kind of agree that Google, it's a little bit fuzzier,
00:26:42
◼
►
but because Google's whole thing is there,
00:26:44
◼
►
every one of their search results
00:26:45
◼
►
is arguably a summary or whatever,
00:26:47
◼
►
but Google is trying to be a venue for you to go somewhere,
00:26:51
◼
►
and by putting the summary things,
00:26:52
◼
►
they're making you not go there.
00:26:53
◼
►
When a program is summarizing
00:26:58
◼
►
and providing like an index or whatever,
00:27:00
◼
►
If you follow that road too far down, you're like,
00:27:02
◼
►
oh, you can't have search engines anymore
00:27:03
◼
►
because you're not allowed to reproduce
00:27:05
◼
►
any portion of my work, even a summary of it
00:27:07
◼
►
if it's programmatically generated anywhere.
00:27:08
◼
►
And that gets absurd.
00:27:09
◼
►
Obviously, we think we should have search engines,
00:27:11
◼
►
so let's not make a dumb law like that.
00:27:12
◼
►
- You can't even have browser caches
00:27:14
◼
►
if you follow that law too closely.
00:27:16
◼
►
- Right, but then there, but you know,
00:27:17
◼
►
this is the kind of technical nuance
00:27:19
◼
►
that laws aren't great with, okay?
00:27:20
◼
►
But what about when they take my webpage
00:27:21
◼
►
and they don't even provide a link to it
00:27:23
◼
►
and they just summarize the content
00:27:24
◼
►
and re-smash it through and say, oh, here's the answer
00:27:27
◼
►
and it's provided by Google
00:27:28
◼
►
and they don't give credit to my answer
00:27:29
◼
►
I'm the one who wrote that like, you know, "Grapes are poison to dogs" on my webpage,
00:27:33
◼
►
and they just took my paragraph and summarized it or massaged it and put it on there, right?
00:27:38
◼
►
And I, you know, again, I think Google won that case because they're rich, not because
00:27:41
◼
►
they're right.
00:27:42
◼
►
I would have decided that differently, but it's a nuanced ruling in that case, very nuanced,
00:27:45
◼
►
because I don't want to outlaw search engines or browser caches.
00:27:47
◼
►
In the case of the, you know, Disney finds someone's been training their AI models on,
00:27:52
◼
►
you know, Marvel superheroes, which absolutely has already happened, I'm sure, right?
00:27:57
◼
►
And Disney gets angry and sues.
00:27:59
◼
►
Disney's probably going to win that because they have all the politicians in their pocket
00:28:02
◼
►
or whatever the paraphrase of the line from The Godfather is.
00:28:07
◼
►
But on the other side of that, unlike the Google case, there is another side to that
00:28:11
◼
►
that is potentially just as powerful, which is people like free stuff.
00:28:16
◼
►
And combined with the fact that it's very difficult to – it's not immediately obvious
00:28:21
◼
►
what has been used to train an AI model.
00:28:23
◼
►
And then mixed into that, these sort of academic things, which is like, well, they're not
00:28:26
◼
►
selling anything that's just part of academic research and they should have broader purview
00:28:32
◼
►
to do things for non-commercial purposes or whatever, right?
00:28:34
◼
►
But the fact that it's just going to be like, "Well, Disney can sue, but they can try to
00:28:39
◼
►
make laws in the US, but the world is bigger than the US and anything that gets on the
00:28:43
◼
►
internet is everywhere, so good luck trying to stop people from making pictures of pregnant
00:28:49
◼
►
Yeah, pictures of pregnant Sonic probably Sega would not like to exist, but people are
00:28:52
◼
►
going to make them and it's a derivative work.
00:28:54
◼
►
That's probably a bad example.
00:28:55
◼
►
But anyway, these AI models are going to create images, and they're going to go everywhere
00:29:01
◼
►
if people want them.
00:29:02
◼
►
If it is a useful thing to have, it's going to be so hard to stop them.
00:29:05
◼
►
So the countervailing force here is at a certain point the case is coming, they're going to
00:29:10
◼
►
say, "Well, we think this should probably be illegal, but it's just so common and people
00:29:13
◼
►
really like it."
00:29:14
◼
►
And I know courts don't put an opinion that says, "Well, everybody does it, so it's fine."
00:29:18
◼
►
But practically speaking, you can end up with decisions that kind of lean in that direction.
00:29:23
◼
►
I'm sure there's some legal precedent of like, you know, like people who squat on a piece
00:29:27
◼
►
of land and at a certain point they own it.
00:29:29
◼
►
That doesn't make any sense.
00:29:30
◼
►
You're like, "That doesn't seem fair."
00:29:31
◼
►
But it's like, "But we have laws like that because it's even more absurd to do the opposite
00:29:34
◼
►
in some cases."
00:29:35
◼
►
Well, we've been living on this land for 100 years, but you know, you're going to kick
00:29:38
◼
►
us off because, well, again, in this country there's terrible analogies for that as well.
00:29:42
◼
►
Anyway, I'm not doing well with analogies.
00:29:44
◼
►
It's early in the morning.
00:29:45
◼
►
What can I tell you?
00:29:46
◼
►
So it depends on when these cases come out.
00:29:48
◼
►
I feel like there actually is something that may be equivalent to Disney trying to say,
00:29:54
◼
►
"Hey," again legally, "Hey, we've decided that this is okay."
00:29:58
◼
►
Regardless of what the law ends up being, I think it is not right to take people's copyrighted
00:30:02
◼
►
work, train a model on it, and sell the results of that model's output.
00:30:06
◼
►
If that model ever becomes a sentient being, which is plausible in the future, then we'll
00:30:10
◼
►
have a different discussion.
00:30:11
◼
►
That discussion will be what kind of rights do sentient computer programs have.
00:30:14
◼
►
And that is a very different discussion,
00:30:16
◼
►
but I want to be clear,
00:30:17
◼
►
none of these programs are sentient,
00:30:19
◼
►
no one is arguing that they are.
00:30:21
◼
►
So we can set that aside for a sci-fi future,
00:30:24
◼
►
but that's not what we're dealing with now.
00:30:26
◼
►
- Yeah, see, I see where you're coming from,
00:30:28
◼
►
I don't agree.
00:30:29
◼
►
I still think that--
00:30:30
◼
►
- Would you agree if you were an artist
00:30:32
◼
►
and your work was being taken
00:30:33
◼
►
and then the results of it being sold?
00:30:35
◼
►
- We are artists.
00:30:36
◼
►
So what would you think about somebody downloading all,
00:30:39
◼
►
all episodes of ATP we've ever made,
00:30:40
◼
►
all the podcasts we've ever made,
00:30:42
◼
►
and training a model on that?
00:30:44
◼
►
- Yeah, I don't think that's something,
00:30:45
◼
►
and selling the results,
00:30:46
◼
►
I don't think that's something you should do.
00:30:47
◼
►
What if they took the source code for Overcast
00:30:50
◼
►
and fed it into a program that tweaked it
00:30:52
◼
►
and then started selling new copies of Overcast?
00:30:55
◼
►
- Well, I don't release the source code,
00:30:56
◼
►
so that's a little bit different, but you know--
00:30:57
◼
►
- Yeah, it can reverse compile it,
00:30:59
◼
►
'cause you released the binary
00:31:00
◼
►
and they can just generate the source code from that.
00:31:01
◼
►
You've all seen that, and the program doesn't care
00:31:03
◼
►
what the variable names are all, A, B, C, D, and stuff,
00:31:05
◼
►
because the program doesn't care about that,
00:31:06
◼
►
but the program refacto, decompiles it,
00:31:09
◼
►
refactores it, changes it a little bit,
00:31:11
◼
►
and sells a new thing.
00:31:12
◼
►
I said, "Derivative work."
00:31:12
◼
►
just looked at it just the same way a person would.
00:31:14
◼
►
- Well see, and that I think, that's why
00:31:17
◼
►
I'm fairly hesitant to restrict the training side of things
00:31:20
◼
►
because in my opinion, I would rather use the systems
00:31:24
◼
►
we already have with copyright and fair use
00:31:27
◼
►
and trademark also, which is a little bit relevant
00:31:30
◼
►
in some of these things, but anyway,
00:31:31
◼
►
I'd rather use the systems we already have
00:31:34
◼
►
around copyright and fair use to manage
00:31:37
◼
►
when things go wrong and when things are misused,
00:31:40
◼
►
rather than say this new kind of technology
00:31:43
◼
►
that's really exciting and has a lot of good valid uses,
00:31:45
◼
►
we're gonna make it way harder
00:31:47
◼
►
and way less practical to create.
00:31:49
◼
►
Because if somebody uses an AI model
00:31:51
◼
►
that's been trained on whatever they could find,
00:31:53
◼
►
if they use that to generate something
00:31:55
◼
►
that is too close to an existing copyrighted work
00:31:58
◼
►
or that is clearly a derivative work
00:32:00
◼
►
of a protected character or franchise or whatever,
00:32:02
◼
►
then we already have infrastructure in place
00:32:05
◼
►
to issue takedown notices, to sue for something being
00:32:09
◼
►
or not being fair use, we already have all those systems.
00:32:12
◼
►
And I think we are better off relying
00:32:14
◼
►
on that human judgment side rather than
00:32:17
◼
►
kneecapping this technology in its early days
00:32:20
◼
►
in ways that are extremely impractical.
00:32:22
◼
►
I mean, if you think about, look at a similar problem
00:32:25
◼
►
that we faced when making web 2.0 and everything else.
00:32:29
◼
►
We had the rise of user-generated content sites,
00:32:33
◼
►
places like YouTube and Flickr and Tumblr
00:32:36
◼
►
and all these places.
00:32:37
◼
►
And you could say, legally speaking,
00:32:40
◼
►
you could say it should be impossible
00:32:44
◼
►
for anyone to ever upload a copy of my work
00:32:47
◼
►
to these services.
00:32:48
◼
►
In practice, that's really impractical and limiting.
00:32:51
◼
►
And so, kind of what you said a minute ago, John,
00:32:54
◼
►
so really the practice became,
00:32:56
◼
►
all right, well we're gonna have the structure in place
00:32:57
◼
►
to manage when there are violations,
00:32:59
◼
►
but it's not gonna be perfect,
00:33:01
◼
►
and it's not gonna be the intellectual property ideal
00:33:05
◼
►
of having it be impossible to break the law,
00:33:07
◼
►
but we're gonna just make a process in place
00:33:10
◼
►
so that we can deal with when people do things
00:33:12
◼
►
that are over the line.
00:33:13
◼
►
- But their process is they have a program
00:33:15
◼
►
called Content ID that will immediately detect
00:33:17
◼
►
if you have music playing in the background
00:33:19
◼
►
that's copyrighted, even if it's like
00:33:21
◼
►
this next house over, right?
00:33:22
◼
►
It is so incredibly fast and automated
00:33:25
◼
►
and defaults to rejecting.
00:33:26
◼
►
I mean, the YouTube video I wanted to see about Fusion
00:33:31
◼
►
that I think I might have mentioned on a past show
00:33:33
◼
►
was from Real Engineering, I think.
00:33:36
◼
►
It was rejected by copyright ID from YouTube or whatever,
00:33:40
◼
►
because the music used in the background was copyrighted.
00:33:44
◼
►
And the music used in the background
00:33:45
◼
►
was from a set of licensable music
00:33:47
◼
►
that the channel had licensed.
00:33:49
◼
►
So the copyright ID thing is so automated and so strict
00:33:53
◼
►
and so defaulting to rejection that it says,
00:33:55
◼
►
oh, this is copyright.
00:33:55
◼
►
It's like, yeah, and I paid for it.
00:33:57
◼
►
I paid for it to use on my channel.
00:33:59
◼
►
But it's like, no, program says no.
00:34:00
◼
►
So if it becomes possible to programmatically detect if your AI training model was trained
00:34:06
◼
►
with any copyrighted data, no one will ever be able to upload any AI trained stuff to
00:34:11
◼
►
YouTube, because YouTube will immediately detect it and reject it, even if you've paid
00:34:14
◼
►
for the rights, because that's the side they err on because big corporations don't like
00:34:20
◼
►
It's more like, you know, okay, so that's YouTube, but then what about the wider internet,
00:34:23
◼
►
like that it'll be everywhere.
00:34:24
◼
►
And again, I think AI models are still useful and can be useful, particularly for companies
00:34:30
◼
►
that want to train it on their own intellectual property.
00:34:32
◼
►
Like it is useful to train things on stuff that you own.
00:34:34
◼
►
And even an individual might not own enough to train a model,
00:34:37
◼
►
but there are licensable image sets that you can train on.
00:34:40
◼
►
That's also legally valid, right?
00:34:42
◼
►
I don't think, to your point about kneecapping disorder,
00:34:44
◼
►
I don't think we need any new laws.
00:34:46
◼
►
My interpretation of the existing laws is,
00:34:48
◼
►
this is already illegal, but the existing laws
00:34:50
◼
►
also provide lots of ways that it's legal.
00:34:53
◼
►
You know, Getty Images can sell image training sets.
00:34:56
◼
►
There are free public domain image training sets.
00:34:59
◼
►
if you would like to license an image training set from Disney, maybe they'll sell you one
00:35:03
◼
►
and then you can train your model on that.
00:35:05
◼
►
And then we have to have all the automated systems saying, "Okay, I can tell this was
00:35:08
◼
►
trained on X, Y, and Z images, and X, Y, and Z images are from these licensable sets,"
00:35:13
◼
►
and then you have to show that you licensed it just the same way the real engineer had
00:35:16
◼
►
to say, "No, no, I licensed this music thing from this company.
00:35:20
◼
►
I'm allowed to use it in my video.
00:35:21
◼
►
This is how the system is supposed to work, YouTube.
00:35:24
◼
►
Stop rejecting my videos."
00:35:26
◼
►
and annoying, but if anything, if there is a programmatic way to enforce existing laws on fair use and intellectual property,
00:35:34
◼
►
it will be worse for creativity and innovation than humans enforcing it, because computers are really, really good at doing stuff like detecting if there's copyrighted music.
00:35:43
◼
►
And I presume eventually computers will be pretty good at figuring out if any copyrighted works from some set are part of a training set.
00:35:51
◼
►
training set, because that's, you know, have I been trained or whatever, and maybe they
00:35:54
◼
►
just have a database, I don't know how it works or whatever, but once it's programmatically
00:35:57
◼
►
possible to answer this question, it swings farther in the direction of Disney stopping
00:36:02
◼
►
you from ever training any image set on Star Wars.
00:36:05
◼
►
Well, I mean, and again, I think using Content ID on YouTube as an example is, that's kind
00:36:10
◼
►
of an extreme case, even though it is a highly relevant one, just because Content ID is so
00:36:13
◼
►
overly aggressive, but you know, that's not, and if you think about also, like, some other
00:36:17
◼
►
angle of this, you know, if Disney starts using this technology to like, you know, scan
00:36:22
◼
►
everything that they can find on YouTube and be like, "Oh wait, that's Elsa. Oh, stop,
00:36:26
◼
►
wait." And like, they can use the AI to detect derivative works even. Disney could start
00:36:31
◼
►
scanning like DeviantArt and Flickr and stuff for like fan-drawn versions of their characters.
00:36:38
◼
►
Well, but under existing laws, they have to show it was taken from like one of their actual
00:36:43
◼
►
Well, drawing's not a picture of Elsa, right?
00:36:45
◼
►
Although if you do want to put Elsa on a t-shirt and sell it,
00:36:48
◼
►
you can't say, well, I drew the Elsa myself, so it's fine.
00:36:50
◼
►
Actually, I don't know where the law is on that,
00:36:52
◼
►
depending on how derivative the work is.
00:36:54
◼
►
But anyway, what you would expect for the training thing
00:36:56
◼
►
and how it would be trained is not just, hey,
00:36:58
◼
►
you drew a picture of an apple, and there's
00:36:59
◼
►
a picture of an apple in the training set.
00:37:00
◼
►
It's like, is it literally your picture of the apple?
00:37:03
◼
►
And that's where it gets tricky.
00:37:05
◼
►
Again, I don't know how this have I been trained thing works.
00:37:08
◼
►
Maybe it is a byte provider-- they
00:37:09
◼
►
know the checksums of every image that
00:37:12
◼
►
was part of the data set and you can upload one of your images
00:37:15
◼
►
and if the checksum matches, then it's the same.
00:37:17
◼
►
Is it literally byte for byte and if you resize your picture
00:37:20
◼
►
or change one pixel, it doesn't match?
00:37:22
◼
►
Or is it fuzzier than that?
00:37:23
◼
►
This is kind of similar to the CSAM system
00:37:26
◼
►
of perceptual hashing or whatever.
00:37:28
◼
►
Because that gets into what you're saying.
00:37:30
◼
►
It's like, OK, well, it's not from a Disney picture of Elsa.
00:37:33
◼
►
It's a picture that I drew of Elsa.
00:37:35
◼
►
And that's different.
00:37:37
◼
►
And again, with Disney being overly aggressive, saying,
00:37:39
◼
►
oh, great, we have a way to tell if anyone never
00:37:41
◼
►
training set on any picture of Elsa.
00:37:44
◼
►
I'm not sure where that falls under existing laws.
00:37:46
◼
►
But is that even possible?
00:37:48
◼
►
Is it possible for them to say,
00:37:50
◼
►
we can tell that you trained your model on these images?
00:37:53
◼
►
- I don't know if you can reverse it,
00:37:54
◼
►
but I assume the people who train the models
00:37:57
◼
►
know literally every single image that's in the image set.
00:37:59
◼
►
It's a long list, but they probably,
00:38:01
◼
►
they have the data somewhere,
00:38:02
◼
►
and they could build a database based on that data
00:38:05
◼
►
with either real content hashes, perceptual hashes.
00:38:08
◼
►
That's how I assume have I been trained is working.
00:38:10
◼
►
Like they know, if you train it on a data set
00:38:13
◼
►
and then you delete the data set
00:38:14
◼
►
and then you can be like, oh well Shrug,
00:38:16
◼
►
I don't even know what it was trained on.
00:38:17
◼
►
I think it's difficult to like,
00:38:19
◼
►
difficult or impossible to reverse it to say,
00:38:21
◼
►
oh, I can tell by looking at the output
00:38:22
◼
►
what it was trained on.
00:38:23
◼
►
I don't think that's possible.
00:38:24
◼
►
But hopefully you keep around.
00:38:27
◼
►
This is where you might need new laws.
00:38:28
◼
►
Like, hey, if you wanna use an AI model,
00:38:30
◼
►
you have to be able to prove something
00:38:32
◼
►
about your training set.
00:38:33
◼
►
And if you have an AI model and you say,
00:38:34
◼
►
oh, we deleted our training set.
00:38:35
◼
►
Oh, well, the law would say,
00:38:37
◼
►
yeah, you can't use any of that.
00:38:38
◼
►
because part of whatever our laws,
00:38:40
◼
►
new laws that come out will say,
00:38:42
◼
►
"Okay, here are the situations
00:38:44
◼
►
"under which the output of an AI model is copyrightable,
00:38:48
◼
►
"can be used successfully."
00:38:50
◼
►
You probably have to be able to prove something
00:38:53
◼
►
about the training set,
00:38:53
◼
►
and ignorance would not be a defense of the law of the law.
00:38:56
◼
►
They're like, "We don't even know
00:38:57
◼
►
"what the training set was, so everything's fine, right?"
00:38:59
◼
►
Yeah, probably not.
00:39:00
◼
►
- I don't know, I'm still on the side of
00:39:02
◼
►
this is addressing it at the wrong stage of the pipeline.
00:39:04
◼
►
I mean, 'cause it's gonna be more complicated.
00:39:07
◼
►
many new models that come out are not based on nothing
00:39:10
◼
►
and trained from scratch, they're based on initial models
00:39:13
◼
►
that were pre-trained by somebody else.
00:39:16
◼
►
- That's part of what's in favor of the chaos
00:39:18
◼
►
of the internet, it's like now we can't even tell,
00:39:19
◼
►
what are you gonna do now?
00:39:20
◼
►
- Right, you know, I think there's gonna be,
00:39:23
◼
►
I think it's gonna be quickly very difficult
00:39:25
◼
►
to ever tell how something was trained in completion.
00:39:29
◼
►
And I think, again, I think this is as unenforceable
00:39:33
◼
►
as it should be impossible to ever upload
00:39:35
◼
►
copyrighted material to your platform.
00:39:36
◼
►
Like that should, it's just impossible.
00:39:39
◼
►
- But I mean, but that's more of like a practicality.
00:39:42
◼
►
The legal thing is like, oh, you can't actually,
00:39:44
◼
►
you don't actually own this copyrighted material.
00:39:46
◼
►
The law is clear, it's just the practicalities
00:39:48
◼
►
of the enforcement of the law are not clear.
00:39:49
◼
►
And here I think the law is not clear yet.
00:39:52
◼
►
- Right, yeah, because I still think it's definitely
00:39:55
◼
►
a very open question about like whether training
00:39:58
◼
►
an AI model on a copyrighted work is itself
00:40:02
◼
►
a violation of that copyright or not.
00:40:04
◼
►
Again, 'cause I still don't think it is.
00:40:06
◼
►
Well, we'll see what the courts say,
00:40:08
◼
►
but I'm pretty strongly on one side of this.
00:40:10
◼
►
But, and again, this doesn't mean I'm against AI models.
00:40:12
◼
►
I think they're super useful,
00:40:13
◼
►
and I think there's lots of places where they can be used.
00:40:16
◼
►
It's just that I think there should be some discipline
00:40:19
◼
►
about the training, that's all.
00:40:21
◼
►
- We are brought to you this week by Blaze.
00:40:25
◼
►
Have you heard of blaze.tech?
00:40:27
◼
►
Blaze.tech lets you build software and web apps
00:40:30
◼
►
without writing code.
00:40:31
◼
►
So really, that's it.
00:40:33
◼
►
If you've been waiting around for an engineer
00:40:34
◼
►
to help you build something,
00:40:36
◼
►
You don't need to wait anymore.
00:40:37
◼
►
Just use Blaze.tech and you can build software
00:40:40
◼
►
without any coding knowledge at all.
00:40:43
◼
►
Anyone with an entrepreneurial spirit
00:40:45
◼
►
can build software with no code with Blaze.tech.
00:40:47
◼
►
So if you want, you can also ask Blaze.tech's
00:40:50
◼
►
world-class team to fully set up and launch your app.
00:40:52
◼
►
That way you can focus on your business
00:40:54
◼
►
and Blaze.tech can take care of your app
00:40:56
◼
►
and it's super easy to use.
00:40:58
◼
►
They have a drag and drop builder,
00:41:00
◼
►
integration with every popular API you can think of
00:41:02
◼
►
so you can create and launch apps in minutes.
00:41:05
◼
►
People have used Blaze to build things like healthcare apps, inventory management systems,
00:41:10
◼
►
customer portals, HR and recruiting apps, automated workflows, and tons of other software,
00:41:16
◼
►
all without coding or needing an engineer to help them.
00:41:19
◼
►
They are super secure also, so healthcare, legal, and finance organizations love them.
00:41:24
◼
►
So check out Blaze.tech.
00:41:26
◼
►
Both startups and Fortune 500s are using Blaze.tech as their secret weapon to quickly create and
00:41:32
◼
►
launch new software.
00:41:33
◼
►
Sign up for Blaze.tech at blaze.tech/ATP for 10% off.
00:41:38
◼
►
Once again, blaze.tech/ATP for 10% off.
00:41:42
◼
►
Pricing starts at just 25 bucks per user per month.
00:41:46
◼
►
Thank you so much to Blaze.tech for sponsoring our show.
00:41:50
◼
►
- Ted Shop writes that we would probably get a kick
00:41:57
◼
►
out of a new Twitter to Mastodon web app,
00:42:00
◼
►
and this is @mastedon-flock.vercel.app.
00:42:04
◼
►
I didn't try it past the first screen or two,
00:42:07
◼
►
but as a former Windows user, oh, the nostalgia.
00:42:13
◼
►
This made me laugh quite a bit.
00:42:14
◼
►
And it looks like a Windows 95 installation wizard.
00:42:17
◼
►
Big, Nsys, like Nolsoft install system energy.
00:42:21
◼
►
I remember I used to play with that,
00:42:23
◼
►
probably around the time that Marco and I--
00:42:24
◼
►
- No, Nsys was what replaced this.
00:42:26
◼
►
That was much better.
00:42:27
◼
►
Wasn't Nsys not full screen?
00:42:29
◼
►
- That's true, I think you're right.
00:42:30
◼
►
This is that full screen blue gradient that was used a lot around that time.
00:42:35
◼
►
And now this is, you could tell that they installed the driver for their GPU before they ran this wizard
00:42:40
◼
►
because the background gradient is not super dithered in 16 color mode.
00:42:45
◼
►
I thought this was really interesting because, well, two levels.
00:42:48
◼
►
One, it is just like Move to Dawn in that it gives you follow on follow links.
00:42:52
◼
►
So ignore the whole Windows 95 look or whatever.
00:42:54
◼
►
It is another tool that is like Move to Dawn.
00:42:56
◼
►
So Move to Dawn never goes away and you want to try something similar,
00:42:59
◼
►
this does the same thing, it authenticates with all your things, then it gives you follow
00:43:03
◼
►
buttons for all the people who have the info.
00:43:06
◼
►
I don't know if it does a better job of scraping or whatever.
00:43:08
◼
►
But the second thing is, part of the gag of it looking like Windows 95, it's not as far
00:43:15
◼
►
as I can tell, maybe you can tell me if you're familiar with Windows 95, it's not a pixel
00:43:20
◼
►
for pixel reproduction, it's almost like it's vector.
00:43:23
◼
►
They have the Windows 95 look, but they've done it with vector pixels, because it looks
00:43:28
◼
►
way bigger, it's scaled up, but it is pixel accurate if you imagine the pixels being these
00:43:34
◼
►
smooth vector-y things.
00:43:36
◼
►
Well, if you zoom in on the next and cancel buttons, on the corners, you see that they
00:43:41
◼
►
have like a beveled edge.
00:43:44
◼
►
Now in actual Windows 95, that would have been probably a one pixel thick line.
00:43:48
◼
►
And so here they have like a beveled edge in the corners, so obviously it's like a CSS
00:43:51
◼
►
border trick or something.
00:43:52
◼
►
Yeah, and I'm on the final page, and I'm looking at the headers of the table, and the headers
00:43:58
◼
►
have like a 3D gradient on them and you can see that there is like a you know a
00:44:02
◼
►
dark line on the bottom and lighter line on the left and top and that line is
00:44:06
◼
►
made of a chunky like would have been like a Windows 95 size pixel but then
00:44:11
◼
►
the bevel is made with pixels that are 1/8 of that size that would not exist in
00:44:15
◼
►
Windows 95 right because they're on an angle so I think it's very clever cut
00:44:19
◼
►
it's a little bit of an uncanny valley thing I'm like this looks like Windows
00:44:22
◼
►
95 but Windows 95 didn't never look like this down to the pixel anyway I thought
00:44:26
◼
►
it was a little extra fun sauce on top of a useful tool.
00:44:31
◼
►
And you can just in case the Move
00:44:33
◼
►
to Don goes away.
00:44:34
◼
►
But it's one of the show notes if you want to try another one
00:44:35
◼
►
or get a little bit of weird uncanny value
00:44:37
◼
►
Windows 95 nostalgia.
00:44:39
◼
►
- And while we're on the subject,
00:44:39
◼
►
Massodon continues to be great, by the way.
00:44:42
◼
►
It just continues to be awesome.
00:44:44
◼
►
And it's way better than Twitter right now,
00:44:47
◼
►
in my view and for my needs and for the people I follow,
00:44:50
◼
►
it's way better than Twitter.
00:44:52
◼
►
So I'm happy there.
00:44:54
◼
►
- Briefly going back to this web app,
00:44:55
◼
►
Did you notice that if you click the X on the installer,
00:44:58
◼
►
it actually drops you down to Windows desktop
00:44:59
◼
►
and it has a privacy policy link.
00:45:02
◼
►
- Oh my God. - An about link
00:45:03
◼
►
that opens Windows Explorer,
00:45:06
◼
►
that opens Internet Explorer.
00:45:07
◼
►
And then if you click on the home page,
00:45:08
◼
►
or no, not the home page, what was it?
00:45:10
◼
►
It was one of these icons.
00:45:11
◼
►
Oh, I guess it was the bookmarks icon at the top.
00:45:14
◼
►
Then you go to what appears to be
00:45:16
◼
►
the author's like old 1999 website,
00:45:20
◼
►
which is pretty delightful.
00:45:22
◼
►
Big like GeoCity's energy on this one.
00:45:24
◼
►
There's a lot more here than I thought.
00:45:25
◼
►
This is really great.
00:45:26
◼
►
Oh, that's so cool.
00:45:28
◼
►
Yeah, isn't that neat?
00:45:30
◼
►
The other way I would have imagined
00:45:31
◼
►
doing this that would have been fun
00:45:33
◼
►
is all the ones that run older operating systems in Web
00:45:37
◼
►
Assembly in your browser.
00:45:39
◼
►
There's one for all the versions of Mac OS.
00:45:41
◼
►
It's like macos-9.app, system7.app, macos-8.app.
00:45:44
◼
►
Try those URLs.
00:45:46
◼
►
And those are basically running a virtual machine in JavaScript
00:45:48
◼
►
that's running the real version of those operating systems.
00:45:52
◼
►
If you had done that and then wrote
00:45:54
◼
►
moved it on in Windows 95 and ran it inside the Windows 95
00:45:58
◼
►
that's running in your browser,
00:45:59
◼
►
then you wouldn't have to fake any of this stuff.
00:46:01
◼
►
You could literally just make everything exactly
00:46:05
◼
►
in Windows 95, although that might be more difficult
00:46:07
◼
►
trying to authenticate with modern SSL stuff
00:46:10
◼
►
from Windows 95.
00:46:12
◼
►
- Yeah, I was gonna say, can you imagine trying to make
00:46:14
◼
►
just any network request that would work
00:46:16
◼
►
in the modern era from that?
00:46:18
◼
►
- I think some, I saw someone who did a 68K Motorola 68K
00:46:22
◼
►
classic Mac OS Mastodon client, and I think somehow they compiled a version of OpenSSL
00:46:28
◼
►
or something. They basically found a way to make a TLS connection from classic Mac OS
00:46:33
◼
►
just to support their idea of doing this funny classic Mac OS client from Mastodon. That's
00:46:39
◼
►
That's pretty good stuff. Alright, and then let's see what else we have. Oh, that's right.
00:46:45
◼
►
I wanted to make note, this news is actually a little bit old. We spoke about rewind.ai
00:46:50
◼
►
I don't know, a couple months ago, this is the livestreams thing that John apparently
00:46:54
◼
►
has always wanted.
00:46:56
◼
►
It is available to anyone at this point, I believe.
00:46:59
◼
►
It is 30 days free trial, but then it's $20 a month, which I don't mean to begrudge.
00:47:05
◼
►
They've written software, they deserve to be compensated for it, but I don't know.
00:47:09
◼
►
I don't love the idea of this 30 days free, and then you have to come back and remember
00:47:15
◼
►
So I have not personally tried it, especially since this is not an itch I feel like I need
00:47:18
◼
►
But for those that are interested,
00:47:21
◼
►
it's available if you wanted to give it a shot.
00:47:23
◼
►
- It ain't begrudging anything.
00:47:24
◼
►
It's just the question of, you know,
00:47:25
◼
►
do you think the price is worth the value for you?
00:47:28
◼
►
And I was into this thing.
00:47:30
◼
►
I'm like, oh, I'm gonna try it out of technical curiosity.
00:47:32
◼
►
And I actually got in early and they sent the email
00:47:34
◼
►
and said, hey, it's available for you to try.
00:47:35
◼
►
I'm like, great, I'll try it.
00:47:36
◼
►
And then I saw it was one of those free trial
00:47:37
◼
►
and then we'll start charging you.
00:47:39
◼
►
And I knew myself well enough to know,
00:47:41
◼
►
I'm probably gonna forget and then I'm gonna get charged
00:47:43
◼
►
and then I'm gonna feel bad.
00:47:44
◼
►
In fact, I did that with the Lenza app, speaking of AI.
00:47:47
◼
►
make money off other people's stuff.
00:47:49
◼
►
The Lenza thing, like you signed up for a free trial
00:47:52
◼
►
of the $50 a month subscription.
00:47:54
◼
►
- Oh my goodness.
00:47:56
◼
►
- And then, or it was $50 a month or $50 a year, I forget.
00:48:00
◼
►
But anyway, you sign up for the free trial,
00:48:01
◼
►
and then once you sign up for the free trial,
00:48:03
◼
►
it gives you a discount on the in-app purchases.
00:48:05
◼
►
And so I made like several $6 in-app purchases
00:48:09
◼
►
to play with it, right?
00:48:10
◼
►
But then I'm like, oh hey, I just gotta remember
00:48:11
◼
►
to cancel this before the $50 thing.
00:48:13
◼
►
And sure enough, I forgot,
00:48:15
◼
►
'cause I didn't make a reminder for myself.
00:48:16
◼
►
And I got a refund for it because I asked for a refund the second it kicked in and Apple
00:48:20
◼
►
was nice enough to give me a refund.
00:48:22
◼
►
But anyway, that's why I haven't tried to rewind.
00:48:24
◼
►
I am curious about it.
00:48:25
◼
►
I'm like, "I don't want to have to sign up for it and set myself a reminder, remember
00:48:30
◼
►
to cancel it."
00:48:31
◼
►
It's just like, for me, that price and that situation was enough of a barrier to not want
00:48:38
◼
►
to deal with it.
00:48:39
◼
►
And honestly, since we've talked about it, I'm like, "Oh, I'll try for technical curiosity."
00:48:42
◼
►
I'm still kind of skeeved out by it.
00:48:44
◼
►
I'm like, "Well, I'll try it on my laptop.
00:48:45
◼
►
It's more isolated."
00:48:46
◼
►
But not that I think they're doing anything nasty.
00:48:48
◼
►
It's just, I don't know.
00:48:51
◼
►
I probably will try it eventually as well.
00:48:53
◼
►
In terms of $20 a month, if I got tons of value from it, I would pay $20 a month if
00:48:57
◼
►
I felt like I was getting $20 a month worth of value out of it.
00:48:59
◼
►
That's the question of software.
00:49:01
◼
►
How do you price it such that enough people will think that price is acceptable?
00:49:05
◼
►
Casey was also scared away by that price.
00:49:08
◼
►
But I think none of us are in the situation where we think we would possibly...
00:49:12
◼
►
The only reason we'd be looking at it is a little bit of curiosity, right?
00:49:14
◼
►
And so, maybe we'll try it.
00:49:17
◼
►
I can see this, like I said, this company eventually being spun off into like an enterprise
00:49:21
◼
►
software because I think lots of companies might like something like this so they can
00:49:25
◼
►
spy on their employees.
00:49:26
◼
►
This is a rich and burgeoning market that already has existing players that are probably
00:49:30
◼
►
worse than Rewind.ai because the technology is newer and better.
00:49:33
◼
►
But so far, their sales pitch does not seem to be to corporations, it seems to be to individual
00:49:41
◼
►
If you're an individual user that's already tried it,
00:49:43
◼
►
write in and tell us what you think.
00:49:45
◼
►
John, real-time follow-up, you cannot run this on your Mac Pro
00:49:48
◼
►
because it is not sufficiently powerful.
00:49:50
◼
►
I have ARM Macs in the house.
00:49:52
◼
►
OK, well, you said you would run it on your laptop.
00:49:54
◼
►
I thought you were trying to imply that you would have
00:49:56
◼
►
otherwise run it on the Mac Pro.
00:49:57
◼
►
And that is--
00:49:58
◼
►
The Mac Studio is another choice.
00:50:00
◼
►
You know, I have a couple of ARM-- yeah, some.
00:50:02
◼
►
Well, because your Mac Pro is just too old and busted,
00:50:04
◼
►
doesn't have Apple Silicon, which
00:50:05
◼
►
is required for rewind.ai.
00:50:07
◼
►
It's got an immunity to the spyware.
00:50:12
◼
►
- All right, John, tell me about your iPad mystery.
00:50:17
◼
►
- My son got a new iPad for Christmas.
00:50:18
◼
►
He was long overdue for one.
00:50:19
◼
►
- Which one?
00:50:20
◼
►
- He was using, well, that's part of the thing.
00:50:21
◼
►
He was using a really old one.
00:50:23
◼
►
He'd been using it for years.
00:50:24
◼
►
I actually, he wanted to draw with the Apple Pencil,
00:50:27
◼
►
so I had an old, like the original 9.7-inch iPad Pro,
00:50:31
◼
►
like the very first one that,
00:50:32
◼
►
I think it was the first one that had pencil support.
00:50:33
◼
►
- Well, the 12.9, that was the first small one.
00:50:36
◼
►
Yeah, that's right.
00:50:38
◼
►
So anyway, I said, here, you can use this,
00:50:40
◼
►
and use my old, roly Apple pencil,
00:50:41
◼
►
and laugh at how it gets charged and everything.
00:50:44
◼
►
And I'm like, well, you should just use that as your new iPad,
00:50:46
◼
►
because it's better than--
00:50:48
◼
►
his iPad was so old, it was older than the 9.7-inch iPad
00:50:52
◼
►
But eventually, he said, no, there's
00:50:54
◼
►
something haunted on your iPad, which I kind of agreed with him
00:50:57
◼
►
that 9.7 started getting weird in its later years.
00:50:59
◼
►
I don't know if it's a hardware thing,
00:51:00
◼
►
or it just doesn't agree with the modern OSes.
00:51:02
◼
►
But he kept using super-ancient iPads.
00:51:04
◼
►
So he was due for a new one and I got him a new one.
00:51:07
◼
►
And I bought it a while ago.
00:51:08
◼
►
And it was just sitting somewhere and then wrapped it and gave it to him.
00:51:11
◼
►
And then I was setting it up for him, because that's what dads do.
00:51:14
◼
►
I was trying to give it a name.
00:51:17
◼
►
And I wrote general about it and wrote the name of the thing.
00:51:21
◼
►
I remembered to change the name so it's easy to find.
00:51:24
◼
►
And the device list.
00:51:25
◼
►
And I started saying, "Oh, wait a second.
00:51:28
◼
►
Which iPad?"
00:51:29
◼
►
Because I like to give it the name of the thing.
00:51:33
◼
►
My wife says like, you know,
00:51:34
◼
►
Tina's Apple Watch Series 8.
00:51:36
◼
►
So I know it's the Series 8 one.
00:51:37
◼
►
Like, people ask that as an AskATD question.
00:51:39
◼
►
How do you name your devices?
00:51:40
◼
►
I name it John's iPad 14 Pro.
00:51:43
◼
►
Like, 'cause otherwise, or iPhone 14 Pro.
00:51:45
◼
►
'Cause otherwise I'll never friggin' remember
00:51:47
◼
►
what the names are and I can't just reuse the same name.
00:51:49
◼
►
I give them the name that is named with the device.
00:51:51
◼
►
- Yeah, it's nice too, like when you down the road
00:51:53
◼
►
when you upgrade that device,
00:51:55
◼
►
it's nice to be able to like keep the backups straight
00:51:57
◼
►
and like do the migration correctly and everything.
00:51:59
◼
►
It makes it very nice.
00:52:00
◼
►
- Yeah, and I do rename them to be like,
00:52:02
◼
►
okay now it's Alex's iPhone 14 Pro, right?
00:52:05
◼
►
But it's still iPhone 14 Pro,
00:52:06
◼
►
so I know which device we're talking about.
00:52:07
◼
►
And luckily in the house we tend not to buy
00:52:11
◼
►
multiple devices of the same type,
00:52:13
◼
►
just because people end up getting things
00:52:14
◼
►
at different years, but if we had like two iPhone 12 Pros,
00:52:16
◼
►
that might be a little bit confusing.
00:52:18
◼
►
Maybe I would put the color in then.
00:52:20
◼
►
Anyway, I wanted to give this thing a name of what it was.
00:52:24
◼
►
So I named it Alex's M2 iPad Pro.
00:52:27
◼
►
'Cause it was like, I got him the fancy one,
00:52:30
◼
►
He deserves it.
00:52:31
◼
►
He's been using an ancient iPad.
00:52:33
◼
►
I think he was using an original iPad Air maybe.
00:52:35
◼
►
It was super old.
00:52:37
◼
►
No face ID, very slow, very old.
00:52:39
◼
►
And he uses it all the time.
00:52:41
◼
►
It's a very important machine for him.
00:52:43
◼
►
So I'm a big fancy one.
00:52:45
◼
►
But then I went to General About and it said, "iPad Pro 4th Generation."
00:52:51
◼
►
I'm like, "Is that the M2?
00:52:54
◼
►
Or did I change my mind and say the M2 is too expensive and get him a cheaper one?"
00:52:58
◼
►
So I do what I normally do and type in a Google query for whatever, and I usually go to the
00:53:03
◼
►
Wikipedia page because it's usually straightforward and I know the sidebar is going to have the
00:53:07
◼
►
name of the SOC in it, and there's some basic information.
00:53:11
◼
►
It's kind of like why I don't go to IMDB anymore, I just go to Wikipedia, because I want to
00:53:14
◼
►
just know what's the cast of this thing.
00:53:16
◼
►
I can find it on the Wikipedia page and IMDB is just a cluster and I can't find anything
00:53:20
◼
►
on that friggin' site anymore and I hate it.
00:53:23
◼
►
So I go to Wikipedia and I go to iPad Pro 4th generation Wikipedia page and I scroll
00:53:28
◼
►
down the thing it says system on a chip a12z bionic did I get him something with
00:53:34
◼
►
an a12z because I would originally looking for a you a refurb m1 iPad Pro
00:53:38
◼
►
could not find one Apple didn't have them you could find them used at other
00:53:41
◼
►
places but I wanted to get an app official Apple refurb and Apple was not
00:53:44
◼
►
selling refurb m1 iPad pros for whatever reason they just didn't have them all
00:53:49
◼
►
right and so I thought I had decided to get an mm - it's I'm holding the device
00:53:53
◼
►
in my hand I'm like how annoying is it that I can't tell if this thing has an
00:53:56
◼
►
M2 in it. Just like, because Apple's thing just says fourth generation and when I
00:54:00
◼
►
google fourth generation it says A12Z. So I downloaded Geekbench. I'm like look,
00:54:04
◼
►
I know what the Geekbench numbers should be for an M2 and it's gonna be very
00:54:09
◼
►
different from the A12Z. First of all Geekbench immediately says in the
00:54:11
◼
►
sidebar this has an M2. So I'm like, I got the right one. But there was like 20
00:54:16
◼
►
minutes where I was like did I buy the wrong thing from, I was looking at the
00:54:19
◼
►
orders like maybe I bought the right thing because the amount looked like an
00:54:22
◼
►
an M2 amount, it looked like an M2 iPad Pro amount,
00:54:25
◼
►
but like maybe they charged me for the M2
00:54:27
◼
►
and shipped me a different one.
00:54:28
◼
►
So this is just a PSA to let everybody know.
00:54:31
◼
►
On the Wikipedia pages for iPad Pro,
00:54:33
◼
►
they have a list of generations
00:54:34
◼
►
that go up to sixth generation.
00:54:36
◼
►
The iPad Pro sixth generation on the Wikipedia page
00:54:41
◼
►
is what Apple calls the iPad Pro 11 inch fourth generation.
00:54:45
◼
►
So do not be confused,
00:54:46
◼
►
and I think Wikipedia should really change this.
00:54:48
◼
►
- 'Cause it's the fourth generation of the 11 inch.
00:54:51
◼
►
- Exactly right.
00:54:52
◼
►
And they're called like the main articles,
00:54:55
◼
►
iPad Pro parentheses fourth generation,
00:54:57
◼
►
which is exactly what you will see in general
00:54:59
◼
►
about on your iPad, but it is not right.
00:55:02
◼
►
So anyway, when in doubt, get Geekbench.
00:55:04
◼
►
And by the way, I actually ran Geekbench
00:55:05
◼
►
to make sure the number that came out was the M2 number.
00:55:07
◼
►
So he has an M2 iPad Pro.
00:55:09
◼
►
Oh, and the other thing is when I was doubting him,
00:55:11
◼
►
I'm like, oh, this is an easy way to tell
00:55:12
◼
►
'cause I got him an Apple Pencil with it
00:55:13
◼
►
'cause he does a lot of drawing on his iPad.
00:55:15
◼
►
I tried to do the hover thing.
00:55:17
◼
►
I'm like, well, I have Procreate, let me launch Procreate.
00:55:19
◼
►
I know Procreate has the hover
00:55:20
◼
►
and nothing would work on the hover.
00:55:22
◼
►
And so I looked at a YouTube video
00:55:23
◼
►
and I saw people hovering over the icons on the home screen
00:55:27
◼
►
and they would make, you know, with the pencil
00:55:28
◼
►
and they would make the icons bigger
00:55:29
◼
►
and that didn't work either.
00:55:30
◼
►
And I'm like, is there something I have to enable for hover
00:55:33
◼
►
and is there some preference of hover in Procreate
00:55:35
◼
►
but why doesn't it work on the home screen?
00:55:37
◼
►
This is why I was like, I think I got the wrong iPad.
00:55:39
◼
►
It doesn't hover, it can't possibly be an M2.
00:55:41
◼
►
Did I get the wrong pencil?
00:55:42
◼
►
I googled for like, is there a second,
00:55:45
◼
►
second generation Apple pencil that does the hover?
00:55:47
◼
►
I'm like, no, it's all the ones with the flat side.
00:55:50
◼
►
he's got the one with the flat side,
00:55:51
◼
►
it magnetically connects, this should be doing hover.
00:55:53
◼
►
The answer to that was, you gotta update to 16.2.
00:55:57
◼
►
- It should, oh, it shipped with the wrong OS.
00:55:59
◼
►
So it was a confusing morning where I was like,
00:56:01
◼
►
what the hell did I buy, and don't put it past me,
00:56:04
◼
►
I've bought the wrong thing before.
00:56:06
◼
►
When I ordered his MacBook Air, I ordered the wrong color
00:56:08
◼
►
and I had to cancel it and did a big thing,
00:56:09
◼
►
or at least I caught that before it shipped,
00:56:11
◼
►
but lots of confusion on Christmas morning,
00:56:13
◼
►
but he does indeed have an M2 iPad Air,
00:56:16
◼
►
which he very richly deserves, with an Apple pencil on.
00:56:18
◼
►
Once updated to 16.2, Hover started working.
00:56:21
◼
►
- All right, so that brings up something
00:56:22
◼
►
I've been wondering about for a long time,
00:56:24
◼
►
and I keep forgetting to bring it up.
00:56:26
◼
►
So I got my first and only set of AirPods Pro
00:56:28
◼
►
a year ago for Christmas.
00:56:29
◼
►
I love them, I do want the new ones,
00:56:31
◼
►
but I'm too cheap because the ones I've got
00:56:32
◼
►
work really well, the battery's fine,
00:56:34
◼
►
no rattling, et cetera, et cetera.
00:56:35
◼
►
The thing that's driving me batty
00:56:37
◼
►
is if you go into Find My, and you go into the Devices tab,
00:56:40
◼
►
it shows my AirPods Pro, and it shows them
00:56:44
◼
►
as outside of the case, which I'm looking at them.
00:56:46
◼
►
I'm holding them in my hand right now.
00:56:48
◼
►
I can assure you they are not outside the case.
00:56:49
◼
►
And it shows them last having been seen
00:56:52
◼
►
on September 27th, 2022.
00:56:55
◼
►
And I vaguely remember, because again,
00:56:58
◼
►
my memory is garbage, that when I think it was 16,
00:57:01
◼
►
I was 16 first came out, or I don't,
00:57:03
◼
►
or maybe it was a late beta, I forget exactly what it was.
00:57:05
◼
►
But my AirPods Pro could do the homing beacon,
00:57:10
◼
►
W1 or whatever it is where you wave your phone around
00:57:14
◼
►
and it shows you where the AirPods are, et cetera, et cetera.
00:57:16
◼
►
And I think that that only worked for a minute,
00:57:19
◼
►
and then they pulled that for the original AirPods Pro,
00:57:22
◼
►
and now it only works on the AirPods Pro 2.
00:57:24
◼
►
But the stupid Find My app is stuck on September 27th,
00:57:28
◼
►
and it's like it never got a clue that this feature,
00:57:31
◼
►
the homing beacon feature, whatever it's called,
00:57:33
◼
►
doesn't work.
00:57:34
◼
►
And so according to this, my left AirPod,
00:57:37
◼
►
or my left AirPod, that's a great movie,
00:57:39
◼
►
my left AirPod Pro was last seen September 27th, 2022,
00:57:43
◼
►
at 2.38 p.m.
00:57:45
◼
►
And it is now, well, according to Marco, it's December 24th,
00:57:47
◼
►
but in most of the world, it's December 26th now.
00:57:50
◼
►
What do I do to fix this?
00:57:51
◼
►
Like, I don't wanna unpair the things,
00:57:52
◼
►
which I guess I could.
00:57:54
◼
►
I really do not wanna sign out of iCloud,
00:57:56
◼
►
'cause that's a nightmare.
00:57:58
◼
►
- Is there a thing I can do to fix this?
00:57:59
◼
►
I'm asking you two, and I'm asking the audience.
00:58:01
◼
►
Like, what do I do about that?
00:58:04
◼
►
- Hmm, I don't know any answer to that.
00:58:06
◼
►
- Right? (laughs)
00:58:07
◼
►
- I would unpair, like, 'cause unpairing
00:58:08
◼
►
is not that big a deal,
00:58:09
◼
►
and it's definitely the first thing you try.
00:58:11
◼
►
Like, restart it, do it over, unpair.
00:58:13
◼
►
Yeah, that's the right move.
00:58:15
◼
►
I'll take that as a homework assignment for myself.
00:58:18
◼
►
People in the chat are saying, oh, I'm
00:58:19
◼
►
seeing some of this as well.
00:58:20
◼
►
So I'm sorry if I've just made you aware of this.
00:58:22
◼
►
But I don't know.
00:58:23
◼
►
It was really cool for the 10 minutes
00:58:25
◼
►
that it worked with the little homing beacon thing.
00:58:27
◼
►
And I really liked that.
00:58:28
◼
►
But then Apple took it away because they're meanies.
00:58:30
◼
►
And then people are saying, when you unpair it,
00:58:32
◼
►
then remove it from Find My when it's unpaired.
00:58:35
◼
►
And then start fresh.
00:58:37
◼
►
Fair enough.
00:58:38
◼
►
I'll take that as homework assignment.
00:58:39
◼
►
And do what Margo said.
00:58:40
◼
►
Then you have to reset all your preferences on all
00:58:42
◼
►
your devices to--
00:58:43
◼
►
Well, if you want auto pairing,
00:58:44
◼
►
you have to do the defaults are in your favor there.
00:58:46
◼
►
- Yeah, yeah.
00:58:47
◼
►
All right, iOS 16.2 came out a little while ago,
00:58:49
◼
►
but it has a feature that we wanted to call
00:58:51
◼
►
to everyone's attention.
00:58:52
◼
►
And it was called to our attention by Brandon Butch,
00:58:54
◼
►
who writes, "You can now stop your wallpaper
00:58:56
◼
►
and/or notifications from appearing on the iPhones
00:58:58
◼
►
always on display."
00:59:00
◼
►
This was as of Beta 3, which obviously now it's true
00:59:03
◼
►
for the shipping version.
00:59:05
◼
►
But yeah, there's a switch in all,
00:59:06
◼
►
or there's several switches and always on display.
00:59:07
◼
►
There's just on/off.
00:59:09
◼
►
Then there's show wallpaper and show notifications.
00:59:12
◼
►
So you can turn the wallpaper on or off
00:59:15
◼
►
and the notifications on or off individually.
00:59:17
◼
►
- Yeah, so remember when the 14,
00:59:20
◼
►
first of all, right before the 14 Pro came out
00:59:23
◼
►
and we were speculating about the rumors
00:59:24
◼
►
about the always-on display,
00:59:26
◼
►
I had said, wouldn't it be cool if they basically
00:59:29
◼
►
had this whole new aesthetic on the display
00:59:32
◼
►
where most of the screen is black
00:59:34
◼
►
and you have maybe just monochrome icons or widgets
00:59:39
◼
►
and the clock and have those things glow
00:59:41
◼
►
and have the rest of the screen be black,
00:59:42
◼
►
which people quickly pointed out that that's basically
00:59:44
◼
►
how Android phones with always on displays have always been.
00:59:47
◼
►
And that this was not a new idea in the Android world.
00:59:50
◼
►
So thank you for that.
00:59:52
◼
►
They all probably used Opera as well.
00:59:53
◼
►
So when the 14 came out though,
00:59:57
◼
►
the only option you really had through iOS was
01:00:01
◼
►
just take whatever your home screen is,
01:00:03
◼
►
just make a dim version.
01:00:04
◼
►
That's it, like that's what it did.
01:00:05
◼
►
And there was no customization.
01:00:07
◼
►
And this led to I think a couple of shortcomings.
01:00:11
◼
►
one of the biggest ones being obviously
01:00:12
◼
►
it wasn't very power conservative.
01:00:14
◼
►
It wasn't as power conservative as it could be
01:00:17
◼
►
if most of the screen was black
01:00:18
◼
►
and only some pixels were being lit
01:00:20
◼
►
if you had any kind of colored wallpaper.
01:00:22
◼
►
But also it became harder to tell
01:00:25
◼
►
whether your phone was awake or asleep.
01:00:28
◼
►
And that was a huge, I think stumbling point
01:00:31
◼
►
for people when they first got the 14 Pro
01:00:34
◼
►
'cause iPhone users were never accustomed to having
01:00:37
◼
►
always on anything.
01:00:38
◼
►
And so you'd see your phone kind of like out of the corner
01:00:41
◼
►
of your eye and you kind of freak out, oh, it's awake.
01:00:43
◼
►
Like you'd think there was notification on it or something
01:00:44
◼
►
or you think it was ringing or whatever
01:00:46
◼
►
because you were conditioned to believe that a phone
01:00:49
◼
►
with the screen on is awake and trying to tell you something.
01:00:52
◼
►
And so what they added with 16.2 is an option
01:00:56
◼
►
to turn off the wallpaper when it's in, you know,
01:01:00
◼
►
sleeping always on mode.
01:01:02
◼
►
And you could also optionally turn off notifications
01:01:05
◼
►
which I believe Do Not Disturb always did
01:01:08
◼
►
on its lock screen, but it was kind of custom on other ones.
01:01:11
◼
►
Anyway, the big thing for me was the wallpaper toggle.
01:01:14
◼
►
So if you turn off Show Wallpaper
01:01:17
◼
►
under the Always On Display settings,
01:01:18
◼
►
you now have in sleep mode a black background
01:01:22
◼
►
that then only shows the clock and your widgets
01:01:26
◼
►
and whatever notifications you might have there.
01:01:29
◼
►
And you can turn off the notifications
01:01:30
◼
►
if you want there as well.
01:01:31
◼
►
So I've been using this since the 16.2 betas,
01:01:34
◼
►
which were, it's been a couple of months now, I think,
01:01:38
◼
►
and it has radically improved my interaction
01:01:41
◼
►
with the 14 Pro's always on screen.
01:01:43
◼
►
I can strongly recommend if you are a little bit put off
01:01:47
◼
►
by like, you know, it not looking too different
01:01:49
◼
►
or if you keep still thinking that your phone's awake
01:01:51
◼
►
when it's not, try this option.
01:01:54
◼
►
Turn off show wallpaper.
01:01:55
◼
►
And it is, I think it's massively better.
01:01:59
◼
►
I think it looks cooler and it functions better
01:02:01
◼
►
in the sense that it allows me to know
01:02:04
◼
►
my phone is not awake right now.
01:02:06
◼
►
- I definitely believe you,
01:02:07
◼
►
but I did turn off the always-on display ages ago,
01:02:10
◼
►
mostly because, for what you said,
01:02:12
◼
►
it seems like my phone was always on.
01:02:14
◼
►
But now that I've gone back to the old way,
01:02:17
◼
►
I'm not anxious to sacrifice battery life.
01:02:21
◼
►
Because what I did learn after using it for a while
01:02:24
◼
►
with the always-on display on
01:02:25
◼
►
is that I didn't derive a lot of utility from it.
01:02:28
◼
►
There was the downside, which is like,
01:02:29
◼
►
"Oh, I think my phone is on."
01:02:30
◼
►
But setting that aside, setting aside the downside,
01:02:32
◼
►
Did I find occasions where it was useful for me to have the screen always on?
01:02:36
◼
►
And I think the answer to that is no.
01:02:38
◼
►
So even though this is, I agree, a much better way of doing it, at least for my purposes,
01:02:42
◼
►
so I'd be able to tell that that's the always on display because the background is black,
01:02:47
◼
►
I don't think I get any value from it the way I use my phone, so I'm not going to sacrifice
01:02:53
◼
►
the battery life for this.
01:02:55
◼
►
But I agree this is a great feature and it might even be useful to be the default, but
01:02:59
◼
►
that could just be old habits, right?
01:03:02
◼
►
Maybe new users who their first iPhone has
01:03:04
◼
►
the always on screen will just be used to it
01:03:05
◼
►
and not develop the habits that we all have
01:03:07
◼
►
to expect a phone with our wallpaper being displayed
01:03:10
◼
►
is on or trying to tell us something.
01:03:12
◼
►
And I think related to wallpapers,
01:03:13
◼
►
didn't 16.2 also add the thing where they separated
01:03:17
◼
►
the screen that appears under your icons on springboard
01:03:21
◼
►
from the lock screen?
01:03:23
◼
►
Like it used to be whenever you made a custom one,
01:03:24
◼
►
it made you make both of them as a match set, as a pair.
01:03:29
◼
►
and I think they split that out, is that correct?
01:03:31
◼
►
- I thought that was the case.
01:03:33
◼
►
I haven't tried it either, but I thought that was the case,
01:03:35
◼
►
and it drove me nuts.
01:03:36
◼
►
It drove me nuts in the original version 16.
01:03:39
◼
►
- And they should be split out,
01:03:40
◼
►
because I, especially when I was first messing with it,
01:03:43
◼
►
I wanted to make lots of different fun lock screens.
01:03:46
◼
►
I did not want to make lots of different thing
01:03:48
◼
►
that goes behind my icons,
01:03:50
◼
►
'cause the thing behind my icons is 100% black
01:03:52
◼
►
and always has been, and that doesn't change for me,
01:03:55
◼
►
and so every time I had to,
01:03:57
◼
►
wanted to make a new lock screen, it's like,
01:03:58
◼
►
"Oh, and don't forget, you also need to make a..."
01:04:00
◼
►
And I go, "Ugh, just separate these,
01:04:01
◼
►
"it's two different things."
01:04:03
◼
►
So if you wanna make a million different,
01:04:05
◼
►
I guess they would call, what would you call it?
01:04:06
◼
►
You would call it the, well, they call wallpaper
01:04:09
◼
►
the image that's behind stuff,
01:04:10
◼
►
but I think there should be a different name
01:04:12
◼
►
for the image that appears behind your icons on springboard,
01:04:15
◼
►
and then the image that appears on the lock screen.
01:04:17
◼
►
And they call that image wallpaper in this setting,
01:04:22
◼
►
because the option is, on the always on display,
01:04:25
◼
►
do you want to show your wallpaper?
01:04:27
◼
►
they mean do you want to show the image that shows behind the icons on your, on Springboard.
01:04:32
◼
►
Anyway, confusing nomenclature but I'm glad they're working this feature out to be more
01:04:37
◼
►
Yeah, because one issue I still have with the Always On Display is that when it's in
01:04:42
◼
►
sleep mode, whatever is on screen is not tappable.
01:04:46
◼
►
It looks like it's tappable, but it's not.
01:04:48
◼
►
The first tap that you do will put it into awake screen mode and then you have to tap
01:04:52
◼
►
again or swipe or whatever you're doing to interact with something on that screen.
01:04:57
◼
►
And so having, like part of the reason why I like this option of turning off show wallpaper
01:05:02
◼
►
is that it makes it more clear to me, not only is the phone not awake, but it is not
01:05:05
◼
►
interactive right now.
01:05:07
◼
►
Like if I want this thing to be interactive, I have to tap it so that it is no longer mostly
01:05:12
◼
►
And then I know, oh it's lighting up with my, I still like the weather background, it's
01:05:16
◼
►
lighting up with the rain that's outside, great, now I can interact with the notification
01:05:20
◼
►
or whatever.
01:05:21
◼
►
And that's, I hope, I really, really hope,
01:05:25
◼
►
in future hardware iterations,
01:05:26
◼
►
I hope they are at some point able
01:05:28
◼
►
to make the sleep state more interactive.
01:05:31
◼
►
I know there's a lot of challenges with that
01:05:33
◼
►
and a lot of kind of practical concerns of like,
01:05:36
◼
►
well, what if it's in your pocket
01:05:37
◼
►
or what if you're handling it, you know, whatever it is.
01:05:39
◼
►
And I think there are certain risks
01:05:41
◼
►
and maybe it's not possible to do that well,
01:05:44
◼
►
but I would love for them to at least try to do that well
01:05:46
◼
►
because, or at least make it more obvious
01:05:49
◼
►
that something's not interactive in the design.
01:05:52
◼
►
And then ideally also, hopefully, make it wake up faster.
01:05:56
◼
►
That would also help a lot.
01:05:58
◼
►
Like if it can go from sleep state to awake state
01:06:01
◼
►
in half the time it does now,
01:06:04
◼
►
I'm sure some of that is a choice
01:06:06
◼
►
in terms of animation speed,
01:06:07
◼
►
but also I think some of that is giving the OS time
01:06:10
◼
►
to wake up and using the animation
01:06:12
◼
►
as kind of cover for that time.
01:06:14
◼
►
So whatever they can do to make it more interactive
01:06:17
◼
►
when you want to interact with it,
01:06:18
◼
►
that would feel very good.
01:06:20
◼
►
And right now, it still feels a little bit clunky
01:06:22
◼
►
in that way.
01:06:23
◼
►
- They could do kind of,
01:06:24
◼
►
the only safe way I think they should try doing this
01:06:26
◼
►
is kind of like they do for continuity camera,
01:06:28
◼
►
which is the thing where you can take your phone
01:06:30
◼
►
and use it as like the quote unquote webcam for your laptop.
01:06:34
◼
►
And continuity camera works by detecting
01:06:36
◼
►
that you have taken your camera
01:06:38
◼
►
and put it in a particular orientation
01:06:40
◼
►
that is more or less stationary
01:06:41
◼
►
when you put it on those little clips or stands
01:06:43
◼
►
to be behind your display, right?
01:06:45
◼
►
So the feature you want,
01:06:47
◼
►
It would be extremely, it would have lots of accidental input if it was just like that
01:06:51
◼
►
all the time.
01:06:52
◼
►
Oh, the lock screen, but it's interactive because people, like I said, people would
01:06:54
◼
►
put it in their pocket, it would activate.
01:06:56
◼
►
My parents are constantly accidentally calling me because, just because the delay between
01:07:00
◼
►
when they lock their phone and when they put it into their pocket is too long and they
01:07:06
◼
►
end up like touching something on their phone when they put it into their pocket and they
01:07:08
◼
►
butt dial me, right?
01:07:10
◼
►
But it could detect, for example, when it is laying on its back on a table for a certain
01:07:15
◼
►
period of time and say, now I've detected that I'm on a table,
01:07:19
◼
►
and I am going to actually register your taps, whatever
01:07:23
◼
►
But as soon as you pick it up and start moving,
01:07:25
◼
►
it's like, nope, nope, I'm locked.
01:07:26
◼
►
No input means anything to me, right?
01:07:28
◼
►
So that you-- it's tricky to do, right?
01:07:30
◼
►
But I think if you just make it active all the time,
01:07:32
◼
►
people are going to dial things like crazy.
01:07:34
◼
►
Like it's just-- it's impossible to have.
01:07:37
◼
►
Because again, when you lock your phone
01:07:39
◼
►
and you put it into any of your pockets,
01:07:42
◼
►
the expectation is there's nothing
01:07:43
◼
►
can happen in that pocket that's going to do stuff on my phone, right? Even if I accidentally
01:07:48
◼
►
hit the power button, which is a physical button for now anyway, it's still not going to unlock
01:07:53
◼
►
my phone because Face ID will fail, Touch ID will fail, it's inside my pocket, right? Whereas if that
01:07:57
◼
►
tap would actually activate the notification, as we know from stupid iOS, once you look at
01:08:01
◼
►
a notification it's gone forever and you can never see it again and I hate that so much. I wish there
01:08:06
◼
►
was like, here are the last 100 notifications that you dismissed, just show me them, keep them around.
01:08:11
◼
►
So anyway, tricky to add feature, but I get where you're coming from, and I do think the
01:08:16
◼
►
16.2 things are all improvements.
01:08:19
◼
►
I actually left my Always On Display in all the default settings.
01:08:25
◼
►
I wouldn't say it's been an earth-shattering change for me.
01:08:28
◼
►
I will say, kind of tangentially related, that battery life, when I first got the new
01:08:32
◼
►
phone and it was chugging on photos and all that and all the machine learning stuff, like
01:08:36
◼
►
the battery life was garbage as expected.
01:08:38
◼
►
Then it got really really good for a while and then I don't know if it was 16.2 or something
01:08:43
◼
►
But sometime in last month or two both errands and my phones
01:08:46
◼
►
Our battery life has gone through the crapper recently and I haven't dug much to figure out what's going on there
01:08:52
◼
►
There was nothing obvious when I went, you know spelunking in the battery like history and all that in settings
01:08:58
◼
►
But it's it struck me as a little odd that both of us were having similar problems
01:09:02
◼
►
So, I don't know if that's just a list family thing or if that's broader or whatever
01:09:06
◼
►
But it's been a little bit of a bit of a bummer and so I might be turning off the always-on display if it continues
01:09:12
◼
►
To be a problem, but all told I do like it
01:09:14
◼
►
I have it rotating between pictures of the kids and pictures of Erin
01:09:17
◼
►
And I like that every time I pick up my phone or well every time it gets locked
01:09:22
◼
►
There's a new picture there and then the next time I lock the phone
01:09:25
◼
►
There's a different picture and I can see it dimly on the display while it while it's just sitting and I dig that I think
01:09:30
◼
►
That's fun cute
01:09:31
◼
►
But I totally understand if it's not for everyone and by the way even without the always-on display like part of my problem of being
01:09:37
◼
►
Distracted by it is that I would think like someone is calling because there I see their face on my phone
01:09:43
◼
►
But I also have a rotation of my family right so every time I pick up my phone
01:09:46
◼
►
It's the picture of someone in my family, but if my phone starts ringing
01:09:50
◼
►
And it's like a spam call from just you know random spam number telling you my car warranty repair or whatever
01:09:56
◼
►
But my phone screen will light up and I'll see a picture of my son's face on it
01:10:01
◼
►
I'm like, oh I'm getting a call from my son. But yeah, I mean like that was the problem with the always on display
01:10:06
◼
►
I always thought that was being contacted by somebody even without the always on display
01:10:09
◼
►
that's still that's just a side effect of putting people's face as the you know, as your
01:10:14
◼
►
Wallpaper or lock screen or whatever because you have to actually then say ignore the face
01:10:18
◼
►
Look up at that
01:10:20
◼
►
You know at the little caller ID thing that shows when you're getting a call to see what number it's coming from
01:10:25
◼
►
And if my son does call, I think it still shows his face, but it's a different picture,
01:10:29
◼
►
but it's still him.
01:10:30
◼
►
So anyway, possible confusion.
01:10:32
◼
►
I mean, I guess I could just do landscapes or, you know, this is the first, whatever
01:10:35
◼
►
it is, the iPhone 14 Pro is the first phone and iOS 16 that I did not have a picture of
01:10:42
◼
►
one of my past dogs as my lock screen.
01:10:44
◼
►
That has been my lock screen since the iPod touch in 2007, right?
01:10:48
◼
►
I remember this.
01:10:49
◼
►
It was Huckleberry or whatever, right?
01:10:52
◼
►
I decided, because I wanted to try the feature, I decided to try the rotating thing.
01:10:55
◼
►
I had a bunch of nice pictures of my family that I made very tall and put room above their
01:11:00
◼
►
heads for the clock so the clock wasn't over their face.
01:11:02
◼
►
And I do like that.
01:11:03
◼
►
I do like seeing those pictures rotate, but it continues to be a little bit confusing
01:11:06
◼
►
when I get a call.
01:11:08
◼
►
So a week or two ago, I think something like that, it was sometime in December, we got
01:11:13
◼
►
word through Bloomberg, and I presume it was Germin, that the Apple Car project has "scaled
01:11:18
◼
►
back" and is delayed. And it won't feature full self-driving capabilities. Imagine that.
01:11:23
◼
►
But yeah, this was a news report and some highlights that were reported, re-reported on
01:11:29
◼
►
MacRumors. Apple Inc. has scaled back ambitious self-driving plans for its future electric vehicle
01:11:33
◼
►
and postponed the car's target launch date about a year to 2026. The car project has been in limbo
01:11:39
◼
►
for the past several months as Apple executives grappled with the reality that its vision for a
01:11:42
◼
►
fully autonomous vehicle without a steering wheel or pedals isn't feasible with current technology.
01:11:47
◼
►
You don't say.
01:11:48
◼
►
Imagine that.
01:11:49
◼
►
The car will have an Apple-designed custom processor
01:11:51
◼
►
to power AI functionality.
01:11:52
◼
►
The chip is equivalent to four of the highest-end Mac chips
01:11:55
◼
►
and is nearly production-ready.
01:11:57
◼
►
That actually kind of brings us back to our discussion
01:11:59
◼
►
about the Mac Pro last week,
01:12:00
◼
►
but that's neither here nor there.
01:12:01
◼
►
- That's a nip, though.
01:12:02
◼
►
- Well, I mean, I would think.
01:12:04
◼
►
Apple will use the cloud for some AI processing,
01:12:06
◼
►
and the company is considering a remote command center
01:12:09
◼
►
that could assist drivers and control the cars
01:12:10
◼
►
from afar during emergencies.
01:12:12
◼
►
That sounds safe.
01:12:14
◼
►
I don't know if we really need to go
01:12:16
◼
►
go into the ins and outs of the particulars of this,
01:12:18
◼
►
but it is interesting that this poop show
01:12:21
◼
►
is still a poop show from what it sounds.
01:12:25
◼
►
- Why are they still doing any of this?
01:12:27
◼
►
That's my, you know, this project has gone through
01:12:30
◼
►
so many phases and ideas and certainly people
01:12:35
◼
►
and certainly money, oh my God.
01:12:38
◼
►
Like how long do they have to keep fumbling around
01:12:42
◼
►
with this massive money waster before they're like,
01:12:45
◼
►
you know what, this isn't for us.
01:12:46
◼
►
Let's just crap can it and move on.
01:12:49
◼
►
I still don't see why they even want to be in this business,
01:12:54
◼
►
let alone why they keep plowing forward
01:12:58
◼
►
without seemingly able to go near any kind of outcome.
01:13:01
◼
►
- The story's interesting because it's kind of like,
01:13:06
◼
►
the best line of the story is where it says that,
01:13:10
◼
►
grappling with their vision, that blah, blah, blah,
01:13:12
◼
►
isn't feasible with current technologies.
01:13:14
◼
►
What it's saying is they don't have it.
01:13:16
◼
►
They would like to have a product that does X,
01:13:18
◼
►
but they don't.
01:13:19
◼
►
It's like, I would like something,
01:13:20
◼
►
I would like something that lets me levitate.
01:13:23
◼
►
That would be a cool product, but we don't have that.
01:13:27
◼
►
So how much money are you gonna spend on a project?
01:13:29
◼
►
So is this R&D?
01:13:31
◼
►
Like that's kind of the problem with self-driving.
01:13:33
◼
►
It's difficult to do the R&D to figure out
01:13:36
◼
►
if you can make a product without spending a lot of money.
01:13:39
◼
►
And I get that.
01:13:40
◼
►
It's not as simple as like,
01:13:40
◼
►
oh, we'll have these people tooling around
01:13:43
◼
►
with a multi-touch, right?
01:13:44
◼
►
And that'll be a small team of a handful of people
01:13:49
◼
►
and they'll tool away at it for years.
01:13:50
◼
►
And eventually, if we get to the point where,
01:13:52
◼
►
no, we could probably make a product out of this.
01:13:53
◼
►
Maybe we'll make a tablet, maybe we'll make a phone.
01:13:55
◼
►
Then you form the iPhone team and then the iPad team, right?
01:13:59
◼
►
And that's when you staff up
01:14:00
◼
►
and that's when you spend the money.
01:14:01
◼
►
Although the amount of money that was used
01:14:03
◼
►
to develop the original iPhone is comically low
01:14:05
◼
►
and every company should feel bad
01:14:07
◼
►
when they read those histories of like,
01:14:08
◼
►
so how much money did Apple spend?
01:14:10
◼
►
Even if we count like everything we can think of,
01:14:11
◼
►
How much do they spend to make the iPhone?
01:14:14
◼
►
The small number I think is they spend $150 million,
01:14:16
◼
►
which is like, we spent $150 million
01:14:18
◼
►
to change the color of the logo, right?
01:14:20
◼
►
That's what big companies normally do.
01:14:22
◼
►
So they spent $150 million to make the iPhone,
01:14:24
◼
►
how much money is it made?
01:14:26
◼
►
Anyway, there's more to it if you add to that, right?
01:14:28
◼
►
But for self-driving cars, it's like, okay,
01:14:31
◼
►
I can understand why they're saying,
01:14:32
◼
►
it would be cool if we had a self-driving car.
01:14:34
◼
►
I agree, that would be cool, right?
01:14:37
◼
►
Can we make one?
01:14:38
◼
►
Let's do some R&D to find out.
01:14:40
◼
►
But it has always seemed to me
01:14:41
◼
►
that this project at Apple was created
01:14:43
◼
►
as if they had already done that.
01:14:44
◼
►
It's like, oh, we know how to make a self-driving car,
01:14:46
◼
►
let's just make a project to make one.
01:14:47
◼
►
Like that it's already in the phase of like,
01:14:49
◼
►
we're designing a product and we're staffing a team
01:14:51
◼
►
to build that product and we're gonna contract manufacturers
01:14:54
◼
►
to manufacture that product.
01:14:55
◼
►
It's like, whoa, whoa, whoa,
01:14:56
◼
►
what product are you making?
01:14:57
◼
►
Our self-driving car.
01:14:59
◼
►
Do you have a self-driving car?
01:15:00
◼
►
He's like, no, but we'll probably be able
01:15:02
◼
►
to figure it out, right?
01:15:03
◼
►
Nope. (laughs)
01:15:05
◼
►
Like you gotta be able to make the thing first
01:15:08
◼
►
before you make a product team.
01:15:10
◼
►
So again, no one knows the details of this,
01:15:12
◼
►
but to the extent that anything with product Titan
01:15:15
◼
►
has been in like let's make a product mode,
01:15:18
◼
►
that has been a bad idea if the premise of the product
01:15:21
◼
►
is a car without a steering wheel or pedals.
01:15:23
◼
►
You cannot make a car, let's go with that saying,
01:15:25
◼
►
you cannot make a car with a steering wheel or pedals
01:15:27
◼
►
until it can drive itself.
01:15:29
◼
►
And if it can't drive itself,
01:15:30
◼
►
maybe get out of the product phase
01:15:32
◼
►
and back into the R&D phase.
01:15:34
◼
►
Don't go to the product phase until you think
01:15:36
◼
►
you have something that works like the multi-touch.
01:15:38
◼
►
We've got a multi-touch screen that we think works.
01:15:41
◼
►
Maybe it's too big, maybe it's too clunky,
01:15:43
◼
►
maybe it takes a lot of power,
01:15:44
◼
►
maybe it's connected to like a power Mac or something,
01:15:46
◼
►
but it works.
01:15:48
◼
►
And the question is, can we make it smaller, cheaper,
01:15:51
◼
►
blah, blah, blah.
01:15:52
◼
►
They don't have anything that works.
01:15:54
◼
►
Nobody does.
01:15:55
◼
►
Nobody has this, right?
01:15:57
◼
►
It would be cool if you could make it, but make it first.
01:16:00
◼
►
I think this is just a hilarious story that the premise is,
01:16:03
◼
►
Apple has decided they can't make a product they can't make.
01:16:05
◼
►
Like, yeah, you gotta be able to.
01:16:07
◼
►
And the other thing they can decide is,
01:16:08
◼
►
we're not gonna make a car without pedals or steering wheel.
01:16:10
◼
►
Make one with pedals and a steering wheel
01:16:12
◼
►
that humans can drive.
01:16:13
◼
►
Lots of people do that.
01:16:14
◼
►
You can have lots of self-driving assistive functions
01:16:16
◼
►
that I think are probably a bad idea in many cases,
01:16:18
◼
►
but that's a thing that you can make.
01:16:20
◼
►
But the idea that they keep trying to make a car
01:16:22
◼
►
without a steering wheel, you can't do that.
01:16:25
◼
►
And so I'm sad for them and I'm sad for the story.
01:16:28
◼
►
I do like the idea that their AI powered chip
01:16:30
◼
►
or whatever is, that sounds so much like the Mac chip
01:16:33
◼
►
that they canceled, equivalent to four
01:16:34
◼
►
of their high-end processors.
01:16:36
◼
►
You know, I mean, you can't afford to put it in a $10,000,
01:16:40
◼
►
you know, Mac Pro, because $10,000 is too much for a Mac Pro.
01:16:42
◼
►
How much would this car cost?
01:16:44
◼
►
Again, if it actually existed and worked.
01:16:46
◼
►
My advice to Apple is if you wanna make a car,
01:16:48
◼
►
add a steering wheel and pedals.
01:16:50
◼
►
- Yeah, I just, I still go back and like,
01:16:53
◼
►
you know, to me, like the question you asked,
01:16:54
◼
►
like all right, can we make this thing work,
01:16:56
◼
►
and then, you know, then move on from there.
01:16:58
◼
►
I would step back again and say,
01:17:00
◼
►
is this even a business that we need to be in
01:17:02
◼
►
or want to be in?
01:17:04
◼
►
And what would it mean if we entered this business
01:17:08
◼
►
and actually stayed in it and maybe even succeeded in it?
01:17:11
◼
►
What would that actually mean for the rest of our business?
01:17:15
◼
►
Can you imagine Apple selling cars?
01:17:18
◼
►
How would that even physically work?
01:17:20
◼
►
Can you imagine Apple servicing cars?
01:17:22
◼
►
Can you imagine Apple going through all the regulations?
01:17:25
◼
►
Like, I can see that it's possible for them to do all that,
01:17:30
◼
►
But why would they want to?
01:17:32
◼
►
And what they would achieve in that
01:17:35
◼
►
would be they'd be a car manufacturer.
01:17:39
◼
►
Look at all the car manufacturers.
01:17:40
◼
►
How happy are they?
01:17:41
◼
►
How good of a business does that seem to be?
01:17:44
◼
►
It doesn't seem like it's worth them entering this business
01:17:48
◼
►
even if it was fairly easy to get into, and it's super not.
01:17:53
◼
►
And so I just don't see why they would even want
01:17:58
◼
►
to be in this business.
01:17:59
◼
►
That even sets aside the question of,
01:18:02
◼
►
could Apple be good at designing cars,
01:18:06
◼
►
which I think is a huge question mark, honestly.
01:18:10
◼
►
Apple's very good at designing a lot of things.
01:18:13
◼
►
I don't know that I would want an Apple-designed car,
01:18:17
◼
►
but again, I think that's even,
01:18:20
◼
►
that's too far down the line of thinking of like,
01:18:22
◼
►
why are they even doing this?
01:18:23
◼
►
Why do they wanna be in this business?
01:18:25
◼
►
That, to me, makes no sense.
01:18:27
◼
►
I mean, it is different in many ways that we've discussed, but I think it is actually
01:18:31
◼
►
a better fit with their traditional strengths than some other things.
01:18:37
◼
►
Because it is manufacturing a product that they sell for a profit.
01:18:42
◼
►
It is, you know, regulation stuff.
01:18:44
◼
►
Again, Apple's not used to automotive regulation, but they are used to dealing with the FCC
01:18:48
◼
►
and the various radio regulations and heat and thermal.
01:18:51
◼
►
Like there are different laws in different countries and being in compliance with them
01:18:54
◼
►
with electronic devices is a thing, right?
01:18:56
◼
►
It is a scaled up version of what they do.
01:18:59
◼
►
They're good at manufacturing, they don't have their own manufacturing plans, they outsource
01:19:02
◼
►
that, and they're good at helping those manufacturers do a better job of manufacturing.
01:19:06
◼
►
Apple is good at actually "making things" even if they're doing through third parties.
01:19:09
◼
►
Like, I can squint and say, this looks like, again, a question of whether they'd be able
01:19:14
◼
►
to make a good car, but they make hardware products that are software powered and they
01:19:18
◼
►
manufacture them and comply with a bunch of laws.
01:19:20
◼
►
So it is conceivable that they could get good at this after many years because it is like
01:19:25
◼
►
what they do.
01:19:26
◼
►
to me of this great quote that Gruber had back on December 21st, I put this in contrast
01:19:31
◼
►
to something that Apple already does that I think is not a good fit for their traditional
01:19:36
◼
►
strengths and model.
01:19:37
◼
►
This is a quote from talking about, what was it, Gruber's not guested on a podcast and
01:19:43
◼
►
they talked about various Apple related things.
01:19:46
◼
►
And he put this little summary at the end of his link to the podcast he was a guest
01:19:52
◼
►
This is quoting Gruber.
01:19:53
◼
►
The App Store's financial success is the worst thing that's happened to Apple this century.
01:19:57
◼
►
It's a distraction at best and a profound corruption at worst.
01:20:01
◼
►
Services revenue and the App Store do not fit with Apple's traditional strengths of
01:20:06
◼
►
making a really good product and selling it to people for a profit because it's really
01:20:09
◼
►
good and they like it.
01:20:10
◼
►
It is a much more complicated model where you're selling access to them, you're rent-seeking,
01:20:16
◼
►
you're controlling a platform and then charging other people money to get access to the people,
01:20:19
◼
►
potentially collecting information about them and advertising to them.
01:20:23
◼
►
That model does not fit with the Apple that we love in a way that the car does.
01:20:27
◼
►
Because if Apple made a good car product, I believe that they could figure out how to
01:20:32
◼
►
sell it, how to service it.
01:20:33
◼
►
In the same way they figure out how to sell their hardware products, their phones and
01:20:37
◼
►
everything, they figure out how to service them, they figure out how to comply with laws,
01:20:40
◼
►
they figure out how to get them manufactured, they figure out how to sell them all over
01:20:43
◼
►
the world with different regulations.
01:20:45
◼
►
Obviously a car is way different than a phone.
01:20:47
◼
►
is a much bigger scale thing. There are so many other complications they're working with,
01:20:51
◼
►
a bunch of other players, but that fits better. The incentives are aligned better for the
01:20:56
◼
►
Apple that I like, the Apple that makes its money by making a good product. Not the Apple
01:21:01
◼
►
that is a distraction at best or a corruption at worst, paraphrasing Gruber, that is distracted
01:21:07
◼
►
by the idea of like, "Yeah, but have we considered rent-seeking? Because that's where the real
01:21:11
◼
►
money is." That does not align with the incentive. I want them to be incentivized to make good
01:21:17
◼
►
The incentives that I like is, hey, if we make a bad keyboard,
01:21:19
◼
►
we have to pay people a class action lawsuit.
01:21:21
◼
►
And if we make good keyboard, people buy our stuff.
01:21:23
◼
►
If we put ports back on our laptops,
01:21:24
◼
►
people buy more of them.
01:21:25
◼
►
If we make a really fast, low-power computer,
01:21:27
◼
►
people buy it, right?
01:21:28
◼
►
If we make an amazing touchscreen phone,
01:21:29
◼
►
people buy it because it's a cool phone.
01:21:31
◼
►
That model aligns better with my interest as a customer.
01:21:34
◼
►
Make cool technology.
01:21:36
◼
►
Make a profit when I buy it, because it costs you
01:21:39
◼
►
less to manufacture it than it does for me to buy it from you.
01:21:42
◼
►
That's your profit.
01:21:43
◼
►
That's a really good business.
01:21:44
◼
►
It's a model I understand that I'm more comfortable with.
01:21:47
◼
►
than the App Store model, which is sell tons of devices that
01:21:51
◼
►
are awesome and then charge people 30% of all the money
01:21:54
◼
►
they make on them because you deserve it
01:21:56
◼
►
because you made the platform.
01:21:58
◼
►
The thing that-- just to go back a little bit to what Marco's
01:22:00
◼
►
saying, I think the thing that makes me very worried--
01:22:04
◼
►
maybe a little dramatic-- but worried about an Apple
01:22:06
◼
►
car is that I feel like the same problems that everyone
01:22:10
◼
►
snarks on Tesla about, like, oh, the windshield wipers don't
01:22:13
◼
►
work because it never rains in the Bay Area, which I know
01:22:15
◼
►
is not really true.
01:22:16
◼
►
but just go with it here.
01:22:18
◼
►
- They don't work that well.
01:22:19
◼
►
- Well, fair.
01:22:20
◼
►
And then the door handles would freeze constantly
01:22:23
◼
►
because it legitimately really rarely does freeze
01:22:25
◼
►
in the Bay Area.
01:22:26
◼
►
This very myopic view of the world,
01:22:28
◼
►
which I feel like is a very common problem
01:22:32
◼
►
for engineers in the Bay Area,
01:22:34
◼
►
I wonder if that's going to be an affliction
01:22:37
◼
►
for an Apple car as well.
01:22:38
◼
►
And where that would manifest itself is,
01:22:40
◼
►
oh, surely $150,000 car is something anyone would want,
01:22:44
◼
►
We charge like 200% of what an average, you know, run-of-the-mill Android phone is for a new iPhone.
01:22:51
◼
►
People pay 2x for a new iPhone. Surely they'll pay 2 to 3x for an entire car, right?
01:22:57
◼
►
And no, people will not. Like maybe it's just me,
01:23:01
◼
►
but I don't have $150,000 just laying around to buy an Apple car, to lease an Apple car at $4,000 or whatever.
01:23:07
◼
►
I don't know. I don't lease cars, but whatever that lease payment would be.
01:23:09
◼
►
I don't have that just laying around.
01:23:11
◼
►
Like, it's wonderful if they think that them
01:23:13
◼
►
and their cushy worlds with their $3 million,
01:23:15
◼
►
2,000 or 200 square foot houses and their, you know,
01:23:19
◼
►
R8s and their Lamborghini Uruses and all that,
01:23:23
◼
►
like yeah, $150,000 car may be great in the Bay Area,
01:23:27
◼
►
but that ain't gonna work in the real world, my friends.
01:23:28
◼
►
- Doesn't it fit with Apple's model?
01:23:30
◼
►
Their hardware is always more expensive
01:23:32
◼
►
than everybody else's.
01:23:32
◼
►
And I know it's a different matter of scale.
01:23:34
◼
►
It's like, oh, 20% higher is an extra 200 bucks
01:23:36
◼
►
versus 20% higher is an extra 20,000 bucks.
01:23:40
◼
►
But still, I think that it is not impossible
01:23:43
◼
►
to be the car manufacturer that sells mostly cars
01:23:45
◼
►
that are too expensive.
01:23:46
◼
►
Look at Porsche.
01:23:47
◼
►
- Oh, I agree, I agree.
01:23:48
◼
►
But I just, I don't know.
01:23:49
◼
►
I think the thing, and maybe it was Gruber,
01:23:51
◼
►
maybe it was you, John.
01:23:52
◼
►
Somebody had once said, like,
01:23:54
◼
►
the great thing about Apple and about Coke is that if,
01:23:57
◼
►
okay, so if you want the best Coke in the world,
01:23:59
◼
►
the best Coca-Cola in the world.
01:24:01
◼
►
- And he was paraphrasing somebody else.
01:24:03
◼
►
- Yeah, wasn't this like Andy Warhol or somebody?
01:24:04
◼
►
- Yeah, yeah, yeah.
01:24:05
◼
►
- Who knows? - I think it was Andy Warhol.
01:24:06
◼
►
want if you want the best coca-cola in the world I don't care if you make a
01:24:09
◼
►
thousand dollars a year or a hundred million dollars a year you're still
01:24:12
◼
►
getting the same coca-cola if you want the best iPhone in the world you know
01:24:15
◼
►
you're still if you're a rich and famous or if you're just somebody of a regular
01:24:19
◼
►
schmo you're getting the same iPhone and I don't know I feel like Apple wants to
01:24:23
◼
►
be in that space where they are a premium product full stop but they're
01:24:27
◼
►
not a 10x premium product I mean look at how well that worked out for the Apple
01:24:31
◼
►
Watch Edition like yeah some were sold some to some of the hosts on this
01:24:35
◼
►
episode but nevertheless... Not the gold one! Fair, fair. But you get what I'm
01:24:41
◼
►
driving at. That they want to be a premium product but I don't think that a
01:24:46
◼
►
10x premium product has ever really fit that well for them. See also Mac Pro and
01:24:51
◼
►
beyond that we're talking, this is what you just said Jon, we're talking about
01:24:54
◼
►
10x, you know, 2, 3, 4, 5, 10x of what's an average car these days?
01:24:59
◼
►
Anywhere between $20,000 and $40,000 for like an okay car? Like how much does a
01:25:03
◼
►
a Civic cost, John. Like that's a decent, very run-of-the-mill car. And a Civic is like
01:25:08
◼
►
fifteen, twenty thousand dollars, isn't it? Easily.
01:25:11
◼
►
What planet are you on? You cannot buy a Civic for fifteen thousand dollars. You haven't
01:25:15
◼
►
been able to do that for decades. Okay, fine. So it's been a while since I've
01:25:18
◼
►
bought a car. Maybe the average selling price of car is
01:25:20
◼
►
like forty grand or something. Oh, there you go. So you think that they're
01:25:23
◼
►
just going to slide in here and charge 3x, 4x, 5x? Like, yeah, again, maybe that works
01:25:29
◼
►
in the Bay Area, where you folks are all getting paid absurd amounts of money and spending
01:25:33
◼
►
it all on housing. But that doesn't work in the real world. And I'm very concerned that
01:25:38
◼
►
they're going to strut in thinking, "Oh, we can totally charge 3X for our car because
01:25:43
◼
►
it's awesome and it's Apple."
01:25:44
◼
►
Well, that was one of the rumors of what one of the revisions of Project Titan, Apple's
01:25:49
◼
►
car project was scrapped because they did a product design and they said, "Here's
01:25:53
◼
►
the car we want to make," and then they priced it out and said, "No, it's too much."
01:25:57
◼
►
So they've already stopped one of their iterations due to cost because they realize—I mean,
01:26:02
◼
►
Again, I think it is perfectly valid to be a car manufacturer that makes expensive cars.
01:26:05
◼
►
You're just going to sell fewer of them.
01:26:06
◼
►
The question is, where do you want to be?
01:26:08
◼
►
Do you want to sell as many cars as Ferrari?
01:26:10
◼
►
They actually sell a lot of cars for a car company that sells cars that start at like
01:26:14
◼
►
200 grand or whatever, but they don't sell as many as Porsche, and then Porsche does
01:26:18
◼
►
not sell as many as Volkswagen and on down the line.
01:26:21
◼
►
So you have to kind of decide how many you want to sell.
01:26:23
◼
►
iPhones, they sell a heck of a lot of those.
01:26:25
◼
►
They sell them to not half the world, but a pretty stable portion of the potential smartphone
01:26:30
◼
►
selling market is iPhones.
01:26:31
◼
►
I don't know what it is worldwide, is it like 30% or something?
01:26:35
◼
►
If you wanted to sell 30% of the cars in the world, you can't price it at 100 grand.
01:26:39
◼
►
But there is a question of, is our first one really expensive and then the price comes
01:26:44
◼
►
down, especially with electric cars, which is what everyone assumes they're building.
01:26:48
◼
►
A lot of the price has to do with battery, that is the cost driver for your car is the
01:26:54
◼
►
Those electric motors don't actually cost that much and building a car costs as much
01:26:57
◼
►
as it's ever cost in terms of making a frame and suspension and tires and brakes or whatever,
01:27:01
◼
►
but the battery is the big cost thing. If the batteries cost nothing, electric cars
01:27:05
◼
►
would be incredibly cheap, but if the batteries were expensive as they were two decades ago,
01:27:09
◼
►
they're less feasible. And again, battery technology is something Apple is familiar
01:27:14
◼
►
with and has relationships with and stuff like that, so I think it is plausible that
01:27:18
◼
►
Apple could sell a car at a reasonable price range, but I also think if they ever come
01:27:24
◼
►
out with a car, it's definitely going to be the Apple of cars, which is going to be too
01:27:27
◼
►
expensive most people, especially the first one. Think of the shock of the iPhone. Remember
01:27:31
◼
►
when the iPhone came out and everyone was like, "I've never been making fun of it because
01:27:33
◼
►
it cost so much money?" Because it wasn't like carrier-subsidized or whatever. And today,
01:27:38
◼
►
the iPhones we all have are very expensive devices. If you had told us back when we were
01:27:42
◼
►
using flip phones that someday you're going to every year, or in my case every other year,
01:27:47
◼
►
buy a phone for $1200, you'd be like, "$1200 from a phone? I'm not going to carry that
01:27:51
◼
►
around and drop it on the pavement?" That's ridiculous. But people derive enough value
01:27:55
◼
►
from their smartphones that they're willing to pay hundreds, sometimes up into a thousand
01:27:58
◼
►
dollars for their phone sometimes every year.
01:28:01
◼
►
So that, of all things, I get where you're coming from, that it just doesn't seem like
01:28:07
◼
►
something that they're going to be able to sell a lot of, and it also doesn't seem like
01:28:10
◼
►
they want to be the Ferrari or the Porsche of phones, but they don't have a product to
01:28:15
◼
►
sell at all right now, so it's not really an issue.
01:28:16
◼
►
And then as for the Andy Warhol thing of like, no matter who you are, a Coke is a Coke, when
01:28:21
◼
►
Gruber uses that analogy and said no matter who you are, a phone is a phone, I think it's
01:28:24
◼
►
a little bit too different things. The idea of Coke is like Coke is not an expensive product
01:28:30
◼
►
to make. Anyone can buy it because it's very inexpensive and it's also not expensive to
01:28:34
◼
►
manufacture. The things that are in it are not rare, right? Whatever the secret formula
01:28:38
◼
►
is, it's not made with platinum flakes or something, right? So everyone can get it because
01:28:45
◼
►
it is accessible to everyone and it is the same. And the whole point is, because it's
01:28:48
◼
►
basically the cultural hegemony of Coca-Cola, because you're all raised drinking Coke and
01:28:53
◼
►
you want something that tastes like what you were raised on, you have brand loyalty to
01:28:56
◼
►
Coke because if you try RC Cola it doesn't taste like what you think a Coke could taste
01:29:00
◼
►
Not because Coke is fancier and more expensive than RC, but just because it's what you're
01:29:03
◼
►
used to due to the dominance of the Coca Cola brand.
01:29:06
◼
►
Whereas the iPhone, no one can get a better iPhone, no matter how rich you are, because
01:29:11
◼
►
making a phone is really, really hard.
01:29:14
◼
►
Unlike Coca Cola, which is protected by whatever the circular form of it is, the iPhone is
01:29:18
◼
►
protected by the fact that making a smartphone with a complete ecosystem around it has a
01:29:22
◼
►
There's a huge barrier to entry.
01:29:24
◼
►
And so yeah, no matter how rich you are,
01:29:26
◼
►
you can't get another iPhone.
01:29:27
◼
►
Because even if given, you know, $3 trillion,
01:29:31
◼
►
try making a better iPhone than an iPhone
01:29:34
◼
►
with $3 trillion.
01:29:36
◼
►
It's a harder thing to make.
01:29:37
◼
►
Whereas if I gave you $3 trillion
01:29:39
◼
►
and said try making your own Coke,
01:29:40
◼
►
you'd just buy Coca-Cola.
01:29:42
◼
►
And you got your own Coke now, right?
01:29:44
◼
►
Like it's different in that like,
01:29:45
◼
►
one is like everyone gets the same Coke,
01:29:48
◼
►
which seems weird because it's so easy to make.
01:29:49
◼
►
And the other one, everyone gets the same,
01:29:51
◼
►
no one can make a better iPhone,
01:29:52
◼
►
how rich they are, because it's so tremendously hard to make a smartphone.
01:29:56
◼
►
So I'm not sure that analogy works out.
01:29:58
◼
►
What it is trying to say is that it's nice that, to your point, Casey, that the very
01:30:04
◼
►
best X in the world is accessible to a large number of people.
01:30:08
◼
►
And although the iPhone is very expensive, it is accessible to a lot of people, especially
01:30:13
◼
►
if you're buying a used iPhone.
01:30:14
◼
►
I think that's what Gruber was celebrating, the idea that iPhones are so important to
01:30:18
◼
►
our life and if you want the very best one you don't have to be a millionaire to get
01:30:22
◼
►
it. You just have to shop for refurbs on Amazon or whatever. Whereas the very best car in
01:30:28
◼
►
the world is not accessible to most people because even if you're shopping for a refurb
01:30:33
◼
►
whatever you consider the best car it's going to cost a lot more than a Civic.
01:30:37
◼
►
It certainly will. Do you think that they would do something, I forget the term for
01:30:41
◼
►
it but like Volvo and I'm sure other manufacturers are doing this, have like a subscription service
01:30:45
◼
►
And the way this worked when it first came out, you had to go there.
01:30:49
◼
►
I'm asking because they love their services revenue.
01:30:52
◼
►
I agree with you.
01:30:53
◼
►
I think the Apple car would have a service or a new component.
01:30:56
◼
►
That's what the thing with like the data center where they can assist drivers.
01:30:58
◼
►
I bet you pay for that service.
01:30:59
◼
►
Oh my gosh, I hope that's not true.
01:31:02
◼
►
But I hear you and you're probably right.
01:31:03
◼
►
Grubber's saying that it's a distraction at best and a profound corruption.
01:31:07
◼
►
Whereas it is a profound corruption to every time they make a cool tech product to figure
01:31:10
◼
►
out and now how can we make a service out of this?
01:31:14
◼
►
How can we make it a recurring payment?
01:31:15
◼
►
How can we make it so that you don't just buy the Apple car, you buy a subscription
01:31:19
◼
►
to the Apple car and if you want your heated seats to get, like again, the auto industry
01:31:22
◼
►
is way out ahead of Apple here, but I have to imagine that in Project Titan, this is
01:31:27
◼
►
disgust because the old Apple that would create a technology product without service revenue
01:31:33
◼
►
attached to it seems dead.
01:31:34
◼
►
Like the headset.
01:31:35
◼
►
Headset's going to be cool tech if and when they ship it.
01:31:38
◼
►
And you're like, well, that's not going to have a service associated with it.
01:31:40
◼
►
Of course, it will have an app store.
01:31:41
◼
►
That is the easy go-to of like if we sell anything that is a platform, it has an associated
01:31:45
◼
►
app store where we try to make service revenue, which may or may not work, but that's the
01:31:49
◼
►
So if Apple comes out with a car, there will be service revenue associated with it because
01:31:54
◼
►
that's what modern Apple does.
01:31:55
◼
►
Yeah, I just don't know if they're going to do like a Care by Volvo, as I was just looking
01:31:59
◼
►
it up, is what I believe I'm thinking of where you get to get a car and I believe insurance
01:32:04
◼
►
as well and maintenance and all that.
01:32:06
◼
►
You just pay a monthly fee and I think you might even be able to switch which Volvo you
01:32:10
◼
►
have from time to time and so on and so forth. I don't know, I feel like it wouldn't surprise
01:32:15
◼
►
me if that's how they get in the door is that you're not buying a $300,000 Apple car. You're
01:32:20
◼
►
instead paying a hilariously expensive lease or subscription or what have you to have access
01:32:25
◼
►
to Apple's Apple car that they own.
01:32:27
◼
►
Well, that's the Apple car upgrade program.
01:32:29
◼
►
Right, exactly.
01:32:30
◼
►
You get a new Apple car every year. And again, I don't, I don't, that is more of a, that's
01:32:34
◼
►
less than the server I've been complaining about. I'm just like, Hey, it's kind of like
01:32:37
◼
►
a rolling lease where you get a new car at a certain interval, but you can also buy it
01:32:42
◼
►
It's the service revenue I'm talking about.
01:32:43
◼
►
Your car is useless to you unless you pay X dollars a month for the service that, I
01:32:46
◼
►
don't know, whatever they would charge for.
01:32:48
◼
►
We're not going to go to the heated seats or whatever, a BMW, but this is obviously
01:32:53
◼
►
not just like, "Oh, Apple's being mean.
01:32:54
◼
►
They shouldn't do this."
01:32:55
◼
►
This is the way the whole industry and the whole world is going.
01:32:57
◼
►
And to some extent it makes sense, but it feels worse when the people who are charging
01:33:04
◼
►
are not charging because the service is useful, but charging because they've got you on a
01:33:10
◼
►
vice that's like they can't.
01:33:12
◼
►
You have no other choice, right?
01:33:14
◼
►
It doesn't seem like you would pay for this otherwise, but because Apple has complete
01:33:18
◼
►
control over the platform, they get to extract 30% of all the transactions.
01:33:21
◼
►
That feels worse than "let me pay."
01:33:24
◼
►
Even just something as simple as "let me pay for Sirius XM Radio" or whatever.
01:33:26
◼
►
I want to pay for ongoing access to this thing.
01:33:28
◼
►
If you think it's worthwhile, pay for it.
01:33:30
◼
►
If you don't, don't.
01:33:31
◼
►
but it's not like all of its competitors
01:33:34
◼
►
are forced to also pay serious
01:33:35
◼
►
to be able to put audio into your car.
01:33:37
◼
►
Like that's what starts feeling bad.
01:33:39
◼
►
- We are brought to you this week by Memberful.
01:33:42
◼
►
Diversify your revenue stream
01:33:44
◼
►
and build sustainable recurring revenue with membership.
01:33:48
◼
►
Memberful has everything you need
01:33:49
◼
►
to run a membership program for your audience.
01:33:52
◼
►
Features like custom branding support,
01:33:54
◼
►
gift subscriptions, Apple Pay, free trials,
01:33:57
◼
►
private podcasts, and so much more.
01:34:00
◼
►
And Memberful integrates with the tools you already use.
01:34:03
◼
►
Tools like WordPress, MailChimp, Discord, and so many more.
01:34:08
◼
►
You can create members-only podcasts using your existing podcast hostings.
01:34:11
◼
►
You can start earning more revenue with your current publishing workflow without changing
01:34:14
◼
►
any of that.
01:34:15
◼
►
And all of this is so that you can focus on what you do best while earning revenue quickly.
01:34:20
◼
►
You always have full control and ownership of your audience, your brand, and your membership.
01:34:25
◼
►
Payments even go directly to your own Stripe account.
01:34:27
◼
►
So, Membrful really is aimed for aligning their incentives with yours and really aimed
01:34:32
◼
►
for the needs of creators.
01:34:34
◼
►
This isn't something where people just start from having no audience and then Membrful
01:34:39
◼
►
takes over the whole thing.
01:34:41
◼
►
This is for people who you have an audience and you want to retain control of it and you
01:34:45
◼
►
have your own opinions on the branding and everything else.
01:34:47
◼
►
This is a tool for you.
01:34:48
◼
►
They have great support to help you get started if you need it and they're always willing
01:34:52
◼
►
to help you jump in and maybe optimize for conversions or whatever else you might need.
01:34:56
◼
►
And you can check it out today for free with no credit card required at memberful.com/ATPs.
01:35:03
◼
►
You can see for yourself how it works for you.
01:35:05
◼
►
This is used by some of the biggest creators on the web.
01:35:08
◼
►
And you know, the great thing about Memberful is that you can launch this whole new revenue
01:35:11
◼
►
stream without rebuilding your entire tech stack.
01:35:14
◼
►
So again, see for yourself at memberful.com/ATP to start your free trial today.
01:35:21
◼
►
Thank you so much to Memberful for sponsoring our show.
01:35:25
◼
►
(upbeat music)
01:35:28
◼
►
- All right, let's do a little bit of Ask ATP,
01:35:30
◼
►
and let's start with Peter Petrovic, who writes,
01:35:32
◼
►
"In this day and age of social media all around us,
01:35:35
◼
►
"and with the Fediverse gaining traction,
01:35:37
◼
►
"is it still worth one's time to have a personal website
01:35:39
◼
►
"or a blog residing on one's personal domain?"
01:35:42
◼
►
Yeah, I think so, because I'm a nerd,
01:35:43
◼
►
and I want that control.
01:35:45
◼
►
I mean, nobody can ever take away
01:35:49
◼
►
casylist.com from me.
01:35:52
◼
►
Well, that's not a challenge.
01:35:52
◼
►
Please don't take that as a challenge.
01:35:53
◼
►
But you know what I'm saying.
01:35:54
◼
►
it's mine, that's where I live, that's my space.
01:35:58
◼
►
My space, space, you know what I'm saying.
01:36:00
◼
►
And so anyway, I control that,
01:36:03
◼
►
and my email address is at caselist.com.
01:36:06
◼
►
So when I moved from Gmail to Fastmail,
01:36:09
◼
►
then nobody knew it.
01:36:12
◼
►
Well, I mean, people knew it
01:36:12
◼
►
because I talked about it on the show,
01:36:13
◼
►
but like any of the people I knew--
01:36:14
◼
►
- I was gonna say, everyone knew it.
01:36:17
◼
►
- Hush you, nobody asks.
01:36:19
◼
►
It's caselist.com/fastmail.
01:36:20
◼
►
Anyway, but again, you take the point,
01:36:24
◼
►
you get the point I'm trying to make here is that because it was under my control, everyone
01:36:28
◼
►
else that I was emailing with was completely ignorant to the fact that these emails were
01:36:32
◼
►
actually getting routed to passmail instead of Gmail. And similarly, you know, my website
01:36:36
◼
►
will live there and maybe it will continue to run on my broken blog engine, maybe it'll
01:36:40
◼
►
run on something else, but it will always be there. And even though my hot takes and
01:36:45
◼
►
quips and whatnot I used to put on Twitter and now I'm putting on Mastodon, it's still
01:36:50
◼
►
to me important to have a like canonical place to live, if you will, a place that is mine
01:36:57
◼
►
and that is kind of my home base on the internet. And I think it is absolutely worth it. Even
01:37:02
◼
►
if all you have is like a single serving site, you know, an about page that says, "Hey, I'm
01:37:07
◼
►
Casey. I do these, you know, I have these projects. You can find me at these places."
01:37:10
◼
►
I still think that's important. And let's not lose sight of the fact that, you know,
01:37:15
◼
►
we were all on Mastodon servers years ago, most of which have crumbled, except John who
01:37:19
◼
►
apparently was on every Mastodon server, but for like Marco and me, you know, or at least
01:37:23
◼
►
certainly me, and I thought Marco, you were in the same boat. We signed up on servers
01:37:26
◼
►
that have since disappeared and that could happen. Now granted, you know, the three of
01:37:30
◼
►
us seem to be using Mastodon.social now, which seems to be the most stable, but you never
01:37:35
◼
►
know, you never know what'll happen. And so I absolutely stand by and think it is important
01:37:40
◼
►
and it's not that expensive. You know, you can use pass sponsor hover to get a domain.
01:37:45
◼
►
You can use past sponsor Squarespace
01:37:48
◼
►
in order to put a website there.
01:37:49
◼
►
In fact, I think you can even get your domain
01:37:51
◼
►
through Squarespace if you so choose.
01:37:53
◼
►
- Linode, yep, that's another past sponsor as well.
01:37:56
◼
►
- Pretty sure those two are also current sponsors,
01:37:58
◼
►
by the way, just not this episode.
01:38:00
◼
►
- I'm sorry, that's what I meant.
01:38:01
◼
►
Sorry, just not this episode is what I meant,
01:38:03
◼
►
but thank you for the correction.
01:38:04
◼
►
But yeah, I mean, we bring these services up
01:38:06
◼
►
in part because genuinely we either use
01:38:09
◼
►
or genuinely recommend them because they really are great.
01:38:12
◼
►
And so this is something that I absolutely think
01:38:14
◼
►
you should have a little corner of the internet
01:38:17
◼
►
that is under your control.
01:38:19
◼
►
And I stand by that 100%.
01:38:21
◼
►
Let's start with Marco, where do you land on this?
01:38:23
◼
►
And I think I have a pretty strong feeling about it.
01:38:26
◼
►
- So I think we have to separate out like,
01:38:28
◼
►
is it worth having your own domain from,
01:38:31
◼
►
what Peter was asking here was,
01:38:33
◼
►
is it worth having a personal website or blog
01:38:35
◼
►
on a personal domain?
01:38:36
◼
►
- That's fair, that's fair.
01:38:37
◼
►
- So whether you want to have a website or a blog,
01:38:40
◼
►
that's up to you.
01:38:41
◼
►
I do think it's worth having a domain
01:38:44
◼
►
because as Casey was saying, on the long term,
01:38:49
◼
►
on an infinite time scale, services come and go.
01:38:52
◼
►
And servers come and go, and companies come and go,
01:38:55
◼
►
and what you want to do on the website,
01:38:57
◼
►
what your identity is, what services and functions
01:39:01
◼
►
it needs to offer or provide, all of that changes over time.
01:39:06
◼
►
And so it is nice to have something
01:39:08
◼
►
that is your own domain, so that way,
01:39:10
◼
►
when you have your email address at your own domain,
01:39:13
◼
►
instead of at a certain email provider.
01:39:15
◼
►
That pretty much guarantees that as long as no one
01:39:20
◼
►
big email provider that does not support custom domains
01:39:23
◼
►
ever takes over and makes it impossible
01:39:26
◼
►
to use a custom domain, which seems unlikely with email,
01:39:28
◼
►
that pretty much guarantees that you can be
01:39:30
◼
►
service provider portable or agnostic.
01:39:33
◼
►
And so, for instance, if you use Gmail with your own domain,
01:39:37
◼
►
and then Gmail starts to suck,
01:39:38
◼
►
or they start charging money for that
01:39:40
◼
►
and you don't wanna pay or whatever happens there,
01:39:42
◼
►
If you don't wanna be on Gmail anymore,
01:39:44
◼
►
you can just move what is hosting your email,
01:39:47
◼
►
but it's, to the world, it's still your domain.
01:39:51
◼
►
You change over the MX records and that's it.
01:39:53
◼
►
So you can change your hosting provider
01:39:55
◼
►
and not affect your ability to be reached or found
01:39:58
◼
►
or whatever popularity you've built up or whatever.
01:40:01
◼
►
If Mastodon continues to be a thing, and I hope it does,
01:40:06
◼
►
then certainly there might become some value
01:40:08
◼
►
in having your own domain for that,
01:40:09
◼
►
in the sense that, again, that's a username
01:40:11
◼
►
that you can easily port between different backend options.
01:40:16
◼
►
That being said, it's a little bit less important there
01:40:17
◼
►
'cause they have this whole redirect mechanism in place.
01:40:20
◼
►
But basically for many services,
01:40:23
◼
►
the redirect procedure isn't so simple or effective
01:40:26
◼
►
or permanent, and so it is nice to be able
01:40:30
◼
►
to have your own domain that things are pointed to
01:40:32
◼
►
and that you can host whatever you want there over time.
01:40:35
◼
►
Now whether you should have a blog,
01:40:38
◼
►
that's a different question, and that's up to you.
01:40:41
◼
►
You know, personally, I have had a blog for a long time.
01:40:45
◼
►
I don't really use it much anymore,
01:40:47
◼
►
but I like the idea that I can always go back to it.
01:40:49
◼
►
Many people, in the collapse of Twitter,
01:40:53
◼
►
many people are going back to their own personal blogs
01:40:55
◼
►
and reviving them and writing again,
01:40:57
◼
►
and I think that's great.
01:40:58
◼
►
And so, you know, whether you're gonna like, you know,
01:41:03
◼
►
become the next Engadget, posting, you know,
01:41:05
◼
►
30 posts a day or whatever, that's, you know,
01:41:08
◼
►
you don't have to do that to make it worth
01:41:09
◼
►
having a personal blog.
01:41:11
◼
►
It could be like John's where you post once a year,
01:41:13
◼
►
or it could be like mine,
01:41:14
◼
►
and when you post whenever you wanna promote something
01:41:16
◼
►
or call out a particularly bad bug to Apple
01:41:18
◼
►
and hope they fix it.
01:41:18
◼
►
Like, whatever it is, you can post once every five years
01:41:21
◼
►
and through the magic of RSS readers,
01:41:23
◼
►
it doesn't really matter.
01:41:24
◼
►
You're not gonna lose your audience.
01:41:25
◼
►
They'll just see a post once every five years
01:41:27
◼
►
and that'll be it.
01:41:28
◼
►
So whether you have a blog, that's up to you,
01:41:30
◼
►
but I think that can be very flexible.
01:41:31
◼
►
But I do think whether you should have your own domain
01:41:35
◼
►
and some way to put stuff on that domain,
01:41:37
◼
►
whether that's your, you know,
01:41:38
◼
►
whether you're just pointing it to even like
01:41:40
◼
►
static page on GitHub that will host it for free,
01:41:42
◼
►
or whether you're putting it to like an S3 bucket,
01:41:44
◼
►
or having a full-blown host like Squarespace,
01:41:46
◼
►
or running a full-blown server like Linode,
01:41:48
◼
►
whatever it is, having a domain,
01:41:51
◼
►
if you are nerdy enough to even know
01:41:52
◼
►
what any of the stuff I'm talking about means,
01:41:54
◼
►
is probably valuable to you, and you should probably do it.
01:41:57
◼
►
- Yeah, this is kind of, this topic comes up a lot,
01:42:00
◼
►
and I think it's kind of a, not a happy accident,
01:42:03
◼
►
but we were lucky enough that the standards of the internet
01:42:07
◼
►
that were made by people who wanted to do something good for the world, or do it in
01:42:16
◼
►
a kind of open way.
01:42:18
◼
►
The attitudes of the people who made the internet were such that the things we got out of it
01:42:24
◼
►
— DNS, TCP/IP, the fact that it was part of a government program that became open,
01:42:29
◼
►
all those different systems — that they sort of snowballed and gained critical mass
01:42:35
◼
►
before private or public corporations could come in and extract all the value means that
01:42:41
◼
►
we actually have a competitive market for the things we're talking about.
01:42:46
◼
►
The reason we say it's so important is not because, to Marco's point, not because we
01:42:50
◼
►
think you should have a blog or be a blogger, but because DNS and owning your own domain
01:42:55
◼
►
name is a way for you to control your identity and your data on the internet.
01:43:01
◼
►
The internet is really important and having control of your identity is important.
01:43:04
◼
►
The reason you can control it is because DNS is not owned and controlled by Google or Microsoft
01:43:09
◼
►
or Apple, right?
01:43:10
◼
►
TCP/IP is not owned and controlled by any corporation.
01:43:13
◼
►
It wasn't embraced and extended by Microsoft, so they own all the networking.
01:43:16
◼
►
We're not doing all this on MSN or whatever.
01:43:18
◼
►
Those open standards that sort of got their foot in the door before corporations would
01:43:22
◼
►
come and ruin everything exist and you should take advantage of them.
01:43:26
◼
►
And all the technical ins and outs of why it's portable and everything are important,
01:43:30
◼
►
but the underlying technology is what makes that possible.
01:43:33
◼
►
That's why there is a competitive market for where you register your domain name.
01:43:37
◼
►
You can use it at one of our sponsors, you can use it at different companies.
01:43:39
◼
►
There's companies that compete to try to be the place where you register your domain.
01:43:43
◼
►
No one controls all domain names.
01:43:45
◼
►
You can register at different places.
01:43:47
◼
►
What about hosting?
01:43:48
◼
►
Where do I host my website?
01:43:49
◼
►
There is a competitive market for hosting your website.
01:43:52
◼
►
Once you control the domain, the hosting companies know you can put that domain anywhere.
01:43:56
◼
►
You can run a server out of your closet, you can put it on Squarespace, you can put it
01:43:59
◼
►
on Linode, you can put it on like there's a million companies that do that because nobody
01:44:02
◼
►
owns and controls, oh this is the one company that controls, like, AWS does not control
01:44:06
◼
►
web hosting, although sometimes it seems that way. They don't. You can host anything anywhere.
01:44:11
◼
►
Same thing with email. It happens to be an open protocol that, yes, there are dominant
01:44:14
◼
►
players in the market, but you can host your email at different places. It's not as easy
01:44:18
◼
►
to be receiving email, which is what we talked about in past shows, because of all the spam
01:44:21
◼
►
rules and all the crap like that. Some of that early sort of open stuff didn't go quite
01:44:25
◼
►
so well. Maybe email didn't work out that well for the world. But that's why we're always
01:44:30
◼
►
pressing on that you should have your thing, not because we think you should be a blogger
01:44:34
◼
►
or not because we think you should use one of our sponsors and have a website, but because
01:44:37
◼
►
we want individuals to own and control their identity on the web.
01:44:41
◼
►
That's why we push for the idea of some username at twitter.com.
01:44:45
◼
►
You don't own and control that.
01:44:46
◼
►
Twitter does, right?
01:44:48
◼
►
Same thing with Mastodon, although Mastodon does let you use the web finger protocol and
01:44:51
◼
►
various other things to basically be your username at domain that you control on Mastodon,
01:44:57
◼
►
but I haven't gone into the web finger thing to try it.
01:44:59
◼
►
It seems like it might not be as well supported in various clients and places as I expect,
01:45:05
◼
►
but that is the ideal.
01:45:06
◼
►
That's what we're shooting for.
01:45:08
◼
►
Open protocols, open standards, a competitive market to provide the services, and that's
01:45:13
◼
►
where this falls down.
01:45:14
◼
►
It's like we're a nerd show and we know that if you're listening to this, maybe you know
01:45:17
◼
►
how to make your own website.
01:45:18
◼
►
We all wish it was easier, and that's why in a competitive market, you have companies
01:45:22
◼
►
like Squarespace that are saying, "We're going to try to make it as easy as possible for
01:45:25
◼
►
you to do this."
01:45:26
◼
►
more complicated than you would like and the competition is,
01:45:29
◼
►
"Oh, look how easy it is to sign up for Twitter.
01:45:31
◼
►
You don't have to pick a server, you just go to twitter.com
01:45:33
◼
►
and you create an account and it's so easy."
01:45:35
◼
►
Private companies can make it easier
01:45:37
◼
►
because there is less competition.
01:45:38
◼
►
There's one place to go.
01:45:40
◼
►
Mastodon, you have to pick a server
01:45:41
◼
►
and even then you're sort of under the thumb of that server
01:45:43
◼
►
even with the redirect rules and everything, right?
01:45:45
◼
►
Domain names and web hosting are the most open
01:45:49
◼
►
and that means there is the most choice
01:45:51
◼
►
and that means there is kind of a barrier to entry
01:45:53
◼
►
because like, "Where do I go?
01:45:54
◼
►
What if I don't want to use to where I sit?
01:45:55
◼
►
what I want to use, you know, like it's a little bit trickier to use, but that is the beauty of it. Like,
01:45:59
◼
►
claim a portion of the internet for yourself because as websites come and go, as MySpace, you know, rises and falls,
01:46:07
◼
►
as Facebook comes in and out of favor, as Twitter comes and goes, and Mastodon comes and goes, if you own your own domain name,
01:46:13
◼
►
your place on the internet will always exist at static URLs that you control.
01:46:19
◼
►
That's why we always send you to ATP.fm/store because that is the URL that we control. And yes,
01:46:24
◼
►
Yes, it leads you to Cotton Bureau with various URLs,
01:46:28
◼
►
but at various times it has led you to other places.
01:46:30
◼
►
We want to send you to the place that we control,
01:46:32
◼
►
so that if you hear something that says ATP.fm store,
01:46:37
◼
►
five years in the past or five years in the future,
01:46:40
◼
►
that page will still exist.
01:46:41
◼
►
And we know that because we control it.
01:46:43
◼
►
And whether we're hosted on Squarespace or Linode
01:46:45
◼
►
or Marco's water closet, wherever we're hosted,
01:46:50
◼
►
we control that URL.
01:46:51
◼
►
That's-- this is the ideal.
01:46:53
◼
►
The web happened to get out the door before corporations could totally destroy it.
01:46:57
◼
►
And yes, it is tricky or whatever, but this is why we wish everything was.
01:47:01
◼
►
We wish that our names on social were our names at a domain that we control, but that
01:47:05
◼
►
is unfortunately, it is not reasonable to expect as much as nerds would like every person
01:47:11
◼
►
in the world to have their own domain name.
01:47:13
◼
►
The namespace contention alone would be horrendous, right?
01:47:16
◼
►
So we recognize this is not, you know, it's a little bit of a fantasy to say that every
01:47:21
◼
►
Every person in the world is going to have their own domain name, although with IPv6
01:47:24
◼
►
they can have their own IP address, but that's a different story.
01:47:27
◼
►
But for the people listening to this show, you're not every person in the world, unfortunately
01:47:31
◼
►
You are a very, very, very tiny subset.
01:47:33
◼
►
And for the people listening to this show, I definitely recommend getting your own domain
01:47:37
◼
►
and using it and making a bunch of static URLs that never change and putting stuff on
01:47:43
◼
►
And if you post it once every five years, that's fine.
01:47:44
◼
►
If the only thing on it is a link to your resume, that's fine.
01:47:46
◼
►
Whatever you want to do.
01:47:47
◼
►
The point is that you control it.
01:47:48
◼
►
Once you have that, no one can take it away from you except probably the government, and
01:47:52
◼
►
hopefully we'll stop that from happening.
01:47:54
◼
►
Alright, Jeremy Nash writes, "I've recently learned about Bitrot and I am going down a
01:47:58
◼
►
rabbit hole convinced all my data will be corrupted one day.
01:48:01
◼
►
Do you guys worry about this on your Synologies?
01:48:03
◼
►
Is ZFS or TrueNAS the only answer?
01:48:05
◼
►
And related, if your Synologies died today, would you immediately replace them or would
01:48:08
◼
►
you look at a do-it-yourself solution like Unrate or TrueNAS?"
01:48:11
◼
►
I'll start here.
01:48:13
◼
►
I am actually looking at, I feel like we just talked about this, maybe we didn't, but I
01:48:16
◼
►
I am looking to replace my sonology probably in the first half of this coming year, not
01:48:20
◼
►
because there's anything wrong with it, but because it's 10 years old and I think it's
01:48:23
◼
►
starting to feel its age and it's time to maybe get something new.
01:48:28
◼
►
But I freaking love this thing.
01:48:29
◼
►
I adore my sonology.
01:48:31
◼
►
I totally understand that if you're patient enough and enjoy fiddling enough, that unraided
01:48:37
◼
►
true NAS may be interesting to you, but it is not to me.
01:48:40
◼
►
With regard to bit rot, it's something I try not to think about because I don't want to
01:48:45
◼
►
think about it. But one of the things that I want to do on a new Synology, which I don't
01:48:48
◼
►
think my current one supports, is I want to move to some file system, I believe, Butter,
01:48:54
◼
►
BTR, whatever it is. I'll talk to Jon about it when the time comes. But I would like to
01:48:57
◼
►
move to some file system that prevents this. Maybe it's, maybe Butter isn't the one I'm
01:49:01
◼
►
thinking of, but one of the ones that at least does a passable amount of effort to try to
01:49:06
◼
►
prevent bit rot. But yeah, it's a fact of life unless you're actively avoiding it. And
01:49:12
◼
►
I'm not actively avoiding it yet, but I hope to be soon.
01:49:15
◼
►
Let's start with Jon this time.
01:49:16
◼
►
- Yeah, so congratulations for learning about bit rot.
01:49:20
◼
►
Yes, all your rots, all your bits will be corrupted someday
01:49:23
◼
►
on an infinite timeline.
01:49:25
◼
►
And the defense against that are these systems that,
01:49:28
◼
►
well, there's two defense of this.
01:49:29
◼
►
One is you want to detect when this happens.
01:49:32
◼
►
Because if you don't detect when this happens,
01:49:33
◼
►
all you do is you'll propagate your corrupted data
01:49:35
◼
►
to all your backups, right?
01:49:36
◼
►
So there's lots of different file systems
01:49:37
◼
►
and storage systems that do this.
01:49:40
◼
►
There's two parts to that.
01:49:41
◼
►
One is detecting whether it happened, in which case you have to have some kind of like checksum
01:49:45
◼
►
to say, "Hey, we wrote bits like this.
01:49:47
◼
►
Are they still like that?"
01:49:49
◼
►
You need to be able to answer that question.
01:49:51
◼
►
Second is, when you get the answer, it says, "No, actually, when we wrote these, they look
01:49:54
◼
►
like this, but now they look like that."
01:49:58
◼
►
They might just know, "This is not what we wrote.
01:50:00
◼
►
Ten years ago, we wrote some data here, and now the checksum doesn't match.
01:50:02
◼
►
This is bad."
01:50:04
◼
►
What do you do about that?
01:50:05
◼
►
Well, what you want is to be notified, at the very least, for your device to say, "Hey,
01:50:11
◼
►
file is corrupt. That doesn't help you if you don't have a way to fix that corruption.
01:50:16
◼
►
One way you can fix it, if you're lucky, is "okay, well I have a backup of that file,
01:50:21
◼
►
and I got notified promptly that the file was corrupt, so that means that my backup
01:50:25
◼
►
isn't corrupt, because it just turned corrupt, and I backed up, you know, I have like 30
01:50:29
◼
►
days worth of backups, so I can pull that file from 30 days ago and restore it, and
01:50:34
◼
►
now it's not corrupt anymore." That relies on timely notification and timely action and
01:50:38
◼
►
frequent backups. The other way you can do that is the file system itself can store enough
01:50:43
◼
►
redundant information such that when it finds something is corrupt, it can just fix it itself
01:50:48
◼
►
if it has a good copy of that data somewhere. But of course that eats your storage space,
01:50:52
◼
►
because now you're not just storing all your data once, you're storing it 1.2 times, 1.5
01:50:56
◼
►
times depending on how many errors you want to be able to recover from. If you want to
01:51:00
◼
►
store your data twice, three times, five times like AWS does and S3 or whatever, the more
01:51:04
◼
►
copies of your data you store, the more likely you will be able to repair it when the computer
01:51:10
◼
►
detects that it's bad.
01:51:12
◼
►
So again, two parts.
01:51:13
◼
►
Detect when it's bad, be able to fix it.
01:51:15
◼
►
And those are two different things.
01:51:17
◼
►
Just because you use a file system that has checksums, you're like, "I'm protected!"
01:51:20
◼
►
All it's going to do is tell you when you're screwed immediately.
01:51:22
◼
►
"Oh, I found your files corrupt and you don't have any other copies of it.
01:51:25
◼
►
Haha, that doesn't help you."
01:51:27
◼
►
You have to have a good copy of the data somewhere.
01:51:31
◼
►
So ZFS and BTRFS and a bunch of other things can let you sort of choose how much storage
01:51:36
◼
►
space do you want me to burn saving redundant data such that you can repair small errors,
01:51:42
◼
►
bigger errors or whatever.
01:51:43
◼
►
Really if you want to be the best protected you should use a file system that does that,
01:51:47
◼
►
store some amount of redundant data and also have backups with not just one backup, not
01:51:52
◼
►
like you overwrite the backup of the new one every time, you want to have multiple versions
01:51:57
◼
►
Like even Backblaze has this now where you can save different versions for 30 days or
01:52:00
◼
►
or whatever, so you have seven different versions
01:52:02
◼
►
of this file, not just one.
01:52:03
◼
►
And then pay attention when it tells you
01:52:05
◼
►
that something is corrupt, right?
01:52:07
◼
►
This is a high bar, kind of like hosting,
01:52:09
◼
►
oh, I don't wanna have to think about all this stuff,
01:52:11
◼
►
whatever, but if you're looking into TrueNAS and stuff,
01:52:14
◼
►
it seems like you're on board to dive into this,
01:52:16
◼
►
so that's what you need.
01:52:17
◼
►
Detect the errors and be able to fix them.
01:52:19
◼
►
As for me, if my Synology died today,
01:52:21
◼
►
I would immediately replace it with a Synology.
01:52:23
◼
►
I love it, it's been one of the best technology products
01:52:25
◼
►
I've had in my entire life.
01:52:28
◼
►
I actually, I don't wanna replace it
01:52:30
◼
►
'cause it just works and it's fine,
01:52:33
◼
►
but I also frequently go to the Synology site,
01:52:35
◼
►
but I had to replace it.
01:52:36
◼
►
- Yep, yep, yep. - What kind of cool
01:52:37
◼
►
Synology would I get?
01:52:38
◼
►
'Cause I love it, it's like, it's my favorite thing.
01:52:40
◼
►
It's in the basement, it's out of sight, it's out of mind,
01:52:42
◼
►
it does everything that I want it to do.
01:52:44
◼
►
It is almost 10 years old, it is old and creaky,
01:52:47
◼
►
it will soon be very, exactly 10 years old
01:52:50
◼
►
'cause we got them in 2013, right?
01:52:52
◼
►
- Yeah, in April, I believe.
01:52:53
◼
►
- Yeah, so in April, this will be the thing
01:52:54
◼
►
that'll be 10 years old, I hope it doesn't take that.
01:52:56
◼
►
I don't wanna replace it 'cause it would be expensive
01:52:58
◼
►
to replace, but I kind of look forward to replacing it
01:53:00
◼
►
'cause I wanna shop for a new Synology.
01:53:02
◼
►
So that's an endorsement of that brand.
01:53:04
◼
►
- All right, Edwin Guggenbichler writes,
01:53:07
◼
►
"Do you use native mouse acceleration?"
01:53:10
◼
►
And then somebody, I don't know if that was Edwin
01:53:12
◼
►
or somebody else, put in a link to SteerMouse.
01:53:15
◼
►
I don't even know what this is about, so John, fill in.
01:53:17
◼
►
- All right, so Casey, you probably remember
01:53:19
◼
►
when you switched from Windows to Mac,
01:53:22
◼
►
the way the mouse accelerates is different
01:53:24
◼
►
between those two systems.
01:53:25
◼
►
- Oh God, it was so long ago.
01:53:26
◼
►
I'm sure you're right, but I have zero recollection of it.
01:53:29
◼
►
- Well, you've certainly noticed if you go back.
01:53:32
◼
►
- That's fair, that's fair.
01:53:33
◼
►
- And if you happen to use PCs and Macs to get,
01:53:37
◼
►
on a regular basis, if you happen to use both,
01:53:40
◼
►
you will probably notice, especially if you use mice
01:53:42
◼
►
on both, it's a little bit harder to notice
01:53:43
◼
►
with trackpad on the Mac or whatever,
01:53:45
◼
►
but if you use mice on both, you will definitely notice,
01:53:48
◼
►
oh, it doesn't move right on one or the other.
01:53:52
◼
►
So basically, yeah, the mouse acceleration
01:53:54
◼
►
works totally differently on Macs than it does on Windows.
01:53:58
◼
►
So to answer Edwin's question,
01:53:59
◼
►
do I use native Mac acceleration?
01:54:01
◼
►
Yes, because I'm mostly on Mac
01:54:06
◼
►
and with the exception of playing my weird Sim Tower game,
01:54:09
◼
►
I'm mostly on Mac most of the time
01:54:12
◼
►
and games are the only thing I use Windows for
01:54:15
◼
►
and games I feel like you kind of get used to whatever,
01:54:18
◼
►
however the mouse behaves in the game anyway
01:54:21
◼
►
and it's always different from how it behaves on a desktop,
01:54:23
◼
►
so it kind of doesn't matter.
01:54:24
◼
►
So is there a setting for this?
01:54:26
◼
►
I still feel ignorant.
01:54:28
◼
►
- So the reason why the Steer Mouse link is here,
01:54:29
◼
►
I assume John put it there,
01:54:31
◼
►
is that Steer Mouse is a utility
01:54:33
◼
►
that I believe allows you to change
01:54:35
◼
►
the mouse acceleration curve on Macs
01:54:38
◼
►
to possibly better match what Windows PCs do.
01:54:41
◼
►
- That's not what it's,
01:54:43
◼
►
it's not supposed to match Windows PCs.
01:54:44
◼
►
So yeah, when we talk about acceleration curve,
01:54:46
◼
►
it's basically saying, when you move the mouse,
01:54:48
◼
►
there's multiple pieces of information coming in.
01:54:51
◼
►
There's how far you moved it,
01:54:52
◼
►
which direction and over what period of time, right?
01:54:55
◼
►
And those are all factors in how the cursor moves.
01:54:58
◼
►
The new position of the cursor is a factor of all of that.
01:55:00
◼
►
It's not just how far you move the mouse, and it's not just how fast you move it, and
01:55:03
◼
►
it's not just the direction, it's all those things combined.
01:55:06
◼
►
And the equation that takes those inputs and translates them to the new position of the
01:55:12
◼
►
cursor is called like, they call it the mouse acceleration curve, right?
01:55:15
◼
►
And there's lots of different kind of, you know, if you could graph like, here's the
01:55:18
◼
►
three inputs, here's the output, you can graph that, and it's probably more than three inputs.
01:55:22
◼
►
But anyway, there's lots of different ways you can do that.
01:55:26
◼
►
I'm very sensitive to mouse acceleration, especially since I was basically born on the
01:55:30
◼
►
Mac and used to the Mac's mouse acceleration, which was just phenomenally better than mouse
01:55:36
◼
►
acceleration on other platforms from day one.
01:55:38
◼
►
If you use a mouse on the Apple IIGS and use it on the Mac, it did not feel the same.
01:55:42
◼
►
Like even from the same company, it was not the same at all.
01:55:45
◼
►
I was so sensitive to it, I remember when I got my SE30, there was some kind of bug.
01:55:51
◼
►
I don't know if it was a bug in the OS, a bug in Mac Paint.
01:55:54
◼
►
I think it was probably a bug in the OS that would cause the mouse cursor to jump in a
01:56:00
◼
►
little L-shaped right angle thing, where instead of going from one point to another point that's
01:56:04
◼
►
on an angle from it, it would go up and then over to the right.
01:56:08
◼
►
Like up three pixels and to the right three pixels, instead of going like the hypotenuse,
01:56:14
◼
►
And that would manifest if you tried to draw with the mouse in Mac Paint.
01:56:17
◼
►
You would see these little stair steps, and not just stair steps from pixels, but stair
01:56:20
◼
►
intercepts like three pixels high, three pixels right.
01:56:23
◼
►
And it's like, no, I wanted you to do five pixels
01:56:25
◼
►
on an angle on the hypotenuse.
01:56:27
◼
►
I actually brought it to the,
01:56:28
◼
►
that was such a complaining kid,
01:56:30
◼
►
brought it to the authorized Apple dealer.
01:56:32
◼
►
- You? - Some things never change.
01:56:34
◼
►
- Because I got a new computer,
01:56:35
◼
►
this is my new fancy computer,
01:56:37
◼
►
and the mouse doesn't work right.
01:56:38
◼
►
Like I can't draw on Mac Paint.
01:56:39
◼
►
I brought it to, we brought it to the authorized Apple dealer
01:56:42
◼
►
'cause there was no Apple stores in those days,
01:56:43
◼
►
and said, hey, we just got this new computer
01:56:45
◼
►
and the mouse don't work right.
01:56:46
◼
►
And then I showed them and they're like,
01:56:48
◼
►
how much you talking about, kid, go away.
01:56:49
◼
►
Like they just, I mean, there's nothing they could do about it.
01:56:51
◼
►
I don't know if it ever got fixed or if it was just like,
01:56:53
◼
►
I got a new computer after that.
01:56:54
◼
►
But anyway, I'm very sensitive to the mouse curve.
01:56:57
◼
►
The reason I use SteerMouse, which is what I put link in here, is--
01:57:00
◼
►
and this is sometimes the case for people--
01:57:03
◼
►
when you go to the system settings or whatever
01:57:07
◼
►
and see the little slider for mouse acceleration--
01:57:09
◼
►
what is it called now?
01:57:11
◼
►
Pointer speed or something?
01:57:13
◼
►
Yeah, mouse.
01:57:14
◼
►
Tracking speed.
01:57:16
◼
►
Yeah, it's a slider.
01:57:17
◼
►
And it goes from slow in the left end to fast at the right.
01:57:19
◼
►
Those are literally the labels.
01:57:20
◼
►
There are no labels in synthetic marks.
01:57:21
◼
►
The left edge is labeled slow, and the right edge
01:57:23
◼
►
is labeled fast.
01:57:24
◼
►
And that's basically saying, do you
01:57:26
◼
►
want the mouse cursor to move a lot when
01:57:28
◼
►
you move the mouse a little?
01:57:29
◼
►
Or do you want it to move a little when
01:57:31
◼
►
you move the mouse a little, right?
01:57:33
◼
►
And there's a setting for this under the covers.
01:57:36
◼
►
I think it's just a floating point value or something.
01:57:38
◼
►
Just a single floating point value, which obviously,
01:57:40
◼
►
given what I said about the inputs,
01:57:42
◼
►
it's more complicated than that.
01:57:43
◼
►
But the OS gives you this one setting, slow and fast.
01:57:47
◼
►
If, like me, you have a very large monitor,
01:57:49
◼
►
even the fastest setting might feel too slow.
01:57:52
◼
►
Like it might feel like it takes,
01:57:54
◼
►
you have to like move the mouse, then pick it up
01:57:56
◼
►
and move it back to the middle and move it again
01:57:58
◼
►
just to get from one side of the other thing to the other.
01:57:59
◼
►
Or you have to move the mouse unreasonably fast
01:58:01
◼
►
and it doesn't feel accurate to do it or whatever.
01:58:04
◼
►
So if you're on the highest setting
01:58:06
◼
►
and it still doesn't feel fast enough,
01:58:08
◼
►
one thing you can do is just like default to right
01:58:10
◼
►
whatever that thing is,
01:58:11
◼
►
and set the value of that thing to a higher value
01:58:15
◼
►
than the slider lets you go.
01:58:16
◼
►
I don't know what the values are,
01:58:17
◼
►
but let's pretend the left side of the thing is like zero
01:58:19
◼
►
and the right side is 1.0.
01:58:21
◼
►
You can from the command line set it to 1.5, 2.0, 2.5,
01:58:25
◼
►
like basically set values that are not
01:58:27
◼
►
settable with the GUI.
01:58:28
◼
►
So that's one way to influence the mouse tracking.
01:58:33
◼
►
On my big monitors, I have found that no matter
01:58:35
◼
►
what value I put into the one value that Apple lets you pick,
01:58:38
◼
►
the mouse doesn't feel right to me.
01:58:40
◼
►
It feels slow, but also inaccurate.
01:58:43
◼
►
So for a while, I've used a third-party utility,
01:58:45
◼
►
a steer mouse, that lets you more concisely tweak
01:58:50
◼
►
the acceleration curves for individual input devices.
01:58:54
◼
►
This is another factor that's not, I didn't mention.
01:58:56
◼
►
The actual mouse influences this as well.
01:58:59
◼
►
If I'm using my Microsoft mouse connected through USB
01:59:02
◼
►
versus using an Apple mouse connected through USB,
01:59:05
◼
►
they behave wildly differently.
01:59:06
◼
►
This may seem strange to you, but they absolutely do.
01:59:09
◼
►
different mice will behave differently
01:59:11
◼
►
with the same tracking setting.
01:59:13
◼
►
So to make this Microsoft mouse not drive me insane,
01:59:17
◼
►
I needed to be able to tweak the acceleration curves
01:59:21
◼
►
in a more accurate way.
01:59:24
◼
►
And so SteerMouse gives you two numbers
01:59:26
◼
►
that lets you type in the values.
01:59:28
◼
►
I don't know what these numbers mean.
01:59:29
◼
►
They call them acceleration and sensitivity.
01:59:31
◼
►
Who knows what they're doing under the covers?
01:59:32
◼
►
But the point is I can fiddle with these sliders
01:59:35
◼
►
and type in exact values and save my settings
01:59:37
◼
►
and associate them with this specific input device.
01:59:39
◼
►
So if I plug in an Apple mouse,
01:59:41
◼
►
it will feel Apple mouse normal,
01:59:43
◼
►
and I can make this Microsoft mouse feel the way I want.
01:59:45
◼
►
So do I use native mouse acceleration?
01:59:47
◼
►
No, I don't, not with my Microsoft mouse.
01:59:50
◼
►
If I had an Apple mouse,
01:59:51
◼
►
I think I probably would use the native one
01:59:52
◼
►
because the Apple mice,
01:59:54
◼
►
when you crank up the built-in setting,
01:59:57
◼
►
still feel right to me.
01:59:59
◼
►
But when I went on the big mouse journey
02:00:01
◼
►
buying all these different mice,
02:00:02
◼
►
I was amazed at how different they feel.
02:00:04
◼
►
Logitech mouse, a Logitech gaming mouse
02:00:06
◼
►
feels different from a Logitech regular mouse,
02:00:07
◼
►
feels different from a Microsoft mouse,
02:00:09
◼
►
feels different from an Apple mouse.
02:00:10
◼
►
That's why it's great to have third-party utilities
02:00:12
◼
►
that let you tweak this.
02:00:13
◼
►
And again, associate that tweak specifically with the device.
02:00:16
◼
►
So once you get every device set up the way you want,
02:00:19
◼
►
you can even have different curves
02:00:20
◼
►
for Bluetooth versus USB connection of the same mouse.
02:00:22
◼
►
Then it just remembers them,
02:00:23
◼
►
and when you plug in the device, it feels normal.
02:00:26
◼
►
One of the fun things StairMouse has
02:00:28
◼
►
is like a social networking aspect
02:00:30
◼
►
where you can see other people's popular saved settings
02:00:34
◼
►
for your mouse.
02:00:35
◼
►
- Are there achievements that like,
02:00:37
◼
►
oh, you've moused 100 miles?
02:00:39
◼
►
- There should be, but no, I think it's just like,
02:00:42
◼
►
save settings for your mouse,
02:00:43
◼
►
'cause you feel like, I don't know
02:00:44
◼
►
what to set these numbers to.
02:00:45
◼
►
Every time of these sliders around, it still feels weird.
02:00:47
◼
►
Can I find a setting that lots of other people use
02:00:50
◼
►
for this mouse, and you can just use
02:00:52
◼
►
one of the save settings.
02:00:54
◼
►
- Thanks to our sponsors this week,
02:00:55
◼
►
Memberful, Nebula, and Blaze.
02:00:58
◼
►
And thanks to our members who support us directly.
02:01:00
◼
►
You can join at atp.fm/join,
02:01:03
◼
►
and we will talk to you next week.
02:01:05
◼
►
Happy New Year everyone.
02:01:06
◼
►
(upbeat music)
02:01:09
◼
►
♪ Now the show is over ♪
02:01:12
◼
►
♪ They didn't even mean to begin ♪
02:01:14
◼
►
♪ 'Cause it was accidental ♪
02:01:16
◼
►
♪ Accidental ♪
02:01:17
◼
►
♪ Oh it was accidental ♪
02:01:19
◼
►
♪ Accidental ♪
02:01:20
◼
►
♪ John didn't do any research ♪
02:01:22
◼
►
♪ Marco and Casey wouldn't let him ♪
02:01:25
◼
►
♪ 'Cause it was accidental ♪
02:01:26
◼
►
♪ Accidental ♪
02:01:28
◼
►
♪ It was accidental ♪
02:01:29
◼
►
♪ Accidental ♪
02:01:30
◼
►
♪ And you can find the show notes ♪
02:01:32
◼
►
And if you're into Twitter, you can follow them
02:01:40
◼
►
@C-A-S-E-Y-L-I-S-S
02:01:44
◼
►
So that's Casey Liss M-A-R-C-O-A-R-M
02:01:49
◼
►
Auntie Marco Armin S-I-R-A-C
02:01:54
◼
►
U-S-A, Syracuse It's accidental
02:01:58
◼
►
They didn't mean to, accidental, accidental Tech podcast so long
02:02:10
◼
►
Did anybody watch any Christmas movies?
02:02:12
◼
►
Even Marco when he thought his Christmas was going to be a normal time?
02:02:15
◼
►
Did any of you watch a movie that you always watched during Christmas?
02:02:18
◼
►
Oh yes, absolutely.
02:02:19
◼
►
You have to watch several movies.
02:02:21
◼
►
There's a litany of movies that you are compelled to watch, Jon.
02:02:24
◼
►
You have to watch...
02:02:25
◼
►
Every year though?
02:02:26
◼
►
- Yeah. - Like as a tradition?
02:02:27
◼
►
How much room is there for multiple?
02:02:29
◼
►
I feel like maybe you can have one or two movies,
02:02:31
◼
►
but you can't have like 17 movies
02:02:32
◼
►
you have to watch on Christmas.
02:02:33
◼
►
- Oh, you absolutely can.
02:02:35
◼
►
Christmas for most of us is the 25th, man.
02:02:37
◼
►
You got plenty of time.
02:02:38
◼
►
I think you were intending to mean on the day of,
02:02:41
◼
►
and that's a different conversation, but--
02:02:42
◼
►
- No, no, not on the day of.
02:02:43
◼
►
Just sort of like I always have to watch these
02:02:45
◼
►
in the Christmas season.
02:02:47
◼
►
What is your list of terrible '90s movies
02:02:49
◼
►
that you like to watch? - Well, since we haven't
02:02:50
◼
►
actually gotten to like the rest of the family
02:02:52
◼
►
where we would normally watch like Christmas Vacation,
02:02:56
◼
►
We've only watched kind of second-tier movies. And in fact, we've actually watched no joke
02:03:02
◼
►
Only bad sequels to second-tier movies so far. So the Christmas movies we've seen so far this year are
02:03:07
◼
►
Diehard 2 and Home Alone 2.
02:03:10
◼
►
Diehard 2 is not that bad. It's not as good as the first but it's not that bad.
02:03:14
◼
►
It's not as bad as Home Alone 2. I'll give you that but but neither are good. I would say.
02:03:19
◼
►
No, I mean for us that you know the Christmas season in terms of movies starts really anytime after Thanksgiving
02:03:26
◼
►
at the very latest on December 1st.
02:03:28
◼
►
And for us, we have to watch Elf.
02:03:32
◼
►
I've always enjoyed Elf,
02:03:33
◼
►
and it is becoming possibly my favorite Christmas movie
02:03:37
◼
►
Aaron is a big fan of Love Actually.
02:03:39
◼
►
I know that's very polarizing.
02:03:40
◼
►
I never had seen or even heard of Love Actually
02:03:43
◼
►
before we had met.
02:03:44
◼
►
And I actually have come to really, really like Love Actually
02:03:47
◼
►
This is where John rolls his eyes and tells me
02:03:48
◼
►
the litany of ways in which it's wrong or bad or what have you.
02:03:51
◼
►
- I was rolling my eyes at Elf, but go on.
02:03:52
◼
►
- Yeah, me too. - Oh, fair enough.
02:03:54
◼
►
Oh, how can you not like Elf?
02:03:55
◼
►
- Oh God, you monsters.
02:03:56
◼
►
- I don't dislike it, but it is not,
02:03:58
◼
►
like, here's the thing with classic Christmas things,
02:04:01
◼
►
it's whatever you get used to at a certain point,
02:04:03
◼
►
it has really no connection to quality.
02:04:04
◼
►
We all just have to admit that.
02:04:05
◼
►
Like, it's like, the things, the Christmas movies
02:04:08
◼
►
that we like is unrelated to the quality of those things.
02:04:12
◼
►
- Well, with that in mind, for me,
02:04:14
◼
►
the thing that imprinted on me when I was a wee lad,
02:04:17
◼
►
you know, many, many moons ago, was Claymation Christmas.
02:04:20
◼
►
- Which almost nobody has heard of,
02:04:21
◼
►
but Marco is the prime age,
02:04:23
◼
►
it's because we're basically the same age,
02:04:25
◼
►
the Prime Age to have heard of Claymation Christmas. It is not the Christmas season
02:04:29
◼
►
if you have not watched Claymation Christmas at least once, preferably thrice to ten times.
02:04:34
◼
►
So that is definitely on the list. I'm trying to think of what else. I just watched for
02:04:38
◼
►
the very first time, I watched Christmas Vacation. I'd never seen it. Which is funny because
02:04:43
◼
►
my parents' license plate growing up was Griswold. But I'd never seen it and I went into it expecting
02:04:49
◼
►
it to be garbage, kind of like a Christmas story. And I actually thought it was pretty
02:04:54
◼
►
good. It, you know, there are some, some jokes that didn't age well, but for the most part,
02:04:58
◼
►
yeah, when you come to something that's 30 years old, 40 years old, whatever it is today,
02:05:02
◼
►
usually it's bad. It's real bad. And I actually enjoyed Kristen's Vacation. It was pretty good.
02:05:09
◼
►
Yeah. I'm actually kind of surprised that like, that like as a newcomer to it in 2022,
02:05:14
◼
►
I'm surprised that you would have liked it honestly, because yeah, you're right. Like most
02:05:18
◼
►
like 80s and 90s movies are really do not hold up well to modernize especially
02:05:25
◼
►
comedies especially Chevy Chase movie like oh god there's so much like the you
02:05:31
◼
►
know odds are poor that that anything you see from that time period is going
02:05:35
◼
►
to hold up today and and even that you even be able to get through it let alone
02:05:39
◼
►
think it's funny and not be horribly offended by it and but yeah it actually
02:05:43
◼
►
as that goes, you know, again, the bar is low, but as that goes, it's good.
02:05:51
◼
►
It's all relative. So, John, what are your John
02:05:55
◼
►
Syracuse approved movies? And is there, are there any? Is there at least one?
02:05:59
◼
►
I don't, I used to watch things like every year just because they were on,
02:06:04
◼
►
like the Rudolph thing, because that's how old I am, right? Right, but that was just on TV and
02:06:07
◼
►
I'd watch it every year, but as I got older, and surprise, surprise to everyone, like, the,
02:06:12
◼
►
When things were of bad quality, but they were still like,
02:06:15
◼
►
oh, you watch us every year,
02:06:17
◼
►
the bad quality eventually went out.
02:06:19
◼
►
Then I stopped watching things that I don't like.
02:06:21
◼
►
It's like, yes, it is traditional, I watch us all the time,
02:06:23
◼
►
but it's not like the Rudolph thing,
02:06:25
◼
►
it's for kids, I'm not interested in it.
02:06:27
◼
►
I don't feel compelled to watch the Rankin/Bass Rudolph thing
02:06:30
◼
►
every year because it's not something that I enjoy,
02:06:33
◼
►
so I haven't seen that in ages.
02:06:36
◼
►
It's true of most things.
02:06:37
◼
►
I remember watching "Christmas Vacation"
02:06:38
◼
►
when it first came out and enjoying it when I was,
02:06:40
◼
►
however old it was when that happened,
02:06:42
◼
►
teenager or something but I don't think it would hold up on repeat viewing and
02:06:46
◼
►
so I don't watch it even though it's like oh as traditional I've seen it all
02:06:49
◼
►
the time like it's part of my Christmas memories it is but it's not a movie that
02:06:52
◼
►
I enjoy so I don't watch it the one thing that I would probably be willing
02:06:56
◼
►
to watch again is a Christmas story which I think is actually good but I
02:07:03
◼
►
know people are bored by it may require I think it is a well-executed version of
02:07:08
◼
►
what it is but it is a nostalgia trip for people of a certain age it is a
02:07:11
◼
►
nostalgic for a time that is before my time, but it is my parents time so I can
02:07:16
◼
►
I can relate to it and kind of the same way I can relate to Goodfellas. I wasn't
02:07:20
◼
►
a mobster in the 70s, but some of my relatives probably were.
02:07:23
◼
►
So when I look at Goodfellas I see people who I remember from my childhood
02:07:30
◼
►
like my uncle was like that, my cousin was like that, like I can connect to it
02:07:34
◼
►
in that way. So in the same way Christmas Story connects with me because
02:07:37
◼
►
because it is a fantasy nostalgia version of a world I never experienced and probably never really existed
02:07:44
◼
►
but that I feel like I have a connection to.
02:07:46
◼
►
And I think it's actually kind of funny and a fun movie.
02:07:48
◼
►
But I understand it's kind of like the same way that I'm never going to like Elf the way that 90s kids like it.
02:07:52
◼
►
I thought Elf was fine, but it doesn't have that connection to me.
02:07:55
◼
►
So if I had, if it was forced at gunpoint, it says, "You must watch a movie every Christmas from now on as a Christmas tradition."
02:08:02
◼
►
A) I wouldn't like it because that's not how I roll.
02:08:04
◼
►
B) If I had to pick out a Christmas story.
02:08:06
◼
►
Oh, I mean, obviously whatever makes you happy makes you happy, but for me, I came to Christmas
02:08:10
◼
►
Story just like five-ish years ago, and I did not care for it at all.
02:08:14
◼
►
There's no Claymation Christmas, right, Casey?
02:08:16
◼
►
No, it is not a Claymation Christmas!
02:08:18
◼
►
It's so much better!
02:08:20
◼
►
Frankly, that'd be my one!
02:08:21
◼
►
If I had to pick one, it'd be Claymation Christmas.
02:08:23
◼
►
Oh, amen, brother.
02:08:24
◼
►
I am right there with you.
02:08:25
◼
►
Oh my goodness, no, you are doing it through rose-colored glasses.
02:08:30
◼
►
Also, honorable mention for the Garfield Christmas as well.
02:08:33
◼
►
I don't think I've ever seen that one.
02:08:35
◼
►
- Claymation Christmas, I absolutely will concede
02:08:39
◼
►
that it is probably garbage if you look at it
02:08:42
◼
►
with any sort of reasonable point of view,
02:08:45
◼
►
but as someone who is of the correct age, like Marco,
02:08:48
◼
►
who is of the correct age for Claymation Christmas,
02:08:50
◼
►
if I had to choose just one, it's that one,
02:08:52
◼
►
but if I got two, I think Elf would be my second.
02:08:55
◼
►
I really freaking love Elf.
02:08:56
◼
►
That being said, Aaron and I watched Die Hard last night,
02:08:58
◼
►
because that's what you do.
02:08:59
◼
►
- Yeah, I mean, Die Hard is more of a recent
02:09:01
◼
►
internet phenomenon, people like getting it,
02:09:02
◼
►
what was it, like a decade ago,
02:09:03
◼
►
like, oh, Die Hard's a Christmas movie, right?
02:09:05
◼
►
I love Die Hard and I'll watch it anytime.
02:09:07
◼
►
It's a fun movie, but it's not something
02:09:08
◼
►
I ever watched at Christmas.
02:09:09
◼
►
I never went in on that fad.
02:09:11
◼
►
I guess the one Christmas tradition thing that I do have
02:09:13
◼
►
that I actually do do every Christmas season
02:09:16
◼
►
is not a movie, but music.
02:09:18
◼
►
When I was a kid, we had a record, Casey,
02:09:21
◼
►
the Muppets and John Denver Christmas album.
02:09:26
◼
►
- We were a big John Denver family.
02:09:27
◼
►
We had John Denver records, and I loved the Muppet Show
02:09:30
◼
►
when I was a kid, watched every single night
02:09:32
◼
►
or whatever time I was on, 7.30 or whatever.
02:09:34
◼
►
love the Muppet show, I love the Muppets,
02:09:35
◼
►
I love the Muppet movie, I'm of that age.
02:09:38
◼
►
And because we had that Christmas records
02:09:40
◼
►
of the Muppets singing Christmas carols with John Denver,
02:09:43
◼
►
we heard it all the time during my childhood.
02:09:45
◼
►
And for whatever reason, unlike, well, here,
02:09:48
◼
►
unlike the movies, which are crap movies
02:09:49
◼
►
and I stop watching, if this is crap music, I can't tell.
02:09:53
◼
►
Because I still enjoy listening to the Muppets
02:09:56
◼
►
and John Denver sing Christmas carols.
02:09:58
◼
►
So when we put on Christmas music,
02:09:59
◼
►
when we're decorating the tree,
02:10:00
◼
►
and it's my random playlist of holiday music,
02:10:03
◼
►
A huge swath of that is John Denver and the Muppets.
02:10:06
◼
►
I love that album.
02:10:07
◼
►
It reminds me of my childhood.
02:10:09
◼
►
And I think they do a good job.
02:10:10
◼
►
I mean, they're singing Christmas carols.
02:10:11
◼
►
It's like, you know, it's not,
02:10:13
◼
►
it's "Rear Out the Red News," "Reindeer," "Silent Night,"
02:10:15
◼
►
like it's Christmas songs,
02:10:17
◼
►
sung by the Muppets and John Denver.
02:10:19
◼
►
And I still give that a thumbs up.
02:10:21
◼
►
I think some of the songs are funny and good for kids.
02:10:23
◼
►
And, you know, especially if you're a kid
02:10:24
◼
►
and you like Muppets,
02:10:25
◼
►
but some of them are just good renditions of a song.
02:10:27
◼
►
I guess it helps if you like John Denver, which I do.
02:10:30
◼
►
So yeah, that's the one thing I actually still do every year
02:10:33
◼
►
not a single Christmas has gone by where I have not listened to John didn't run them
02:10:36
◼
►
up at St. Christmas Carols at some point.
02:10:38
◼
►
This is from a TV special apparently? I've never heard of this, I'm digging on Wikipedia
02:10:41
◼
►
now, but apparently it's a soundtrack album from a special?
02:10:44
◼
►
I think it was on TV, but I have the record and now I have it in MP3s.
02:10:50
◼
►
Also on the list, it will surprise nobody that I'm a bit of a pack rat about certain
02:10:53
◼
►
things, and I really enjoy the Pentatonix Christmas specials from the years and the
02:10:58
◼
►
Disney Christmas specials through the years, and I'll put those on as like background noise
02:11:01
◼
►
and whatnot. All of these of course are living in Plex as they are wont to do. But no, I
02:11:07
◼
►
freaking love Claymation Christmas and Elf, again I think I'm of the right age, that Elf
02:11:11
◼
►
just absolutely clicked for me. Even though it came out when Marco and I were like almost
02:11:15
◼
►
graduated from college if I remember right, I think it was like 2003 or thereabouts.
02:11:18
◼
►
Yeah, we were a bit old for it. I think if I liked Will Ferrell I would like it, but
02:11:22
◼
►
I don't so I don't.
02:11:24
◼
►
Oh, see, I love Will Ferrell so I'm all in. But no, I don't have any real problems with
02:11:31
◼
►
with any of these, not that it really matters, you do you,
02:11:33
◼
►
but I don't know, Christmas Story just never clicked for me.
02:11:35
◼
►
I've never seen It's a Wonderful Life,
02:11:37
◼
►
so I don't know if that's good or bad or whatever.
02:11:40
◼
►
Obviously, The Grinch, I'm okay with the Jim Carrey version,
02:11:43
◼
►
the original version I love.
02:11:45
◼
►
One of you, I think John had mentioned Rudolph,
02:11:48
◼
►
and all the Rankin/Bass stuff, and Frosty,
02:11:50
◼
►
that always happens, but no,
02:11:52
◼
►
Elf and Claymation are my two winners.
02:11:55
◼
►
- Now, speaking of owning your place on the web
02:11:57
◼
►
and having permanent links to live on,
02:11:58
◼
►
this is a difficult problem,
02:12:00
◼
►
I just put a link in the show notes.
02:12:02
◼
►
I said, I want a link in the show notes to the John Dutton
02:12:04
◼
►
and Ryn Muppet albums that I was thinking of.
02:12:06
◼
►
Someone in the chat room put an Apple Music link,
02:12:07
◼
►
but then you're like, oh, but what if people
02:12:09
◼
►
don't use Apple Music?
02:12:09
◼
►
Should I have a Spotify link?
02:12:11
◼
►
Maybe it's on YouTube, blah, blah, blah.
02:12:12
◼
►
So there's these services that have grown up
02:12:15
◼
►
over the past few years that try to provide
02:12:19
◼
►
a canonical link to a song and say,
02:12:21
◼
►
hey, if that song is available in Apple Music,
02:12:23
◼
►
here's how you get to it.
02:12:23
◼
►
If it's a song on Spotify, here's how you get to it there,
02:12:25
◼
►
and so on and so forth.
02:12:26
◼
►
But of course, those services themselves
02:12:28
◼
►
are just another thing that might disappear.
02:12:30
◼
►
So album.link is the song.link.
02:12:35
◼
►
- Oh, see, I use songwhip, but it's the same idea.
02:12:38
◼
►
- Right, but that's the problem.
02:12:39
◼
►
None of those things are any more permanent
02:12:41
◼
►
than the individual services.
02:12:42
◼
►
So I'm gonna put an album.link thing in the show notes,
02:12:44
◼
►
but five years from now, when album.link is a spam site
02:12:47
◼
►
that tries to give you a virus, I feel bad,
02:12:49
◼
►
but there just isn't a canonical place for this.
02:12:52
◼
►
You can't rely on the record label to have a canonical link
02:12:54
◼
►
'cause they change the URLs every day
02:12:55
◼
►
and they probably don't have it up anymore.
02:12:56
◼
►
So we're gonna put an album.link here,
02:12:58
◼
►
This just shows the problem of like if if the people who own this media really cared about it
02:13:03
◼
►
They would provide canonical links for all the media in their catalog that then sublinked out to its availability on services
02:13:10
◼
►
But that's a lot of work and they just seem like they're not going to ever do that. So we're stuck with these things