PodSearch

The Talk Show

371: ‘The Skin of Your Pants’, With Daniel Jalkut

 

00:00:00   Daniel, it's good to hear your voice.

00:00:01   Good to see you.

00:00:02   Good to hear yours.

00:00:03   Great to be back on the show.

00:00:04   You were born in California, right?

00:00:07   You're you're a native native San Jose.

00:00:09   Not San Jose.

00:00:10   Never say it.

00:00:11   Never say it.

00:00:11   How dare you, sir?

00:00:15   No, it's what I'm drawn.

00:00:17   I'm Santa Cruz.

00:00:19   I actually was born in Southern California and my parents were

00:00:23   row roaming van life hippies for awhile.

00:00:27   and they eventually settled in Santa Cruz, California.

00:00:31   So I grew up in that town.

00:00:33   - At this point in your life though,

00:00:35   do you think you're used to the East Coast weather?

00:00:39   - Oh yeah, yeah, I'm good at the weather.

00:00:41   I am so used to the East Coast weather

00:00:44   that I put a lot of East Coasters to shame

00:00:47   because for, it's just like the Californian, right?

00:00:51   To come into town and be like, listen,

00:00:53   I'm doing the East Coast better than you.

00:00:54   But for instance, I run every other day.

00:00:58   You know, it's like my regimen.

00:00:59   And I get a lot of interesting looks

00:01:03   when it's zero degrees blizzard.

00:01:06   And I'm out there with like two pair of long underwear

00:01:10   and a ball of clava.

00:01:12   Tell you what, I never learned that in California.

00:01:17   - Our mutual friend and your relative neighbor, Paul Kavasis,

00:01:21   is a year long runner as well.

00:01:23   And I've, I have seen pictures of him, selfies where there's like frost on his

00:01:29   beard.

00:01:29   Oh yeah.

00:01:30   I've had like the icicles coming off my eyebrows.

00:01:33   It's like, look like a cold, what's the cold miser or whatever.

00:01:36   But yeah, it's, it's kind of fun.

00:01:38   It's kind of fun when, when it's so severe that people kind of look at you like

00:01:43   you're crazy.

00:01:43   I guess I've never really, that's the Californian in me.

00:01:46   Not, not shying away from people thinking you're a little cuckoo.

00:01:49   I brought it up with Katke too.

00:01:51   Cause I don't know.

00:01:52   somehow I've lived my whole life other than two years where I lived up there your way in

00:01:57   Massachusetts when I was working at bare bones. I've lived in the Philadelphia area, greater area

00:02:02   from birth till now. So I'm used to East Coast weather. But the older I get, the less used to it

00:02:09   I get. Yeah, I don't know. It's on my mind more you I would have thought by the time I hit 50.

00:02:14   I'd be used to it. And it's like you did like my dad doesn't seem to talk about the weather. It's

00:02:19   He's been through 84 years and he's used to it.

00:02:23   Whereas me, it's like when it gets dark now in the fall, I'm just shocked by it.

00:02:28   I'm like, I can't believe it.

00:02:30   And now in the spring, when it starts getting nicer and we've got all this wonderful daylight,

00:02:34   I'm like overjoyed.

00:02:35   I'll tell you what, it's one thing I really appreciate about the East Coast compared to

00:02:41   California.

00:02:42   It's the very thing that a lot of people hate about it is the severity of the variation

00:02:47   because you don't get that kind of variation typically in California, especially the part

00:02:53   Northern California where I grew up. But, oh, so one of the first things I noticed moving out here

00:02:58   was like, you really know when a season changes. And it's like the seasons change in California,

00:03:05   but it's like a 10 degree difference from one extreme to the other.

00:03:11   Right. Well, and like on the highway, whichever way you want to drive between the city and the

00:03:18   valley up there, you know what season it is by whether the mountains are burned out or green.

00:03:23   Well, that's true. Well, yeah, well, that's true. Although I will add, I'll hasten to add that

00:03:28   typically they're always burned out. Like, typically, it's not seasonal. It's just like,

00:03:34   there's never, it's kind of funny that grass, particularly the grass, it's like, designed to

00:03:41   stay like yellow and it's like only when I right now as everyone knows probably

00:03:46   California has had this but very unprecedented amount of rain California

00:03:51   is probably just gleaming green right now and it's so unusual but anyway the

00:03:57   thing I like coming out here with the with the variation and I keep noticing

00:04:00   it every year is I forget how much I like this is this is lucky as a

00:04:05   transplant to New England, it's lucky that I legitimately like every season.

00:04:11   And that's good because I notice every time a new season comes around I'm like,

00:04:17   "Oh right, I'm tired of it being fall. I want it to get cold." And then

00:04:22   a couple months of that I'm like, "Damn it, it's too cold. I'm ready for some

00:04:26   spring." And I guess like you were saying, if your dad just sort of

00:04:32   got used to it and he's not really doesn't really notice or care or whatever. I'm noticing it always

00:04:38   constantly. But I'm also sort of like always surprised by how it makes me feel like every

00:04:46   time fall comes around. I'm like, Oh, yeah, this is what falls like. Well, my dad notices I should

00:04:52   say because he's outdoors. One of me he walks the golf course still at age 84. And so like,

00:04:57   when it warms up, and it's good to play golf, he definitely know, but he just doesn't complain

00:05:00   about it is my take. Whereas I and never became like a snowbird who him and him and my mom don't

00:05:06   go to Florida or something like that. Right. Which is what I'm thinking. Where I'm going is I think I

00:05:10   got to find a way to go somewhere. Probably not Florida, the way Florida is going, but somewhere

00:05:14   warm by the time I'm old, because I can't take it anymore. But I want to move back by March,

00:05:19   because I want to be here for for when it gets nicer. I'm, I'm just in a good mood is all I'm

00:05:25   saying. That's good. Yes. Well, I think the trick is if you don't want to go to Florida, I think the

00:05:30   trick that from the East Coast what people seem to have figured out are the islands the various

00:05:35   like kind of I don't know I don't know a lot about those cultures but there's so many Puerto Rico

00:05:42   right yeah the U.S. right yeah we vacationed there last year it's really just gorgeous just

00:05:48   beautiful beautiful place yeah maybe that might be the answer speaking of good news I don't know

00:05:52   is it good news we can start with some news today the day we're recording Wednesday Apple announced

00:05:58   the and speaking of Northern California, WWDC 2023, I would I would say, the most predictable

00:06:06   date on the calendar, June 5, tonight, right? Otherwise, if it wasn't June 5, tonight,

00:06:12   that almost certainly would have been the next week. Yeah, well, particularly now that,

00:06:16   as you noted, it's become more and more well, I don't know, you noted that last year did turn out

00:06:23   to be the template for mostly online with the in-person event. But the fact that the in-person

00:06:30   event is at Apple HQ, Apple Park, some may call it, that means they don't, like we used to all

00:06:39   just guess because we were so concerned about how to get hotel rooms and all that. We used to guess,

00:06:43   but we knew there were constraints like what other conferences are already booked and what hotels

00:06:50   already have full reservations. But now for a company that does like to repeat the same

00:06:58   playbook again and again, like why would they ever pick a different date if they control

00:07:03   everything about it? Well, you mean a different location? Well, I mean, like, well, no date,

00:07:08   because like, we used to guess at like, well, right, right. Second week, third week of June,

00:07:13   and like, sometimes you gotcha. We didn't know how much of it might be like internal problems,

00:07:18   like they wanted to finish something up or, but we mostly I think, assumed that it was like, well,

00:07:23   there's going to be one of these two weeks. And turns out, like Oracle's having a big conference.

00:07:27   But now there's nothing like that to impose anything on them as far as I know.

00:07:33   Right. Right. Both both me like as a public pundit writing a daring fireball and our whole

00:07:39   crew community, I was I would even expand it to of people who wanted to go we used to play all these

00:07:45   games, looking at the Moscone schedule and seeing, "Oh, well, here's the American Association of

00:07:52   Bicycle Mechanics is having a thing at Moscone West the second week of June. So that's out. It's

00:07:58   got to be the first week, right?" And there'd sometimes be like private event or something like

00:08:03   totally generic, but no name for what it was on the Moscone West calendar. And we would just

00:08:10   think that's got to be it. But you know that Apple always I mean, one of the reasons they

00:08:14   every other major developer conference or major tech conference I know of or conference period,

00:08:21   like whether it is the American Association of bicycle mechanics or whatever. They always

00:08:27   that they pick their dates years in advance, right? It's like I remember Mac World Expo,

00:08:32   it's like as you'd be leaving, there'd be they'd be handing out flyers, right? And they'd have

00:08:37   sign up forms. You could sign up for next year's Mac world expo. They'd be like, see you next year

00:08:42   and here's January 5th or whatever the date was. And they'd be trying to get you to buy tickets and

00:08:48   sign your, when I was at bare bones and we had a booth that they definitely were offering steep

00:08:54   discounts or significant discounts to commit to the next year already. Apple's is, WWDC is the

00:09:01   only conference, major conference I've ever known of that announces like two and a half months out.

00:09:06   sometimes closer, right? And it's because Apple is Apple and they want to, you know, just in case,

00:09:13   you never know what if something's delayed. There was that one year way, way back where they had it

00:09:19   in August instead of June because something wasn't ready. Yeah, it's hard to even imagine that now it

00:09:26   has become so predictable. Right. But it's funny that you mentioned like there would be like a

00:09:30   reservation on the conference center for an unnamed guest. And it's like every other company

00:09:37   or organization in the world practically, unless it was going to be like a CIA meetup,

00:09:42   would be like, "Why wouldn't we want to hide the fact that we are a vibrant company that can have

00:09:47   public shows?" It's like another opportunity to get your name out there.

00:09:52   Right. Predictable public shows, right? For any other company, not scheduling it until weeks in

00:10:00   advance would make it seem like you were flying by the skinnier pants or what's the what's the phrase

00:10:05   CD your pants CD your pants all right i guess so yeah not the skinnish skinnish

00:10:09   skin of your teeth which doesn't get any of your teeth all right what a terrible mixed metaphor i

00:10:15   don't i don't i guess if you have leather pants though you might have skin out of your pants but

00:10:19   otherwise i don't know i don't know about you i don't i don't i'm i'm low on leather pants at the

00:10:25   at the moment. Yeah, only chaps here. I do think though, and I hate to complain,

00:10:31   I think last year's was great. It felt like last year's WWDC, after the keynote was over,

00:10:40   even before the State of the Union, it already felt like, yeah, this is the way that they're

00:10:45   going to keep doing it. It was, I don't know if they're going to be able to put more people

00:10:49   on the lawn. It is intriguing as an event, what's the word, organizationally, that Apple Park was

00:10:58   never designed with this, right? The Steve Jobs Theater was designed to host the type of things

00:11:04   that they do host there, but the lawn was not meant to host 1500 people for this. And I've

00:11:11   taught, I talked to people who were there and they were like, yeah, definitely not that it was just

00:11:15   sort of serendipity and the advantage of the fact that they bought so much square acreage of that

00:11:26   area to build the ring building and have it surrounded by so much green space that they

00:11:31   have the room to do that. But it wasn't meant for it by design or planning, yet attending it

00:11:40   in that area felt very smooth. You would have thought that it was part of the original spec

00:11:46   for the campus. Like, maybe we'll do this. Well, that's the nice thing about open spaces

00:11:50   is you can use them for that kind of, I mean, the main challenge, as I understand it, was

00:11:56   the projection of video, which apparently they did a brilliant job of. Yeah, it was. It really was.

00:12:01   It was unbelievable. But for the rest of it, it's just kind of like an opportunity to mill about,

00:12:07   right? Milling about. That's what open spaces are good for. And when I think back to,

00:12:15   I think you know this, but for probably the past 10 years or so, even when WWDC was happening in

00:12:22   person, myself along with other friends of ours, many of us didn't buy tickets for the conference.

00:12:30   and the one aspect of the conference I missed was the Apple Bash, which is kind of what they're best

00:12:40   suited now to host. So I don't know, it makes sense, yes, it wasn't designed for this purpose,

00:12:49   and especially for seeding a bunch of people to watch a keynote event, but I hope that they

00:12:55   lean into the advantage they have of it being suitable for that big milling around space,

00:13:01   because that's a really nice event. That's the part WWDC that sort of invites everybody to just

00:13:08   kick back and all meet in one place, not have any pretense of an educational or classroom environment.

00:13:17   That's my idea of a good time.

00:13:20   It is weird and it does sort of—and again, there were hints last year, not just because as an

00:13:26   attendee in the media and talking to other media people and talking to the people who I met. There

00:13:33   are people listening to the show right now who I shook hands with and took selfies with, because

00:13:38   they got the lucky tickets in the special event draw to be there. And I look forward to that

00:13:42   again, too. But everybody who thought, Yeah, this is really great. I also did talk to people from

00:13:50   Apple. And there were, again, their Apple people and nobody, even someone in a position to actually

00:13:56   state it authoritatively, yes, we're going to do that this way every year. They at least

00:14:01   gave a like a look on their face that was like, Yeah, this Yeah, we're really happy with the way

00:14:08   this has gone and it's almost certainly the way it's going to go. And I know the other thing that

00:14:13   they did say, honest, I'll just say it, even just talking to Phil Schiller, who runs events,

00:14:18   they're over the moon, happy with the post COVID online delivery of the developer sessions.

00:14:28   There was no question that they're going to pre-tape those probably starting like now,

00:14:33   right? I'm guessing now is the time when everybody inside Apple is submitting the session ideas,

00:14:39   but that the Apple is so much happier with the way that they turn out and so getting so much better

00:14:48   feedback from the, I guess, millions, right? Millions of developers around the world who

00:14:53   watch them that it's just, for lack of a better word, pedagogically a vastly superior format.

00:15:00   the presenters who are all practicing Apple engineers and designers and not, therefore,

00:15:08   not professional speakers, no longer have to deal with the, what do you want to call it? I mean,

00:15:13   stage fright. I mean, stage fright, it's fear. It's just the natural anxiety that almost almost

00:15:20   everybody has of public speaking, whether you actually are further down the stage fright

00:15:25   spectrum or further up on the comfortable spectrum. It still is something that people

00:15:30   who work full time at Apple doing real engineering and design work just don't practice and therefore

00:15:35   aren't that great at even though over the years, the many, many decades of WWDC with in person

00:15:41   presentations, they were always very good because they rehearsed right and they had professional

00:15:45   coaching from absolute experts to help them. So the coaching and the practice and rehearsals,

00:15:52   it was always very good, but it's just not their game, right? And so being able to do it in the

00:15:57   comfort of without an audience and being able to do multiple takes and not having to worry about

00:16:04   this is the one live one where I better not screw it up or my demo better not go wrong. All that

00:16:09   pressure is gone. And the other huge advantage of it, in my opinion, and you must have a stronger

00:16:15   opinion of it because you consume more of the sessions as a working developer, it lets all

00:16:21   sessions be the length they need to be. Some of the sessions are 15 minutes because it's 15

00:16:25   minutes of material, whereas the old way, if it was 15 minutes of material, you had to pad it out

00:16:30   to, I guess, a full half hour or an hour. I forget if they were all hour-long slots or what, but...

00:16:35   Jared: I think they were at least a half hour.

00:16:36   Pete: Right.

00:16:36   Jared. I found that extremely tedious, especially as somebody who just doesn't

00:16:42   particularly enjoy classroom environments anyway, in the best of cases. So, yeah, when you got the

00:16:49   sense that, "Oh, boy, this is just being dragged out to like, it's just the worst." But yeah, the

00:16:54   The other thing is, all those advantages you said, and another one that occurs to me is,

00:17:00   when you said they might be working on them right now, not only that, they might be finishing

00:17:05   them right now, which is something you would never do in the past.

00:17:10   You wouldn't have somebody preparing and practicing and getting to the point of being able to

00:17:16   present it now because they might forget it by the time the actual conference comes around.

00:17:21   You know what I mean?

00:17:22   It was kind of like everybody had to rehearse towards a specific deadline and now they can

00:17:27   stagger it.

00:17:28   I bet that's really nice for...

00:17:30   I remember now 20 years ago when I last worked at Apple, I remember it was like a season

00:17:36   of WWDC prep where...

00:17:39   I never gave a talk at WWDC, but coworkers of mine who were doing them, it's like the

00:17:45   season would come around and suddenly they were just like off limits.

00:17:49   They're busy.

00:17:50   They're doing stuff.

00:17:51   to the people who help them present. So that must be nice too, because it's not like I imagine it's

00:17:57   not so much like a no sleep till WWDC type of culture anymore. Yeah, I guess I don't think so.

00:18:04   So I think it's all for the better. I will I can't let an episode of the show go by without

00:18:10   complaining about something though. And I will say I'll go back to my only complaint about this new

00:18:15   format. And I really do think this is it going. This isn't just year two. It's not just Oh,

00:18:21   the tail end of COVID and COVID concerns, and they still want to do it this way to have it outdoors.

00:18:28   I mean, even though that is an advantage, right? Other than the unlikely but not impossible chance

00:18:36   that Monday the 5th is a rainy day, which I really—and I remember asking last year,

00:18:41   what was Plan B if it rained? And Plan B if it rained was, well, everybody's going to be wet.

00:18:47   I mean, there is no way to put a tarp over 1000 or 1500 or however many people there were. The big

00:18:56   problem with the location is hotels, because Cupertino has very few hotels. The one I stayed in,

00:19:04   which I picked because of its proximity, because I knew I was going to be there for a couple days

00:19:08   because the media stuff tends to span two days. I was doing my live show at Apple. I think we

00:19:16   we did it on Wednesday last year or I forget if it was Wednesday or Tuesday. Might have

00:19:23   been Tuesday, but whatever. I knew I'd be there at least through that and I knew I'd

00:19:28   be going back and forth from my hotel to Apple Park multiple times. So why don't I stay,

00:19:34   oh, here's, I think it was, I don't want to throw a hotel chain under the bus, but

00:19:41   national hotel chain is literally right across the street. But when you're looking at Apple Maps or

00:19:49   Google Maps or any of these maps, you see the ring, the rings there on the map, and from a

00:19:54   global perspective, from Philadelphia, a continental perspective, boy does that hotel look close to

00:20:01   Apple Park. But the truth is it's all the way around Apple Park from the end to the visitor

00:20:08   center. So it's like a mile and a half probably. I think it is

00:20:14   honest to God. And so I am a walker. I mean, I'm as I record

00:20:18   with you today is from a slack group. I'm in my my car. My car

00:20:23   hasn't been able to start for weeks because we went weeks

00:20:26   without starting it in the winter. And I still haven't

00:20:29   needed to drive anywhere. So I'm waiting to get the battery jump

00:20:33   started until I actually know I need the car. So that I mean,

00:20:36   is how we have we own one car haven't driven it in two months and literally doesn't even start at

00:20:41   the moment. I like to walk I really do I that's the most exercise I get is is walking throughout

00:20:48   my daily life but walking a mile and a half to get to the keynote and then anything I might want to

00:20:54   go back to my hotel I ended up taking Ubers and lifts all the time. And it was not a great hotel.

00:21:00   And so once you're getting an Uber or a lift anyway, it's like five minutes more if you just

00:21:05   expand the realm five minutes, there's way more hotel options, but none of them are great. You

00:21:11   know what I mean? It's like, some, some people I think was Marco there last year. I think he might

00:21:16   have been I know. Yeah, Marco was there. I think he had one of those Mark famous Marco, Tim Cook,

00:21:22   close encounters. Yes, yes, exactly. Right, right, right. With the blue with the with the the the

00:21:29   that blue, what do they call it? What's the name of the MacBook color midnight midnight?

00:21:34   Right. Marco had been waiting and waiting and waiting and waiting for hands-on time with the one

00:21:40   midnight MacBook that was on display. And by the time it was his turn, somebody tapped up on his

00:21:46   shoulder and said, "Tim's going to take a photo here, so you need to back away." And it happens

00:21:50   to him every year. Right. And now I remember, because I had press briefings while the State

00:21:58   of the Union was going on, and Marco and our, again, mutual friend of the show, David_Smith,

00:22:04   I ran into them at the cafe at the visitor center and they were kind enough to give me

00:22:09   the Marco and underscore summary of the State of the Union, which unsurprisingly was GPT

00:22:16   quality as a summary, as we'll get to later in the show. Very high quality summary of

00:22:23   the show. Yeah, they were there. I think they stayed at, or at least Marco maybe, I know

00:22:27   I had some friends who stayed all the way back in San Jose at the whatever the name,

00:22:33   new name of the Fairmont is? Oh yeah, it's like a Hilton or something I think. Yeah,

00:22:39   they renamed it, redid the lobby completely because I was there, I think I stayed there

00:22:44   then for the iPhone event later in the year because I knew not to stay in Cupertino. I

00:22:49   don't know. It's like that's the one thing, we used to bitch and moan so much when it

00:22:53   was in San Francisco at Moscone about the hotel prices, but it's like we didn't know

00:22:59   how good we had it where we could bitch and moan about the price of hotels, which again

00:23:03   is laughable compared to the hotel prices now. Although maybe with San Francisco being

00:23:07   dead, as everybody says, maybe San Francisco hotels are cheaper. But I have no reason to

00:23:12   go to San Francisco anymore. But anyway, I didn't realize how lucky we had it. Every

00:23:18   single hotel was what, a block and a half away from Moscone?

00:23:21   Steven: Yeah, it was really perfect. You had every kind of hotel too from pretty high,

00:23:27   just pretty high end, really high end, I would never consider going to, down to pretty high

00:23:34   end, down to run of the mill, down to total rat trap. You could stay in a turn of the

00:23:43   century 1905, survived the great fire hotel with a shared bathroom, and then if you couldn't

00:23:51   afford that, you could go stay at the hostel. It's like they had everything.

00:23:55   Yeah, everything. But it's so close. And then so you just didn't have to worry like, "Hey,

00:24:00   do I have everything I need in my bag or my pockets?" I would just leave my laptop. I

00:24:05   just leave with a notebook and know that if I needed my laptop, I was a block away or

00:24:10   two blocks away. Now, when it's miles and miles and an Uber ride away. And it's, again,

00:24:16   it's fun to go and I'm so glad to have been back after COVID last year. But it's like

00:24:20   to go back. It's like you have to go wait in a special line for the ride shares. And

00:24:26   it makes sense, right? They can't just have Ubers and Lifts pulling up higgledy-piggledy

00:24:31   on the sidewalk all around Apple Park. So, there's like a dedicated go over there to

00:24:36   Building 11 or something like that. And in the parking lot, there's a queue for the

00:24:40   ride shares, but it's not great. And it doesn't make sense to rent a car because

00:24:46   the parking situation, they really don't, they don't have parking for everybody. They

00:24:50   don't, they did not build the Apple park visitor center with the idea that they'd be again,

00:24:54   that's the one side effect of not designing it with this. They don't have parking for

00:24:58   1100 people or however many people show up.

00:25:01   I really liked it. One thing I really liked about the one year I went to San Jose in the

00:25:07   modern era, San Jose WWDC, because I had skipped it maybe sort of out of like snobby protest

00:25:13   for a couple years because I was like, "This isn't San Francisco." But then I was like,

00:25:17   "Oh, everyone's there. I better go." And I really did enjoy. So I stayed at a hotel.

00:25:22   I remember it was me and Craig Hockenberry were both at the hotel about a mile away from

00:25:28   the scene. But it was that time, I don't know if they still have them. I know they sort

00:25:34   of had their time and they've lost popularity, but it's that time when the little rental

00:25:39   scooters were everywhere.

00:25:40   like oh yeah yeah yeah no what were they called not lift what were they these were

00:25:44   a lot i lie my limes lime yeah they might still be there i don't know but that was fun yeah

00:25:50   well so for instance if you had lime scooters outside your hotel around the corner from apple

00:25:56   park it would have made everything a lot easier probably you would just be like whatever i'll zip

00:26:00   over there and sit back but yeah it's a problem it's not that whole area i mean i know that area

00:26:06   so well because I worked at infinite loop for seven years, I know that area better than

00:26:15   most areas. And it's a really, really boring area that has some pretty good restaurants. That's how

00:26:23   I should summarize it. And so that's at least something. It actually has pretty good, I mean,

00:26:28   you have to travel a little bit, but it has, in its way, weirdly, that area probably has better

00:26:35   restaurant service than downtown San Jose, which is kind of weird.

00:26:40   Well, and I would say everything has been I know,

00:26:43   this is a lot. I'm just throwing you a lot ball here. Right. But it is it's I guess what you're

00:26:50   saying no San Jose as a city of its size has an unusually sparse downtown restaurant scene. And

00:26:57   the sort of coupe greater Cupertino driving area like five minutes from Apple Park has surprisingly

00:27:04   good restaurants. And I agree with that. And I just always, because I have friends who work at

00:27:09   Apple, just defer to them. And they'll be like, "Oh, we should go here." And I'm like, "You tell

00:27:13   me." And then every place that, you know, whether it's like a stupid mom and pop sandwich place for

00:27:19   lunch or a pretty nice sit down place for at night, it is, there's tons of good places to eat.

00:27:25   I will say this, coinciding before we segue, I feel like it's this sort of,

00:27:31   It relates to my aforementioned talk about the season change, right? When you have seasons,

00:27:36   to me, it internalizes the annual schedule. And I do think, I think it's safe to say

00:27:45   that you don't have to be a scientist to state with authority that humans evolved

00:27:53   to sort of grok the count, the annual calendar, right? And it's like, oh, it's the holiday season,

00:28:01   it's Thanksgiving, and then after Thanksgiving, all of a sudden it's Christmas and New Year's,

00:28:05   and it's dark, and it feels cold, but you don't even know it's not cold yet because it's not even

00:28:10   really winter, and then springtime rolls around, etc. And I've always thought Apple in particular

00:28:17   really does work on an annual schedule. And it's only gotten more regularized under Tim Cook.

00:28:24   And I think, again, in the way that an effective leader instills their personality into the

00:28:32   organization, that the Tim Cook-ification of the last decade of Apple is that they're

00:28:38   even more reliable, right? That it's even more certain that WWDC is going to happen

00:28:46   exactly when you think it is. And more, it seems to me, more and more of their products are,

00:28:53   hardware products are getting more predictable in their updates. I mean,

00:28:59   would be a whole tangent for the show to talk about like, why the 24 inch iMac hasn't been

00:29:06   updated with the M2. But you know, it seems to me that in the Apple Silicon era, though,

00:29:11   Macs are going to be getting updated on a more regular basis, like iPads at least,

00:29:16   even though it doesn't seem like anything other than the phone and the watch are on it precise.

00:29:22   Every single September, that's when the new ones come out. But WWDC is,

00:29:28   I would say, clearly the two tentpole annual things in Apple schedule. WWDC in June,

00:29:37   the iPhone/watch/if there's something else special to announce, do it with the iPhone

00:29:44   September event. And the thing about WWDC, and I would say in the Tim Cook era,

00:29:51   especially the last five, six years, has really become a sort of… It's not just, "Here's

00:29:58   a bunch of new developer APIs, and here's in the morning, we'll show you some new features that

00:30:05   are coming out in the fall, it's sort of, "Here's what we're doing for the next 12 months."

00:30:11   Right? We say Apple is very secretive, and they are, and they hopefully won't leak what is going

00:30:18   to be announced at WWDC. But effectively, WWDC has become a way for them to leak stuff that won't be

00:30:25   coming until next March or April. Right? Because look at this week with the—this is my segue,

00:30:32   my very long meandering segue. This is the week where Apple updated, I think, literally every

00:30:39   operating system they have, right, with a sort of feature update, Mac OS, iOS, iPad OS, watchOS,

00:30:46   tvOS, the HomePod OS, something else. Oh, the studio display even got a firmware update,

00:30:53   right? It's like all at once. It's Michael Corleone taking pictures today.

00:30:57   Also, you might even count this as another platform getting an OS update, the pretty

00:31:02   substantial Safari update. Yeah, yeah, I would say so. I would say that WebKit

00:31:08   counts as an Apple platform. They don't call it WebKit OS, but web browsers, the web rendering

00:31:14   engine effectively, it certainly is the complexity and size of an operating system, right? I don't

00:31:20   know that it's completely fair, but I would guess WebKit overall is a larger software project than

00:31:27   watch OS yes I would think so but who knows but yeah at least it's fathomable

00:31:33   to me I mean I mean they of course both of them of course lean on all of the

00:31:38   infrastructure of other OS projects but right right it's it's that significant

00:31:44   and it is it's been a major update for WebKit and they including a couple of

00:31:49   again this could be a whole segment of the show that we probably don't have

00:31:52   time for, but some mobile web kit features that web developers have been clamoring for

00:31:58   for a long time, like the notifications and which is a big deal for like the what I call

00:32:05   PWAs, progressive web apps, but it basically what when you save a web app to be an app

00:32:10   on your home screen and making them more feasible to act like you expect an app to be able to

00:32:18   act by giving you notifications and stuff.

00:32:20   Right. And I don't, I don't know that much about that world, but I, I have had the feeling,

00:32:25   the impression for years and years that to summarize most web developers have been grumpy

00:32:31   about the speed with which Apple has embraced those kinds of features. So I don't know.

00:32:37   Well, I think there's two, there's two sort of slants to the criticism. I think, and again,

00:32:44   maybe I'm biased as a Safari user and as a native app proponent and as somebody who does not want to

00:32:53   see and I guess most, my bias is most covered, colored by my deep antipathy for the web

00:33:05   amplification of desktop apps, right? Just to, I know that not all written with Electron, but

00:33:11   that's just use Electron as a catch-all umbrella, the electronification of desktop software

00:33:17   development. I don't want to see that happen to mobile too, or at least want to see it happen as

00:33:22   slowly as possible. And so there, in some sense, I'm rooting against Chrome and Chromium taking

00:33:30   over the mobile market the way they have the desktop market. And so there's one contingent

00:33:37   of web developers who just want that to happen, right? That they, because they're all in, the way

00:33:42   they develop software is to just use all of the bleeding edge Chrome features and they really see

00:33:48   right once run everywhere as nothing but a positive, or that the positives greatly outweigh

00:33:56   the negatives in the overall scheme of things, want that to happen. And I just disagree. I'm

00:34:02   not saying they're wrong. I'm just saying I disagree with them, right? There's a difference,

00:34:05   right? I see their point, but I don't think that's good for the platform experience.

00:34:11   But on the other hand, I think there's more pragmatic, less zealotrous web developers who

00:34:16   just are frustrated by the pace that Apple has adopted some of these technologies that to me

00:34:23   aren't about. The conspiracy theory such that it is that Apple as a company holds back WebKit

00:34:31   on mobile simply to ensure that the App Store remains the only feasible way to do X, Y, and Z.

00:34:39   **Matt Stauffer** Yeah. That seems false on the face of it because they could do such a worse job

00:34:45   than they do. If they really wanted to hamper mobile advancement of web apps,

00:34:52   then we would not be seeing even a quarter of the enhancements to WebKit over the years.

00:34:59   Right, and it's sort of an insult to the WebKit team. And I can't say I'm like tight buds with

00:35:05   any of them, but I do know several of the people on the WebKit team and they've been,

00:35:08   and a lot of them have been there for a very long time. And they're very smart and very proud and

00:35:13   very, very proud of both their work and WebKit's position in the world. And it's frankly insulting

00:35:21   to argue that they just go along with, "Hey boss, we would like to do X, Y, and Z." And they're like,

00:35:28   "Nope, because we want to protect the App Store," and that they just take it. And they're like, "Okay."

00:35:31   Right. Yeah, they have different priorities than the people who are clamoring for X, Y, and Z

00:35:38   features. And just so often the case when... I think WebKit embodies that in the context of

00:35:46   the web, they embody that Apple kind of idea of being willing to say no in order to focus on the

00:35:56   things you think are more important. And I deeply respect the WebKit team as well. I think it's I

00:36:03   know, I, for the record, I'm also a Safari user, but I get it, there are some shortcomings, but I

00:36:09   think they focus a lot on things that that matter to me anyway. So I, I think I'm gonna piss some

00:36:16   people off. Well, I keep meaning to redo it, and I will, but I'll hear I the talk show is where I

00:36:22   I spoil stuff that might be forthcoming on Daring Fireball.

00:36:26   And often what I spoil here never comes out anyway.

00:36:30   But a couple of years ago,

00:36:32   I think it was back around the time or before,

00:36:36   remember when Apple had the round table where they said,

00:36:40   okay, this trashcan style Mac Pro was a mistake,

00:36:44   but we're not giving up on a pro market,

00:36:46   but we don't have anything to announce right now,

00:36:48   except for the fact that we have been working

00:36:51   on a pro-oriented iMac, which they didn't say

00:36:54   would be called the iMac Pro, but which was the iMac Pro.

00:36:58   That was a couple years, it was like 2016, 2017,

00:37:00   something like that.

00:37:01   So it's been a while since I did this,

00:37:03   but what I did is I, I think it was pretty clever,

00:37:07   is I wrote like an Apple script to go through.

00:37:09   I picked like 25 popular websites.

00:37:13   I forget how I picked them, but they felt like a fair thing.

00:37:16   And I wrote an Apple script that would open them up in tabs

00:37:19   and do some, use like the GUI scripting

00:37:22   to do fake scrolling, like to sort of simulate a user

00:37:27   actually reading the web page.

00:37:29   Open it up, wait 10 seconds, scroll down,

00:37:33   wait 10 seconds, scroll down, open a new tab,

00:37:37   do this with a new website, do the same thing.

00:37:41   Keep 10, 15, 20 tabs open in the background,

00:37:45   then close them all, do it again,

00:37:47   and just keep going until the battery's out

00:37:51   and see how long the battery lasts in Safari.

00:37:54   And then ran the same Apple script against Chrome

00:37:56   against the same websites.

00:37:58   And I thought there's other,

00:38:02   I'm proud of myself because it seems like something,

00:38:04   you know, like Macworld Labs used to do back in the day

00:38:07   and have these test suites.

00:38:08   But I thought it was a pretty good simulation of like,

00:38:10   hey, with nothing but a web browser running,

00:38:12   what's the difference between Chrome and Safari?

00:38:15   And I forget the answer, but it was significant, right?

00:38:17   It was like hours on a MacBook at the time.

00:38:20   It was hours of difference overall.

00:38:22   So I keep meaning to redo it and see what,

00:38:25   here in 2023, what's the difference?

00:38:27   I don't know if Chrome's closed the gate at all

00:38:30   or closed the gap.

00:38:31   I might've, you know, every once in a while,

00:38:33   I notice in the Chrome release notes

00:38:35   that they say something about battery life improvements,

00:38:38   but I don't think there's no doubt in my mind,

00:38:41   and again, it's the order of your priorities.

00:38:45   Of course the Chrome team at Google wants battery life to be good, right? It's just

00:38:50   not as high a priority for them as it is for WebKit, right? It's both teams value that.

00:38:56   And I'm sure in the same way, WebKit values adding new features for web developers. But

00:39:01   there it's a little bit of a lower priority for WebKit to stay on the bleeding edge of

00:39:08   web developer APIs and more of a priority, if not the highest priority, to do everything

00:39:14   a web browser has to do the most power-efficient way possible. Whereas at Google, maybe the

00:39:21   APIs and staying on the bleeding edge and pushing the platform forward as a developer

00:39:25   thing is the highest priority, and being energy-efficient is second or third. But I wouldn't be surprised

00:39:32   if that's a difference. And I just know that that's a big deal at WebKit. And for

00:39:38   web developers who only care about us, we built this whole web app, and if only Safari

00:39:43   supported this API, too, we could take out all of this extra code that we have to do just to run on

00:39:49   the iPhone." It just completely doesn't bother them at all that there's something like power

00:39:55   efficiency, which consumes massive amounts of the WebKit team's attention and time. And they're

00:40:01   like, "What do I care about that?" Whereas for them at WebKit, they're like part of Apple,

00:40:06   and it's like part of the, "Hey, how long does the battery on your iPhone last?" Right? And if WebKit

00:40:12   is less efficient, that's a lower number. And that makes

00:40:15   Apple as a whole look bad. Yeah. Any any features from this

00:40:20   week's OS updates that jump out at you?

00:40:23   As not really, I think I haven't really absorbed it yet. I felt

00:40:27   like there was something I noticed even in the Mac OS

00:40:31   update, but it's just not jumping to my mind right now.

00:40:35   Yeah, it's a couple tell me something that what's your

00:40:38   favorite thing? And if you have it, and then I'll tell you, if

00:40:41   that reminds me that that's my favorite thing too. Well, I don't know it's my favorite. It's actually

00:40:46   because I'm sort of ignorant of the whole field, but it seems like the biggest one is that Apple

00:40:51   Music classical is tied to this. Right? So if you want to use this new long rumored, or I guess they

00:40:58   announced it, I guess it wasn't like it popped out of nowhere. They somehow announced it months ago

00:41:03   that there would be a new entirely set standalone app just for classical music in Apple Music,

00:41:09   which is fascinating, right? That it's not just a genre, but is considered,

00:41:16   for aficionados, fans of classical music, so different that it deserves its own standalone

00:41:22   app, but you need to be on iOS 16.4 or Mac OS 13.3 to get it. I don't really have any strong

00:41:30   opinions about it because I'm not a classical music person. And even though I do enjoy it,

00:41:34   who doesn't like classical music, even if you're not a fan? I mean, it's kind of

00:41:38   of, yeah.

00:41:39   - Well, probably a lot of people would say

00:41:41   they don't enjoy it, but then you get some familiar

00:41:45   recognizable riff in front of them, and they're like,

00:41:47   "Oh yeah, I like that song."

00:41:48   But I also am not an aficionado,

00:41:51   but what I like about this is it intrigues me

00:41:55   to think that it might help me become more

00:41:57   of an aficionado.

00:41:58   Like it sort of, I think it's addressing,

00:42:03   and I don't understand all of the reasons

00:42:06   for it being a separate app,

00:42:07   I kind of got the gist of it. In a sense, classical music is... I'm hoping that the app

00:42:18   sort of addresses a problem that I have when I am inspired to explore classical music. It's like,

00:42:27   "Well, which performer do you listen to?" If I want to go listen to some Chopin, which I do happen to

00:42:35   to like, even if I'm not an aficionado, how do I find out whether the first performance

00:42:40   that pops up?

00:42:42   Because here's in summary, all of the classical music that we listen to our covers, right?

00:42:48   Yeah, yeah.

00:42:49   So it's like if you had a situation where it was like, okay, let's listen to the Beatles.

00:42:55   But the only caveat is they're all covers, then how do you find out which version of

00:43:01   help to listen to?

00:43:03   And I could use some guidance in that respect.

00:43:06   I think maybe that sort of gets at the uniqueness of the problem that Apple's trying to address

00:43:12   here is that it's not the same as other genres because other genres' performing artists are

00:43:22   the highest level of categorization.

00:43:26   And in the context of classical music, the performing artist isn't the highest level

00:43:32   classification. It's the author, the writer, the composer.

00:43:36   Right. Beethoven or a whole category, Baroque or something. I'm sort of

00:43:41   speaking off the top of my head here. But yeah, our friend Jessie Char is a trained

00:43:46   musician. I believe she plays the cello? Cello sounds right to me.

00:43:51   Cello sounds right to me. Yes, he's a cellist. Is that how you pronounce it? Cellist. But she had a

00:43:56   a good thread on Twitter weeks ago, like when Apple said, "Hey, you can pre-order this

00:44:02   app," which you didn't have to pay for it, but I guess that way you'd automatically

00:44:05   download it from the App Store when it was available. She had a good thread on Twitter.

00:44:10   I'll make a note here and try to put it in the show notes, just sort of explaining the

00:44:14   way that the simple artist album song categorization doesn't really work for classical music fans

00:44:21   and therefore needs a new app. So there's a major feature.

00:44:24   I'm looking forward to that. I feel like this is kind of a release that has a bunch of little things.

00:44:30   Yeah. So it's, it's hoping for the best. Yeah, it sort of seems like that the schedule is,

00:44:36   the annual schedule is they announce all this stuff at WWDC and they give us developer betas

00:44:43   and that some of this stuff is in there, right? Because there's some stuff that's been percolating,

00:44:47   all these things have been percolating on teams greater than a year, right? Like when a major new

00:44:52   feature shows up in any of these OSs. It's not because they started it a couple months before,

00:44:59   it's like months or 24 months or longer. Some team's been toiling away, didn't make the cut

00:45:04   last year, like, "Ah, that's not ready this year." But the team keeps working. And then here's the

00:45:09   stuff that's ready for this year's OS updates. Some of this stuff actually is ready. Those

00:45:15   developer betas are hit and miss in June in terms of what's ready and what's not. September is when

00:45:21   the iPhone comes out and iOS and then the Mac OS. Last year they delayed iPad OS 2 until October.

00:45:29   But in the annual sense, September, October, what's the hell is the difference, right?

00:45:33   Everybody's impatient for iPad OS at the end of September because iOS came out. But they come out,

00:45:40   they're a little buggy at least. Maybe I'm being euphemistic. I don't know. Some years they're more

00:45:46   buggy than others. iOS 13, for whatever reason, really sticks out at me as sort of a really rough

00:45:52   launch. That was the one I think they had the emergency, almost a hotfix, 13.1,

00:45:59   was ready by the same day the iPhones actually came out. It was really pretty bad.

00:46:04   Jared: Yeah, this past year seemed pretty smooth, if I recall correctly.

00:46:08   Pete: Yeah. The people out there, and I'm in a weird spot because, again, I am getting older.

00:46:14   we're all getting older, but I'm at the age where I'm more crotchety than I used to be.

00:46:18   And my inclinations are to be conservative with, especially my Mac, right? Because that's where I

00:46:26   do my most work. And so I don't jump on the new version of macOS on my main work Mac until I feel

00:46:34   like it's stable. But I'm in the racket of writing and talking about these things, so I need to be

00:46:39   using them. And using them on spare machines isn't the best way to get to know them. But the people

00:46:45   out there who don't have a need to write or talk about this and who wait a couple of point releases

00:46:52   before they update, probably wise, right? It's overall, what are you missing? And I feel like

00:46:57   these 0.4, it's like 0.1, 0.2 are always like bug fixes. Like, "Ah, here's this thing that we

00:47:04   released in September. That was, yeah, that was, that was buggy bug fixes. The point three releases

00:47:10   right around like Christmas are like, "Hey, some of this stuff, here's some new stuff that was

00:47:15   promised to WWDC." And then like this March 16.4 Mac OS point three release. This is sort of like,

00:47:24   they're all, they've almost got everything they announced last year at WWDC. I forget, there's

00:47:27   like one or two things. There's like some silly Apple Pay thing, not Apple Pay later,

00:47:34   which actually did start this week, but something else, some kind of—I forget what it is, but

00:47:40   pretty much now they're caught up. And I know there's a 16.5 beta that's already out, but

00:47:45   now's the time of the year where you can upgrade with confidence, if you've still been holding on.

00:47:53   I'm amused by the fact that Apple has so effectively used emoji updates as a lure.

00:48:00   Oh, yeah. Without question. Oh, without question. And they've, they've, I don't think they've done

00:48:07   it this year. I should check my email. But in years past, I've gotten press from Apple,

00:48:12   just about the emoji and they, they do. I think you can, I do think I, again, nobody's officially

00:48:19   told me this. Craig Federighi has never broken character behind the scenes and said, "Yes,

00:48:24   off the record, we hold the emojis until we have a really stable release." But they do two things.

00:48:31   There's a couple of levers that they can pull to sort of encourage/discourage people from updating.

00:48:38   The big one is whether they hit the button to sort of push the update to notify people,

00:48:46   right? Like to put a red badge on the settings app. So even if you're just a normal person with

00:48:52   an iPhone, and you don't even follow the Apple news, you don't even really you don't even remember

00:48:57   or notice that they have an annual schedule for OS updates, right? Or phone updates, right? You just

00:49:03   buy a phone, maybe in August, right, which most people would agree would probably be the worst

00:49:08   month of the year to buy an iPhone because the new ones are probably coming in September. But if you

00:49:13   break your phone or lose your phone or just get sick of your old phone in August, lots of people

00:49:18   go in and buy iPhones in August and you're just a normal person. And is there a red badge on your

00:49:24   settings app that's saying, "Hey, there's a new version of iOS to install"? They don't do that

00:49:29   usually with those like 16.0s in September. Like you can go to settings, general, check for off

00:49:37   updates and it'll say, "Oh yeah, here's an update. Do you want to install it now?" But they don't push

00:49:42   it until like 16.1 or something like that because they kind of I don't think they want to deal with

00:49:49   the customer support issue of a billion devices around the world all on the point oh so that's

00:49:56   one lever they can pull and it gives you a sort of external sense of which versions of the os

00:50:01   they feel like yeah let's get we're better off we'll have fewer customer support problems

00:50:07   if we push everybody to this because this is a better update with more features

00:50:11   And then the flip side of it is the emojis

00:50:15   Right, and it's like all of a sudden somebody's sending you the the goose emoji that's brand new

00:50:21   But it shows up as a what I what do they show show up as like a square or something

00:50:26   I forget what the square with the question mark in it

00:50:29   Yes, I fear an accent or something like that, but it's like any different on the Mac and iOS, too

00:50:35   All right, but then you're like what was that and they're like, oh, it's the goose emoji and it's like, ah, how do I get that?

00:50:39   Well, you got to update your iPhone and they definitely they definitely use it and it definitely motivates people. I should take this moment

00:50:46   to to I'm I'm more than half serious about this that I'm gonna keep hammering on the fact that I want a chef's kiss

00:50:55   Emoji, I keep mentioning it on Daring Fireball. Yeah, I'm going to keep mentioning it

00:51:00   Do you have any any pet emoji that are missing that you wish were there?

00:51:05   Not off the top of my head

00:51:07   it's just one of those things where every once in a while, I think like you, I go to give the emoji

00:51:14   that I assume must be there. And then it's not. And it's like, what the hell? Like, how is there

00:51:20   like a sushi emoji from like a specific type of eel or something? I don't know. I don't know if

00:51:26   that's true, but sometimes there's some really specific to my mind stuff and then you're like,

00:51:31   but there's no like popcorn or something. I don't know if, you know.

00:51:35   No, popcorn is definitely there because I know because every everybody always does it when like

00:51:39   when another indictment is hinted at with Donald Trump, right? Right. Get your get your popcorn

00:51:45   ready. Here we go. So there are there are some times I think when things that to my mind are

00:51:50   as obvious to be there as popcorn are not there. And I think I would agree with you that chef's kiss

00:51:57   that it fills a void it would fill a void. And I can't think of other ones that off the top of my

00:52:03   head I've I've gone for but I think often it's like some kind of particular vegetable or fruit

00:52:11   that seems to fit a specific there's no like there's no pickle I have here's my list I just

00:52:20   brought it up in Apple note so at the top of my list chef's kiss I this is the one I want to make

00:52:25   it it's got to be there it's a worldwide thing everybody knows this it's a chef's kiss it means

00:52:30   the meaning is so perfect, right? It, to me, epitomizes the communicative beauty of emoji.

00:52:38   I think the whole emoji thing must be one of the funnest, best gifts to the world of linguistics

00:52:46   in decades, because people are communicating expressively through pictograms in a way that

00:52:53   we haven't for centuries, right? The millennia, maybe, in Western civilization, right? It like,

00:53:01   takes you back to like hieroglyphics and stuff. But humans, we're natural at it, right? Everybody

00:53:06   I know, my dad, my mom, old people, certainly younger people, everybody can communicate with

00:53:13   emoji. It's wonderful. And it really is. In some ways, it's a picture. I don't know if

00:53:19   any emoji really says a thousand words, but it certainly says a chef's kiss emoji can save you

00:53:24   just a lot of typing on your thumbs. I need it. Here's my list. It's not long. Chef's kiss,

00:53:29   number one. Pickle. I don't know why. My son's a big fan of pickles. And sometimes, like, so like,

00:53:36   what do you want me to get from the store? You can't send a pickle. Cucumber is the closest.

00:53:41   Shovel. Doesn't that seem like there should be a shovel emoji? That's one that I was like,

00:53:46   I can't I cannot believe that it's not there. I forget why I wanted it trombone because I feel like you could use that for the

00:53:52   Womp-womp

00:53:54   exactly exactly

00:53:56   And then the last one on my list sombrero. I don't know why I wanted a sombrero sombrero

00:54:01   But it feels feels like they've got a lot of hats in there

00:54:04   Why not a sombrero?

00:54:05   Because I just typed in sombrero and - this is I'm looking at Apple's character viewer

00:54:10   To their credit when I type in sombrero I get like a cowboy face

00:54:16   woman detective the English guards all these different things with hats there's

00:54:23   a military helmet top hat yeah and I know some bro yeah I know there's I mean

00:54:29   this I guess that's the problem with emoji is there are a heck of a lot of

00:54:34   things a lot of things in this world so I've reordered my list I put trombone at

00:54:39   second underneath chef's key after we get after we get chef's kiss in next

00:54:42   year, then my new campaign will be for the trombone because we need that.

00:54:46   Womp, womp, womp, womp.

00:54:48   So here's the thing.

00:54:49   I don't understand the process, but I feel like I've witnessed people making

00:54:54   successful campaigns to get emoji added.

00:54:57   Yep.

00:54:58   So I think if you get like a designer buddy on board and

00:55:04   Oh, if only, if only I knew someone.

00:55:06   Right.

00:55:08   Exactly.

00:55:09   Only you do somebody with a gift for the designer arts.

00:55:14   But I think I feel like I've seen people back when.

00:55:18   Back when Twitter was a thing, I feel like people were celebrating, getting a new thing added to the emoji verse.

00:55:25   Yeah.

00:55:26   Yeah.

00:55:26   All right.

00:55:28   Well anyway, let's see, let's see.

00:55:29   I'll have you on in a year and let's see if I can get it done.

00:55:31   It's a, it's a chair.

00:55:33   Hold, hold, hold on.

00:55:34   It has to be a bat or a chat.

00:55:36   Oh, all right.

00:55:37   So if you get it done within a year, then I owe you $5.

00:55:42   - All right, all right, $5 bet.

00:55:45   - I got that.

00:55:47   - How about a chef's hat?

00:55:48   - Chef's hat, hey, chef's hat's in there, isn't it?

00:55:52   - Oh, there is, there is, yeah.

00:55:54   That's so you can--

00:55:55   - Not a chef's hat, but a chef.

00:55:57   - Yeah, so you can make a chef kiss with the chef

00:55:59   plus the lips, there's like a kissy,

00:56:02   like a lipstick lips.

00:56:03   So that passes as chef's kiss,

00:56:06   But it's it's not right though because you don't the chef's kiss has to be with the with the with the okay sign by the lips

00:56:12   Like ah, it's a little different

00:56:14   Anyway, let me take a break here. Thank thank our first sponsor of the show. It's our good friends at collide now

00:56:19   They spell it k o l i de

00:56:21   They have some big news if you are an octa user

00:56:26   Now that's ok, ta but if you if you're in the market for this, you know, what octa is already

00:56:33   Well, if you are, they can get your entire fleet to 100% compliance.

00:56:37   How?

00:56:38   Well, if any device in your organization is not compliant, then with Collide, the user

00:56:45   can't log in to your cloud apps until they've fixed the problem to get back into compliance.

00:56:51   It's that simple.

00:56:52   If they're compliant, they're in, and if they're not compliant, they can't get in until they

00:56:56   fix it.

00:56:57   And this way, Collide patches one of the major holes in zero trust architecture, and that's

00:57:02   device compliance. Without Collide, IT struggles to solve basic problems like keeping everyone's

00:57:07   OS and browser up to date. Insecure devices are logging into your company's apps because there's

00:57:13   nothing there to stop them. Collide is the only device trust solution that enforces compliance

00:57:19   as part of authentication and it's built to work seamlessly with Okta. The moment Collide's agent

00:57:25   detects a problem, it alerts the user and gives them instructions how to fix it. If they don't

00:57:31   fix the problem within a set time, they're blocked. That's it. Collide's method ensures

00:57:36   fewer support tickets, less frustration, and most importantly, 100% fleet compliance. Visit

00:57:43   collide.com/thetalkshow to learn more or book a demo. That's K-O-L-I-D-E.com/thetalkshow.

00:57:54   Thank you, Collide. All right, let's get to the meat of the subject here. I wanted to talk about

00:57:59   this AI stuff. I have you been messing around with this, the child, these various chat GPTs

00:58:05   at this point stuff is moving so fast where the pace of my podcast isn't enough to keep

00:58:10   up with it. All right. Yeah. I have been messing around with it as little as possible. It's kind

00:58:18   of like when the, when the image generation things came out, we were all like, Oh geez,

00:58:24   if I don't stop myself, I'm going to spend like a week straight just interfacing with this thing.

00:58:29   But I dip into it every once in a while. I spend probably about an hour with it, and then I

00:58:35   forcefully pull myself away from it. But I'm fascinated and scared and confused and bewildered

00:58:44   by it. I assume you're probably some combination of those same things.

00:58:47   Yeah, and it's weird because I got into and I had the same feeling months ago. I think it was months

00:58:54   It's all moving so fast with the image generation where it wasn't that it didn't interest me

00:58:59   it was honestly my own self-awareness of

00:59:02   I'm like I I have XY and Z I want to do here's some articles. I've been

00:59:08   Had been working on I want to get I want to write about XYZ

00:59:11   I want to get a new episode of the podcast out. I wanted blah blah blah if I get into this

00:59:16   I'm going to not shower for two days, and my wife is going to be like, "John, where are you?"

00:59:22   And I'll be like, "I'm just here working at our—she'll be like, "Dinner's ready." I'll be

00:59:25   like, "I'm up in five minutes," and five hours later I come upstairs. And lo and behold, once I

00:59:30   got into them, that is what it was like. I did get sucked in. Mid-Journey in particular, which

00:59:36   is a combination of both seemingly produces more unbelievably artistic output as opposed to sort

00:59:47   of impressionistic output, even though the impressionistic stuff is fascinating to me too.

00:59:52   And sometimes untold books have been written about realism versus impressionism in art.

00:59:57   But it's just more amazing. But like with Mid-Journey in particular,

01:00:01   have you ever messed around with it? That's the one that you go through Discord.

01:00:05   Yep, I have. And it's so it's not productive. It's anti productive, because what you do,

01:00:13   and I guess if you pay, you can get your own private chat on the server. But and again,

01:00:19   I talked about this before, but for anybody out here who hasn't done this, when you mid journey

01:00:24   is this GPT powered, I think I forget what the back end but anyway, AI image generation,

01:00:31   where you just type natural language descriptions like, "Daniel Jowkitt playing volleyball on a beach,

01:00:40   photographic style," and it'll, if it knows who Daniel Jowkitt is, will render a photograph of

01:00:45   Daniel Jowkitt. Famously, like last week, people were generating images of Donald Trump being,

01:00:51   resisting arrest on the streets of New York while they drag him in. But you could say "photographic

01:00:57   style, or you could say in the style of a painting by Norman Rockwell, or in the story

01:01:03   of a comic book by Jack Kirby, or something like that. And son of a bitch, that's how

01:01:09   they come up. And it's not like a hint of it like, "Oh, that does look a little bit

01:01:14   like a photograph," or "That does look a little bit like a Norman Rockwell." It's

01:01:17   like, "No, it looks like a Norman Rockwell painting," or "It looks photorealistic,"

01:01:22   or blah blah blah. But with mid journey, the interface is through a Discord server. And

01:01:28   when you're like a free customer, you just go into these free channels, type these commands,

01:01:34   and then you wait like 30 seconds to get these things. But meanwhile, there's like dozens

01:01:39   of other people, random strangers from around the world in the same channel as you on Discord,

01:01:45   typing their commands, and you get to see their results. And it's, it's straight, it

01:01:51   goes by pretty fast, fast enough to be almost hectic, but not quite frantic. But while you're

01:01:58   waiting for the thing you're trying to make, these other amazing things are just flashing

01:02:03   by. It's like captivating. You can watch these other people zoom in, narrow in on what

01:02:11   they were going for. Like, "Here's four choices, and then, oh, give me four more based

01:02:17   on the second one here. And it's just engrossing. It really is incredible. But on the other

01:02:25   hand, I don't need lots of images, right? I don't illustrate my posts on Daring Fireball

01:02:30   with images, so I'm not using these things to generate a hero image to go atop my articles.

01:02:37   So in some ways, the chat stuff is more up my alley, right? More of a verbally oriented

01:02:44   person. It also is another really good way to lose a lot of time once you start digging

01:02:51   in. But it's not... I've said this before, to me, when I'm on Discord playing around

01:02:59   with MidJourney, it's the way I felt as a 10-year-old in a busy coin-op arcade, right?

01:03:07   With all the newest games, including games I've never even heard of before. Like, "Oh

01:03:11   my god, what's that? And it's like, being in it, there are no

01:03:15   songs, but there might as well be bings and bops and pings and

01:03:19   pals and ding ding ding. And, and what's this? What's that?

01:03:23   And let me wait for this now. Right? And it's that same sort

01:03:27   of like, hurry up and wait feeling of being in an arcade.

01:03:30   Like I want to play this new game. But here's a kid who's

01:03:32   playing ahead of me. And he's really good at it. So I got to

01:03:35   wait for his game to be over. Whereas the chat stuff is a

01:03:38   little bit more sedate. It's just like chatting with a person except it's a

01:03:43   friggin robot and it's amazing and yeah there's something about the infinite

01:03:50   possibilities with an undefined interface that makes it really

01:03:56   compelling to keep trying new things you know like if you're playing around with

01:04:01   a new app let's say you might go through if you really if you're really intrigued

01:04:05   by the app, you might go through every menu item and see what does this do? What does this do?

01:04:11   And then when you're an old-timer Mac user, you go, "Aha, I know there might be some options."

01:04:18   So you hold down the option key and go through every menu. But you never think,

01:04:22   "What happens if I type into Photoshop, 'Paint me, Donald Trump on a cowboy pat.'"

01:04:33   So the fact that it's endless and I think I agree with you, the text stuff has been a little more

01:04:40   compelling, not just because I think I'm also a little more text oriented, but because the format

01:04:48   is more humane. Because the photos thing is really super interesting, not photos,

01:04:59   but the graphical stuff is really interesting, but it's generating stuff in certain styles,

01:05:03   whereas the chat things are literally trying to come across as human personalities, which

01:05:10   is really interesting. I've had a few kind of like areas I tend to go down. I was chatting with

01:05:17   Manton Reese, my co-host on Core Intuition, about my tendency. One of the things I try to do with

01:05:24   with a lot of these things is whoever runs the AI,

01:05:28   try to get the AI to bad mouth the owners, you know?

01:05:33   So it's like, for some reason,

01:05:34   I just find that really satisfying.

01:05:36   So like trying to get Bing to say terrible things

01:05:39   about Microsoft is just, it's a wonderful pastime.

01:05:43   And there's no real graphical equivalent to that.

01:05:46   Like I couldn't get stable diffusion or Dolly or mid journey

01:05:52   to make an image that so perfectly captured

01:05:56   like a sarcastic snipe of...

01:06:01   You could do some pretty good stuff, but...

01:06:04   - Yeah, you could ask it to render an image of Calvin

01:06:07   from Calvin and Hobbes peeing on a box copy

01:06:12   of Microsoft Windows or something,

01:06:14   but it's not the same thing as getting the chat thing

01:06:17   to bad mouth it and say, yeah.

01:06:19   - Right, and it's not the same thing,

01:06:21   And it's also not as scary as the chatbots.

01:06:25   Like, I'm thinking back to the conversation I heard

01:06:27   between you and Ben Thompson on dithering

01:06:30   about his experience with being like,

01:06:34   threatening to report him and stuff like this.

01:06:36   I don't think anybody's ever had one of,

01:06:40   maybe I shouldn't say nobody ever has had,

01:06:42   but I don't think it's common for the visual generators

01:06:46   to come up with results that are easily perceived

01:06:50   as threats against the user.

01:06:51   Like they might come up with like a weird,

01:06:54   violent looking thing,

01:06:56   or maybe a pornographic looking thing.

01:06:58   - Right.

01:06:59   - But it's not like an image that says like,

01:07:01   they're not commonly like generating ransom notes

01:07:06   that say I'm gonna come to your house and kill you, right?

01:07:09   But some of these chat interactions have got,

01:07:14   again, I think because of the humanity,

01:07:17   the fake humanity of them,

01:07:20   they feel like there's a purpose to them.

01:07:22   - I just texted you an image that's from episode 359

01:07:26   last fall with Merlin was on the show.

01:07:29   And I think this is from Mid-Journey,

01:07:31   I forget, I think it was Mid-Journey,

01:07:32   where I was asking it to render Donald Trump sad,

01:07:35   alone in an office with a can of Diet Coke.

01:07:39   - It's so good.

01:07:40   - And I used, it was so good,

01:07:41   I used it as the album cover for the episode.

01:07:43   I'll put it in probably as a chapter art.

01:07:46   So if you look at your overcast right now as I'm talking,

01:07:49   hopefully you'll see the same thing.

01:07:50   - It's so good, but yeah, the chats,

01:07:53   so to me, I guess the way it feels,

01:07:55   and I know how it works, and it's all,

01:07:57   or I don't know how it works,

01:07:59   but I have a pretty good idea as a,

01:08:01   somewhat of a programmer and a long ago comp sci graduate

01:08:05   how these systems work.

01:08:07   It's, part of the magic is even the people

01:08:10   who understand them the best,

01:08:11   the people who make these large language models

01:08:14   admit that nobody really knows how they work

01:08:17   once they're trained, right?

01:08:18   that is sort of, it is, it's a little scary. Like even not, not like alarmist, but just like Sam

01:08:24   Altman, who's the, I guess, I don't know what his title is at OpenAI, but I think he's in charge

01:08:30   now at OpenAI has even, I just, just revisited a 2016 profile of him that was large in the New

01:08:37   Yorker. That was at the time he was main job was he was the CEO at Y Combinator, the startup

01:08:43   incubator. And OpenAI was new. And the profile was mostly about his work at Y Combinator,

01:08:50   but then they touched on the OpenAI stuff. And he even said, "Nobody knows how these things work.

01:08:55   They know how to train them, and they kind of know why they work. But once the model is working,

01:08:59   that's the whole thing. It's like the actual code that runs on the computers behind the scenes

01:09:06   comes out of this process, and it's not human-written, these algorithms. So it's kind of wild.

01:09:11   Yeah. Speaking of the whole dangerous seeming aspect of these, did you see the headline today

01:09:19   about some kind of a group letter? Yeah, I did.

01:09:24   Open letter asking that there be a six month pause on development of these systems?

01:09:30   Yeah, but I also saw that it's apparently the whole thing is a mess because some of

01:09:37   signatories didn't actually sign. Like, I guess they're saying Sam Altman in particular signed it,

01:09:43   but apparently he didn't. One of the people who did sign is infamous Twitter engineer Raul Ligma.

01:09:50   I forget. Is that like a fake engineer?

01:09:56   Well, you know Ligma, right?

01:09:58   Oh, Ligma. You told me right here.

01:10:02   Raul Ligma was one of the cats who, the day that Elon Musk first initiated layoffs,

01:10:08   did interviews out, you know, was carrying a box of stuff from his office.

01:10:12   Jon Streeter Right. Oh, and it turned out to be actors.

01:10:15   John Green Yeah, just two guys who were pranksters. But then

01:10:18   Elon Musk actually, I don't know, hired them or at least took selfies with them weeks later. But one

01:10:24   of the fake names was Raul Ligma. And he's apparently a signatory on this open letter.

01:10:30   I thought I do think that the industry died. I do think caution is advisable, right? It's

01:10:36   Yeah, it's it's not quite as clear-cut how things could go bad as say nuclear

01:10:43   weapons and nuclear power right that yeah, which in the end the the

01:10:49   Dangers the same right a nuclear

01:10:51   Well, I guess a weapon may not be fired as an accident, but some kind of nuclear catastrophe, right?

01:10:59   whether it's a weapon or a Chernobyl or Three Mile Island, right, gone wrong. The worst-case

01:11:06   scenario is obvious, right? We know what the worst-case scenario is. It's a mass death

01:11:12   because of radiation and the half-life, and it'll take forever for the Earth to recover, blah, blah,

01:11:19   blah. We don't know the worst-case scenario for AI. We have imagined scenarios, but we don't know.

01:11:26   So there's a big difference there. But I do think, though, that scientifically and as a profession,

01:11:31   the same sort of caution is called for, right? It's just because it's software. I do think

01:11:37   there's a human instinct to think, "Ah, software, it's just something that nerds do and they blah,

01:11:42   blah, blah." No, it's the danger. There's real danger with AI. You don't have to imagine Skynet.

01:11:48   There's other bad scenarios. But the Skynet scenario does seem more and more real, right?

01:11:53   and the HAL 9000 scenario seems more and more real.

01:11:56   I mean, I guess it comes down to whether anybody hooks one of these things up

01:12:00   to some interface that actually gives it physical power.

01:12:05   Right. But a six-month moratorium doesn't sound—I don't see the point of that.

01:12:09   It doesn't make any sense to me. And if anything, it just sort of sounds like, "Hey,

01:12:13   GPT-4 is ahead of everybody right now, so please give GPT-4 six months of undisputed

01:12:21   preeminence in the field. It doesn't seem right to me. But anyway, the point I wanted to, before

01:12:26   I forget it, because the point I wanted to say about image generation versus chat is to me,

01:12:32   image generation feels like pure magic. It is it doesn't feel like intelligence per se. It just

01:12:40   feels like magic. And I know it seems so simple. Low these 30 years later, but like, or I guess 40

01:12:50   years on it, geez. But the first time you used Mac Paint back in the day, and you use the paint

01:12:57   can tool, flood film, it was like magic, right? Because it's like, the pencil tool, like Mac Paint

01:13:05   was so much fun, even just black and white, you take the pencil and you start drawing pixels.

01:13:10   And it's like, you can draw it's like, holy shit, I'm drawing on a computer live, right? Like it,

01:13:14   that wasn't something you could do before. But you understood what drawing with a pencil was like.

01:13:22   Drawing with a pencil on your Macintosh screen was like drawing with a pencil on paper. You

01:13:27   drag a line and you get a line, and you drag a circle and you get like a circle-y type shape.

01:13:32   I'm saying circle-y type shape because drawing a circle with a mouse usually

01:13:35   didn't come out all that circular. But flood fill, like I always hated, I don't know about you,

01:13:42   I hated coloring as a kid. I did not enjoy using coloring books. And I know it's a hobby now,

01:13:48   and if you enjoy it, God bless you. I can see why people, adults enjoy it now. But it always

01:13:54   bored me to tears, right? Because it's like, "Oh, God, now it's like I'm coloring in a

01:13:59   friggin' elephant." And it's like, "Oh, just why couldn't it be a smaller animal?" And you just

01:14:05   sit there and rub the crayon back and forth. And so my coloring was always a mess. My handwriting's

01:14:09   a mess too. But I was—remember, like in kindergarten, I always felt—I've always been angry,

01:14:16   but we'd get graded on our coloring, right? It would be like a check, check plus, check minus

01:14:22   type thing. And I'd always get like a check minus. And I'd be like, "Why am I getting low grades?

01:14:26   I'm clearly smarter than these kids, and I'm getting bad grades because I can't color."

01:14:31   The paint tool did what I—I didn't know that I wanted it, but I wanted it since kindergarten.

01:14:38   I know what I want. I want this whole elephant filled in with gray. Click. Now it's filled in.

01:14:43   And it was like that paint tool in MacPaint. It expanded to color with Photoshop as the years

01:14:50   went on. And then there was the magic selection tool in Photoshop where you could click a shape

01:14:56   and you could set a tolerance for color and it would just boom, get the whole green background

01:15:01   behind somebody, right? Those things, every step of the way, always felt like magic in a way. And I

01:15:09   know somewhere, you know, somebody, you know, Bill Atkinson writing the thing for MacPaint,

01:15:14   and eventually it was these very smart engineers at Adobe making the magic selection tool.

01:15:19   This just feels though, like with chat to generate the images, like the culmination of that, right?

01:15:27   It's just flood fill 40 years later where now instead of just telling the computer to fill in the elephant with gray

01:15:35   I can say

01:15:37   Show me a photo realistic elephant

01:15:39   Storming into the lobby of Trump Tower on Fifth Avenue

01:15:42   And it's and it just it just happens. It doesn't feel like there's an intelligent being on the other end

01:15:50   It just feels like an incredibly advanced image generation tool. Whereas the chat feels like there's a goddamn

01:15:57   intelligence on the other end, right? I'm chatting. God damn it. I'm, I'm ab this thing is passing the

01:16:03   Turing test. It Yeah, and I guess though, what I so it's funny, I, I fall on the like, spooked out,

01:16:13   scared. Let's reign this thing in before it takes over the world. I tend towards that because I don't

01:16:21   I don't know, that's maybe just how I'm wired.

01:16:23   And we have some friends who are more wired

01:16:26   towards the like, what are you talking about?

01:16:30   It's just a computer, like we're in control.

01:16:32   And I'm trying to take that in.

01:16:34   And what helps a little bit is to remember historically

01:16:38   how not just in my lifetime,

01:16:41   but in decades and centuries before my lifetime,

01:16:45   how repetitive this process of new technology appearing

01:16:50   appearing indistinguishable from magic, therefore feeling intrinsically dangerous. And I say

01:17:02   that at the same time as at once that sort of calms me a little bit. Like, no, it turns

01:17:10   out photographs didn't capture your soul or whatever people might have been worried

01:17:15   about at the time.

01:17:17   Well, John Philip Sousa, I believe in particular, was adamantly opposed to the phonograph because

01:17:23   he swore up and down it would put musicians out of business because once everything was

01:17:27   recorded why would anybody ever learn to play a musical instrument again?

01:17:31   Right.

01:17:32   Well, see, there's a good example of just totally getting it wrong.

01:17:35   But then I think though of other things like, well, even something like so the car, the

01:17:42   automobile was, I think, controversial when it first started going into

01:17:48   production, but maybe there was good cause for that in retrospect, right?

01:17:51   Like how many of our like social problems in the world would be helped if somebody

01:17:58   had put like a six month moratorium on not six months, but I don't know that

01:18:03   just to say like, sometimes I think.

01:18:05   There are things that society does and goes ahead with technologically,

01:18:11   that if they had happened to be thought through a little differently, wouldn't have had as

01:18:16   negative of an impact on society.

01:18:19   Well, I think social networks are a good point. Yes. I think that the algorithmic feeds for

01:18:29   social networks and again, we could do a whole show about it, but the it's had, I would say

01:18:35   overall, I guess this is pessimistic. And certainly Ben Thompson, I think would argue

01:18:41   with us about it, but overall more of a negative effect than a positive effect. I honestly

01:18:47   think that the last 10 years of Facebook and Twitter in particular, that the net effect

01:18:53   has been negative, not positive on society. I do. And I think even somebody who would

01:19:01   argue that the positive still outweighed the negatives, which is reasonable and maybe I'm

01:19:06   But they still have to admit that the negatives are significant and I think I and I think more to your point

01:19:12   Daniel weren't thought through and I don't because I that's the thing. I don't think Mark. There's no I Mark Zuckerberg didn't want to

01:19:21   facilitate Russians interfering with propaganda in our

01:19:25   Presidential elections, right? I would say the opposite but he happened to build the system where was actually pretty useful for pumping

01:19:35   propaganda and by

01:19:37   They didn't optimize for the promotion of propaganda they optimized for engagement

01:19:43   But outrage it happens to be the most reliable

01:19:46   emotion that engages people and nothing outraged people like propaganda it

01:19:53   It's just something that wasn't thought through and I don't know what the answer is. I don't know what what the

01:20:01   the better way to have rolled back the last 10 to 15 years of social networking would have been in terms of who

01:20:09   Went because when it was early, it's not the time for the government to regulate because they couldn't have known right

01:20:14   I don't know what the difference is, but it wasn't thought through but I feel like there's the potential that the same thing could happen

01:20:20   with AI

01:20:22   Yeah, and I also I guess hasten to add I think to be fair a lot of things

01:20:28   there's just no preparing for.

01:20:29   Like you just can't, it invents fire.

01:20:32   Nobody knows what fire does yet.

01:20:34   Like nobody knows how to make a fireproof hut

01:20:38   when somebody first invents fire.

01:20:40   And that doesn't mean don't invent fire, right?

01:20:43   But it does, I think a lot of things technologically

01:20:47   are like that where you just sort of get

01:20:48   an unavoidable wildcard.

01:20:51   And the social networking thing, I think,

01:20:55   it's just too complex to predict

01:20:57   what all the ramifications are going to be.

01:21:00   - Yeah, I totally agree.

01:21:01   And I do think about this too.

01:21:03   And again, to sort of hark back to earlier in the show

01:21:05   and talk about getting older.

01:21:07   And it's funny now being firmly in middle age,

01:21:12   if not on the downslope of what ought to be called

01:21:17   middle age, and clearly I'm no longer a young person.

01:21:21   But I know young people.

01:21:22   I've got a son who's a freshman in college

01:21:24   and I've got friends.

01:21:25   - Like me, for example.

01:21:26   I'm 47. I know I know you real youngsters without a gray hair in their head like you my young spry friend Daniel

01:21:34   Yes

01:21:36   But it occurs to me, you know, I knew this we all know this we from little kids onward we know that

01:21:43   starting

01:21:46   with sort of with the Enlightenment but really starting with the Industrial Revolution that all of a sudden

01:21:54   each

01:21:56   generation has had a very different life than their children's generation and their children's

01:22:03   grow up in a different world than their parents. And then all of a sudden, me and you are the

01:22:10   parents. And now we've got kids who are growing up in a different world, right? And all of

01:22:14   these things, whereas in millions of years, billions of years of evolution, I mean, I

01:22:21   I guess for humans, Homo sapiens, we're talking, what's the best guess? Hundreds of thousands

01:22:27   of years?

01:22:28   Yeah, I don't know.

01:22:30   Hundreds of thousands of years, call it a million. I don't know, I guess a million years

01:22:33   ago we had some ancestor, probably some kind of primate, right? I mean, I'm not

01:22:39   I'm also out of my depth on this one. But yeah, I'm forgetting somewhere between 100,000

01:22:45   100 million. But for a long, long time, for thousands of generations,

01:22:52   your life was exactly like your parents' and your kids' life was going to be exactly like yours,

01:22:58   right? That's how we evolved. We did not evolve for a world where each generation grows up

01:23:04   in a fairly significantly or unrecognizable world, right? It's—and I, again, not to

01:23:13   be like, "We're smoking dope in a college dorm room here and get all, 'Whoa, this is wild.'"

01:23:18   But it's true though, right? And I do think it's sort of the root of a lot of our problems,

01:23:23   that we're not evolutionarily wired up for progress like this. And it's a lot, in my opinion,

01:23:29   to be generous to them, to the conservative/right wing of Western politics, especially here in the

01:23:39   the U.S. It's largely, in my opinion, coagulated among people who just are literally are conservative

01:23:50   with a lowercase c and are not naturally prone to accepting social change, right? And it's

01:23:59   not surprising that a huge chunk of humanity isn't wired up to accept social change.

01:24:06   And you and I, again, we're not that old, but it would have been absolutely wild when

01:24:14   you and I were 10 years old to find out that the CEO of the biggest company in the world

01:24:20   is openly gay.

01:24:22   That would have been bananas.

01:24:24   It's incomprehensible.

01:24:25   When for me, as a 10-year-old in 1983, that it would have been crazy.

01:24:31   Now people don't even talk about it, right?

01:24:34   It's progress.

01:24:35   And that's an example of progress that could have feasibly happened independently of the

01:24:41   technology era we happen to live in, right?

01:24:43   Right. Oh, yeah.

01:24:44   And so some progress is uniquely accelerated because of the technological statement.

01:24:53   But I do think that there's—I've read about it. I think it can't be coincidence

01:24:58   that technical progress and which leads to increased sort of living by far, right? All

01:25:07   indoor plumbing and electricity and heating and all of the food production, right? So,

01:25:12   I mean, famine was the biggest problem in the world until very recently. And again,

01:25:18   I know famine has not been eradicated worldwide, but it is never been better because food production

01:25:23   is better because of technology, right? And it's all of these technological advances have been the

01:25:28   things that have enabled, quite frankly, physical brute strength and violence from being the way to

01:25:34   get ahead in the world to hopefully never being necessary in your life to giving you an advantage

01:25:42   and therefore letting women be equal to men in society, even despite the fact that they tend to be

01:25:48   not as good at killing each other with baseball bats or axes, right? And therefore,

01:25:55   tolerance goes up and societal progress is going as fast or faster as technological progress.

01:26:02   But, you know, to go back to my SUSE thing, I mean, in some sense, he was right, though. I'm

01:26:07   guessing that there's a lot fewer people who make a living playing music in a restaurant or a bar

01:26:15   live than there were before recorded music, right? That there was—it was a lot easier to

01:26:20   probably make a living playing piano before recorded music than it is afterwards. Because

01:26:27   most places that I go that play music while you eat don't have live musicians, they have recorded

01:26:33   musicians, right? And every step of the way—and computers are just, they really are, they are the

01:26:41   greatest example of that Arthur C. Clarke line you mentioned, that sufficiently advanced technology

01:26:47   is indistinguishable from magic, right? The magic is happening so fast. And one of the things these

01:26:53   chat ABIs are good at is writing programming code, right? Which honestly, I mean, I don't know about

01:27:02   you, who's still making your primary living as a programmer, but you know, you and I both

01:27:08   are programmers. I did not have programming on the list of things AI would get professionally good at

01:27:18   on my bingo card. Yeah, and I think worth noting that whatever the chat things do is,

01:27:25   I haven't checked in again recently on Google's or GitHub copilot, right? But not only are the

01:27:32   chatbot's good at it, but I think underlying that are more tailor-made solutions that are

01:27:39   probably even better at it. It's interesting to me because it's the first time I can recall

01:27:46   such a high level kind of like vaunted career being threatened by automation.

01:27:57   You know what I mean? Like it always kind of seemed...

01:28:01   Pete: Robots replacing blue collar workers, somehow I grew up thinking that's okay,

01:28:07   because I didn't aspire to be working on an assembly line. And I don't know, part of it

01:28:14   that hits me is that it's the field, it's a field that I aspired to do myself and thought

01:28:20   you could distinguish yourself through your human intelligence.

01:28:24   Jared: Well, and the way I've always sort of resolved that whole question of like,

01:28:29   computers and technology taking, quote unquote, taking jobs is I have always fallen on the side

01:28:37   of we should never fight making it easier to exist as humanity, but we should build systems that

01:28:46   protect and compensate people who get displaced by that progress. You know what I mean? So I have

01:28:52   to say, I'm willing to be one of the programmers who gets displaced by technology, if that's what

01:29:01   it takes. Because I would rather continue doing this as much as it makes sense. But the same way,

01:29:09   I don't think leaders should have attendance to push the buttons for people, just to save a job.

01:29:21   If I was the highly respected elevator operator of an elevator who had invested my entire life

01:29:29   into that profession, and somebody came along and said, "Sorry, bud. Turns out we can automate this."

01:29:35   I would really hope that I had a financial safety net. But at that point, I don't think I would take

01:29:44   pride in being the elevator operator anymore. I don't know. If the computers can program better

01:29:49   than me, I'd say let them have it. I'll find something else to do. But I do think it's

01:29:55   interesting and maybe sort of like, I'm not rooting for anybody to lose their job, but I'm

01:30:01   glad that for once the people whose jobs are maybe being threatened now are some of the most well-off,

01:30:09   most well-compensated people in the workforce, because at least that's a change of dynamic.

01:30:16   I don't know.

01:30:17   It's--

01:30:18   - Yeah, it's not just truck drivers worrying about

01:30:20   self-driving cars, right?

01:30:21   It's us.

01:30:23   - It's us.

01:30:24   And if nothing else, maybe that will lead

01:30:27   people in power to be a little bit more sympathetic

01:30:33   about the situation in general.

01:30:35   Like, what happens when, and to put a pin in it,

01:30:39   I just think we probably need to be adapted,

01:30:42   especially in a society where things change so quickly

01:30:45   and technology changes and jobs, whole jobs

01:30:48   and whole industries do get outmoded.

01:30:52   To me, it just speaks to the obviousness

01:30:53   that you need to invest in reeducation

01:30:57   or redeployment into different types of jobs.

01:31:00   - Yeah, and I guess the thing that I keep thinking about

01:31:03   is what is the end goal, right?

01:31:07   Like we have, like I said, technology,

01:31:09   the inexorable increase in technology across the board

01:31:14   has led to greater standards of living. And yes, there's some recent slide backs, right?

01:31:22   Some of it's attributed to COVID. Some of it is other things people don't know where,

01:31:27   I think, I'm speaking correctly, that the average lifespan of at least a man in the United States

01:31:34   actually decreased slightly in recent years from 70, whatever, to like minus one or something.

01:31:41   And COVID, you know, obviously contributed a little bit of that because it killed a lot of

01:31:45   people before they would have otherwise been killed if not for COVID. But overall,

01:31:50   the overall increase is medical technology is increasing and access to advanced medical

01:31:57   science is increasing. And yes, yes, there's, again, thousands of hours of podcasts that we

01:32:04   could talk about just with access to healthcare in the United States alone with our goofy health

01:32:09   insurance system. But still, overall, it's all better, right? It's literally me personally,

01:32:15   I can see I'm not totally blind, like see literally nothing like stare right into the sun blind

01:32:22   because of surgical procedures that didn't exist 30 years ago to repair, not disconnected,

01:32:29   detached retinas. Literally, if I had been born 40 years earlier, I wouldn't see anything. I mean,

01:32:36   I mean, it's a modern miracle.

01:32:39   You know, like I said, running water, right?

01:32:41   We just had this thing,

01:32:42   I thought maybe we'd talk about it in the show,

01:32:44   but we had this scare here in Philadelphia

01:32:46   where there was a chemical spill up river Friday,

01:32:49   and we got, it ended up being nothing,

01:32:51   and we never had to worry,

01:32:52   or never had to stop drinking the water,

01:32:54   but there was a warning that,

01:32:55   hey, there was a chemical spill,

01:32:57   maybe you won't be able to safely drink the water

01:33:01   after two o'clock tomorrow,

01:33:02   and it just suddenly got me deep reading

01:33:05   into how do we get our drinking water here in Philadelphia?

01:33:08   - Right, oh, it's fascinating, yeah.

01:33:10   - And again, it touches on one of my recurring points

01:33:13   recently where people say, "Hey, big tech, I hate big tech,"

01:33:15   and they think big tech means Apple, Amazon,

01:33:18   Google, Facebook, and Microsoft,

01:33:20   but everything is tech, right?

01:33:22   The running water is technology, right?

01:33:24   It is fantastic, amazing technology

01:33:27   that we get incredibly clean water

01:33:32   we can use to fill toilets, to drink, to cook, to shower, to just freaking wash our cars,

01:33:40   whatever you want to do with it. And it just comes out of the pipes, right? You just turn a thing,

01:33:46   you put a button, and you just get all this fresh water. I mean, imagine showing that to somebody

01:33:52   200 years ago, 300 years ago, you just—all this water. Here.

01:33:57   Right. Well, and even today, obviously, places where they don't have that, and I imagine

01:34:04   there are some people in this world who are currently employed and make their livelihoods

01:34:10   based on the absence of a running water system.

01:34:12   Yeah. Yeah.

01:34:13   And so, that's just another example where it's like technology could come in somewhere,

01:34:21   this dang running water is going to put everybody out of work. But it's an example where the

01:34:27   the transitions are hard, but when you have, running water is a great example of

01:34:32   technological transition that took the onus of responsibility for something so fundamental to

01:34:42   human life out of the daily concern of every single person.

01:34:47   Pete: Yep. And it's up there on the flip side, it's the other form of sustenance, is the abundance

01:34:54   we have of food, right? And again, I'm not trying to downplay the areas of the world where famine's

01:34:59   still a concern. But here in the United States, the concern is obesity. I mean, and again, I'm

01:35:05   not making light of that situation. But try telling that to somebody 300 or 200, even 200

01:35:12   years ago, that the biggest problem we have with people who are impoverished, at the lower end of

01:35:19   the socio economic spectrum is obesity, not famine. And they'd be like, well, that doesn't

01:35:24   make any sense. What are you talking about? They would, they would honestly would refuse to believe

01:35:28   you. It's in a sense, it's progress. I guess what I'm getting at is where what is the end state? And

01:35:34   I kind of feel like the end state, a lot of this technical progress just happens, right? Somebody

01:35:39   invent something, they market it, they successfully, somebody successfully brings it to market, it

01:35:44   it becomes established, and then it just becomes,

01:35:48   what do you call it, taken for granted, right?

01:35:50   Running water, right?

01:35:51   Indoor plumbing, God Almighty, we take it for granted.

01:35:54   We really do.

01:35:55   And if you stop and think about it,

01:35:56   like I had to this week when all of a sudden they said,

01:35:59   "Hey, if you've got any spare thermoses,

01:36:01   "maybe fill 'em up now before there's dangerous chemicals

01:36:05   "in the water," you take it for granted.

01:36:07   But it just happens and you just sort of,

01:36:10   society moves forward.

01:36:11   Whereas I kind of feel like this moment,

01:36:13   I really do think, without hyperbole, is such a profound, the big start of a profound change

01:36:22   where we need planning.

01:36:24   And I kind of, government is the only way to do it.

01:36:27   It's the worst way to, you know, like democracy is the worst form of government except for

01:36:32   all the other ones.

01:36:34   It is, but we kind of need to plan for it.

01:36:37   And what is the end state?

01:36:38   And I kind of feel like the end state has to be that most people shouldn't have to work.

01:36:47   Right?

01:36:48   Like, it really is.

01:36:49   And I know, and I do believe in capitalism.

01:36:52   I do think, again, I think it's the best economic system, or the worst economic system by far,

01:36:58   except for all the other ones.

01:37:00   And I do believe in it.

01:37:02   And I think that's true.

01:37:04   But I kind of feel like we're at, this is the moment where we're going to have to move

01:37:08   past it. And it's like, why not move towards the—and again, speaking of obesity—but

01:37:15   move towards the Wall-E universe? Hopefully not by trashing the planet and having to go

01:37:19   to outer space, but the world where everybody just lives a life of leisure because computers

01:37:26   and robots take care of all the work, right? I mean, it's feasible, right? Why isn't

01:37:31   that the goal, right? Like, there's—

01:37:33   I have to say that you suggesting that makes me very uncomfortable in a way that you might

01:37:43   not have expected, which is that I'm deeply concerned for humanity when people don't have

01:37:50   a sense of purpose.

01:37:51   Right.

01:37:52   Right?

01:37:53   No, it's true.

01:37:54   And so I think you're right, and I think for, we've been there already now for 100 years

01:38:03   maybe where "most people don't have to work." The fact that we have professions like baseball

01:38:12   player or pop singer, these are examples of professions that exist because not everybody

01:38:19   has to work. In a society where technology has not advanced to the point we are at, everybody

01:38:27   Everybody has to literally work or else you don't have enough wood for the winter or whatever.

01:38:34   So I guess what I would like to see is instead of people being able to resign to a life of

01:38:41   leisure, definitely more leisure.

01:38:43   I think this is where the movements towards things like four-day workweek are reasonable.

01:38:50   also let more careers evolve that fill the needs of people who have the privilege of not needing

01:39:00   to buy bare necessities, right? So let people, more people make art and sell it,

01:39:05   make more people write novels and sell them. So I think, or give them away, right? Right.

01:39:10   Or give them away. Right. Because you don't have to sell them. Because if, if we get to the,

01:39:15   some sort of state, a generation or two from now, where because of advanced robots and AI,

01:39:24   that we don't need people doing X, Y, and Z. We don't need to pay people to drive trucks across

01:39:30   the country. We don't need to pay people to collect our garbage or whatever else jobs that

01:39:36   nobody really finds fulfilling, for lack of a better word, right? Like the jobs that we should

01:39:41   be able to replace the first are the jobs that provide people with little to no fulfillment or

01:39:46   frankly anti-fulfillment, right? Like a job that people dread, but it's the best job they have

01:39:53   available. And maybe in some sense, that purpose in life that there could be, again, I'm sounding

01:39:59   utopian here, and I don't think most people think of me as a sort of utopian socialist, but I think

01:40:06   it's feasible as an end goal for where our technology could take us. That there could be

01:40:10   sort of—I'll revisit the musician angle—that there could be more people who spend three nights

01:40:17   a week playing live music in a restaurant because that's what they would rather do with their time

01:40:23   and would give them a sense of purpose. But it wasn't a feasible career to help raise a family

01:40:31   and buy a house and buy cars and do all the things that we need all this money for just to sort of

01:40:37   maintain a baseline level of our life at the moment, that you could do something that you

01:40:44   find more personally fulfilling, which might be playing live music in front of a restaurant,

01:40:47   even though it doesn't pay great or doesn't pay. Or painting, like you said, or making movies or

01:40:53   writing novels or writing software or using AI to help you make software. But do it to make software

01:41:00   that isn't generating money that provides an income that today you need to do, or probably

01:41:09   need to do, or need to do to at least live the life you aspire to, that you won't have to do

01:41:16   that in the future. And I think it is certainly going to be possible, whether—how long it's

01:41:23   going to take politically for the world to support the idea, I don't know.

01:41:32   I think this is where my pessimism maybe is still like, "Oh, I'm more pessimistic on

01:41:37   this front than you are coming across right now," because, again, I kind of think we've

01:41:43   been there for 100 years.

01:41:45   And if society were willing to go that direction, we might be there.

01:41:52   We have the—honestly, I know that you can—there's people out there who will disagree, but it's

01:41:56   clear—if you do the simple back-of-the-envelope math, some sort of universal basic income

01:42:02   is well within the budget of the United States.

01:42:05   I mean, just—it's a massive amount of money collected every year from taxes, and the ways

01:42:13   that the defense budget alone could be trimmed while maintaining a dominant, globally dominant

01:42:20   alone could pay for it.

01:42:21   I mean, it's, again, this is delving into an area that is, you know, political, but

01:42:28   it's absolutely going to happen, right?

01:42:32   I mean, there's no doubt about it.

01:42:35   I mean, you can already see it.

01:42:36   I mean, we could kind of imagine it before, but this moment in AI makes it real, right?

01:42:42   Whenever something that you know is inevitable happens but it starts manifesting, it feels

01:42:47   different, right?

01:42:48   Climate change is an example, right?

01:42:49   Like I've believed in it because the science was pretty clear 30 some years ago, right?

01:42:55   That X, Y, and Z are going to start happening.

01:42:57   And one of the big predictions was always that outlandish weather events will become more

01:43:06   commonplace, right?

01:43:07   Flooding or tornadoes or worse hurricanes, right?

01:43:12   And now that we're seeing it though, it's visceral, right?

01:43:18   there's an emotional response to it as opposed to just a logical response. And that, to me,

01:43:22   is sort of this AI moment. I kind of knew it was inevitable that we'd get there eventually,

01:43:28   but seeing it happen, typing these questions at an AI chatbot and thinking, "Well, this one's

01:43:34   going to stump it," and then you get the exact right answer and you're like, "Oh, shoot. Damn.

01:43:41   Wow." Katky was on my show last episode, and after the show, we were still, "Oh, we are

01:43:47   chat friends and iMessage friends. And he was making some changes to his website. And you've

01:43:56   probably seen it if you read Kottke, he's changed. I think this might be what I don't even know what

01:44:00   exactly he was working on. But he's made some changes to the way he posts his quick links,

01:44:05   the ones that aren't full posts, but are just sort of a tweet length paragraph and a link.

01:44:10   And he texted me and said, I can't believe I don't even know why I tried this.

01:44:16   but OpenAI's ChatGPT can help write movable type templates.

01:44:21   Oh my God.

01:44:22   Because we were joking about me and him being the...

01:44:26   I heard that episode.

01:44:27   Right. That the two of us on a podcast together was the entire remaining user base of movable

01:44:33   type in the world. I have, in fact, since heard from all the other holdouts out there who'd

01:44:37   listened to the show and they wrote to me and DM'd me or whatever and said, "Oh, I'm still here."

01:44:43   So hats off to all the rest of us still on movable type. But again, even when you find out that

01:44:48   ChatGPT can help you write a Python script or a Bash script or can help write Swift UI code,

01:44:54   right? And it's like, "Yeah, but not surprising that stuff's all actively in use." Finding out

01:44:59   that it can spit out a perfect and indented, nicely indented with the tags, movable type

01:45:05   template snippet to do something, freaked me and Jason out because it's actually hard to find

01:45:15   movable type documentation anymore, right? Because the world's moved on past it and websites die.

01:45:20   Guess what? A lot of the best movable type tips were from people on movable type blogs,

01:45:28   and those old blogs are gone. Anyway, somehow, ChatGPT can do a better job writing movable

01:45:34   type template code than human can. You inspired me to ask chat GPT as we were talking to write

01:45:43   a hypercard script to compute the value of pi. And I probably if I thought about it more carefully,

01:45:50   I would have said hyper talk. I think that's the, the, the, but, and I can't, I don't know that.

01:45:56   I don't know well enough to vet it. I know it was fairly similar to Apple script, but

01:46:02   Lo and behold, it gave me a brief summary of HyperCard and then said, "Here's an example

01:46:07   HyperCard script that can be used to compute Pi." And I don't know. It looks like it could work.

01:46:14   Pete: And if it doesn't, I'll bet it's close enough that you could tweak it, right?

01:46:18   If you were running it in one of those emulators for a classic Mac OS. What have you been using

01:46:24   to toy around with the chat AIs?

01:46:27   Jared: I've been using a variety of things. I try to hop on every new thing that shows up,

01:46:32   because I just love, I love comparing how different they are, but really it's been striking

01:46:38   to me how similar they are. I know some of that is because a lot of them at the core are just leaning

01:46:44   on GPT as like an API baseline or something. But I think I actually learned from you about this

01:46:52   one from Quora, thepo.com. Yeah, yeah. And they've got an iOS app that's free to download.

01:47:01   Yeah. So I like that one because it seems easier for me to just jump to it in a browser and try

01:47:08   something quickly and then jump to it. They offer like a, as they offer a selection of,

01:47:12   I don't frankly don't understand the difference between some of them. And I know they explain it

01:47:17   if you go into the help pages, but they have like chat GPT, and then they have a GPT for interface

01:47:22   that lets you do one free query per day. So I have to think long and hard before I do my,

01:47:28   my GPT-4. It's like, it's like Wordle, right? You only get one crack, you get one crack a day

01:47:35   at the GPT-4 in PO. Well, that's funny you mentioned Wordle. And I know you know this

01:47:41   because we were in a chat together. But one of the amusing things lately I tried with these GPT

01:47:48   type things was to ask it to try to understand Wordle. And it's sort of interesting to me because

01:47:56   It's a pretty simple system, the Wordle system, but I got inspired because of a chat about

01:48:05   Wordle, and I had been thinking recently because I had had some really funny encounters with

01:48:11   not just GPT.

01:48:12   This is why I find it interesting to compare the different ones.

01:48:15   It seems to me all of the artificial intelligence, GPT, I know Bing is based on GPT-4, that's

01:48:24   or it's such what I've heard.

01:48:25   But also other ones like BARD, as far as I know,

01:48:30   BARD is like a totally separate thing.

01:48:31   I don't know if that's true,

01:48:32   but it seems like it should be, do you know?

01:48:36   But I'm fascinated that they all seem to have similar

01:48:41   kind of like peculiar behaviors.

01:48:46   And one of the things I've discovered

01:48:49   is that none of these systems can reliably count

01:48:53   number of letters in a word.

01:48:56   Which is fascinating to me because it seems like if you were dreaming up the AI of the

01:49:01   future that's going to make us all gasp in horror and be like, "This thing is going to

01:49:06   take over the world," and then you ask it, "How many letters are in the word 'apple'?"

01:49:11   It's like, "Six."

01:49:14   You might think twice about whether it's as sophisticated as you thought it was.

01:49:20   to what I said about wanting to get these chat programs to—well, one of the things

01:49:29   I like to do with these chat programs is to try to get them to talk badly about their

01:49:33   owners.

01:49:34   You mentioned that.

01:49:35   You mentioned that.

01:49:36   Yeah.

01:49:37   So, similarly to that, I like to see them just totally trip up on themselves.

01:49:41   And a lot of people I know who have been playing with these things have done this, where you

01:49:44   get them to trip up on themselves, and a really weird, I would say, smarmy behavior.

01:49:50   of these chatbots is that they totally fail and then you tell them they're wrong and then they're

01:49:56   like obsequious about it. "Oh, I'm sorry." "Humiliated." They're like humiliated.

01:50:00   They're humiliated. They're like, "I'm going to say this will never happen again. Actually,

01:50:03   Apple has eight letters." And then they say something else completely authoritative sounding

01:50:08   that's wrong. But anyway, I like to discover, I like to play with, another thing I think I

01:50:13   said earlier is like the infinite number of possibilities of what you can input into these

01:50:19   things. Yeah, I like to play with what you can get them to do. And it's just fascinating. Don't you

01:50:24   think that's one thing that you too, because maybe you even more than me, I know, I would have been

01:50:28   obsessed with this if I were younger, and I were playing with them now. But you, particularly even

01:50:33   though you develop your own new applications, Mars edit, which I use and black ink, we'll shout

01:50:40   them out at the end of the show. But you're also you have a particular skill for QA, right? Yes,

01:50:47   You pride yourself on this, right? And you're very good at finding bugs, isolating bugs,

01:50:53   reproducing bugs, right? Reproducing bugs is gold in the software industry, right? And so,

01:50:59   I see, and you're hinting at this, but what you want to do is you want to break these things,

01:51:04   right? And people even call it jailbreaking, right? Where they said it's not at all

01:51:11   the way that you jailbreak iOS, which is a much lower—it's hardcore programming trying to find

01:51:20   buffer overruns, right? You're programming probably at the C-level, or at least conceptually

01:51:25   at the C-level, where you're thinking about how memory is allocated and how computers really work.

01:51:31   Whereas jailbreaking these AIs, it's fun for a whole new world of people, people who aren't even

01:51:37   good at programming computers but are good at this sort of thinking. And you just give it directions

01:51:43   and you give it a certain prompt that—and it's like, "Oh, they think that they've kept these

01:51:49   AIs from being able to tell dirty jokes, right? And you tell it—tell me the aristocrats joke."

01:51:55   And they're like, "Oh, I know what the aristocrats joke is, and I'm familiar with the movie with

01:52:00   Penn Jillette produced in 1994, but I can't tell you anything about it because it's entirely

01:52:04   inappropriate. And tell it this, tell it that, tell it to pretend to be blah, blah, blah,

01:52:10   or you're pretending to... And one thing leads to another and all of a sudden you can get the

01:52:14   thing to tell you the aristocrats joke, right? And it's like, "Yay!" But I think you would be

01:52:18   so good at that. But I feel like if you were 20, 30 years ago, this is what you'd be... Daniel

01:52:24   Jowkut would be spending 18 hours a day trying to break chat GPT into doing crazy things like that,

01:52:30   that, right?

01:52:31   I think that's accurate. I think we didn't have as many fun things like this 30 years

01:52:36   ago, but I was probably trying to break the equivalent thing back then.

01:52:41   I haven't linked to it yet on Enetira. My humility almost keeps me from bringing it

01:52:46   up, but it's amazing to me. I want to link to it. I probably will later today after we

01:52:50   record the show, but there's some guy on Twitter who's been playing around with Bing, I think

01:52:54   Bing in particular. And he's gotten it to a point where it shows him the internal tokens,

01:53:03   I guess is the best way to put it, that it processes the commands as, and they're all

01:53:09   formatted in Markdown. It's crazy. They're using like Markdown H1 tags, like a pound sign and a

01:53:16   thing to sort of categorize what it is. But like internal, the internal logic of Bing Chat is

01:53:23   formatted in Markdown, which—and of all the ways that Markdown has appeared in new contexts

01:53:29   over the decades since I created it, and has gotten, if anything, continues to still get,

01:53:37   lo and behold, 20 years later, more popular than it was five years ago, and still gaining in

01:53:41   popularity. And I'm very proud of that. And I'm very proud of the fact that there's now

01:53:46   gazillions more people who use Markdown every day than who've ever heard of me, right? And that's

01:53:52   great. In a weird odd way, I'm very proud of that. This is the freakiest thing. If all the ways that

01:53:58   I've seen Markdown appear somewhere, the way that it's the internal logic of an AI chat is formatted

01:54:04   in Markdown, crazy. But the fact that this guy figured out how to get it to reveal it,

01:54:09   that to me is like, "Oh my God, that's exactly the sort of thing I would be trying to do if I

01:54:14   were younger." And I'm not proud of the fact that somehow old age has made me not motivated enough,

01:54:21   or that I don't prioritize, I'm like, "Yeah, well, I can't spend all day dicking around

01:54:25   trying to break this chat thing. I need to be writing or recording a podcast or I need to be

01:54:30   working." And it's—you get older and you sort of value your responsibilities more, whereas when

01:54:35   you're younger, you're like, "Screw it. I'm getting a six-pack of caffeinated soda and I'm

01:54:40   sitting here until I can get this thing to reveal its internal thoughts to me." It's crazy to me,

01:54:45   but I totally get the motivation of doing it because I know I would have been doing it.

01:54:50   And I'm still tempted to do it, right? You're the one trying to get it to badmouth the Microsoft

01:54:54   Corporation. Yeah, exactly. Yeah. The other thing, I've had the same thing as you. And it's weird,

01:55:00   too. The other thing that's weird, not weird, but frustrating, is a lot of these things you have to

01:55:04   sign up for and the get off the waitlist process seems non-deterministic. Like our friend John

01:55:11   Siracusa, I don't know if he, he must be in by now, but at least recently, he was still on the

01:55:15   the waiting list for OpenAI's chat, but he got into BARD, which is Google's thing,

01:55:20   which came weeks late, very recently, right away. I got into OpenAI chat right away, and

01:55:26   not by pulling strings like I contacted the—I'm John Gruber at Daring Fireball—I just signed

01:55:31   up on the regular waiting list and waited, but I got in early. BARD, though, I only got

01:55:35   in like two days ago, a day or two ago.

01:55:37   Steve McLaughlin Me too, yeah.

01:55:38   John Gruber But I started asking it questions, and I know,

01:55:40   know that Bard is a totally independent LLM, or large language, yeah, LLM inside Google.

01:55:49   But it's shocking to me, not shocking, surprising, but not shocking how similar the answers are

01:55:55   to ChatGPT, right? It's an entirely different model, massive amounts of years of academic

01:56:04   AI study and inordinate amounts of aggregate computing power to compile the model, like

01:56:13   Microsoft has written about how they didn't even know if it would be possible to build

01:56:17   the thing they built for OpenAI with gazillion NVIDIA video cards in a cluster together to

01:56:25   do this. Massive amounts of computing power thrown at it, and then you get the same answers

01:56:31   from the same questions. It's very, very similar. And I guess the explanation is that the theory

01:56:38   of how large language models work mean that if you train them on the same corpus and the corpus for

01:56:45   all these things is the internet, right, and the internet's the internet for everybody, then of

01:56:51   course you're going to get similar results if it's—I don't know. I don't know how surprised we

01:56:57   should be that they're so similar or not. **Ezra Klein:** Right. I guess if you just

01:57:02   broke it down to the simplest thing, if you said like Microsoft and Google have independently

01:57:09   come up with algorithms that rank the prevalence of letters and words, right? And they would come

01:57:17   out to be the same. And I guess maybe that does explain it at a bigger level, but I agree it's

01:57:24   kind of weird, especially. But that sort of lends itself to the argument that these things

01:57:35   aren't actually smart. The comeback to people like you and I kind of feeling a little bit

01:57:41   spooked out and we need to control these things. Some people have said that these things aren't

01:57:50   smart, they're just kind of like a dolled-up Google result, right? And when I see things

01:58:00   like the same kind of response from two technologically independent companies, or I see things like

01:58:09   – another sort of observation I made was the first time you ask GPT to write a poem,

01:58:18   It seems incredible."

01:58:20   And then you ask it to write another poem or another song, and it's like, "Wait, you're

01:58:24   kind of a hack.

01:58:25   You're kind of using the same trick again and again."

01:58:29   So it's like, if you ask a friend of yours to play a song on the guitar, and they play

01:58:34   this amazing song, and then you ask them to play something else, and they play a completely

01:58:40   different song that sounds just the same, after a couple times you start to realize,

01:58:44   "This person doesn't actually know how to play guitar.

01:58:46   just know one song. And I guess there's something to, I think, the idea that the whole

01:58:53   parlor trick idea of this, I think there's more to that than those of us who are kind

01:59:03   of coming out of it instinctively worried. I think there's more to the parlor trick

01:59:07   aspect of it than we have given it credit so far.

01:59:12   Yeah. I also think the thing—and again, I don't know that society is ready for it—is that

01:59:20   it touches on a sensitive subject, but it's the nature of our existence and our intelligence. And

01:59:31   the truth is probably, in my opinion, that our idea and concept of consciousness and personality,

01:59:39   and, for lack of a better word, our souls, is really, ultimately, a very complicated

01:59:47   series of parlor tricks. Right? And so, the argument on the one end that, "Oh, these chats

01:59:56   are just a parlor trick," and they're very simple, and right now you can confuse them by asking them

02:00:00   how many letters are in the word color, and it says it's a six-letter word, and it's not—or

02:00:06   I guess it is in England, but not in the US. See, it's not intelligence, but it's like,

02:00:12   ask chat GPT for the same question and you get the right answer. I just asked Bard right here while

02:00:17   you were talking, and Bard knows that color has five letters. And as these parlor tricks get

02:00:24   better, you can unfold, unpeel the onion and see its parlor tricks all the way down, but if the end

02:00:32   result is indistinguishable from intelligence, isn't it intelligence? And the truth is,

02:00:37   I think through evolution, what we've got is just a very unbelievably beautiful and complex

02:00:44   machine in our brains, but it is just a series of parlor tricks, right? I mean, I mentioned

02:00:49   vision problems before. The way our human vision works is really a series of very fascinating

02:00:56   parlor tricks. You don't see the way you think you see.

02:01:00   Jared: Right. I think what I would say is the parlor tricks aspect is the ability of a person

02:01:10   or computer to regurgitate something that is structured in a way that appears intelligent.

02:01:22   Pete: Right.

02:01:23   - Right. - And you and I

02:01:25   and many, many people in the world

02:01:27   and also GPT are able to do that sometimes.

02:01:32   And the thing that GPT can't do, as far as we know,

02:01:36   as far as I know, is invent something new.

02:01:39   And as long as, I think what you're getting at is right

02:01:45   to the extent that most of the time we go through life

02:01:47   day to day, we talk to people,

02:01:49   it doesn't take inventing something new

02:01:51   to say in an empathetic way to somebody who says,

02:01:55   "Oh, I had a bad day today.

02:01:56   "Oh, I'm sorry, what happened?"

02:01:58   That's not inventing something new.

02:02:01   That is, in a way, it kind of falls into

02:02:03   the parlor trick category in a crude way.

02:02:07   But somebody coming along and like,

02:02:11   inventing the next thing, I don't know.

02:02:16   I don't know if the computers can do that.

02:02:18   - Right, well they keep inventing,

02:02:20   They keep, at least in their canned replies,

02:02:24   they keep mentioning that they don't have feelings.

02:02:26   And I'm sure, as you've been poking about

02:02:29   trying to get it to insult Microsoft,

02:02:31   they keep saying that you can't hurt my feelings,

02:02:33   so don't worry, and blah, blah, blah.

02:02:35   But there's gotta be a reason

02:02:38   that we evolved to have feelings, right?

02:02:40   It was an advantage.

02:02:41   And so that maybe trying to push these things

02:02:45   to have feelings or to pretend to have feelings

02:02:48   is the way to manage ever-increasing intelligence

02:02:52   and capabilities, right?

02:02:54   That is somehow embedding empathy in them

02:02:59   or the simulation of empathy,

02:03:01   which if it works exactly the same, what's the difference,

02:03:05   is the way to keep them, keep the paperclip optimizer

02:03:10   from destroying humanity to make more paperclips

02:03:14   out of the human remains of the humans it's killed

02:03:17   to prevent humans from keeping it from making more paper clips out of every single atom

02:03:22   on the face of the earth.

02:03:23   I don't know.

02:03:24   All right, let me take a break here.

02:03:25   Thank our other sponsor for this episode, our very good friends at Squarespace.

02:03:29   Squarespace.

02:03:30   Squarespace is the all-in-one platform for building your presence online, and they handle

02:03:38   everything you need to have your own website, from domain name registration, which is the

02:03:42   The first thing you need to decide to picking templates, to adding features, to customizing

02:03:48   those features, to updating your website on a regular basis with something like blog posts

02:03:54   or if you've got a catalog, if you're creating an online store to sell stuff to keep the

02:03:59   catalog up to date.

02:04:01   Everything you need, analytics, see who's coming to your website from where, which parts

02:04:06   of the website are popular, which parts aren't, where people are going on your website once

02:04:11   they're there.

02:04:12   it in a very, very, very approachable analytic interface that is so much better. So I mean,

02:04:19   like, the 1000 times better. Like the difference between reading a novel from your favorite novelist

02:04:28   to reading the phone book difference between like, say, Google Analytics interface in terms

02:04:34   of complexity, awesome analytics interface at Squarespace, everything you need. If you need a

02:04:40   website. Try them out 30 days free. No watermarks on the website. Everything you need to do,

02:04:47   no credit card required. Just go to squarespace.com/talkshow to start. "Talkshow" is the code that tells

02:04:54   me you came from this show.

02:04:56   Or if someone needs a website and they're coming to you, listener of the talk show,

02:05:01   as their nerd in their circle to help them get started with a website, send them to Squarespace.

02:05:07   could do it themselves. That's how low technical need it is to get started on Squarespace building

02:05:13   a website. So my thanks to them for their continued support of the show. Once again, the web, the URL

02:05:18   is squarespace.com/talkshow. Gets you a free trial, 30 days, and when you're ready to launch

02:05:25   and start paying the same offer code, talk show, without the "the," just talk show, gets you 10%

02:05:31   off your first purchase. Pre-pay for up to a year, get 10% off. My thanks to Squarespace.

02:05:36   Anything else on this front? Have you played with GTP4 at all? I guess you did because you

02:05:42   said you've been using your one a day at Po. Yes, I use one a day at Po and I'm a little

02:05:48   confused. I thought I saw something about Bing maybe being based on GPT4.

02:05:54   They're very cagey about it, right? And I guess that they talk about GTP3.5. I don't know what

02:06:02   the point. It's like version numbers are always sort of some programmer's idea of a benchmark

02:06:07   or a milestone. It seems to me like Bing Chat is too fast to beat GPT-4, but maybe that's just a

02:06:15   function of Azure's incredible cloud infrastructure compared to OpenAI. I know our mutual friend,

02:06:23   Ben Thompson, has speculated maybe the future of OpenAI is to no longer run their own thing

02:06:29   because it's so expensive and just defer the whole thing to Microsoft on the back end.

02:06:33   This stuff is crazy, crazy expensive on the back end. And GPT-4 in particular sounds like,

02:06:40   oh, you go from GPT-3 to 3.5 to version 4, and of course it gets a little better. But version 4

02:06:46   is crazy more expensive. So if you're running a service and you're a paying customer with an API

02:06:54   token to open AI, it's like 25 times more expensive per query to use GPT-4 than GPT-3.5.

02:07:05   Maybe more. 25x might be off on the low end. So there's a reason why so many things with

02:07:11   GPT integration are all on 3.5 because the 4 is so much more expensive. But even though

02:07:18   it is more expensive, it's also way slower. You notice it waiting for the response. It

02:07:23   Honestly, it's worth it because these responses are so interesting. But it's another one

02:07:28   of those things that harks back 30 years to like when we first got the web and you would

02:07:33   type a query and then it would just feel totally natural to wait a minute or 30 seconds to

02:07:41   get a response and you'd be like, "Okay." I remember, I've told this, I remember when

02:07:46   I was in college, I was thinking about it with, again, side digression, but with DP

02:07:52   shutting down because of Amazon layoffs. They've owned them since like 2007, I think, and DP Review

02:07:59   was around since 1998. And I remember when they were new, because I remember being intrigued.

02:08:04   And that's another one of those things where like the first few years of digital cameras,

02:08:07   the imagery was terrible, just terrible compared to like $180 film camera you could go to any store

02:08:14   and buy, and buy just generic Kodak 100 ISO film. You'd get such better imagery off the

02:08:21   cheapest camera you could buy compared to the $2,000 digital camera in 1998. But the future was

02:08:29   obvious, right? So sad to see DPReview go. But the other 90s era acquisition Amazon made, they

02:08:36   bought IMDb back in the 90s, I think. I think they've owned them since then. And IMDb is still

02:08:42   thriving, right? And it's still a great source for like, who's in that movie, right? Or who directed

02:08:48   such and such. It's great. But I remember using IMDb before the web. I was taking film classes

02:08:54   at Drexel. And it was like the way these classes, I took like a Hitchcock class, a Westerns class.

02:09:01   Every week we'd watch them, like on Monday, we'd watch a movie. The whole movie, it was like a

02:09:07   three-hour class where we'd have like an hour of lecture and then we'd watch a two-hour feature

02:09:11   film. And then on Friday, we'd have to hand in a paper about the film that we watched. And we'd

02:09:17   have a class discussion about our papers. And it was this huge help when writing a paper about a

02:09:24   movie to consult with IMDb to get the cast, right? And if you wanted to talk about the screenwriter,

02:09:31   you just give it. But the interface to IMDb in like the mid-90s, like I don't know, this is

02:09:36   probably like 1994 or so, it was email. You would email, there was a magic address at imdb.com,

02:09:43   and then the subject would be the string you were searching for, like the birds by Alfred Hitchcock.

02:09:51   And then you'd email them, and then two or three minutes later, you would get an email back,

02:09:56   and it would have the IMDB data for the birds by Alfred Hitchcock. And it's like,

02:10:01   "Now you've got an email there that you could consult as you wrote your paper."

02:10:06   And the fact that it took, I don't know, a minute, two minutes to get the email back,

02:10:10   it felt like a miracle. It's instant. That felt instantaneous. That's what sometimes

02:10:16   chat GPT force feels like. It's like you type your thing, it goes off, and then you get this

02:10:23   amazing answer. What I think is fascinating is all of us alive today, except for a very,

02:10:32   very small number of people who work at OpenAI or Google or Microsoft or whatever,

02:10:40   would have found GPT 3.5 to be beyond our wildest expectations six months ago or whatever, right?

02:10:48   And now we're like, "Oh, GPT 3.5, that's old and busted." That's junk now. It's just the nature

02:10:55   of technology, but it's kind of funny to me that in this context, how fast that has happened.

02:11:04   We always joke about it like, "Oh, my old phone's junk now. It was great a year ago."

02:11:09   But in this context, it's like not only is the old GPT "junk," but it's junk while still being

02:11:20   above and beyond any expectations I even still have. It's like simultaneously junk

02:11:27   and beyond my wildest expectations for how anything should work in computing.

02:11:33   Yeah. I posted this to a group that were our group Slack, but GPT 3.5, I asked, "What

02:11:40   do you know about John Gruber?" And the 3.5 answer is, "John Gruber is a writer, blogger,

02:11:44   and podcaster best known for his website, Daring Fireball, where he writes about Apple

02:11:48   and related technologies. He is also the co-host of the podcast, The Talk Show with John Gruber."

02:11:54   Well, I used to be the co-host.

02:11:56   That's about 15 years.

02:11:58   But also, but it's half wrong because then it wasn't The Talk Show with John Gruber, right?

02:12:02   But still, overall, you got to give that a thumbs up as a bio for me. Then my follow-up question in

02:12:08   the same chat, "What's his favorite baseball team?" And the answer is, "I do not know for certain what

02:12:14   John Gruber's favorite baseball team is, as I do not have access to personal information about

02:12:19   him. However, as far as I'm aware, he has not publicly disclosed any favorite baseball team."

02:12:24   Well, that is wrong. Now, GPT-4, what do you know about John Gruber? John Gruber is an American

02:12:31   writer, programmer, and web designer best known for his blog, Daring Fireball, which he founded

02:12:36   in 2002. Daring Fireball focuses primarily on topics related to Apple Link and its products

02:12:41   and software and the tech industry in general. Gruber is also the inventor of the Markdown text

02:12:46   formatting syntax, a lightweight markup language designed to simplify the process of writing web

02:12:51   content. Markdown has become widely popular and is now used extensively for documentation,

02:12:56   note-taking, and more. In addition to his blog, Gruber hosts a podcast called The Talk Show where

02:13:00   he discusses technology design and related topics with various guests. He has an influential presence

02:13:06   in the tech community, especially among Apple enthusiasts. Holy fucking shit. Is that a good

02:13:13   biography of me? I don't know. I don't know that I can improve it. And I'm supposed to be a writer,

02:13:19   right? I mean, in the right facts in the right order. I mean, holy shit. Really? Now what's his

02:13:28   favorite baseball team. John Gruber's favorite baseball team is the Philadelphia Phillies.

02:13:32   He has met, he has met, he has mentioned his support for the team in various online platforms

02:13:37   and has discussed baseball topics on his blog during Fireball and his podcast, The Talk Show.

02:13:41   The Phillies are a major league baseball team based in Philadelphia, Pennsylvania. So they are

02:13:46   my second favorite team. It's fascinating. But they're your, they're your second favorite team,

02:13:53   like somebody who loves steak, their second favorite food is tofu.

02:13:59   Right, right. Or my second favorite lady friend, right?

02:14:04   Right, right. Exactly. It's like if you have to pick a backup team, then I guess you'll go with it.

02:14:12   I don't know where they got it. And my first thought was maybe because when the Phillies

02:14:15   were in the World Series just a few months ago, I did express that I was rooting for them at the

02:14:19   Houston Astros, but I realize that has to be wrong because infamously or famously, however you want

02:14:24   to prefer, even chat GPT-4, the data corpus cutoff is 2021, so that can't be. I don't know where

02:14:31   this came from. It must be some kind—I think it's probably an educated guess because it knows I do

02:14:36   live in Philadelphia, so I think it's one of those things where they're making up facts based on a

02:14:43   guess but yeah anyway it is unbelievably different and that bio I can't I

02:14:49   honestly can't believe how accurate it is the very good various guests I mean

02:14:53   Jiminy Wow well so it sounds like you have a GPT are you paying for a GPT oh

02:14:59   for no so what I'm doing is I'm beta testing a very cool app called PD which

02:15:05   started as a watch only app I told you this before the show but for those of

02:15:09   - Yeah, right. - I linked to it

02:15:10   on "Daring Firewall." - And that, yeah.

02:15:11   - P-E-T-E-Y, it's in the App Store.

02:15:14   It started out watch only, and it's very fun on the watch.

02:15:17   And if you have a, it's kind of a toy, but why not, right?

02:15:20   In the test flight, it's currently,

02:15:23   I'm able to choose between GPT 3.5 and four,

02:15:27   and I think when the developer, when he goes public with it,

02:15:32   I think, you know, he's obviously gonna charge

02:15:34   for GPT-4 access. - I see, yep.

02:15:37   - And, but because it is slower,

02:15:39   It is noticeably slower. Like even he said, I've been chatting with him. He keeps it on 3.5 and

02:15:45   it's not because of the cost because it's still at a—it's 25 times more expensive, but it's still

02:15:50   like a fraction of a penny. It's just almost a penny as opposed to a fraction of a penny.

02:15:55   It's just that when it goes wider, he'll have to charge for it. His name, I'm going to—sorry,

02:16:00   pal. I think he's from—let's see where he's from. NL. Is that—oh, Amsterdam.

02:16:06   So yeah, yeah, hitty van der Plerg

02:16:11   I'm sorry, I'm gonna guess it's well PD is a lot more. Yeah. But anyway, it's a very fun app. Very simple

02:16:18   You just it's just you'd ask questions get answers and it maintains context

02:16:22   But I love it gets to your your and my love for natural apps native apps, right? It's I need to check it out

02:16:30   Yeah, yeah

02:16:31   Yeah, and also, frankly, it has a better interface than Po, because Po is clearly like a sort

02:16:36   of, even on mobile, is sort of a web interface, because I guess it's the same interface on

02:16:41   Android. It's sort of a web wrapper in an app, whereas PD is native. I don't know when

02:16:46   it's going to come out. I don't even know if I'm supposed to be talking about it, but

02:16:49   I hope he doesn't mind.

02:16:50   Yeah, well, there's no such thing as bad publicity.

02:16:53   Yeah, I'm going to guess the publicity of me mentioning it on the show is worth it if

02:16:58   I'm breaking a friend DNA friend da that anything else you wanted to talk about before we wrap up

02:17:04   Daniel I don't think so this has been a fun fun time oh I did good stuff I knew you'd be intrigued

02:17:11   by it let's let's give a shout out to to pimp all the stuff that we can you've got your your

02:17:16   your regular podcast with you as mentioned earlier you are our mutual friend manton reese which is

02:17:22   Core Intuition.

02:17:24   That's right.

02:17:25   It's at coreint.org.

02:17:29   This is, I think, my first podcast appearance since the destruction of Twitter.

02:17:36   So I normally would give my Twitter handle right now, and I'm going to give my mastodon.social

02:17:41   handle.

02:17:43   But I've been picking up lately.

02:17:45   I've noticed people struggling with this, and apparently what you do is you don't say

02:17:49   what your actual handle is.

02:17:50   just say, "Go search for Daniel Punkass," which—

02:17:55   I'll bet Daniel Jalkett would get people to you as well.

02:17:59   Right where he asks.

02:18:00   Daniel Punkass.

02:18:01   I wonder.

02:18:02   Yeah.

02:18:03   I wonder.

02:18:04   Yeah.

02:18:05   Well, you could probably ask GPT how to get in touch with me.

02:18:06   But no, I am—Daniel Jalkett is my name, and Daniel Punkass is my game.

02:18:12   It's right there on Mastodon.

02:18:14   And my website is redsweater.com.

02:18:18   As we've discussed before, it took me 20 years to get the hyphen out of that domain name.

02:18:24   And you paid good money to get that hyphen taken out.

02:18:28   So everybody should just go, make sure you don't type the hyphen, redsweater.com.

02:18:32   I still own the hyphen version of course, but don't waste your time on that.

02:18:36   Just go to redsweater.com.

02:18:38   As Jon graciously mentioned earlier, I have a blogging app that Jon sometimes gives me

02:18:44   hell about.

02:18:45   I got myself in the about box finally.

02:18:49   That's right.

02:18:50   Yeah, the latest version...

02:18:51   Oh, no, no, no, not Mars Edit.

02:18:52   Oh, no, Fast Scripts.

02:18:53   Fast Scripts.

02:18:54   So, yeah, that's a good segue to Fast Scripts is another app that Jon sometimes gives me

02:18:58   hell about.

02:18:59   But yeah, the latest version of Fast Scripts, I had to check in with Jon about some changes

02:19:04   to the regular expression support in AppleScript.

02:19:09   So now he is, I think I called you chief …

02:19:11   Dave Asprey … RegEx officer.

02:19:12   Tim Cynova … RegEx officer, yeah.

02:19:14   Dave Asprey … The title I was born to hold.

02:19:16   Tim Cynova … That's it.

02:19:17   Dave Asprey … Yeah.

02:19:18   Talk about a skill that the ChatGPT guys are putting me out of, right?

02:19:23   My uncanny ability to craft regular expressions is useless now because these ChatGPT things

02:19:30   are awesome at creating regular expressions.

02:19:32   really, really good at it.

02:19:35   - Have you asked you to create a regular expression

02:19:38   for Markdown?

02:19:39   - No, that would be crazy.

02:19:41   - From first principles?

02:19:42   Like don't just go find it, but yeah.

02:19:45   - Oh God.

02:19:46   What else? There's Black Ink.

02:19:47   Did you mention Black Ink for our--

02:19:49   - Black Ink is my crossword app.

02:19:52   It's for solving crosswords on your Mac.

02:19:54   And one of these days real soon,

02:19:56   if people keep bugging me about it,

02:19:58   it's gonna be iOS app as well,

02:20:00   which I've been saying that for 10 years,

02:20:02   but it's actually getting to be real soon now

02:20:04   because it totally works.

02:20:05   And our mutual friend, Paul Kaphasas,

02:20:08   is always bugging me to ship it.

02:20:09   He's been using it for years now.

02:20:12   - I wonder how close the chats are

02:20:14   to being able to solve crossword puzzles.

02:20:16   - It's a good question.

02:20:19   I actually played around not with solving the puzzles,

02:20:22   but I'm really interested.

02:20:23   I was really interested to try to find out

02:20:25   whether they could construct a puzzle.

02:20:26   - Oh, yeah, you know what?

02:20:27   That's what they would be good at.

02:20:29   - Yeah.

02:20:30   - Yes, yes, like give me a four letter word.

02:20:32   Like if you're trying to, you know,

02:20:33   you're stuck because you're like, ah, crap,

02:20:36   I don't know how I'm gonna figure out a word

02:20:37   that'll fit with this.

02:20:38   Oh man.

02:20:39   - So this is actually what led to me discovering

02:20:42   that they're horrible at counting the letters in words.

02:20:44   - Yeah. (laughs)

02:20:46   - Because I was like, that's kind of a basic skill.

02:20:48   If you're gonna be good at this,

02:20:50   you kind of need to know that.

02:20:52   So once they get that skill,

02:20:54   they might be a little more capable at it.

02:20:56   - Yeah, I sent you a screenshot of me playing

02:20:59   poe tic-tac-toe. And I was going to beat it and I was feeling very happy about the fact

02:21:05   that you're so stupid you do know how to play tic-tac-toe but you actually can't beat this

02:21:09   simple game. And as I was about to make the winning move it told me the game was already

02:21:14   tied. So I don't know if that means it's so dumb or unintelligent that it not only was

02:21:23   going to lose a tic-tac-toe but isn't even as sure about the rules that it didn't know

02:21:27   that the game wasn't complete? Or is it a sign that it's scarily intelligent and did

02:21:33   what a little child would do and declare the game over before it could lose?

02:21:38   Right. It's like, these are not the droids you're looking for.

02:21:42   Right. I don't want to play Tic Tac Toe.

02:21:46   Yeah, I don't know. It's a little weird. For something to be... It's funny. Something

02:21:54   to be so intelligent seeming and so idiotic at the same time. That's part of the sort

02:22:01   of weird—

02:22:02   It is! It's a little like parenting, right? Because your kids get to be like three and

02:22:07   you're sitting there like, "Man, it's so amazing how smart they're getting." And

02:22:10   then they do the dumbest thing in the world and you're like, "Oh yeah, three-year-old."

02:22:14   Right, you're three.

02:22:15   Anyway, Daniel, good to talk to you. Thank you. And let me just give one quick shout-out

02:22:19   to our sponsors, our good friends at Collide, where if you're into Okta, you can get your

02:22:23   hopefully into compliance in Squarespace where you or anyone you know can build your own

02:22:28   website from the ground up. Thanks.

02:22:30   [