PodSearch

The Talk Show

414: ‘Annoying Friendliness’, With Joanna Stern

 

00:00:00   I was doing a very important project in the garage.

00:00:02   I feel like I should be giving this exclusive to The Verge cast where I have spent many hours,

00:00:08   and in fact, I FaceTimed Neelai from the garage.

00:00:11   But fans of mine and The Verge cast and my interview with Craig Federighi will know that I've been on a months-long journey to get my garage working with Siri.

00:00:25   This is, no, you've got to break it here. Come on. You only could, you know, Verge cast, take it your way.

00:00:29   And I want to break some news. I know, let me break this news here.

00:00:33   Joanna Stern has finally gotten her garage to open by saying, and you know what, I'm going to say it here, so everyone who's listening, if you have this set up, it will do it on your phone.

00:00:44   Hey Siri, open the garage.

00:00:46   What was the trick? Can you, yeah? Nope, now you're opening your garage.

00:00:54   No, it needs my Face ID, so it's good. It didn't do it.

00:00:56   Oh, good. That's why I use on the podcast, if I can think of it, I called all of the various devices, "Hey Dingus."

00:01:03   Well, you know, we should talk about that.

00:01:07   I watched your, I want to talk about a couple of your recent videos and columns, but I watched your most recent one where you went on the weekend, or was it a weekend? I don't know.

00:01:16   But you went on an overnight trip with a bunch of voice assistants and, son of a gun, the HomePod in my office set a timer for six minutes or whatever, whatever you were asking.

00:01:28   I've heard from quite a few people that I've done that, but I wish there was a way, like, for the systems to know this is not a natural voice in the environment, you should not.

00:01:39   Supposedly, I mean, I've never looked into it because I don't even know what they do. I don't even know. I try to remember to say, "Hey Dingus," myself, and that's as much as I can be bothered.

00:01:49   But my understanding, and I've never, this might be something you'd like to look into, is like on TV commercials, on sports a lot, Apple will have a commercial and they'll say, "Just ask Siri something, something."

00:02:03   And they'll demo it, and it doesn't set off devices, and there's some kind of trick with the modulation. Humans can hear it, but the devices don't, you know, like a frequency thing for that recording. I don't know. I can't be bothered.

00:02:18   Right, right. I mean, and there was something recently, some headline about that, that they were working on something deeper around that. But anyway, I can't get into the details.

00:02:30   Okay, okay. Don't spoil the details.

00:02:33   No, I could. No, it's more that everyone will stop listening to the podcast if I go through the details. It is such a technical, ridiculous, really niche, niche issue.

00:02:45   But let's just say, top line, and I should write this, and maybe John, you'll let me write it for Daring Fireball, because I'm pretty sure no other place, maybe I could do a Reddit thread. I should just do a Reddit thread about how I did this all.

00:02:58   Bottom line is, Chamberlain, which owns all the garage door makers, LiftMaster, LiftGate, whatever all of them are, has become really anti anything else, other than its MyQ app, which is what you use to open the garage through their digital platforms.

00:03:20   And they just don't like anything else. And so you have to buy a third party adapter, a third party accessory that is basically a hacked together small module that works with all of this stuff.

00:03:33   And I bought this Miros accessory, which Nilay suggested, and a few other people in this world have suggested as we've been going over this in the last couple of weeks. Thank you to all the people on Threads who shared their setups.

00:03:48   Anyway, I had to get an accessory, but then because I have a LiftMaster version that is really anti working with anything, it's just completely closed down. I had to get another accessory, so I have an accessory for my accessory.

00:04:03   This is all to say, it finally works. And now I have to do some cleanup and some rewiring in the garage, but I'm very proud of myself today.

00:04:12   That sounds like an accomplishment. Yeah, I couldn't remember the name. I know LiftMaster.

00:04:18   You do have a garage, right?

00:04:19   No, we do have a garage. We actually do. Even though we live in the city, we actually have a two car garage. We didn't want a two car garage. We only have one car, but we have a two car garage.

00:04:30   We do have a LiftMaster, but we don't use it enough. I've never even tried to hook it up to a home kit. If I drove all the time, it would drive me nuts that it's such a hassle.

00:04:42   I know that it is one of those weird things. When we talk about antitrust lately, everybody, whether it's tech nerds like us or even people in the outside media world looking at the law, from a political perspective, they're looking at the big tech companies.

00:04:59   And for good reason. They are so huge financially and in terms of people's daily lives that of course they're going to get the scrutiny. But it's some of these little niche industries where it's absurd.

00:05:13   And I don't think you can buy a modern garage door opener that isn't from one of the companies from Chamberlain.

00:05:21   You can't. I've looked deep into it and none of them support HomeKit or Google Assistant or even Alexa. They've got this integration with the MyQ app and Amazon. But it's just completely un-user friendly. They think their app is the ultimate and it is not.

00:05:37   It's like if you found out that General Motors also owns Ford, Honda, Toyota, Hyundai, and it's like, "What? How is that possible?" And then none of them want to support CarPlay or Android Auto or something like that.

00:05:53   You know, and the whole thing, you've covered this too. You had the Ford CEO on stage at a Wall Street Journal event.

00:05:59   Oh, I talk to Jim Farley like every day. He's my best friend.

00:06:03   But it really is like the GM tech strategy that they're taking of, "Hey, we're going to move away from the CarPlay type stuff and we're going to endeavor to make our own home-built console entertainment system so good that you won't even miss it."

00:06:21   Which is the technique that Tesla and Rivian have taken. And in some sense, it is competition at work where if CarPlay really is important to you, you can go to a car dealer and say, "Hey, does this car I'm looking at support CarPlay?"

00:06:39   And if that's a deal breaker for you, you could tell the salesperson and the salesperson will say, "You know, I lost a sale because this car doesn't support CarPlay." It's competition at work because you can just go across the street and buy a car from another company that supports it.

00:06:53   But when it's one company like this Chamberlain that owns all the garage door openers and they've decided they're going to do their own. Remember with Apple Pay when there was the current currency?

00:07:05   Oh, God, the currency.

00:07:07   Where you'd have the convenience of scanning a QR code on a piece of paper at the checkout every time you wanted to pay. And it's like, yeah, competition actually works, ideally. You know, I know it doesn't always work and there's always exceptions, but when one company owns everything, it really does break down.

00:07:24   Although I think 9to5Mac was reporting it and somewhere maybe it's been confirmed that Home Depot, which has been one of the last holdouts on mobile payment and Apple Pay because I believe, I need to go back to what I was reporting on like 10 years ago, which was that currency thing, but I believe they were going to be on that trend or with that protocol.

00:07:47   But Home Depot has apparently decided to go Apple Pay and that is a game changer in my life.

00:07:55   And I happen to know that because I write about it that Home Depot, I don't know if they're related or what the level of separation is, but in Canada, Home Depot has supported Apple Pay for a long time.

00:08:08   There's some kind of North American divide between Canada and America. And the last time I wrote about it during Fireball, I got a couple of emails from people saying, I don't know what you're talking about. I go to Home Depot all the time and Apple Pay just works.

00:08:21   And I write back and I say, I'll bet you live in Canada. And they're like, how did you know? And I'm like, because I watch you. Yeah, I've been watching you go to Home Depot in Canada. That's how I know.

00:08:30   I mean, I go to Home Depot a lot, as you can tell, I'm very interested in my garage. And so I'm very often there, but my kids love it. And it is annoying that it does not take Apple Pay.

00:08:41   Yeah, and at this point, I've thought about it and written about it enough that I remember, but I can't tell you how many times I'd get to the checkout, like in the garden center or something like that, where it's like you're practically in the garden center, you can see your car.

00:08:55   It's right there. You're already outside. And you're just like, let me pay. And it doesn't work. And I don't know.

00:09:00   Yeah, put your stuff down. And I have had people that the employees of Home Depot say you should write to the company. And it's like me writing to the company is not going to help.

00:09:08   No, but you could. This is where like being especially you with your perch where you're like, you know what, I don't even want to get into what I could do.

00:09:17   Yes, yes, exactly. Exactly. I could write to the company. I could in fact. Anyway, thank you for giving me this time and space to talk about Siri and my garage.

00:09:27   Yeah. You know what's one of the other companies that's like that that owns the like more brands than you think? And I always forget how to pronounce her name. It's the glasses company that owns Ray Ban.

00:09:37   Yeah. And you think about them like Ray Ban is especially in our beat. And I know you were just wearing them on your trip, like with the Ray Ban. I was just wearing them. I'm holding them up right now.

00:09:48   I was just how do you think I was capturing opening my garage? Of course, but they own you. And I actually am a fan of Ray Ban sunglasses and glasses frames, but they own like a zillion brands of glasses.

00:10:02   It's crazy. Like when you go into to buy a new pair of eyeglasses, what percentage chance you have. I think especially sunglasses, like they own all these brands that you think of are rivals are all owned by Luxottica.

00:10:16   They're not rivals. They're all part of the same company.

00:10:19   And I think you go into like LensCrafters. Yeah. 90% of those are those brands.

00:10:26   Yeah, because they do kind of rotten anti-competitive things because of their market share where it's like, yeah, sure. Why don't you give us how about 95% of your shelf space if you want any of our glasses?

00:10:38   And it's like, oh, that's that seems like it ought to be illegal. And they're like, shh.

00:10:44   Sorry, they own LensCrafters.

00:10:46   Oh, that's it. Yeah, that's it. Yeah. Which is it doesn't seem like it should be right.

00:10:53   Fact check, they also own Sunglass Hut.

00:10:56   Yes, exactly. See what I mean? So when you go to Sunglass Hut, you don't even have the option of buying glasses from a brand that isn't one of those.

00:11:03   Wait, sorry, they also own Pearl Vision.

00:11:05   Yep. Which they didn't always, right? That was like an acquisition at some point as they built up this arsenal of both the brands that make the high-end glasses and the stores where you buy glasses, which present themselves as being sort of,

00:11:21   oh, you just go to Pearl Vision to get your eyes checked and you get a prescription and you choose from all these brands.

00:11:28   And instead, it's all sort of like when you go in the Apple store and everything is from Apple, except for a couple of Belkin charges, you're not surprised.

00:11:36   No, but anyway, everyone should go read Luxatica's Wikipedia pages. It's fascinating.

00:11:42   All right. Well, before we get into it here, I got a new gimmick on the show.

00:11:47   You hear that?

00:11:49   That sounds like a real one.

00:11:50   That is. It is a real bell. That's my new money bell that I am going to ding when I do sponsor reads. That's the money bell.

00:11:59   I'm going to thank our good friends at WorkOS.

00:12:02   If you are building a B2B, S-A-A-S, SaaS, I think people pronounce it, app, at some point your customers are going to start asking for enterprise features like SAML authentication, SCIM, provisioning, role-based access control, audit trails, etc.

00:12:22   Now, I don't know what any of that stuff means because I'm not building SaaS apps, but if you, dear listener, are, you know what all those things are and you know you need them.

00:12:32   WorkOS provides easy to use and flexible APIs that help you ship enterprise features on day one without slowing down your core product development.

00:12:40   You get to spend your engineering and design time on the features that matter, that separate your company, whatever it is you want to do, not these sort of baseline SaaS features like those acronyms I gave you.

00:12:51   Those acronyms I just stumbled over.

00:12:54   WorkOS is used by some of the hottest startups in the world today such as Perplexity, Vercel, Plaid, and Webflow.

00:13:01   WorkOS also provides a generous free tier of up to 1 million monthly active users for its user management solution.

00:13:10   Comes standard with rich features like bot protection, MFA, roles and permissions, and more.

00:13:16   You don't pay a cent until you hit over a million active customers per month.

00:13:21   Truly, I think that's by any definition generous.

00:13:24   If you are currently looking to build SSO for your first enterprise customer, you should consider using WorkOS.

00:13:30   Integrate in minutes and start shipping enterprise features today.

00:13:34   Go to WorkOS.com.

00:13:37   WorkOS, that's how it's spelled exactly how you think, .com.

00:13:42   You want to start with your weekend adventure or you want to start with the Federighi interview you did with Lighting Class Month?

00:13:48   I guess they kind of intersect in some places.

00:13:51   Yeah, they do. I think this is actually a very thematic episode of the show to be honest.

00:13:55   And it fits with the, you know, as we head towards the end of 2024.

00:13:59   It is the year of AI and I think next year is going to continue that.

00:14:03   I don't really think it's a fad.

00:14:05   It is, there's some real there there, unlike let's say, I don't know, not crypto in general, but what were those NFTs?

00:14:14   NFTs. Or I think we could even lump Metaverse in there.

00:14:18   Yeah, Metaverse. Yeah, everybody's got it. Even Meta, the company that we named itself, doesn't talk about it anymore.

00:14:25   We can say that that was just a nice hype trend of, I don't know.

00:14:31   Let's just start with the most recent one. You, I loved it. It is such a perfect Joanna Stern video.

00:14:39   You took one overnight. Was it one day, 24 hours?

00:14:42   It was one overnight, yeah. Yeah, we went on a...

00:14:45   Describe the premise.

00:14:47   Well, people have been wondering like why I did this. So let me start with the premise of, and he's featured in the video,

00:14:53   but Mustafa Suleyman, who's now the CEO of Microsoft AI, an overseas co-pilot.

00:15:00   They were kind of the last on the train to release voice bots or voice capabilities to add to their chat bot.

00:15:08   And so co-pilot came out with that, I believe in early October. And when I was interviewing him, he was pretty clear.

00:15:14   And he's very clear in the video in the column that I wrote this past week, which is, this is a new generation of relationships.

00:15:20   And I kept saying like a relationship with a computer, but he kind of kept wanting to go further than that to say that this is a new type of companionship and friendship.

00:15:28   And hearing the word friendship with these just kind of kept making me think of like, what would I do to test friends?

00:15:35   What would I do with friends? And so, Girls Trip, it made total sense.

00:15:39   I was going to take my robot friends, my bot friends, the voices of Meta AI, which they also came out with within the last month or two,

00:15:47   a slew of voices, celebrity voices that you can use in Instagram, WhatsApp, Messenger, et cetera, with Meta AI.

00:15:55   So I had that, I had co-pilot, I had chat GPT with advanced voice mode.

00:16:00   And then I had, what was my last one? Google Gemini, there.

00:16:05   I go to Gemini, which came out in the end of August and just this week came to iOS with the Gemini app.

00:16:10   And so I strapped them all to a tripod thing, four phones, decided it needed a head because obviously it just didn't look right.

00:16:20   Head, wig, put it in the car, drove up, brought a wonderful producer, David Hall, and my reporting assistant, Cordelia James.

00:16:29   And we just spent 24 hours in this cabin, mostly me just talking to these things.

00:16:35   We recorded so much footage, we had 10 hours of footage at some point strung out of me talking to these bots because that's what they are.

00:16:43   They're voices, you're supposed to have conversations with them.

00:16:46   They have very arbitrary limits about when you can kind of stop.

00:16:50   Like I did hit the limit with chat GPT.

00:16:52   Chat GPT will say, sorry, you've talked to this thing enough.

00:16:55   You can't keep using advanced voice mode.

00:16:59   But that happens like, I would say six hours in.

00:17:02   Oh my God, that's a lot.

00:17:03   Yeah.

00:17:04   But don't worry, we had another account and so we kept going.

00:17:06   So that was the premise.

00:17:08   And I just, I sort of thought, okay, what are some things I can put it through the test?

00:17:12   And as you can see in the video, we built a fire together.

00:17:14   David, my producer, really thought it would be hilarious if I learned how to chop wood.

00:17:18   I was like, maybe just the hatchet on the branch is fine.

00:17:22   And, you know, tried to have some serious conversations with these bots and that didn't go as well.

00:17:28   But it was all in all a very, very fun, eye opening experience.

00:17:33   My favorite part, and you seemed genuinely, I don't know how many, if it was the first take or what, but you seemed genuinely blown away was when they started talking to each other about movies, I think.

00:17:47   No, Gillian Flynn novels.

00:17:49   Yeah, that's it, novels.

00:17:51   Which was, I think I was so, first of all, they started talking to each other because they all have mute buttons.

00:17:55   So I knew like, okay, I've got to mute this one to test this one and keep muting.

00:17:59   And that was the beauty of having the tripod set up.

00:18:02   But I had left them all unmuted.

00:18:04   And I don't know how they started getting on this topic of books.

00:18:08   But again, the fact was I tried to like program them to be, not program, but I wanted this to be a girls trip.

00:18:14   I set them to all have female voices.

00:18:17   I was definitely stereotyping there, right?

00:18:20   And then they all start talking about, I mean, men read these books too, but like, you know, this is a pretty popular genre among women.

00:18:29   And they all start talking about Gillian Flynn and Gone Girl and this genre of books and I'm just like dying and I try to like talk and get my words in and they won't let me in.

00:18:38   And it was, I mean, I guess it was kind of a human experience because I often feel that way.

00:18:43   People don't let me talk.

00:18:45   It reminds me of like when I first got into computers when I was really young.

00:18:51   And I mean, everybody did this in my generation.

00:18:54   Like every time I went to Kmart, I'd go to the Commodore 64 display unit and type 10 print Kmart sucks, 20 go to 10 run.

00:19:03   And then it runs.

00:19:05   The infinite loop is, I mean, Apple named their first campus infinite loop.

00:19:09   It's just a fascinating concept that to a mind that's drawn to computers.

00:19:17   It's just kind of beautiful.

00:19:19   But hooking up two machines to do something where they will never stop doing it to each other.

00:19:25   It's satisfying in a weird way.

00:19:28   I don't know.

00:19:29   I think partly it's just like a neat technical trick.

00:19:31   And part of it is reassuring, even going back 40 years to 1980s Commodore 64s and infinite loop.

00:19:41   It's a way of asserting our humanity versus their machines.

00:19:46   Like a human is never going to get tricked into an infinite loop.

00:19:50   Right?

00:19:51   You can trick people for a long time and it's kind of funny in a practical joke kind of way.

00:19:56   But eventually they're just going to pass out or something.

00:19:58   Whereas if you get computers talking to each other, they'll never stop.

00:20:02   They'll never stop.

00:20:04   And we saw that.

00:20:05   I mean, it's beautiful, but it's also amazing to just watch where they'll go.

00:20:10   Because these large language models, they're basically programmed to never not have an answer.

00:20:17   Right?

00:20:18   They're not even programmed.

00:20:19   They just never not have an answer.

00:20:21   They will always come up with something.

00:20:24   And that's part, it's like the most chatty of friends that you kind of just, you just need to give me a one line answer, but you gave me four paragraphs.

00:20:31   And in that case, they just will keep, keep going.

00:20:35   There was another instance where one of them started talking about, I don't know, it must have been maybe something in the memory of Gemini or something based on me, but it wanted to talk about writing an essay about EV charging.

00:20:45   And if they just all kind of kept going, not all because it was got one of them would, co-pilot specifically would get confused by this all.

00:20:53   But three of them would keep going, keep, keep, keep going.

00:20:57   You mentioned this, what was the phrase?

00:21:00   I think I wrote it down.

00:21:01   Annoying friendliness.

00:21:03   And I, in two words, I can't summarize my, that, that might be my single biggest frustration with all of them is, and part of it's me, you know me.

00:21:14   I mean, my personality, I'm a little, yeah, come on.

00:21:18   I don't need the overly friendly stuff.

00:21:19   But I think anybody gets tired of it because it's clearly phony.

00:21:24   Everybody knows that these are computers.

00:21:26   And to me, this sort of annoying friendliness that is clearly programmed into these things is, I find it distasteful.

00:21:39   It's if every, every food offered, like if you go on a resort or a cruise ship or something where you don't have options and everything is dessert.

00:21:50   It's like at first, the kid in you, there's a part of you that is, oh, it's a friendly computer.

00:21:56   That's amazing. That's kind of funny.

00:21:58   And as a kid, you might think, ah, like a week of nothing but candy and, and cookies and ice cream would be great.

00:22:05   You get physically sick after a while.

00:22:07   And it's like mentally, it's like this fake chipperness.

00:22:10   It just is so overly sweet.

00:22:13   It's like the mental equivalent of my teeth.

00:22:16   I feel like I'm getting cavities.

00:22:19   And I just want, I wish that, and they don't have, this is the thing that really gets me is they don't have a way to turn it off.

00:22:25   They don't have a dial.

00:22:26   They should have a friendliness dial and it should just be like, you should be able to set that.

00:22:31   And I think, like I said, the annoying friendliness comment in the, in the context of that these companies have to build trust with us, with these bots, right?

00:22:41   We've got to feel comfortable asking it about cooking, but also a major life change or whatever that spectrum looks like.

00:22:50   You've got to feel like I can confide in this thing and I'm going to trust that it says something that's smart and you need to feel comfortable to do that.

00:22:57   But like some people are just asking for cooking instructions or every time I like open it up, it doesn't need to greet me like I'm the queen of England.

00:23:08   I don't know.

00:23:09   Yeah. And it's like, sometimes I really do just want a button on an onscreen alert.

00:23:15   I just want the button that just says, okay, if there's no other option, it's I just want the one that just says, okay.

00:23:20   Cause it's okay.

00:23:21   Okay.

00:23:22   Or cancel.

00:23:23   And it's like, yeah, I just want to hit okay.

00:23:24   I don't want like a button that says, Hey, good to, good to see you again, John, do it.

00:23:30   And it's no, no, just give me okay.

00:23:32   I would really love to turn the personality off on these things and just get flat answers.

00:23:37   And I, there's a part of me that thinks like it's that when we first started talking to these things at all, like when Siri came out in 2010 or 2011, whatever the year was, it was more like that.

00:23:50   And it's, I, we all know what these companies are thinking.

00:23:54   They are thinking this is a disruption to society that are, you can have a conversation with your computer now, and we want to make this as, I don't know, as seamless or not seamless, but as acceptable as possible or as non-threatening as possible.

00:24:12   Right.

00:24:13   And I find the fake friendliness in a weird way is actually a little more disturbing.

00:24:19   Right?

00:24:20   And maybe, and that's where I finally got with some of this too, which is the, with given what had happened with the, the boy who had committed suicide and there's information about how he had been chatting with Character AI leading up to the, that and all these other types of bots that are being built for loneliness and companionship.

00:24:41   I think that's where this is just a disguise, right? The friendliness and the personality is a disguise for code and just computer.

00:24:54   Yeah. And sometimes I really wish I could turn it down and just get the code and just, I don't want it to, actually, I kind of would, me personally, if I, if there were a dial for like sarcastic, and I know if any, if anybody's trying to do it for God, I know it's,

00:25:10   I know it's the XAI that Elon's making, and I'm glad someone's trying to do other than pure saccharine friendliness, but it's like, I do not share Elon Musk's sense of humor at all.

00:25:23   And what they think is biting sarcasm to me is, is also not funny at all. It is not my, my vector of sarcasm, but at least they're trying something a little different.

00:25:36   But I would really just

00:25:37   No, and that's like the sarcasm thing. Obviously, I'm, I'm dry and sarcastic. And there's this moment in the video too, where I am, I can't believe we captured it because I, it like flubbed, right? It made a chatty B.T. flubbed and said, how the fire making you feel.

00:25:53   And I was like, you know, and I thought that was hilarious. Like if he was a human, you'd be like, you laugh at somebody's like, the way they butcher or something.

00:26:02   And I say back to it, fire make me feel good. And I'm cracking up. I can't like really bite my tongue. I'm cracking up and you can hear my producer cracking up. And it like doesn't get it right.

00:26:15   Like, and it just, but it gets the laughter like it, it picks up laughter. So it knows that you were, there's human laughter. And so somewhere in there, it's in and it just went, haha.

00:26:26   It just worse. It's just worse. And I do find it funny now that I've been doing this for so long and just not just professionally, but just going back to being 10 years old and the way computers were then.

00:26:41   And it's like, I see all of this progress and more than any other industry, the computer industry, it's why I continue to love following it and doing what I do is that it's still moving so fast and things are changing so fast.

00:27:00   And you get all of these examples over the decades of things that had previously only been imagined in science fiction. And now they're real, right? I mean, just the fact that we have, I mean, it comes up all the time, but the things that our iPhones and are all modern smartphones can do in our pocket are just flabbergastingly amazing in science fiction.

00:27:22   I mean, video calls like what you and I are doing right now to have this conversation used to be science fiction, and we could be doing it on our phones over the air. It's all amazing.

00:27:33   But it's always so interesting to me. Like the thing I'll, the thing I always think of first is how once it becomes real, at first it's amazing.

00:27:43   And then it settles in and everybody, nerds and non nerds alike, it just becomes a part of life, right? I mean, and things that happened a century ago are like that.

00:27:53   Running water is a marvel of technology. I often say like, if we could time travel and bring Ben Franklin to the modern day, and you want to take him to the airport and show him airplanes, you want to show him the internet and Google search and chat GPT.

00:28:07   I think just getting him to stop raving about being able to take a pee in the middle of the night indoors and hit a button and it just goes away is, you got to give him a couple of days to absorb the toilet and running water because it's just amazing.

00:28:25   Yeah.

00:28:26   It settles in, we just take it for granted. But I think that another way of looking back is how the science fiction writers who imagined it got it wrong.

00:28:33   And it's like one thing is everybody's imagined talking robots and talking AI, whether they are humanoid robots like C3PO who walk around or like HAL 9000 from 2001, which is a lot closer to the modern AI.

00:28:51   The one thing, you know, the way that HAL isn't a robot who moves around, but is everywhere on the ship, and it's the same HAL you're talking to in the place where the pod bay doors are as in the living room area or whatever.

00:29:10   Nobody imagined these fake friendly attitudes. Nobody. HAL is what I want. I want a little bit of personality, but mostly just sort of flatten just the facts.

00:29:22   I'm going to be honest with you. I don't even, I've never watched most of these science fiction movies. Every time I try to I fall asleep.

00:29:30   But I think so much of what you just hit on is so right. I think, especially the parts where we are amazed by it, and then we just take it for granted.

00:29:40   And actually, that is something I don't have the exact quote from here. But when I interviewed Craig Federighi a few months ago, or weeks ago, I don't know what day it is, about Apple intelligence.

00:29:51   And I really was trying to go after why a Siri like this? Where's the Siri improvements? I was really hitting him on Siri. He really did back up and really wanted me.

00:30:01   And I think he even was thinking about how far Siri has come and we got used to it. Right? Right? We started like to want more and more.

00:30:11   I want to pull up the quote because he obviously said it so eloquently. But I think, yes, and now even so with these bots, like you hit a wall, right?

00:30:21   We hit a wall with Siri and Alexa and I say that in the column. We hit a wall. We now know we can say, what do you call it? Dingus?

00:30:30   Dingus. Yeah. You can say, Hey, dingus, set the blah. Dingus do this, right? We've all learned the very strict formulas of how to do those things. We've had to talk the talk in their way.

00:30:43   We hit a wall with that. So now we have these bots. We are now hitting another wall with that. I don't know. I think they can change the friendliness and they can. There's probably a lot that can be done there.

00:30:55   But then you hit this wall of like, it sounds really believable, but it's just still a computer. There's more you can do. And it will walk you through these things in a more personable, friendly way.

00:31:09   And it is smarter. But you also will hit this wall.

00:31:13   Yeah. And it's, I don't know. The fake personality, the fake friendliness, I just never would have guessed it. But now that it's here, it's, oh, but I see it as of course they're doing it this way because they think it's non-threatening.

00:31:27   And they realize that people are going to feel threatened by the weirdness of this. And then I think it also sets in with everybody else is doing it this way, so we should too. Nobody wants to stick out and just sort of put out a more, much more robotic personality type, which I think after the novelty wears off, people would settle in and, and, and joy better, right?

00:31:54   Yeah. And here's, and this is what he said. He said, "But as humans, our expectations for what it means to use our voice to communicate and ask for things is almost unbounded."

00:32:04   Which is right. It's like, we start to ask and we start to feel more conversational, right? So we've gone from the, like, we know the formulaic. Now we're in this moment with, with large language models where we're able to talk more, more like yourself.

00:32:18   I mean, that was kind of also the amazing thing of being away with these things for 24 hours. You don't prompt it, or you're just like, oh, let's do a thing of yoga, pick some, let's make a yoga routine for me. Let's do it now. Right? Like you don't talk to it. You just talk normally.

00:32:31   And so I guess if the endpoint is Hal or whatever that dream robot that is our partner, or maybe friend, we're going to keep hitting walls.

00:32:43   It's like, there's our internal monologue in our own brains, but nobody, you know, that's everybody's is their own. And the next level of communication is speech. It's the oldest part of evolution. Literacy is a thing. And there are human beings who are illiterate, who cannot read or write.

00:33:02   But there's, unless you have like some kind of disability, everybody can speak. And it's kind of amazing. It's just it, it really is one of those things where you're like, wow, that is kind of amazing that people of who are really, really not very intelligent at all, just naturally on their own, learn to speak as babies.

00:33:19   And what do you do when you speak? You communicate your thoughts and your thoughts are, for whatever your level of intelligence is, for you, they're unbounded. And you just speak. And so speaking to computer, it is it's a really interesting observation by Federighi, because there really is no limit.

00:33:39   And it's, you and I are in fact, professional writers, or we try to be, but it's I feel you and I are at the higher end of aptitude for expressing ourselves through writing. But for most human beings of whoever you do, wherever you live in the world, whatever you, however old you are, communicating yourself by speech is the most natural thing in the world.

00:34:02   And so interfacing with computers that way is very, very different than pushing buttons or typing commands. It's just a complete, it's like a removal of abstraction. It is a level of abstraction, but we're evolutionarily hooked up not to think of it as being abstract.

00:34:18   You mentioned like the, you know, I think it was a 14 year old in Australia who recently was found to have gotten obsessed with Game of Thrones constructed character, and it sat all cases of teenage suicide, it's all tragedies.

00:34:34   And there are way too many people who, in the midst of depression, turned to suicide. It's always been true, it still is true, and there used to not even be chatbots.

00:34:48   And now that there are chatbots, somebody who's in the midst of an episode like this, that might have and probably would have been just as depressed if not more so without any access to chatbots, got obsessed with chatbots, and now that's the headline.

00:35:05   It's same thing with self-driving cars, where how many people die in all human driven car acts, cars and vehicles per day in the US? It is one of those things that once technology solves it, people are going to look back at 2024 today and see our driving culture and the number of deaths.

00:35:31   And for all the deaths, all of the gruesome injuries that people don't die and thankfully and recover from as barbaric. It is really, really strange historically how we've just, and it's one of those things that we've just accepted, you know, we, me and you and every most of the people listening to this, we're all born in car culture in North America, we just accept it as the way the world works.

00:35:58   But it's really weird and gruesome and self-driving cars are a way out of that. And, but the problem is there are still going to be some accidents, especially getting from here where all, well, we're already at a point where some cars are self-driving, but getting from the point where all cars were human driven to a future where all cars are either self-driving robotically or have systems in place to prevent collisions.

00:36:27   And we're seeing that where you can be like not paying attention. And if you get within a certain proximity of the car in front of you, the car breaks itself and fantastic, fantastic life-saving features.

00:36:39   And even at a lesser degree at lower speed, just nobody was going to get hurt. But thank God I didn't rear end that guy because I wasn't paying attention and avoid just the minor irritation of a fender bender.

00:36:52   But the problem is, it doesn't even matter which brand it is, but of course, if it's Tesla, it brings in all this other political baggage and personality baggage of Elon Musk.

00:37:04   But one accident that kills somebody with a self-driving car is the Man Bites Dog headline, right? It's, "Oh, there it is!"

00:37:13   But meanwhile, there's 10,000 other people who die, I don't know what the, you know, but it's numbers like that every month in human-driven cars. And I think there's that sort of effect with these chatbots.

00:37:23   100%. And look, like, there's also a big difference between the chatbots that I, and I try to make this point in the piece and the video and the column, but big difference between the smaller startup character AI, which didn't seem to have many protections in at all in the app for if somebody's using words like suicide or killing themselves to provide more information, versus what I saw here.

00:37:47   I tried to test that with all of these, and they're all providing hotlines or talking, and saying, "Talk to a real human." So there's certainly also a lot that can be done on the product side.

00:37:58   I don't want to say, like, always the answer's in product. It's not, and I don't think that this, look, we've got, you've got a lot of forces, and you could talk about lots of forces that play into all of this horribleness.

00:38:09   But I agree with you. I don't think we should hold that up as the, not, and by the way, it sounded to me like in that case, he was also typing. It was a text-based chat. It wasn't voice, though character AI and a few of these do have the voice-based stuff.

00:38:27   But look, that makes it more realistic and human. I mean, there's no doubt that I had formed an image, especially of meta AI. I gave the Kristen Bell voice to it, because I just love Kristen Bell, and it's like, "I'm going to be hanging out with Kristen Bell all day."

00:38:44   And I have an image of that in my head as not just a chatbot, right? Like there is something of a, I want to say, a species. There's something more there than just, "Oh, I was texting with a computer."

00:38:59   Yeah, it tickles a lower lizard part of your brain. I guess lizards don't really talk to each other, but you know what I mean. Older, very old evolutionary when we first started to communicate with our voices, part of our brains, where when you hear the voice of a loved one, it sets off endorphins in your brain.

00:39:20   If you just happen to run into your kid accidentally, you were out and about and your kid's school group happens to be where you are and you hear your kid's voice, it's "Whoa!"

00:39:33   And it's like, you know, your brain kind of lights up in a funny way.

00:39:38   You know, just when you hear like a friend who you haven't seen in a while, they just happen to be in the same store as you and you hear their voice and it's like, "Whoa!" When you see somebody's face, you recognize it and you have a reaction that's not voluntary, it's involuntary and you don't get that.

00:39:55   I mean, a weirdo like me can get it from like typefaces or something. But even there, even for someone who's obsessed with something like that, "Ooh, this book has a real--oh, I love this font. It makes me a little more likely to buy it."

00:40:09   It's still not the same as when I see the face of a loved one or something like that. We're hooked up for that and computers have never had that before.

00:40:17   Yeah.

00:40:18   And now they do.

00:40:19   Kind of.

00:40:20   Kind of, right. And that's the other thing that science fiction didn't really imagine. It's like science fiction always gives us the finished product, right? And again, even if you haven't watched 2001, it's the red eye and C-3PO.

00:40:37   It's like, but then nobody ever imagined like, "Hey, what was it like the first year where they had this technology where you could talk to them?" And it was, and it's so different than I think anybody would have imagined it to be.

00:40:52   Because if you think about it superficially from the perspective of a science fiction writer 20, 30, 40 or more years ago, you start thinking like, "Well, maybe because it's the early days, it can only understand three or four things or something."

00:41:07   Because superficially, that seems like a good way to get it going. But no, they'll answer anything. They'll answer questions about anything. And at times, incredibly amazing and knowledgeable and give you these helpful answers and then at other times are more stupid than any person you've ever even fathomed.

00:41:29   You know, like when Google turned on the AI in their search and was giving really funny answers to how many rocks should I eat on a daily basis, right?

00:41:39   Right.

00:41:40   Which is such—

00:41:41   Or how to glue the cheese on pizza.

00:41:43   Yes!

00:41:44   Yeah, and they're just engineered to be confident.

00:41:49   Right, and it takes a human being at the moment to come up with a question like, "How many rocks should I eat a day?" to trick the LLM-based AI into giving a funny answer because it's a nonsensical question in a way, right?

00:42:09   Right.

00:42:10   You probably haven't seen it, given what you said about science fiction movies, the movie Blade Runner from 1982, I believe. Spoiler here. But there's the whole premise of the movie is Harrison Ford plays a Blade Runner who is a police detective whose job it is to hunt down replicants.

00:42:30   Replicants are robots who look like humans, and so they have skin and completely visually look and talk like humans, but they're fake underneath, and the ones who get out and are illegally out and about, you have to identify them.

00:42:45   And there's a thing called the Voigt-Kampff test where it's a series of questions that the police can ask, and you give 20 questions and a replicant is going to get tricked up by some of them.

00:42:57   And it's a really good scene in the movie. Everybody who's watched it is one of the most memorable parts of the movie. But the type of questions they ask are much more thoughtful and cool and interesting, and they're not, "How many rocks should you eat a day?"

00:43:13   Right? Like it turns out at our current moment, the way to figure out if you're chatting with an AI is to ask a nonsense question.

00:43:21   Yep.

00:43:22   Not a deep philosophical question.

00:43:23   It's so true.

00:43:24   It's true. Well, I'm making a list of all the movies I need to watch.

00:43:29   All right. Blade Runner might put you to sleep. I love it, but it looks good.

00:43:34   A lot of times I've started a lot of these movies and then I just fall asleep.

00:43:39   Here, let me hit the money bell again, do another sponsor. Thank you. To our good friends this time it's Squarespace. Squarespace is the all-in-one platform for building your own website online.

00:43:54   Everything you need to do to own your own website, Squarespace can do for you. Domain name registration, picking templates to what it looks like, modifying what it looks like through their fluid engine, which is the name of their next generation engine for modifying the way the thing looks.

00:44:12   And the fluid engine works great whether you are on a desktop computer or on your phone, any modern web browser, you just build your Squarespace website on the Squarespace website itself.

00:44:26   And what you see is what the people who visit your site will get. It is WYSIWYG for making a website and it works absolutely terrific.

00:44:35   They have all sorts of other features. I mean, just about everything you can imagine. Squarespace payments is the easiest way to manage payments in one place with Squarespace.

00:44:46   Onboarding is fast and simple. You get started with a few clicks and you can start receiving payments right away and you can give your customers any way to pay imaginable.

00:44:55   ACH Direct Debit in the US, Apple Pay support, Afterpay in the US and Canada, ClearPay in the UK, all sorts of stuff like that.

00:45:05   Credit cards, of course, easily accepted. They have all sorts of other features. Invoices, so if you're setting up not a website for just personal presence, but actually running a business and you need to send invoices and stuff like that.

00:45:21   They've got an invoicing feature built into the platform that you can use. Analytics, so you can see the stats, how many people are coming to your site, where are they going to on your site and where are they coming from, all sorts of stuff like that.

00:45:34   And it's such a great analytics interface where instead of looking at an airplane dashboard and it's all confusing and it's a beautiful, beautiful presentation that just makes it clear.

00:45:45   Just information graphic wise is just absolutely top notch. Where do you go to find out more? Go to squarespace.com/talkshow.

00:45:55   squarespace.com/talkshow and by using that URL, they'll know you came from this show and you, the listener who's signing up, will save 10% off your first purchase of a website or domain and you get 30 days free just by going there to start.

00:46:10   30 days to start, build the whole website, whole month, you can build it out, use it, turn it on, go live, make sure you like it and only after the 30 days are up do you need to pay.

00:46:20   Just go to squarespace.com/talkshow.

00:46:25   You did not include Siri in your girl's trip.

00:46:29   I didn't.

00:46:31   And the reason why was so funny. You said, Siri, can you introduce yourself? To get, I guess, B-roll footage of Siri introducing itself and Siri's answer was just to confirm, do you want to turn this device off?

00:46:48   Yeah, it was. I mean, I was looking for a moment to just have a like a sound like Siri having a basic answer, but Siri knew exactly how to write itself into the script.

00:46:58   I'm sorry there because I said Siri and now I've got devices.

00:47:01   Oh yeah, now we write Dingus and Dingus.

00:47:03   Beeping and bopping. But I noticed it back at WWDC when Apple unveiled the whole Apple intelligence array of features and they said it that with this new interface that came out with iOS 18.1 last month, where the new Siri gets this all new visual interface.

00:47:24   We're like, let's just speak about the phone where when you invoke it, you get a whole new visual interface where the border of the screen lights up in a Siri rainbow of colors.

00:47:37   And you can, they, in a typical Apple fashion, it made it seem like it's nothing but great, which is that you can continue a conversation.

00:47:49   So you could say, hey, Dingus, who did the Baltimore Ravens play last week? And Siri will give an answer.

00:47:59   And then while it's still up, you could say, and who did they, who did they play the week before?

00:48:05   And Siri in theory should know we're still talking about the Baltimore Ravens or whatever it was, and then give you an answer from the week before.

00:48:13   But then once you're done and that Siri interface goes off, then you're starting over each time.

00:48:21   And so the continuing context in a session is definitely better than the way Siri used to be.

00:48:29   But the fact that it forgets everything by design each time you start it up is very different from these other chatbots.

00:48:40   And I think that alone would have ruled it out from inclusion in your girl's AI weekend, right?

00:48:48   Well, I think it's also the, like, I was just testing something using Apple intelligence too, just to see, because I was, I'd been testing 18.2 and wanted to see about the chat GPT integration.

00:49:00   And if, when you do start talking to Siri, if it's going to really start to draw on chat GPT, and it really doesn't.

00:49:07   No, I could, I've been running 18.2 pretty much since the first beta too.

00:49:12   And it doesn't.

00:49:13   Yeah, I mean, yeah, I mean, and that's supposed to start happening.

00:49:19   I'm finding in my testing of 18.2 that I don't mind that this chat GPT integration is there, but I'm finding that it's kind of, if it weren't there, my opinion of the overall thing would hardly be different.

00:49:39   It really, and I don't know if that's because leading up to WWDC, I mean, there was a lot of reporting, Mark Gurman, of course, the king of the rumor reporting in Apple world had a bunch of stories leading up to WWDC, suggesting that the deal with open AI didn't really get finalized until the last minute or close to the last minute.

00:50:03   And Apple said at WWDC that there will be or could be future partners like Google in the future that you could switch from using chat GPT as your answer questions, Siri alone can't partner to Gemini or something else.

00:50:20   Here's what I think is happening.

00:50:21   All right.

00:50:22   Go, finish your sentence and I'll tell you mine.

00:50:24   Well, I'm just saying I think they built it so that if they had zero partners, if that's just the way it worked out for this year, that it would still be, we can still tell people buy these things for Apple intelligence.

00:50:34   And I think it's, I think it shows.

00:50:36   But I think what's happening specifically in 18.2, because you asked like, why wouldn't Siri fit into this?

00:50:42   And I think the main reason is that Siri does not have a strong large language model component to like be conversational, right?

00:50:50   And that, see my interview with Craig, we went deep on that.

00:50:54   And he definitely hinted at, he didn't hint, he just said, is that where this is going?

00:50:58   Of course, but we've got to figure out kind of the right parts to put that together.

00:51:02   And so because he was very clear, this is the garage example, he was very clear right now, Siri does the garage, it does it very well.

00:51:10   But what we don't want is Siri to go off and not do the that thing very well.

00:51:15   So they have to figure out the right way.

00:51:17   And that to me made a lot of sense.

00:51:19   You have billions of users using this thing, got to make sure that it can do the thing that everyone relies on it for, but then also be able to go and do the advanced thing.

00:51:28   One thing though, that I think is happening right now in Siri, so like I just asked Siri, give me some good recipes for meatballs.

00:51:37   I don't know, I always go to meatballs as my like, go to test of it, right?

00:51:40   And it searched the web and found that and gave me like the little pop up that has that.

00:51:45   I tried a couple things and it didn't bring up chat GBT.

00:51:49   But when I asked it to write a poem about John Gruber, it did.

00:51:55   Can you read the poem?

00:51:56   It asked me, yes, what happened to the poem? Hold on.

00:51:59   But it also went away.

00:52:00   Wow.

00:52:01   Hold on, didn't I screenshot that?

00:52:02   Yeah, see, it's clearly by design and I think partly to cover up for limitations in their own system, but also partly because they don't want to even get into that now, like the permanent memory of your interactions with the service.

00:52:15   They really want that blank slate each time you bring it up.

00:52:19   Hold on, I'm asking to do it again.

00:52:20   But I think that what's happening is that there's very specific types of prompts where it, write a poem or whatever the other ones are that they're saying, hey, let's go to chat GBT for that.

00:52:31   But then for some other things where a large language model could be useful, it's not doing that.

00:52:36   And I also wonder if I had asked it to write a poem about John Gruber and I was doing it, because I'm doing it by text right now, if I had done it verbally, would it read it back to me?

00:52:49   I don't know.

00:52:50   Probably not.

00:52:51   If you do it, if you texted it, it texts the answer, right?

00:52:54   I have found myself, I don't care enough about these things to use them all.

00:52:59   And so I've sort of gone all in with chat GBT.

00:53:03   And I mainly also, I like the answers.

00:53:06   I think the 4.0 model is probably the best or closest to the best for the sort of things I'm using it for.

00:53:13   But I really love their Mac app.

00:53:16   The chat GBT Mac app is by far and away the best Macintosh app for any of these bots.

00:53:22   And I know...

00:53:23   I use it all day long.

00:53:24   I love it.

00:53:25   It's just a really well engineered Mac app.

00:53:28   It's not an Electron thing that's a wrapper for their website.

00:53:32   It's a really good, fast, with a great interface.

00:53:36   And it's interesting that it has your whole history with it.

00:53:39   Because I'm logged in and I have a chat GBT account and I pay 10 bucks a month for whatever.

00:53:45   And there was a...

00:53:47   I don't know if it's a meme or what, but like a thing a couple days ago where people are saying,

00:53:53   "Ask chat GBT or ask any of these things to make a picture."

00:53:57   How do you imagine a typical day in my life?

00:54:00   And I don't think chat GBT knows a lot about me, but I'd imagine the picture I got...

00:54:06   It didn't have a lot...

00:54:08   Some of the people's pictures have so much in them.

00:54:10   Mine was like...

00:54:11   Of course all the rooms look idyllic.

00:54:13   It was sort of a rustic, lots of wood, a big leather, old leather couch, which is kind of me.

00:54:20   Like if I'm going to have a couch...

00:54:21   I don't have a couch in my office, but if I did, I probably would have a leather one.

00:54:25   And I like an old looking leather couch.

00:54:27   A typewriter on a desk, not a computer, but a typewriter.

00:54:31   And some... not a lot.

00:54:33   In real life, I've probably got more crap around my office, but some books and papers on the floor.

00:54:39   And I am kind of a mess.

00:54:40   And I do... one of my very favorite, most used things that I ask chat GBT is when...

00:54:46   It's like my super powered thesaurus.

00:54:49   Either I know I'm looking for a certain word and I can't think of it, or I want a better version of a word.

00:54:56   I'm always asking chat GBT for either a specific word I can't think of, or like a better word than the one I'm thinking of.

00:55:03   And it's amazing at how good it is at it.

00:55:05   It's so good at it.

00:55:07   And so based on that, it kind of knows I'm a writer and kind of, seemingly knows I'm a little scattered.

00:55:13   And I do research, then build up piles of books, and oh, there are lots and lots of books in my imagined chat GBT world.

00:55:22   So this is the poem I did right about you. Now I finally got it working again.

00:55:26   "John Gruber with wisdom keen, in the world of tech he's often seen. With markdowns, elegance, and grace, his thoughts in code find their place."

00:55:36   I'm not going to keep going on.

00:55:37   That's pretty good though.

00:55:39   Yeah, that's pretty good.

00:55:40   Okay, my image of that chat GBT did of my world, which was pretty funny.

00:55:47   I mean, I tried it a few of them, but I posted it on threads.

00:55:49   It thinks I'm, and it's not wrong, but it must think like I'm truly like the world's leader of the Cub Scouts.

00:55:56   I don't know what that's called, like the Master.

00:55:59   Scouting America. They've changed the name to Scouting America.

00:56:03   Right, right.

00:56:04   To get rid of the boys and girls separate aspects.

00:56:07   My son, my seven year old did just join the Cub Scouts.

00:56:10   And so I have been using it a lot because it's complicated.

00:56:14   That's like crazy. I'm like, please explain to me.

00:56:17   I've been asking about dens versus packs. I've been asking about what are the different den names or what are the different pack names and the Cub Scouts and there's wolves and there's bears and there's this.

00:56:27   And so it really took that to heart so that everyone I generated, I'm like, I've got Cub Scout paraphernalia everywhere. I've got patches. I've got kids in the background with Cub Scouts.

00:56:38   And I guess at some point I also asked about basketball. I want to say at some point I was asking about the best way to get a basketball hoop into cement.

00:56:46   Because that was something I was struggling. Like I use it a lot for home stuff.

00:56:50   I sent you a picture of mine. I should post it on social media and I'll put links to yours from threads.

00:56:57   Oh wow. I mean this is a beautiful home that you have here from the early, I don't know, 1920s.

00:57:03   Well, it is. You know what though? You know what's funny is in the early years of Daring Fireball, you know, I started the site in 2004.

00:57:15   And there weren't, YouTube wasn't a thing and I've never really done much video and podcasts weren't yet a thing.

00:57:23   So all people knew me by was the writing and I didn't put a picture of myself on the site.

00:57:29   And I started going to things like WWDC or Macworld Expo and people would meet me for the first time.

00:57:37   And I mean like dozens of times people would just say, "Oh, I thought you were really old."

00:57:44   And at the time, it was like 20 years ago, I was only, I don't know, like 30.

00:57:48   So I wasn't really old at all and they were like, "Oh, I thought you were like old."

00:57:54   Somehow ChatGPT seems to have the same impression of me.

00:57:57   You had an old sounding written voice, I guess.

00:58:00   I guess. Because it's not just a typewriter.

00:58:03   Or you were very wise. You were very wise.

00:58:04   It's a manual typewriter.

00:58:06   Yeah.

00:58:06   I also enjoy that there's a map on the wall of mine and it's like a nonsense map. It's not a real place.

00:58:14   You have a lot of plants.

00:58:16   Yeah.

00:58:16   But yeah, I mean it like really, I just sent you mine, but you can see I've got the Boy Scouts, I've got basketballs, it thinks I love bagels.

00:58:24   I don't know about the bagel piece.

00:58:26   Oh yeah, see yours is really, yours are very, very jam-packed with things.

00:58:31   Look how big, they jam-packed it. And I love that little safari compass they gave me on the first one.

00:58:37   Oh, the first one.

00:58:39   Look at that safari compass on the table.

00:58:41   Oh yeah, yeah, I see it. It's like a coaster or a pen or something like that.

00:58:45   And now I'm like, I want that. I want that for the Cub Scouts. I want a safari compass.

00:58:50   Yeah, they've definitely got you as being like the scouting mom. Yeah, definitely.

00:58:54   I mean, but it was very appropriate because that week I had obviously asked a lot about this, but my life is, I wish, I mean I don't wish I was as involved with the Boy Scouts as this thinks I am, but I'm not.

00:59:05   And the second one that you've actually got a Boy Scout framed logo on the wall.

00:59:11   Yes, I love them so much.

00:59:13   And also in that second one, you've got a wood-grained sign on the wall, big, with your name on it.

00:59:20   Yep, I love myself. I love myself. I love basketball, apparently.

00:59:24   But you have an iMac. You have an iMac, not a, you know, you have an iMac on a desk and a laptop on your lap and another like a tablet.

00:59:35   A detached keyboard.

00:59:37   Yeah, yeah. So you've got a lot of computers. I don't have any, but oh well.

00:59:43   But just going back to the the ChatGBT integration, I think, look, I think there's something there about how Apple's thinking about how much they obviously we know that that they have taken some, they wanted to minimize some of the risk, right?

01:00:00   And they also, whether we want to say they're behind or whatever the reason is, they're being cautious about how they roll this out. And so they've used ChatGBT and potentially some whatever whatever partners they can to plug into Siri.

01:00:14   Eventually, as Craig said in that interview, we will see a Siri that has more large language model capability and can converse and can likely do a lot of what we've already been seeing from these other bots that I went to the cabin with.

01:00:29   It's just they're not, they're just not, it's just not comparable right now. And look, we're going to see this from Alexa. I mean, Amazon has been delaying and delaying but that that's going to come soon too. So Apple's going to have no other choice.

01:00:42   Because it is conspicuous. And I thought of that with your, I thought of it just earlier on the show when you were trying to remember all four of the companions who you took with you, that I think Gemini was the one, the fourth of four that you couldn't think of.

01:00:56   But it does stand out. We're not even, we're not talking about Amazon at this point. We're clearly they're working on it. And I think Amazon is sort of taking, Amazon and Apple are the two, I was going to say weirdos, but exceptions.

01:01:11   They're doing things in very different ways from the others. And Apple's is let's be very deliberate and slow, but we'll start with very simple stuff. So like when iOS 18.1 came out, which is the only non-beta version of Apple intelligence that anybody out there listening has, it doesn't even do much, right?

01:01:35   There's not even that much to it. It's got like the writing tools, doesn't do any of the image generation stuff. They're releasing stuff, but it's like a little bit at a time. And I get the feeling that Amazon, rather than release a little bit at a time is sort of holding back and then they're going to come out with a big thing at once. I don't know, I could be wrong.

01:01:55   And I think, you know, as much as there was some controversy, not controversy, but like when Apple intelligence launched a few weeks ago, there were many people on the side of Apple's late, they're behind, they're clearly behind, there's reporting that shows they're behind. That was one camp.

01:02:13   Then there's the camp, which I didn't actually take a stance on in my piece. It was mostly leaning on Craig who was saying, we're doing this to be deliberate, we're deliberately being slow, we want to get it right, we're doing it all right, because we have a responsibility.

01:02:29   Probably someplace in my opinion is that it's someplace in the middle of those things. But when you think about Apple and Amazon specific, I mean, the two big assistants that people make fun of and talk about the most are Siri and Alexa.

01:02:45   Google Assistant as well, but I think to a lesser degree, they have the most writing on it. Right? If people wake up and Alexa can't do the basic things.

01:02:58   They can't turn on the lights, or starts doing those things in a ridiculous, crazy way. People are going to freak the hell out.

01:03:06   Yeah, we have them hooked up to do real things.

01:03:10   To do real, talk about the garage. If the garage just doesn't do the thing, or Alexa's not turning on the lights, or Alexa's turning on your fireplace and like, those all are real things and these companies built massive business around them.

01:03:22   Again, Google has this too, but Google's done it weird in a typical Google way. They've put Gemini Live within Assistant, but Assistant still lives there. It's a mess. It's total mess.

01:03:33   It's Google that I was thinking of when I retracted my use of the word weird for Apple and Amazon and said their exceptions and doing it differently because it's Google that's always the weirdo.

01:03:45   But they've done a weird thing too. If you ask Gemini Live to set a timer, it won't. But you can still go back to Google Assistant and set a timer.

01:03:54   And they've worked out a new way with extensions to set a timer, but it's all like, hacked together. They obviously just wanted to get it out the door.

01:04:02   Again, do I think Amazon and Apple were behind and are playing catch up? Yes, but I also think it's not as easy as them to just say, "Rush it out, rush it out."

01:04:13   And I think the proof that Apple was caught flat-footed to some degree, maybe behind isn't quite right, but just caught flat-footed is the clear, obvious—

01:04:26   I mean, at my live show after WWDC, I asked about it and it was one of those questions that I thought, "Ah, there's no way they're going to give me a straight answer."

01:04:34   And they just answered straightly, "Yeah, that's an issue, is the amount of RAM in the device for on-device processing." And yeah, it's like, Apple Intelligence really needs—

01:04:46   whatever the device, iPhone or iPad or Mac, it needs at least 8 gigabytes of RAM, and so devices with less than 8 gigabytes of RAM don't get it.

01:04:54   And wow, I was like, that's a really clear answer to the sort of question Apple executives usually don't answer.

01:05:01   And the fact that so many recent devices, most conspicuously, last year's iPhone, 15 non-pros, don't have that much RAM.

01:05:11   And we know that things like that get set, especially for the iPhone because of the massive 100 million units per year scale, whatever it is that's done, a year in advance, right?

01:05:23   Like the iPhone 17s from next year are locked in at this point. I know there's—I've talked to people at Apple where there are certain little things that could happen as late as December or January,

01:05:35   but for the most—I mean, something like how much RAM is on the actual system on a chip, that's already set in stone, probably months ago, if not even longer.

01:05:44   The fact that they put a lot of, especially iPhones out in the world that don't have enough RAM for on-device Apple intelligence processing is a sign that they were caught flat-footed.

01:05:54   But I also think Apple is kind of—I don't think they're happy about that, and I think if they could go back in time and say, hey, maybe start putting 8 gigs of RAM in the iPhone 14 or 13, they would, and I think they kind of regret it.

01:06:08   But I think overall they're comfortable—and Tim Cook said it in his Wall Street Journal profile recently, that it's very, very clear it's a long-time Apple message, our aim is to be the best, not the first.

01:06:22   And they're comfortable with that. I think Google was very, very uncomfortable culturally being accused of being late or behind, and it sort of panicked, and still does.

01:06:37   And it's just sort of like, okay, here, fine, here's everything we know how to do. And it is a lot.

01:06:42   And look, they had researchers that worked on these transformer—I mean, of all places to be behind, oh crap, they had that reason to feel like this was happening in our own house, everyone in the house, get it together, get it together, everyone.

01:06:54   It's a different situation across the 405 or wherever they are down in geographic—I don't know. California geography, not my thing.

01:07:05   I do think Apple's comfortable with their position, but I will say I really don't like the ad campaign.

01:07:13   I would say it might be the most disliked Apple advertising campaign that I can remember is the commercials they're doing for Apple intelligence.

01:07:22   Because I feel like all of them—my wife even says when they come on, it's sports season when she sees—because we don't really see a lot of regular TV commercials.

01:07:32   But I'm watching football or when the Yankees were still playing, and she's sitting with me on the couch, and she's like, "Why would I want that?"

01:07:39   There was a commercial where somebody—and some of it too, it just makes the human beings in the ads look rude.

01:07:46   There's one where the actress—she's the actress from The Last of Us, and she has a meeting with an agent or somebody in the entertainment industry, and she's like, "Did you read my script?"

01:07:57   She looks at her phone, and there's an email with the script, and she's like, "Summarize." And she's right there.

01:08:02   It's all—A, it's just rude. Top level. It is rude to show up at a meeting where you're supposed to have read the script not having read the script.

01:08:11   It's rude. It's rude and inconsiderate. And B—

01:08:15   You gave me reading materials before this podcast, and I absolutely read them.

01:08:20   I used Apple intelligence to summarize your text to me.

01:08:25   B, how stupid do you think this other person is when you're staring at your phone, poking at it, and going, "Oh, yeah, yeah, it's a story about coming of age of whatever the plot is." It was pretty good.

01:08:39   Then it's like the commercial's implicit message is that the other person is so stupid that they don't see that you're reading a summary of the script on the fly right in front of them,

01:08:52   and then that you're lying about it and saying that you liked it when you haven't read it. It's all—all of those things are—I don't see any of that as positive, right?

01:09:02   And then there's the one, too, where Bella Ramsey is her name, right?

01:09:05   Yeah, Bella Ramsey.

01:09:06   Where she runs into somebody, and I actually really think that's a good use case, where she runs into somebody, she sees them at the, like, down the hall or whatever, as like, "Remind me of the person I had coffee with at blank café."

01:09:21   I actually would love that, but that doesn't exist right now on the product.

01:09:25   No. That's the one commercial that—and I've seen some people complain about that one, too—that's the one in the campaign that I like the most, because I am often bad with names.

01:09:34   Oh, I'm so bad with names.

01:09:36   I always have been. I'm not worried that it's a sign of dementia creeping up on me, but I've always been bad with names, so I do look forward to a future where I'll get some kind of assistance.

01:09:51   You know, and I wear glasses all the time now, so I have a leg up on that, where if all my glasses do is just tell me the name of everybody I'm looking at or give me a button where I can do it, I would buy those glasses in a second.

01:10:03   I love how we all want that thing, but we know that that is the worst privacy situation in the history of the world.

01:10:10   And also, by the way, maybe Apple is the only one that can deliver that kind of feature with privacy because of the iPhone integration, but probably another conversation.

01:10:20   Everybody—yep, it is a whole other conversation where the whole idea of face recognition is this enormous can of worms for privacy and civil liberties, and I'm not being blithe about those very serious things that I agree with are true, but in my dream world, I want the feature, I want it available, and I want it to be provably private.

01:10:42   I absolutely want this, and in fact, we had a great story in the journal this week by a colleague of mine, Anne-Marie, and she wrote about how Apple Notes is being used for all these ridiculously funny things.

01:10:53   And I actually featured one in my newsletter today about how people make stickers of their outfits and put them in notes. Anyway, check that out.

01:11:01   My reason for bringing this up was that we had this conversation about notes, and that, to me, that's the most private thing. If that were to ever leak, I would be devastated, and I would be canceled.

01:11:13   Truly. Not even maybe publicly, but just by friends, right? Because I have very specific notes about people's names.

01:11:21   Mom of so-and-so, where's so-and-so, her name is blank, right? That is what I have to do so I don't seem so rude every time I run into them at a practice or something.

01:11:33   I do it too, Joanna. I would be canceled right with you. A lot of mine, the notes like that are in my contacts, not Apple Notes.

01:11:41   Oh.

01:11:42   But there are ones I have some in Apple Notes.

01:11:46   Talk about this system a little bit, because how would you know the name of the person?

01:11:52   I kind of know which people, I have an intuitive sense of who I've saved a contact from.

01:12:00   So, when Jonas was in school for a lot of his friends' parents, they're all in Apple Notes, because I don't have contacts for most of them.

01:12:10   In fact, I think it was just one Apple Note with Jonas school friends or something like that.

01:12:16   Yeah, I have that. Yes, exactly.

01:12:18   But I have stuff in there that I definitely wouldn't want to come out like somebody has a gap in their teeth.

01:12:25   That's how you remember.

01:12:26   Right, not the worst thing in the world, but it's like the mom with the gap tooth, it's so-and-so's mom.

01:12:34   And it helps me remember.

01:12:36   No, I have the same crap in mine.

01:12:39   It is only-

01:12:40   This is like the parenting nightmare.

01:12:42   Yeah, it's not rude, and I do trust that my Apple Notes are not going to be leaked publicly.

01:12:48   And it is a note to my future self.

01:12:50   It is John Gruber right now making a note to John Gruber in the future after I've forgotten whose mom she is.

01:12:59   And so nobody has ever intended, and that is, oh yeah, the gap tooth mom.

01:13:03   I know, and I'm like, oh yeah, yeah.

01:13:05   And that's-

01:13:06   And you also know you can go search gap tooth mom, and it will come up to me right there.

01:13:10   Because I know that that's-

01:13:11   This is how I use it, right.

01:13:12   Because I remember that's the thing that I would put in the note.

01:13:15   Right, and then you go and you're like, oh yeah, Lucy, right.

01:13:17   Okay, hi Lucy, how are you? And then you don't feel-

01:13:20   Right, but if somebody ever uncovered all of them, it would be like an episode of Curb Your Enthusiasm where everybody's going to be mad at me because there's something in there that's-

01:13:29   And I'm like Larry David, I'm like, I didn't mean it, I didn't think you'd ever see it.

01:13:33   It's just how, and you do have a gap tooth. You do. I mean, let's face it.

01:13:39   This is so, I don't know how we got, well, I do know how we got here, but yes.

01:13:44   But you know what, but it's funny to circle it back. We're not getting that sort of companionship from our AI friends yet, right?

01:13:52   Like you can't keep a secret with chat GPT like that. Like just between me and you, remember that's the dad who walks with a limp or something.

01:14:00   Right, right.

01:14:01   Or looks really old, you know, like somebody has a really old looking dad or something like that.

01:14:07   You know, these unpleasant things, but that's the thing that I remember. I can't ask Siri about that.

01:14:14   No.

01:14:14   I can't ask chat GPT. And in the future, I'd like to have a trusted, you know, where every bit of- and Apple's the one who's sort of going towards that.

01:14:24   They haven't said anything about a future version of Apple intelligence having contextual access to your notes.

01:14:31   I know they're talking about email, like their-

01:14:33   Email calendar.

01:14:35   Yeah. The Halo demo that does not exist in any shipping form in beta yet is that the woman-

01:14:42   Pick up the mom at the airport.

01:14:43   Yeah. When's my mom's flight? When's my mom flying into town?

01:14:46   Yes.

01:14:47   And it involves Siri knowing the emails that her mom had sent and it involves Siri having access to the calendar where maybe you put the flight information and stuff like that. None of that's there.

01:14:59   But if Siri had- or Apple intelligence, I guess, had access to my notes, it would be very helpful to me, but I really need to trust it.

01:15:08   I mean, because- and I don't think my notes- I'm sure there's a lot of people out there with a lot worse stuff in their notes than mine, but it's private.

01:15:16   Very private. Well, yeah, we have a great story about just all the weird things people are doing in notes.

01:15:23   I have not seen that, but I made a note of it here and I will put it in the show notes.

01:15:28   The other thing, before I forget to bring it up, is one of the interesting examples from your video with these four chat companions is that none of the four that you can have these ongoing conversations with can do set a timer for six minutes.

01:15:48   Right.

01:15:49   And it- I don't know why timers and everybody in our field, me, you, everybody who writes about these things, we all turn to timers as the shortcoming.

01:16:00   And famously, Siri couldn't set more than one timer at a time.

01:16:04   Yes.

01:16:05   I know you've talked about this.

01:16:06   That was a great success.

01:16:07   I feel great success and partial credit to all of us in this industry who forced Apple to create multiple timers.

01:16:15   There's like some engineer who was forced, he's like, "Yeah, I worked on that. I worked on that."

01:16:20   If you were the engineer to have worked on that, me and John and so many others out there are so proud of your work and we know it's thankless, but we are very proud of your work.

01:16:29   I might have, probably not with you, but at some point I think this has come up on this podcast before, but we have HomePods and an Alexa in our kitchen.

01:16:39   And you've met my wife. She is not into our world at all. She doesn't read my site every day. She's not a tech enthusiast.

01:16:48   And me getting permission to have two different talking devices in our kitchen is really out of character for her.

01:16:57   But it's because Alexa for so long was the only one that could set multiple timers and Amy's the one who does cooking where there might be two different things going.

01:17:07   In a kitchen, it is actually more unusual if you only need one for an even mildly complex meal, that you only need one timer at a time.

01:17:18   And so Alexa, effectively, she's there just for multiple timers.

01:17:22   But at some point they had added multiple timers to HomePod and that was Apple's answer, which was like, every time I would go and check just to make sure it doesn't work or go check with the PR team, they would say, "Well, it does work on the HomePod."

01:17:34   I mean, I practice timer parenting. I don't know if this is a thing. I should probably like some parenting expert, but I'm constantly setting timers for my kids.

01:17:43   Like five minutes. Yeah, five minutes till we are going to leave here.

01:17:47   You can finish watching your iPad for five minutes and we're going to eat dinner. Five minutes that you can sit on the potty, you know, all the things.

01:17:55   And so I need multiple and I have two kids and I've got a cooking. I've got multiple timers is essential to my life.

01:18:01   Yeah. Reading time was mandated. Read a book for 30 minutes. We definitely use that a lot. I mean, Jonas is 20 now, Jesus.

01:18:09   And you're still like, read a book for 20 minutes.

01:18:12   Yeah, but I can't make him do it anymore.

01:18:15   Right. Yeah, I don't think I can make my seven year old read a book for 20 minutes still.

01:18:21   But it was interesting that these other ones, which are so much more advanced conversationally, can't do the device timers and dumb old Siri is pretty good at it at this point.

01:18:35   Yeah, but they can't do all those fundamental device control. The things that we, I think we would call now more voice assistant than AI companion or whatever.

01:18:45   These are the assistant tasks. Set a timer, set an alarm, play the music, add a reminder.

01:18:52   There are so many of those things that are essential to what we do with Siri or Alexa.

01:18:58   And that's where that conversation with Craig really went, which was that we need to be able to get both things right.

01:19:03   And I assume it's the same thing that's happening at Amazon with Alexa, especially with Alexa.

01:19:08   Like, when did my package arrive? Would I order this thing off Amazon? All of those things we talked about at the beginning of the conversation where we've trained our language.

01:19:18   We know the vernacular and the words to say to get the assistant to do the thing.

01:19:24   Right. Right. And in some ways, it really is the same thing that has made people like us able to make sense of the terminal interface.

01:19:35   Where you have to enter the commands in this order or it's not going to work.

01:19:41   Or if you screw up the RM command, it's going to permanently delete files that you didn't want deleted.

01:19:48   And it's, oh yeah, of course, because that was my fault because I entered the command wrong.

01:19:53   And we're like, ah, curse me. You get mad and you curse your luck. But you'd think, well, it was my fault.

01:19:58   I put the command in wrong. And we, our minds naturally go that way.

01:20:02   And most other people do not. And they shouldn't. They shouldn't be expected to learn the magic way.

01:20:08   And even to learn something as simple as, it makes total sense to me that Siri and Alexa are now very good at setting timers on devices.

01:20:18   And these other ones aren't because these chat GPT and copilot style ones aren't, don't have the device context.

01:20:28   They don't run, they're just apps on your phone, right? So it's kind of, it makes total sense to me.

01:20:34   But to a normal person, it's like, I'm just talking to these things, set a timer.

01:20:38   If you're supposed to be so smart, I could hire the world's worst human assistant.

01:20:43   Somebody who is so inept and bad at their job that I'm going to have to have an uncomfortable conversation and fire them.

01:20:49   But they could set a timer for five minutes if I told them to.

01:20:53   Exactly. And when you look at what Google did, they seem to take that as a moment to say, okay, we'll still have this added on functionality for that.

01:21:03   We need to have that functionality. And I believe now I have to look on the Pixel and I have to set it up.

01:21:09   And I'm sure there are going to be some Android listeners that will say, yeah, that's just an extension now.

01:21:13   You just have to plug it in. But that's still complicated. It's not what I would see Apple doing or even Amazon doing, right?

01:21:21   They know, absolutely know, what the most used things are. Or the most used, most requested things of Siri and Alexa are.

01:21:30   And they're going to make sure that those things do not break.

01:21:33   Yeah, I think so, right. And it's just, they're approaching it from a very different perspective than these other, the other chat makers.

01:21:41   Here, I'm going to hit the money bell one last time.

01:21:44   Ding.

01:21:45   What do you think of this? What do you think of this gimmick? Let the people know.

01:21:49   I mean, it does break up the show.

01:21:51   Yeah, I'm going to, I'm going to stick with it. Anyway, I want to take-

01:21:55   Is this the first time you've used the bell?

01:21:56   Yeah, I used it last week with Merlin. But Merlin's the one who had the bell. I said I wished I had a bell.

01:22:01   But I didn't know. It turned out he had a bell on his desk and he hid it for me. And then I went on Amazon and bought myself a bell.

01:22:09   Anyway, that was the money bell. And I got to thank our third and final sponsor of this episode.

01:22:14   And it is our good friends at Memberful. Memberful is best in class membership software used by many of the web's biggest independent creators, publishers, and even bigger media companies.

01:22:28   It's really, really good. It lets you offer membership perks and exclusive content to your loyal followers and fans, readers, listeners, watchers, whatever it is that you're creating.

01:22:42   Giving you full control over who has access to your blog posts, newsletters, for email newsletters, online courses, podcasts, private community chats, and more.

01:22:53   And with Memberful, you maintain full control over your brand, your audience, and your business.

01:23:00   If you ever want to leave, you can just export the full membership list, take it with you, build your own system, take it somewhere else.

01:23:07   It's a sign of confidence that you won't want to leave Memberful, that they make it so easy to export and leave if you ever wanted to.

01:23:13   Their brand doesn't get put in front of yours. It's not, for example, Substack, where everybody who's on Substack looks like a Substack and says Substack.

01:23:22   Memberful is behind the scenes. It's quiet. Your brand stays in front. People who sign up and are happy members of your site may not even know you're using Memberful.

01:23:33   It's that behind the scenes, which I think is super, super creator-friendly.

01:23:37   Full customization, you can set up and manage your membership program with Memberful's intuitive platform, create multiple membership tiers if you want to, and payment options that cater to different audience segments.

01:23:48   Seamless integration, they integrate with all the tools you already use, like WordPress, MailChimp, Google Analytics. Discord, Discord's a big one where you can gate access to your community Discord for members only through Memberful.

01:24:02   They already have an integration with that to make it as easy as possible.

01:24:06   It's just a great way, and it is clearly the trend for independent media going forward.

01:24:12   And they have great customer support, too, where you can contact somebody for help for your particular case, and they can make recommendations because they know from their experience what sort of things work and which don't for building it and making it successful.

01:24:27   Like which sort of things you should make members only and which ones you should put out there for everybody else so that they want to become members of your site.

01:24:35   They help you with that.

01:24:36   So anyway, where do you go to find out more?

01:24:38   Go to memberful.com/talkshow.

01:24:51   As we head down the homestretch, have you had this experience? I think everybody in our racket dealing with Apple has, where you encounter an issue or a problem or an observation.

01:25:03   And when you're communicating with Apple PR about it, they act like, "Ah!" They don't say you're crazy.

01:25:11   They don't gaslight you per se, but they act like it's like, "Ah, nobody's ever said that before."

01:25:18   Huh, which I guess is sort of gaslight-y.

01:25:21   But for me, I ran into this testing the iPhone 16s in September.

01:25:26   And they didn't say you should install the 18.1 beta, but they didn't discourage it.

01:25:33   And for reviewers who don't know how to do it, they gave instructions.

01:25:37   And so I used them for a couple days with 18.0, which was what they were going to ship with in the box, which I thought was important to note. If you buy it when you first unwrap it, here's what you get.

01:25:49   And then I put the 18.1 beta on, and that's when I first got the new Siri.

01:25:56   And new Siri's voice is definitely more human.

01:26:00   And I still use, I forget, they don't name their voices cutesy style anymore, they just call them like voice one, voice two.

01:26:08   I think I'm a voice three person, which is sort of the traditional, like, original Siri female American voice.

01:26:18   Let's see here, oh nope, it's voice four. American voice four is my Siri.

01:26:23   And it definitely sounds more realistic.

01:26:26   But also is talking to me way too slow. It's so slow.

01:26:33   I have American voice too. I've been playing around with them though, so.

01:26:36   I'm going to start playing around with them.

01:26:38   The colors of the sky fade with the setting sun as the stars begin to shine through the clear night.

01:26:43   Sorry, it always says download it.

01:26:45   The sky fade with the setting sun as the stars begin to shine through the clear night.

01:26:50   The colors of the sky fade.

01:26:52   The colors of the sky fade with the setting sun as the stars begin to shine through the clear night.

01:26:55   The colors of the sky fade with the setting sun as the stars begin to shine through the clear night.

01:26:56   You're so sorry, you were saying yours is slower and now I'm trying to download this and sorry.

01:27:00   I don't hear the slowness in those canned examples when you're trying the voices.

01:27:05   But just day to day talking to it and getting verbal answers back, it seems like new Siri simultaneously does sound more human in inflection.

01:27:17   Which has been like a slow boiling frog as Federighi told you in your interview over 15 years now.

01:27:25   It's gotten more and more realistic incrementally every couple years.

01:27:29   But I find that it talks way too slow and they were like, "Ah, nobody said that."

01:27:34   And maybe I'm the only one who noticed, I don't know.

01:27:36   But then if you go into set it on your phone, you can go into settings, accessibility, Siri and there's a talking speed.

01:27:43   And I've turned it up to like 110%, which to me still sounds a little slow.

01:27:49   But the other thing and Amy agrees with me because at one place I hear the Siri voice all the time is when we are driving and going somewhere and getting directions.

01:27:59   Is the new Siri has like a Gen Z vocal fry that I find annoying and it's not friendly but it'll be like, "Get off at exit 347?"

01:28:16   It's like, "Why? Why are you talking like this?"

01:28:18   And I realize that it is sort of a verbal tick of younger people to sort of inflect like that.

01:28:26   But I don't want my phone talking to me like that.

01:28:29   I don't know. Have you noticed this or is it all just me? Amy definitely notices it.

01:28:33   I don't know if I've noticed this from voice three.

01:28:35   Yeah, I think it might be a voice four thing.

01:28:37   And I'm going to try switching after this, after we're done recording and see if I notice that.

01:28:42   And I just switched too. I've just downloaded.

01:28:45   It says still downloading voice four.

01:28:48   I think this is some sort of bug because I've got very good connection.

01:28:51   It should be downloading or downloaded already.

01:28:54   I noticed something different than that, which was when I first was using the beta of iOS 18.1 in my car.

01:29:01   It would just misunderstand in a real bad way.

01:29:06   It didn't make any sense if I would ask, like, direct me to this or play this.

01:29:13   They were very, very egregious errors.

01:29:17   So that's gotten better, I feel.

01:29:21   I want to keep an eye on that, but I feel like that was my biggest thing about the switch.

01:29:25   Yes, but to your point, yes, and to Apple's credit, it has gotten to sound more natural.

01:29:30   They did really highlight the fact that if you ask something and then you flub or if you like,

01:29:36   "Oh, actually I meant this," it picks up on that.

01:29:39   I've noticed all of that working very well.

01:29:41   And that happens to me frequently. I'll say, "Oh, turn off the lights."

01:29:45   Actually, I mean, I'm not going to say it right now.

01:29:48   I don't know why, because ostensibly as a podcaster, I should be fairly good at putting sentences together without stammering.

01:29:57   But when I'm talking to Siri, I guess I know I'm talking to a device, and so I don't devote my full attention to it,

01:30:03   because I don't have the respect I have for it that I have for either a person I'm talking to physically or, like, here.

01:30:10   I'm conscious of the fact that tens of thousands of people are going to listen to the podcast,

01:30:15   and so it has my full attention in a way that when I'm just asking Siri to open the living room shades in the morning, it doesn't.

01:30:24   Right. I think it makes sense.

01:30:26   And sometimes for me, I'll say, "I want to do this thing," but then I just slightly change it, and that works pretty well.

01:30:34   Yeah, because you change your mind halfway through, because you started the command before you've really thought it through.

01:30:39   You're like, "Oh, no, actually, make it 11 minutes, not seven minutes," or something like that.

01:30:43   Like, "I was going to heat up a piece of pizza, and the oven's not even hot yet, so yeah, seven minutes is not going to be hot.

01:30:49   Give it ten minutes," and it's like you change it halfway through.

01:30:52   Definitely, it's a real noticeable improvement, and also one that I've already taken for granted.

01:30:59   No longer seems impressive.

01:31:01   Which just goes to what Craig was saying, which is that we're just going to keep wanting more and more.

01:31:06   But to be fair, it is a pretty low bar still.

01:31:09   I mean, I think for me, and I highlighted this in my Apple Intelligence review, is that when you say that Siri's getting better,

01:31:16   people expect some of the worst parts of Siri to get better.

01:31:21   And I think that one of the worst parts of Siri for me isn't like, speaking of the great Larry David,

01:31:28   there's that great Larry David scene, which I think you linked to a few months ago.

01:31:31   Yes, yes.

01:31:32   It's not the misunderstanding. I mean, the misunderstanding happens and it's funny, or it triggers all of the devices when you want to ask one,

01:31:39   and it's triggering your computer or the HomePod. Those are all, I think, other Siri quirks that we live with, and it's fine.

01:31:45   For me, it's search the web, right? Something that I think is pretty normal, or I expect it to have the answers, and it tells me to search the web,

01:31:52   is where I think that, like, Siri stupidity lies.

01:31:57   Yeah, totally, and it just feels frustrating. And now that these other bots can do it, it really stands out as an omission.

01:32:09   It's like, don't just tell me to go to Wikipedia. I'm asking you verbally, because I want a verbal answer,

01:32:15   and I'm 100% certain that you could parse the Wikipedia page that you're sending me to and get the answer from the first paragraph.

01:32:23   I know you could do it, and you're just refusing not to.

01:32:27   You're lazy. You're being lazy, Siri.

01:32:29   Yeah.

01:32:30   [Laughter]

01:32:32   The Larry David thing was so funny, and I saw an interview, it was him talking to Siri in his car and getting angrier and angrier at the directions

01:32:41   and the argument they're having over trying to get the directions, and somebody asked him about that scene, and he said it was complete, you know,

01:32:49   like almost everything in the show is totally based on real life, except that in real life, he got way angrier.

01:32:55   [Laughter]

01:32:56   And he said, usually the me on the show is actually a worse version of me than the real me, I think.

01:33:02   And he said, in that case, the real me was ready to drive the car off the cliff.

01:33:07   [Laughter]

01:33:09   I mean, I think also, he's just like fully, like he's cursing. I linked to it a few months ago, and some readers got mad because they said it was just completely not suitable.

01:33:18   The curse words.

01:33:19   He's really cursing.

01:33:20   But sometimes you really need curse words, and sometimes when Siri and these things really let you down, there's no other way to express yourself completely.

01:33:31   The other thing I've noticed with the ChatGPT integration with Siri and Apple Intelligence is it's clearly not the ChatGPT you get in the app.

01:33:46   It's just like pure access to the model, which is powerful and does amazing things like write poems that Siri or Apple Intelligence can't do.

01:33:58   But it doesn't have the integration with the live web results that the ChatGPT app does, right?

01:34:05   So there's the whole training window model problem where, oh, this model was trained on data up to the summer of 2022, and that's like where all of its knowledge ends.

01:34:16   So it doesn't know anything that happened after whatever the cutoff date is.

01:34:20   And when you're using the ChatGPT app now, I don't even notice that anymore because behind the scenes it'll go on the web and get live answers for last week's election or sports, the World Series that took place last month or something like that.

01:34:37   And when you're using the ChatGPT integration in Apple Intelligence, you don't get any of that.

01:34:43   It just isn't there.

01:34:44   And again, if that integration in Apple Intelligence wasn't there, I wouldn't miss it because when I want to ask a question like that, I don't go to Apple Intelligence because I know it's not going to work.

01:34:55   I go to the ChatGPT app.

01:34:56   So I'm not even sure why just use the ChatGPT app for ChatGPT things isn't Apple's answer to these questions.

01:35:04   See, I'm just, and again, I'm doing more testing and I'm planning to do something more on this in the coming weeks.

01:35:10   So I haven't really played around deeply with the ChatGPT stuff, but when is it going?

01:35:15   I just don't know if I have a best, the handle yet on when it is going to go ask ChatGPT.

01:35:20   I mean, I know it's asking ChatGPT and the writing tools to when now you specify a prompt.

01:35:25   So if you say write an email to Jon Gruber telling him, I will definitely be on his podcast, but I can only do it at these times, make it professional, blah, blah, blah.

01:35:33   It will use ChatGPT.

01:35:35   But that's how I know you used ChatGPT if it came across as professional.

01:35:39   Right.

01:35:40   That's using ChatGPT there.

01:35:41   But when I go to Siri and I type something, I'm not getting anything going to ChatGPT beyond like my POEM ask.

01:35:50   No.

01:35:51   And I feel, and it's one of those things.

01:35:55   I really do think that sometimes talking to Apple, like we can, you and I can through Apple PR in a way that is rare.

01:36:04   It is sometimes like talking to an LLM because they don't want to give you in a way that these LLMs won't just say and are technically incapable of saying, I don't know.

01:36:15   Right.

01:36:16   So they give you, rather than tell you, I don't know how to make your cheese stick better to your pizza.

01:36:22   I'm sorry.

01:36:23   I realize that's a problem, but I don't know.

01:36:25   They'll just make up an answer like use Elmer's glue.

01:36:28   And Apple, when you ask them questions like, when does it go to ChatGPT?

01:36:33   They're like, huh, that's a good question.

01:36:36   And because they don't want to answer instead of saying, you know what, we don't want to give an answer to that because we want.

01:36:41   I found another example though.

01:36:42   All right.

01:36:43   I know I'm like, I'm like the Apple PR, but it says, it says I asked it to do a gluten free banana pudding recipe and it went through to ChatGPT.

01:36:54   But when I asked it before, just for a meatball recipe, it did not go to ChatGPT.

01:37:00   See, I don't know how you, I don't know how to explain that.

01:37:03   Live testing here on the show.

01:37:06   Yeah.

01:37:07   And when you interviewed Federighi and of course on camera, he's going to stay on message, but it was a great interview and it was insightful in many ways.

01:37:17   But in terms of getting an answer like that, they just, they know it's a thing and they know when and why it's going to ChatGPT, obviously.

01:37:27   But they don't want to talk about it.

01:37:28   And I think part of it is that they want to be flexible so that if they have a, if their on-device processing is better in three months than it is today, that it will do things on device that it previously went to private cloud compute.

01:37:44   And maybe it'll use private cloud compute for things that previously it was handing off to ChatGPT.

01:37:50   And they don't want to give an answer in November that may not be true in March.

01:37:55   But they don't want to say that either.

01:37:57   I wish that they, in the same way that I really, really wish ChatGPT would just say, "I don't know how to answer that."

01:38:04   When it doesn't know how to answer it, I wish Apple would tell me, "Yeah, we just don't want to answer that."

01:38:09   Rather than talk around it.

01:38:11   Yeah.

01:38:12   And it, yeah, I mean, I think there's a lot of things going on because I'm also using this too.

01:38:16   And it's using obviously the knowledge base of Wikipedia, which it has had before.

01:38:20   So, and it's using whatever other web sources it's already brought in.

01:38:24   So it's clearly there's some segmentation of use ChatGPT for X, Y, Z types of prompts.

01:38:32   And to your point, maybe we can ask Apple and they will get back to us.

01:38:37   Or maybe they will just have a LLM response that is just, we are looking into it, but thank you for inquiring about this.

01:38:43   Pretty much.

01:38:45   I don't have anything else that I wanted to talk about, at least on this subject.

01:38:49   And you're, I always appreciate, I love talking to you, Joanna.

01:38:52   I love this conversation. I mean, I took a lot away from it, but especially that I must watch these movies.

01:38:58   I, and I realize now the one thing I will, it's the last thing I wanted to talk to you about, but I'm a little, I'm a little hurt.

01:39:06   I learned only through your most recent video after I, I don't know which one of them, but one of those bots started addressing you as Joe.

01:39:17   And you said, well, it's interesting because I didn't tell it to call me Joe, but my closest friends call me Joe.

01:39:25   Well, you never told me to call you Joe.

01:39:27   You can call me Joe. My family calls me Joe. My friends from like high school, college call me Joe.

01:39:33   My wife sometimes, yeah, everyone calls me Joe.

01:39:36   Yeah.

01:39:37   You can.

01:39:38   All right.

01:39:39   You can do it. You're there. We're there. I didn't feel like I was there with ChatGPT, but I mean, it really took like that.

01:39:44   Yeah. I think my only other thing that I wanted to say was you wrote an amazing essay last week, and I don't know if you've talked about it on your show, but I can interview you about that if you'd like.

01:39:53   We, we, I did talk about it with Merlin Mann last time, the last episode, but that's all right. It's all right. I did. And thank you very much.

01:40:00   The response has been, it's an unusual essay for me, and I will just say it generated way more responses than I usually get to anything I write, and every single one of them, not even 99, but 100% of them that I've seen have just been gracious and wonderful.

01:40:21   And I'm trying to, unlike my usual self who just reads them and gives up on answering, I'm trying to answer everyone, and if anybody wrote and I didn't answer and you're listening, know that I did read it, and I thank you. It was very, very touching.

01:40:35   No, I think it was, it was, I loved it, and I think sometimes we don't get to see the other sides of people. I don't even know if it was so much of that because I feel like I do know you a little bit as a friend in some of your personal life, but I just, the way you've threaded that was so, so wonderful, touching.

01:40:52   Yeah, thank you, thank you. And it's, it's, and I always, Amy always knows, Amy knows that when I'm the most uncertain about publishing something, it turns out to be like one of the best things I've written.

01:41:07   It's, I, my mom died back at the end of June, very end of June, and like her funeral was July 1st, so I, for a while I was saying July, but it was the funeral, it was the end of June, but anyway, it was summer, and I had no words to write about it, but I thought about it, and thought about it, and it's been there, and then the election happened, and my dad lost his ring, and the whole thing just came, it just,

01:41:33   Yeah, it was per-

01:41:34   It just formed in my head, and it's, yeah, that thing where your mom died has been turning in my head for so many months, and it just is like, oh, I need to write this, and I think people might, it might suit the moment for a lot of people to read, and it took me like three days to write, and by the time I got done with it, I was like, ah, this seems self-indulgent. I don't know.

01:41:57   No, I think that's where it was like your story, but I think, at least for me, it was so relatable, I'd also gotten some bad news about a family friend last week, and it was just, everyone was, not everyone.

01:42:08   Wow, unfortunately, not everyone.

01:42:11   Yeah, everyone was sad.

01:42:13   But many people.

01:42:14   A lot of people were sad, and it just was a, it was beautifully written. Actually, I was wondering though that about your writing, and you're very out and open about your politics.

01:42:25   Mm-hmm.

01:42:26   And you must have some readers. I mean, look at the makeup of this country right now. I mean, you must have some readers that don't, that did not agree or did not vote Harris. Do they share with you? I mean, do they?

01:42:40   So that's an interesting, very interesting, I like this bonus interview.

01:42:45   Yeah, this is my podcast that I don't have.

01:42:48   But again, I often hear this when you're on the show, this show, is that people love it because you tend to take over.

01:42:56   I waited two hours. I know, 51 minutes.

01:43:00   And again, I'm often reluctant, my nature is to be reluctant to talk about the behind the scenes thing, but on this particular one, I'm actually happy to. I, if it, so I started Daring Fireball in 2004, and I had very strong opinions politically about the George W. Bush, Dick Cheney administration, and especially the second term.

01:43:25   And I almost never wrote about it on Daring Fireball. And it wasn't because I was just starting this and I don't want to piss people off.

01:43:32   I just thought, even though I had extremely strong opinions, I felt in the natural way, in a way that people who don't agree with me politically still think I should separate them because I come to your site for tech, please leave this stuff off.

01:43:47   And I just, I kind of felt that way about it then. And I also didn't spend much time, even though I really, really like Barack Obama, and I really, his, especially his 2008 election was one of the best days of my life.

01:44:02   Me and Amy and Jonas, Jonas was four at the time, we voted in the morning and flew to Chicago to be with our friend Jim Kudall and his lovely family, and we went to Grant Park for the big, it was like, I don't know, 100,000 people?

01:44:18   I mean, it was an enormous place and it was packed to watch the election results. And I mean, I've had this feeling many times, but I thought he was going to win.

01:44:26   I mean, it's partly why we flew to Chicago, but we didn't know, you don't know. And we're there for his speech and it was remarkable.

01:44:34   And they didn't know who we were, but there's, I'll put a picture here. I'll make a note and I'll send it to you then.

01:44:41   But it was on the front page of the Chicago Tribune the next day, it was a picture of people from Grant Park and you can see me with Jonas on my shoulders and Jim, his son Spencer was a couple years older than Jonas, but was on his shoulders even though he was six or seven at the time.

01:44:59   And you could see Amy and Jim's family, his wife Heidi, were in the crowd. It's not a picture of us per se, but especially Jonas and Spencer as the kids who were there.

01:45:11   I got a, you know, and you can buy, I don't know if the journal does this, but you can buy pictures from the newspaper and we bought, we have a big frame version of the picture. It's just this great moment.

01:45:21   I didn't spend a lot of time writing about Obama in that time either. So it's not just that I hate on Republicans and it just didn't feel the place.

01:45:29   But when Trump got elected in 2016, to me, it crossed a line that's other than politics. It's not just policy and you agree with it or you don't agree with it and it's traditional conservatism and liberalism in a left-right bend.

01:45:45   It's these bigger things like just truth and lies and competence and abject stupidity. Or as I love the word, kakistocracy. I can't say it, but I love the word and it's a word that means government of the least competent people.

01:46:04   And, you know, nominating this idiot Gates from Florida to be the attorney general is the, it goes beyond the definition of kakistocracy. But the whole first Trump administration was full of this stuff.

01:46:17   And then he tried to overflow the results of a fair and free election. That to me isn't political in the traditional sense. It's different.

01:46:27   And I couldn't see keeping my mouth shut about it for four years. And yes, I definitely heard from people who were upset about it, people who still liked Trump while he was the president.

01:46:36   And I guess I lost some number of readers. I mean, I don't really pay attention to analytics that much. It seems like my site is popular and I feel like when I did pay attention to stats, it was not helpful.

01:46:54   And, you know, it was my wife. And this time around, was it any different? I mean, it seems...

01:47:00   What's interesting is when I posted that, how it went, the essay about my mom and never really talking about the election much other than my experience watching the results come in, in a very nerdy, data-driven way.

01:47:27   Some people stopped reading from 2016 to 2020. And other people who I think are more... They disagree and they still voted for them this month, but I think are less passionate about it.

01:47:42   For whatever their reasons to vote for the son of a bitch, they're not as lava fuel, lava in the veins about it, and just wrote a very nice thank you note and acknowledged that we have different opinions on it, but that was a lovely thing to share.

01:47:59   But here we are again, facing down four years of this, and I don't have a strategy for it. I posted it. I don't know if you saw it. I posted last night.

01:48:13   There was Mike Tyson, 58-year-old Mike Tyson is fighting. Now, by the time this podcast is out, it'll be over, but on Friday, November 15th, he's fighting Jake Paul, the YouTuber.

01:48:29   It's a real fight, though. It sounds like a stunt, but at the real weigh-in for this fight, which has to take place before a sanctioned official boxing match, Tyson slapped him in the face, and I linked to the headline at ESPN, Mike Tyson slaps Jake Paul in face, and then I quipped, "The winner of the fight gets to be the next secretary of the treasury," which made me laugh out loud, I thought, last night, sitting on the couch.

01:48:58   I literally laughed out loud before I wrote it, and I thought, "Oh my God, I've got to write that."

01:49:03   And then you realized there's a chance that could be coming.

01:49:06   Yeah, with Matt Gaetz as the nominee for attorney general, I actually think Mike Tyson as secretary of the treasury would actually—if I could have one or the other, I'd rather have Mike Tyson as secretary of the treasury, in all seriousness, than pedophile, creepo Matt Gaetz as attorney general.

01:49:24   I really would. It's turned that much into pro wrestling.

01:49:28   Yeah.

01:49:29   I mean, Hulk Hogan actually went to the Republican convention. He's got to be mad that he hasn't been nominated.

01:49:35   As you say, there's a lineup of people. Kid Rock, there's a lot of people that could be up for some jobs that we don't know.

01:49:40   Yeah, Kid Rock, I mean, why is he—he should be in charge of alcohol, tobacco, and firearms, right? I mean, it seems like he loves them all, so—

01:49:48   Or just education. That's right there for the taking.

01:49:51   It's pretty much the same thing that you just described. Anyway, I didn't want to get into politics. I've just been wondering about it.

01:49:57   No, I realized, for example, you know, there's no way that you are, you know, you work at the Wall Street Journal, and you have a beat, and your day job, you have to stick to the beat, of course.

01:50:07   And other people who are more independent creators who the election came and went, and there's nothing on their site about it, whether pro or con.

01:50:17   I don't pass any judgment on them at all, and you write what you want to, but I realize the way I do that is a bit unique in our field.

01:50:27   I mean, we have very strict standards and guidelines at the Journal, so I'm not even allowed to really share much on politics other than news headlines.

01:50:35   And I mean, I can stretch that, but I just—I've tended not to. I've written some pieces in the last couple weeks, I wrote a lot about tech spam.

01:50:43   And I got this crazy amount of tech spam a few weeks ago, and it was completely all right-wing, like, insane amount, and I wrote about trying to track down, because I did text "stop."

01:50:56   The story is basically, I texted "stop," and I got a flood within the first 24 hours, I had another 30 messages, and it went on for three days, and I had about 100 messages.

01:51:09   And so I wrote all about this, but they were all from Trump's—from PACs, Trump-supporting PACs, all of those things, and so a lot of people assumed I was a Trump supporter.

01:51:19   And so I got a lot of hate mail from people saying, "You're gonna support him?" And then this week I tweeted something, where I was on NBC News, and I said something about why people are leaving X, they had interviewed me,

01:51:32   and I got a lot of people from the other side saying, "You're this, that," and so I try to let it all be out there, and of course, I've also told my standards people at the Journal, and I've always said, "If these are social issues that affect my life, I'm going to be out there talking about them."

01:51:47   On social media?

01:51:48   Yeah, on social media. I mean, I wasn't quiet around October 7th and what's happened in Israel, and there are certain—you know, I'm not quiet when things are about LGBT rights, I'm not quiet about those things.

01:52:00   People can make their assumptions, but I try to stay away from it all.

01:52:05   Yeah.

01:52:06   But—

01:52:07   Then there's Neillie's prescient "I fear" with JFK Jr. apparently being the nominee for Health and Human Services, "A vote for Donald Trump is a vote for school shootings and measles."

01:52:19   Which is very well said.

01:52:21   Oh, I loved it, and it's really—it's both—it's what makes us human is the—I'm going to butcher the paraphrasing, but the F. Scott Fitzgerald line that the sign of a first-class intellect is being able to hold two opposing thoughts in your head at the same time.

01:52:41   So you can both find it funny and find it heartbreaking at the same time, right?

01:52:46   It is. It is a vote for measles.

01:52:48   It is a vote for more school shootings.

01:52:50   It is.

01:52:51   And those are—vaccines are one of the great technologies of all humankind.

01:52:59   It is just absolutely astonishing how many people either died or went blind or with polio wound up unable to walk from these diseases from the time human beings existed until vaccines and the wide distribution of them.

01:53:17   And nothing is as electric emotionally as school shootings.

01:53:25   It's thank God that you can—if you want to put on your data analytic cap, you can say very few kids as a percentage are in schools with a school shooting in any given year.

01:53:38   But it's very small consolation when they keep happening over and over and over again and randomly, anywhere, right?

01:53:46   So it's also very much human nature that anything that's randomly reinforced is on your mind.

01:53:55   Yep.

01:53:56   Yeah.

01:53:57   I guess if you want to be really dystopian and to bring the whole episode full circle, the bots, my friend bots, they don't need vaccines.