PodSearch

ATP

635: An Effective Operator

 

00:00:00   I have a weird problem. I have many problems, lots of which are probably weird.

00:00:05   Well, you've come to the right place.

00:00:06   I have one specific problem that I'd like to discuss. So I have what I've always called,

00:00:12   and I know it's not an official name by any capacity, an AirPods Pro Mark II. So what I

00:00:16   mean by that is there were the AirPods Pro and then there was the second iteration of the AirPods Pro

00:00:23   where it's still a lightning charging case, but it got like the buds themselves got a lot more

00:00:28   smarts. I forget the actual delineations.

00:00:30   That's the AirPods Pro 2, but the 2s, they had like a quiet like 2.1 version, or I guess

00:00:37   maybe they called it like 2.0 in OpenAI notification. So the AirPods Pro 2 have had two versions,

00:00:46   the lightning port case version and then the USB-C case version. You can't just buy a USB-C case for

00:00:53   the lightning port version because it's actually like a slightly different hardware revision and

00:00:57   USB-C version, which is still called AirPods Pro 2, but as the USB-C case, that version has that

00:01:03   special like lower latency mode for the Vision Pro.

00:01:06   Yes, yes. I know exactly what you're talking about. So I have had these AirPods Pro Mark II. I know that again,

00:01:12   not the right name, but I'm just gonna call it that because that's what I think of it as.

00:01:14   Well, wait, well, which one do you have? Do you have the first revision of the 2s or the second?

00:01:18   The lightning charging case one.

00:01:20   So they're called AirPods Pro 2, the first ones. I don't know why you want to put Mark in there.

00:01:26   Because I drive a Volkswagen. That's how we talk. Anyways, so my AirPods Pro Mark II, I've had them

00:01:32   for a year or two. I forget exactly when I got them. I believe it was a Christmas gift a couple of years

00:01:36   ago now. I think it doesn't really matter, but I've had them long enough that it's not entirely

00:01:41   unreasonable to replace them. You know, it's been a couple of years or whatever.

00:01:43   The buds themselves still work great. The battery is still lasting sufficiently long for my needs.

00:01:51   I am going to be traveling this week, so maybe I'd change my tune if I have them in for like four or

00:01:56   five hours at a time, which is very unusual for me. But generally speaking, in day-to-day stuff,

00:02:00   the batteries are great. No worries. However, I'm having the first-worldiest of first-world problems.

00:02:05   The charging case doesn't seem to want to Qi charge anymore, which is okay, but it gets worse

00:02:13   because remember what I just told you? It is what kind of charging case, gentlemen? A lightning

00:02:17   charging case. Oh no, I'd replace them. Well, so here's the thing. I concur with your assessment.

00:02:24   However, we keep getting these rumblings that there's going to be a Mark III. I know not the right

00:02:29   term. Mark III coming soon. It's going to come any day now. It's happening. It's happening. And so I feel

00:02:34   like, is now really the right time to do this? I don't know what to do. So we'll see how grumpy

00:02:39   I get while I'm traveling over hooking up the lightning cable, which I can already tell you,

00:02:43   I'm getting pretty grumpy about it. And maybe I'll just like insta-buy some while I'm on my trip or

00:02:47   whatever. You're going to be more upset about missing out on when the threes come out because

00:02:51   then just plug them in. Plug them in for a few months. You'll be fine. People plugged in their

00:02:55   iPod cases for years and we all survived. It was terrible, John. It was terrible, I tells you.

00:02:59   I still plug mine in. I mean, I do have the wireless charger next to my bedside,

00:03:03   every other place in the house. I plug them in and it's fine.

00:03:06   The funny thing is, is that it does occasionally work. So I don't know what the problem is. I did

00:03:12   try, I looked up like a KBase or whatever about it. And one of the things they said was like disown

00:03:18   it from the iOS side, you know, disconnect and forget it from the iOS side and then mash down

00:03:23   on the button on the back for something like 10 seconds or whatever. And that'll effectively

00:03:26   clear them out and set them up as new, which I did assuming it would not make a difference.

00:03:32   And it made no difference. So, uh, every great, like one out of every 10 times it will charge via

00:03:39   Qi, but it generally doesn't. And I am unreasonably annoyed by it. And my brain knows that John,

00:03:47   you are right that I should wait. But my heart is listening to Marco and saying,

00:03:51   your heart's going to be sad when the threes come out anyway, unless you're planning, unless

00:03:54   you're signing up now to also buy the threes when they come out, you're going to feel so much worse

00:03:58   when the threes come out and you're like, Oh, I can't get them. Cause I just bought new twos.

00:04:01   You are right. You are right. You can just get the case replaced. Isn't that like 80 bucks or

00:04:05   something? That'll satisfy your need to buy something. I don't have a need to buy something.

00:04:09   That seems like you do stuff to work. Uh, yeah, yeah. As, as a quick aside, um, I think we talked

00:04:15   on that. I believe we talked on the show that I've discovered that, um, that electric cars,

00:04:19   the market for electric cars here in America is kind of bottoming out right now. And, um, and I've

00:04:25   noticed that the really unaffordable Porsche Taycan or whatever you call it, the electric Porsche,

00:04:31   um, that is almost affordable when it's bought used. Here it goes. This is how it begins.

00:04:37   Speaking of having a need to spend money. I don't need to spend money on a stupid car. My car has

00:04:43   under 30,000 miles on it and I've had it for eight years. It, it gets driven at least every other day,

00:04:49   but I go nowhere. So I have no reason to get rid of my car. I love my car. I don't want to lose the

00:04:56   third pedal. And yet I see that the electric life is kind of calling my name. And I can feel,

00:05:03   I feel like there's, I, I, I see this with other people. I can't remember. There was some podcaster

00:05:08   that was going through the same thing. And I was like, Oh, I know exactly what this feels like.

00:05:11   But, um, anyways, I can feel like my subconscious trying to like convince my conscious brain that,

00:05:18   Oh, it's time. It's not. And it's not, this is a terrible decision. I know intellectually,

00:05:23   it's a terrible time, but our country is crumbling and I'm like, Oh yes, I'm going to buy an electric

00:05:27   Porsche cause I'm a jerk. Uh, it's just such a terrible decision. And yet I can't help myself.

00:05:32   I'll try to talk you out of this last episode. I don't forget if it made it to the air or not,

00:05:35   but I'll, I'll try again. The reason they're cheap is because the second revision and the Mark two,

00:05:40   as you would call it, that car came out and it's so much better. That's why you can find the quote

00:05:44   unquote Mark ones for less money. Cause nobody wants them. Don't buy it. I know you're right.

00:05:48   You're right about both of these.

00:05:49   Buy the second revision, which will cause way more money.

00:05:52   No, I, Oh God, I cannot afford, no matter how many shirts we hawk, which we will be doing momentarily,

00:05:57   I cannot afford a new Tycon Tycan. I'm so sorry. I really should figure out how to pronounce that.

00:06:02   That's that is not the one you should get anyway. Like you don't, what should I get?

00:06:06   You should get some kind of like more normal electric car, like a W I three based on your

00:06:12   driving habits, maybe an I four. Now I don't know. I don't think I can do another BMW. Not only do I

00:06:18   don't, I don't think I can do another BMW and leaving me out of it. I'm pretty sure the historical

00:06:23   commission will 100% veto any future BMW. There's a prayer since it's electric and hypothetically

00:06:30   much less to go wrong. But, um, yeah, I don't, I think she would poop all over that idea.

00:06:35   You know, like, uh, I know you've soured on BMW for a bunch of good reasons, but, uh, Porsche

00:06:39   repair bills and maintenance costs are not good.

00:06:43   Don't ruin my moment. It's not, you're not saying I want to stay away from those expensive cars

00:06:47   like BMW. Hmm. Porsche. Uh, it's so true. So when you said, you know, I have a need to spend

00:06:56   money, I don't have a need to spend money on an AirPods case. I am, I can feel within me the need

00:07:01   to spend money on a stupid car growing and I need, I need John. I need you to convince me not to,

00:07:06   because you're the voice of reason. We all know Marcos didn't say just do it, but you're the voice

00:07:10   of reason. You need to tell me not to, because I know you're right. Intellectually. I know you're

00:07:14   right. I already do. I already did. I told you like, this is not the time to buy that particular

00:07:17   car. You just need to wait and you need to save a lot more money based on your apparent

00:07:21   cars. So true. We need to sell a lot more. Make a separate fund. This is the Casey car

00:07:27   fund and put money into it for like five years. That won't be enough, but I hear you. Here's

00:07:32   the thing too. Like you already have like the big family car and what, what you've described

00:07:37   is that your, the role of your car is shorter range and not usually hauling everybody around.

00:07:45   Correct. So you shouldn't have a giant Porsche. Like, you know, ideally you would get like,

00:07:51   you know, something reasonable. Like, you know, I'm a big fan of the Ionic five, at least the

00:07:55   way it looks. I've never actually been in one. Um, there's all sorts of like reasonable,

00:07:59   like small ish EVs that are not that much money. Um, and honestly, again, I, I, I can say from like

00:08:07   what I said last episode and before, like BMW's EV drivetrain is awesome and very mature. And so I,

00:08:15   I think you should maybe look at an I4 if you're itching to buy a car. Um, the problems that you had

00:08:20   with BMW were largely because you had a, uh, one of the higher end engines in the engine time.

00:08:29   Um, and it was out of warranty. And so there are ways you can avoid this. You can get one

00:08:35   in warranty. You can get one that doesn't have an engine. Uh, you know, there's, there's ways to do

00:08:40   this that would be, you know, less of a risky experience and would, would limit your, uh, you know,

00:08:47   potential costs. Like there's ways you can do that. Uh, so I wouldn't, and certainly as John said,

00:08:52   like, I'm not going to like, I wouldn't recommend that you, you know, in fleeing BMW for reliability

00:08:57   concerns, go over to Porsche. Forget. Oh, no, it's a reliability thing. It's just the maintenance costs

00:09:01   and the parts. Parts and maintenance are just so expensive for those brands. Like you, you stare at

00:09:05   the dealership for two seconds, hundreds of dollars fly out of your wallet. Like everything is expensive.

00:09:09   I mean, the reason, no, no joke, no snark, no exaggeration. The reason I got rid of the BMW was

00:09:14   because it was going to the dealer every month, every other month, something like that. And every

00:09:19   time it entered the dealer's parking lot, it didn't leave without me being at least a thousand dollars

00:09:24   poor. And that's not exaggeration. It was literally a thousand dollars every single time, sometimes many

00:09:29   thousands of dollars. It was bananas. It's absolutely a terrible idea to get a Porsche leaving aside the

00:09:35   fact that I really can't afford it. I just, it's like just barely outside of what I consider to be

00:09:40   reasonable. And so I feel like I can afford it, even though I really can't. Um, but no, Marco, you're

00:09:45   right. You're both right. The Ionic five. I, I think I would maybe go a different route, like one of the, um,

00:09:51   other cars that's, that shares the same platform, but we're saying the same basic thing. Uh, the, the on Ionic

00:09:57   five is great for my needs. And this is what you were talking about a minute ago, Marco, for my needs, what I

00:10:02   really want, or excuse me, what I really need is a Chevy Bolt, which is what my parents have. And it's

00:10:07   not a particularly pretty car and it's not a particularly luxurious car, but it is the perfect

00:10:14   car for the sorts of driving I do. But because I'm a car guy, I don't really want one. So I'm just going

00:10:21   to continue to drive my, uh, Dino juice mobile until it falls apart. And when that happens, go test drive

00:10:26   an I-4. No, you should, you should, at this point, you should keep driving your current car until it

00:10:30   gets old enough that you won't mind teaching your son to drive on it and then he'll crash it and then

00:10:35   you'll get a new car. Here's the thing. If you think that I need a new car, which I don't, and I

00:10:42   should get an expensive car, which I shouldn't, then one thing you can do to help that process is go to

00:10:46   atp.fm slash store. This is such a great job. I am an expert and a professional. So here's the

00:10:52   thing. The ATP store is back and it is back until Monday, April 28. And this is the second episode.

00:10:59   We've talked about it. So now I get to do my little speech. Here's the thing. Every single time we do a

00:11:04   time-limited sale like this, every single time, there's at least one, usually three to five people

00:11:08   that reach out and say, Oh no, Oh no, I'm that guy. Or I'm that person. Really? I'm that person.

00:11:17   I've done it. I'm the one that missed it. I thought I would remember and I didn't. So if you're driving

00:11:23   your Porsche Taycan or if you're driving your BMW i4, whatever you may be driving, then you should pull

00:11:29   over and go to atp.fm slash store. And John will walk through at least very quickly some of the wares

00:11:34   that we have on offer. And if you're walking, pull over to the side of the sidewalk or get out of the

00:11:39   way of the people that are still walking. Go to atp.fm slash store. John, what is up for grabs?

00:11:45   So this is a reminder of what we've got. The new shirts this time, we've got M3 Ultra shirts,

00:11:49   including our Joke M3 Ultra shirt. And that brings me to the most common question since we announced

00:11:54   the sale last episode. People are always asking about this, but then are asking even more now that

00:11:59   the store is up. They usually say something like, I just got a new Mac and it's got an M-whatever

00:12:04   processor in it. And when you were selling the shirts for the M-whatever processor, I didn't have this

00:12:09   Mac, but that Mac broke. And now I have this one and I want to get a shirt that matches my processor.

00:12:14   And I always tell those people, uh, buy the shirt for the chip that you want, not the chip that you

00:12:18   have, which is a, uh, homage to the, uh, uh, dress for the job you want, not the job you have, uh,

00:12:24   advice back when people used to dress up for work and like in the eighties and stuff. Anyway,

00:12:28   uh, that's what that is. Uh, the idea is that, you know, if you, you know, don't, yeah, you can buy a

00:12:33   shirt for the chip that matches whatever computer you have, but let's say you don't have an M3 Ultra,

00:12:38   which you probably shouldn't. You can still buy the shirt for it because it's Apple's most powerful,

00:12:42   uh, you know, chip that they make. Um, the, the reason I give this advice is because we don't sell

00:12:49   these M shirts forever. There is lots of chips and lots of variants. The usual way it works is

00:12:54   Apple announces the chip and puts it in a Mac. And then whatever the next sale we have, uh, is

00:13:00   whatever chips were announced in that period, we sell at that point. And we usually don't sell them

00:13:06   again. So if you've seen people like on YouTube or whatever, wearing like a beat up M1 Ultra shirt,

00:13:10   that's because we sold it when the M1 Ultra, like shortly after the M1 Ultra was released

00:13:14   or at least announced. And we haven't sold it since. So don't think you're going to be able to buy an M3

00:13:19   Ultra shirt three years from now, because you probably won't. Uh, I mean, I, so thus far we

00:13:23   haven't brought back, uh, especially like the, the fancy variants we do have in the on-demand store

00:13:28   in between sales. We sell like the M1, M2, M3, M4, but just the plain ones with no modifier.

00:13:32   And those are on-demand stuff. That's not for you. You're listening to the show. You shouldn't

00:13:35   buy it in the time limit of sale because the printing quality is much better on these ones

00:13:40   that, you know, we have to take the orders, then they print them, then they give them to you. And

00:13:43   that's it. The on-demand ones, they print as soon as you order them, but the printing quality is not

00:13:47   as good. So we sell them in between if you're desperate for a shirt in between, but really this is,

00:13:51   this is your time to shine if you're actually listening to the program. So if you think you might

00:13:55   ever want an M3 Ultra shirt, this is the time to buy it because the next sale we have,

00:14:00   the M3 Ultra shirt will not be on sale. And unfortunately for the people who are like,

00:14:03   well, what about the M4 Max? That's in the Mac studio too. Yeah, but it was in a bunch of computers

00:14:06   before and we already sold it. So yeah, we have no M4 Max, uh, or M4 Pro or any of those shirts

00:14:13   available. It's just M3 Ultra and the Joke M3 Ultra. So that's our system. It's not ideal for some

00:14:17   people who like, I didn't know I was going to buy a new computer and now I want the M shirt to match it.

00:14:21   The only solution obviously is to buy every M shirt and then whatever computer you buy,

00:14:25   you'll always, unless you bought an M2 Ultra, sorry about that. Uh, we never made that shirt.

00:14:29   Anyway, uh, we've got a Mac Pro Believe shirt, which is very important for certain disciples of

00:14:34   the Mac Pro. We are trying to, as the kids would say, manifest, uh, a decent Mac Pro. If not,

00:14:39   we'll just get a crappy Mac Pro with an M3 Ultra knit and we'll cry together. Uh, the Pro Max Triumph

00:14:43   shirt is back. Uh, very popular, uh, as we've sold it many times in the past, then it went on a long

00:14:48   break and now it's back and it's still very popular. So check that one out. We've got our performance

00:14:52   shirt and a bunch of colors and our usual MTV stuff and the mugs, which it seems like we're

00:14:56   going to have for the rest of our lives because no one is buying them. Oh well. Oh, and also a hat.

00:15:01   Anyway, that's the, that's the M chip situation. That's the store. This is the second show we're

00:15:06   talking about this week. There'll be one more show, which I believe we are also recording on

00:15:10   a Monday, one more show where we announced this and that's it. So don't think you can, as Casey

00:15:15   said that you're like, Oh, I'm sure they'll remind me out again. Yeah. We will remind you one

00:15:18   more time. This is your second to last warning. If you want anything from the store, get it now.

00:15:23   And especially if you actually are going to WWDC, please buy a Mac Pro Believe shirt and wear it to

00:15:27   WWDC and find John Ternus and make him look at your shirt until he gets it. Yes. Yes, please. Uh,

00:15:34   also John, one clarifying question for you. Are there chip designs on the back of any of these that

00:15:39   are being sold right now? No chips in the back of anything. Someday I may be inspired to do another

00:15:43   chip, but the M three ultra wasn't it. Good deal. All right. Well, let's do some followup and let's

00:15:48   start with, can we, can we just skip the tariff stuff? How? We wish we could skip it. We wish we

00:15:55   could all skip it. We wish we could skip this entire presidency, but unfortunately that's not the reality

00:15:59   we live in. Seems not. All right. So let's just dig in. Let me put on my big boy britches and let's just

00:16:04   talk about it. All right. So Trump has decided and just bear with me here. Trump has decided to exclude

00:16:08   smartphones, computers and chips from higher tariffs reading from the New York times on the 12th of

00:16:13   April. As we record this, it is two days later on the 14th. Uh, anyways, on the 12th, the New York

00:16:18   times wrote after more than a week of ratcheting up tariffs on products imported from China, the Trump

00:16:22   administration issued a rule late Friday that spared smartphones, computers, semiconductors, and other

00:16:26   electronics from some of the fees and a significant break for tech companies like Apple and Dell and

00:16:31   the prices of iPhones and other consumer electronics. A message posted late Friday by us customs and border

00:16:36   protection included a long list of products that would not face the tariffs. President Trump imposed

00:16:40   in recent days on Chinese goods as part of a worsening trade war. The exclusions would also

00:16:46   apply to modems, routers, flash drives, and other technology goods, which are largely not made in the

00:16:49   United States. The exemptions are not a full reprieve. Other tariffs will still apply to electronics and

00:16:54   smartphones. The electronic electronics exemptions apply to all countries, not just China. Still, any relief

00:17:00   for the electronics industry may be short-lived since the Trump administration is preparing

00:17:04   another national security-related trade investigation to semiconductors. That will

00:17:08   also apply to some downstream products like electronics, since many semiconductors come into

00:17:12   the United States inside other devices, a person familiar with matter said. I didn't realize you

00:17:16   needed a person familiar with matter to know that particular tidbit, but that's all right. These

00:17:20   investigations have previously resulted in additional tariffs. Again, this was this past Friday. We are

00:17:24   recording on a Monday evening.

00:17:25   Tim Cook's butt-kissing has finally paid off. He gets another exemption. Apple is saved.

00:17:33   Mm-hmm.

00:17:34   See, and that's the thing. Like, this is why, like, look, Tim Cook's obviously a smart guy. He knows how to

00:17:40   play the system. When you are the CEO of any major tech company today, and especially one as big and

00:17:47   important as Apple, the job of being a CEO at that level is somewhat technical, but it is largely

00:17:54   diplomatic. Tim Cook is basically serving as a diplomat between his massive company and countries

00:18:02   around the world, including our own country that he's in. This is the job. However, you know, obviously

00:18:09   we've had lots to say about how he's been doing that job and his other job over the course of the last

00:18:14   few years, but I think in this case, like, he is probably playing this game as best as he can,

00:18:21   given the circumstances that they're in, which are largely his fault. But people who try to play

00:18:28   with Trump always get burned. He probably knows that, but he also probably knows, like, at least he is,

00:18:35   like, able to talk to him sometimes. But Trump is not working on any kind of, like, you know,

00:18:41   long-term strategic level. He's impulsive, and he's a compulsive liar, and he doesn't really know what

00:18:47   he's doing. He basically governs by chaos. Tim Cook is doing the best he can to try to, like,

00:18:54   you know, have his foot at the table and get what he wants done. And by the way, this is not just about

00:18:58   tariffs. It's also about getting the government out of Apple's affairs and, you know, stop suing us,

00:19:02   stop regulating us, etc. Like, it's not all, you know, flowers over there. But this is what happens

00:19:09   with Trump. He turns against everyone. We are here yet again with more chaos. This is how Trump

00:19:16   governs. Inept chaos.

00:19:18   After this announcement, one of the celebratory posts I saw someone make, which was, again,

00:19:25   short-lived, as we'll get to in a moment, was that Tim Cook's two primary, two, top two skills that

00:19:31   Tim Cook has is, number one, supply chain. Number two, despot management.

00:19:36   Yeah, because he's been dealing with China for, you know, for just decades, right? And so dealing

00:19:41   with China is a very similar situation where there's someone who is, you know, who is going

00:19:47   to do things that don't make sense to you, but you have to kind of kiss up to them, but you don't

00:19:50   really agree with them. And he's been doing that for years and years, which should have been his clue

00:19:54   to maybe hedge his bets against China. But we are where we are. Anyway, so he made his play for it,

00:20:00   as he did in the last Trump administration. How'd that work out for him, Casey?

00:20:03   I mean, it depends

00:20:05   on who you ask, but I would say not great,

00:20:07   personally.

00:20:07   All right, so moving on with regard

00:20:11   to the last few days.

00:20:13   Smartphone tariffs are coming back in a month

00:20:15   or two, says Trump admin.

00:20:16   So this is yesterday, as we

00:20:19   recorded this, on Sunday the 13th.

00:20:20   Smartphone, I'm sorry, I'm reading from The Verge.

00:20:23   Smartphones, laptops, and other products are exempt

00:20:25   from Trump's April 9th tariffs.

00:20:26   will be lumped in with duties on semiconductors in a month or two,

00:20:30   Commerce Secretary Howard Lutnick told ABC News anchor Jonathan Karl on this week.

00:20:34   This is not like a permanent sort of exemption, Lutnick told Karl, saying that they will be

00:20:39   subject later to a special focus type of tariff applied to the semiconductor industry,

00:20:43   similar to automotive tariffs Trump has already issued.

00:20:45   When asked if the new tariffs will include products like iPhones, many of which are built

00:20:50   in China, Lutnick said that's correct, and that the goal is to encourage them to

00:20:54   re-sure to be built in America.

00:20:55   It's not like you can open a factory tomorrow

00:20:58   to build iPhones, Karl said.

00:20:59   Yeah, you really can't.

00:21:02   The best and brightest

00:21:03   in this administration.

00:21:05   So this is the, oh, you've got an exemption.

00:21:06   But then it was, I think it was like, was it a day

00:21:09   later, less than 24 hours later,

00:21:10   the person who's supposedly in charge

00:21:13   of something, you know, the Commerce Secretary

00:21:14   comes in and says, oh, well, those are coming back

00:21:17   in a month or two, because we know

00:21:19   you can't just, you know, we want those to be built in America,

00:21:21   but we know it's not like you can build a, open a factory tomorrow to build iPhones.

00:21:23   You can't open a factory in a month or two either.

00:21:26   It's like, I don't know.

00:21:28   It's anytime they try to provide any kind of rationale,

00:21:31   like a rationale that they think will be acceptable to people,

00:21:34   other than like, we're enjoying insider trading,

00:21:36   or we want power over people,

00:21:38   so they will come and beg us for things.

00:21:40   Like whenever they try to explain it in a rational way,

00:21:43   it's like, we want to, we want to re-sure manufacturing.

00:21:45   Like, if you let them talk, they either reveal that they have no idea how anything works,

00:21:49   like saying, oh, well, you know, we'll put these tariffs,

00:21:51   and then next month, iPhones will be built in the United States.

00:21:54   Or they can't bring themselves to say that,

00:21:56   because they realize how ridiculous it is.

00:21:57   Anyway, again, I feel like we need a date and timestamp

00:22:00   every time we say anything about this,

00:22:02   because as Marco said, it's chaos.

00:22:04   And just because they said this in the most recent story that's in our show notes,

00:22:07   does not mean that by the time you hear this episode,

00:22:10   there won't be some new news related to this.

00:22:12   It's ridiculous.

00:22:13   I don't know how anyone is expected to run a business or live a life

00:22:16   with this level of uncertainty and chaos on a day-to-day basis,

00:22:19   but this is where we're at.

00:22:20   Yeah, and like, you know, the thing is, like,

00:22:23   obviously, as an economic policy, like, you know,

00:22:26   it seems like, you know, Trump took one econ course ever,

00:22:30   maybe, let's be honest, he just, he probably saw it on TV.

00:22:33   But like, you know, he took like, you know, intro macroeconomic lessons,

00:22:37   and was like, oh, we can adjust prices with tariffs.

00:22:40   Okay, great, that sounds awesome.

00:22:41   Then we can screw people, and they can negotiate with us.

00:22:43   And okay, well, that's, you know, it, as with all macroeconomics,

00:22:48   like, it's pretty complicated, and there's a lot of other factors to consider,

00:22:51   and a lot of side effects that you might not be considering.

00:22:54   And if you're going to do tariffs well, the ways that tends to be done are, you know,

00:23:02   A, somewhat predictable, and B, usually a little bit gentle, or with subtle tweaks.

00:23:10   Like, most economic policy actually does best when it's, especially at the large U.S. government level,

00:23:17   slow, subtle, predictable tweaks.

00:23:20   That's not what this is.

00:23:22   So even if, even in, you know, in the Republican fantasy world,

00:23:27   where this actually makes any kind of sense whatsoever,

00:23:29   even if that was, like, what they were going for,

00:23:32   having it be both, you know, large, sudden tariffs that are seemingly nonsensical,

00:23:41   based on literally, like, impossible math that makes no sense,

00:23:47   and that partially was generated by ChatGPT for them,

00:23:50   and to have them be erratic, and to have them be changing, like, every day, or every few days,

00:23:55   like, oh, now this, oh, now this, oh, there's going to be this exception, now that exception.

00:23:58   Even if it was a good idea to put these tariffs in, which it's not,

00:24:02   but let's, even if it was, you undermine that by having them change constantly.

00:24:07   Because even, again, even if you, suppose you were going to build a factory in America because of this.

00:24:12   First of all, Trump's term is only four years long, and he'll be dead soon.

00:24:17   So, like, there's only, you only have a certain range that you even have to worry about this guy

00:24:22   even being able to spout stuff off in public.

00:24:23   And his range of, you know, his time range of power is probably even shorter than that.

00:24:29   What are you going to do?

00:24:30   Build, like, move all iPhone production to America?

00:24:34   Invest in all these factories only for him to say, like, tomorrow?

00:24:37   Oh, never mind. It's different now.

00:24:40   No one's going to invest in, you know, moving stuff over here

00:24:44   If the environment is so unpredictable and chaotic, no one's going to make long-term investments.

00:24:49   All they're going to do is try to wait Trump out.

00:24:52   Because they know that before too long, either he'll be out of office or he'll change his mind again.

00:24:58   So, this entire, the entire concept of this makes no sense, will not achieve what he wants,

00:25:06   and all it's doing is causing chaos in our country and the financial markets around the world, even.

00:25:11   Which may be what he wants.

00:25:13   I don't think he wants the chaos.

00:25:15   I think he doesn't know how to govern any other way.

00:25:18   No, I mean, anyway, we're going to get too far into it.

00:25:21   But, like, the stated goals of these things are so disconnected from reality,

00:25:25   up to and including the constant refrain that tariffs will increase prices and that other countries pay for them.

00:25:30   Like, just flat out, unreal things that are not the case.

00:25:34   It's just, what he's going to say is usually not why he's doing the thing.

00:25:39   It's, as you said, like, the uncertainty is the worst part of this.

00:25:44   And I'll just add, as a reminder, that, like, the amount of years and money you would take to build iPhones in America

00:25:52   is double-digit years and a vast amount of money, and that's assuming anyone ever wanted to do it.

00:25:58   But certainly, they're not going to go down that path because we all just assume this is a temporary situation

00:26:05   that will be rectified by the next sane person to hold office.

00:26:08   But you never know what the future holds.

00:26:09   But anyway, this is not the type of environment that attracts investment.

00:26:13   So, there's that.

00:26:15   I think it's worth kind of drilling into something you just said a moment ago.

00:26:20   And I didn't really think about this at first.

00:26:21   I can't remember who I heard talking about this.

00:26:23   I apologize.

00:26:23   But you had said a moment ago, John, that nobody really wants to build the iPhones here anyway.

00:26:28   And I think that that's important to note that I think you're – and maybe I'm out of touch.

00:26:33   I don't know.

00:26:34   But I feel like the average American that I know doesn't particularly want to be screwing in the same screw for 8 to 13 hours a day, you know, 5 days a week.

00:26:43   I think that most Americans I know would find that sort of work or labor to be below them.

00:26:49   And so, even if you could stand up the physical factory, and even if you could do that in a timely manner, which I don't think you really could, where are you going to find the people?

00:26:59   And this is what you were saying, John.

00:27:00   Like, where are you going to find the people to do it?

00:27:01   It's not just the people who are screwing and screwing over and over again.

00:27:04   It's the skilled labor that does all the fancy machines that, you know, machine all the parts.

00:27:08   We don't have them either.

00:27:09   We don't have anything to do that.

00:27:11   And there was – I think you might have seen one of my reposts on Mass Down or something.

00:27:14   Someone did a survey of – a recent survey, as in like a couple days ago, of Americans saying,

00:27:20   do you think it would be better if we manufactured more things in the United States?

00:27:25   And 80% of the American public said yes.

00:27:28   And then the second question was, would taking a job, manufacturing things in a factory be an improvement in your job situation?

00:27:36   And only 20% said yes.

00:27:37   So 80% think we should do it, but 20% say, oh, well, I'll do it.

00:27:41   Everyone always assumes that someone else will be working in the factory.

00:27:43   Yeah, we should totally manufacture more stuff, but I don't want to do it.

00:27:47   Yep.

00:27:47   I mean, and the thing is like because they're thinking of the factory job like, oh, I'm on a semi-line doing a boring thing.

00:27:52   And that is part of the job.

00:27:53   But also, the other reason we can't do it is we don't have people with the skills and the smarts and the experience to do the complicated stuff.

00:28:00   All those CNC milling machines that carefully, you know, make all the little parts that go into the thing.

00:28:06   And then we don't have the supply chain of all the companies that make the screws, that go into the thing.

00:28:10   We're so far behind.

00:28:12   We just, we don't have the capacity to do that in this country at all.

00:28:15   Not just because people don't want the job screwing in the screws.

00:28:18   Every aspect of it.

00:28:19   The only thing we have is the land to build the factories on and probably the power to power them.

00:28:24   But everything else having to do with the factory, we can't do.

00:28:26   We are sponsored this episode by BetterHelp.

00:28:31   You know, therapy is something that I think we undervalue as a society.

00:28:36   And I've always thought that, like, you know, when something's wrong with, you know, your body, you go to some doctor about it.

00:28:43   You know, you go to, if you're, you know, something's wrong with your teeth, you go to a dentist.

00:28:45   There's no glory in saying, like, well, I've never been to the dentist.

00:28:49   Like, that's a ridiculous concept.

00:28:51   We should have that same standard for our minds.

00:28:54   They're so important.

00:28:56   They control so much about our lives.

00:28:57   It's always good to check in with a therapist.

00:28:59   I have personally used therapy and I find it really very helpful.

00:29:03   And I strongly encourage everybody out there.

00:29:05   Give it a shot.

00:29:06   Even if you don't think you, like, quote, need anything, you still go to your doctor once a year for, like, a checkup or a physical.

00:29:12   Give your mind the same care.

00:29:14   Now, traditional in-person therapy can cost hundreds of dollars per session, and this adds up fast.

00:29:19   With BetterHelp Online Therapy, you can save, on average, up to 50% per session.

00:29:24   You pay a flat fee for weekly sessions, and therapy should feel accessible this way.

00:29:29   It shouldn't just be a luxury for, like, the rich people.

00:29:31   With Online Therapy, you can get quality care at prices that make sense.

00:29:35   It can help you with anything from anxiety to everyday stress to more.

00:29:39   Your mental health is worth it, and with BetterHelp, it's within reach.

00:29:43   With over 30,000 therapists, BetterHelp is the world's largest online therapy platform, and it's convenient.

00:29:48   You can join a session with a click of a button so you can fit therapy into your busy life.

00:29:52   And if you're with a therapist that's not working out for you, you can switch therapists at any time, no problem.

00:29:58   Your well-being is worth it.

00:30:00   Visit BetterHelp.com slash A-TechPod today to get 10% off your first month.

00:30:07   That's BetterHelp, H-E-L-P, .com slash A-TechPod.

00:30:11   Thank you so much to BetterHelp for sponsoring our show.

00:30:15   All right, so we had an Ask ATP with regard to cross-platform photo management, and Kevin Militello writes,

00:30:24   My wife and I are a split household.

00:30:26   We use Synology photos with smart albums.

00:30:28   It backs up when charging.

00:30:29   I see photos of her and my daughter without my wife tagging.

00:30:33   She sees photos of me and daughter that I take without me tagging.

00:30:36   The only manual thing I seem to have to do is tag people in video, which I rarely do.

00:30:40   Then BAPT writes, I would like to recommend Image App as a cross-platform photo syncing solution.

00:30:46   Let me jump in here.

00:30:48   I have been, as you guys know, getting deeper and deeper into the self-hosted rabbit hole,

00:30:53   which is, I think, mostly okay and healthy.

00:30:55   But knowing me, I'll take it too far.

00:30:57   Image seems to be one of the darlings of the self-hosted community.

00:31:02   And I've not tried it myself, but I've heard many, many, many very, very good reviews of it.

00:31:07   And this is kind of the same thing until the end.

00:31:09   So let me go back to BAPT.

00:31:11   It's open source and self-hosted, and its mission statement is basically to offer a Google Photos alternative.

00:31:15   It's amazing.

00:31:16   It's not as simple to use as Google Photos because you have to host it yourself,

00:31:19   but you do get to keep ownership of your library and pretty much guarantee that it will keep running forever.

00:31:22   It's also ahead of Google Photos in terms of features in some places.

00:31:25   The one big caveat is that it warns you pretty explicitly about is that it's still under heavy development,

00:31:33   and you must keep backups of your photo library.

00:31:36   But you should already be doing that anyway.

00:31:38   Okay, I hear what BAPT is saying.

00:31:40   I totally get it.

00:31:42   But if Image is explicitly saying,

00:31:44   Hey, you really want to have another copy of all your photos because bad things may happen.

00:31:49   I'm out.

00:31:50   No, thank you.

00:31:51   Yeah, the self-hosted ones, I didn't recommend any of those just because it seemed like the person was saying,

00:31:55   like, if there's anything manual I have to do, I don't want to do it.

00:31:56   But they also said they were willing to spend money.

00:31:58   So maybe a Synology-type solution would be there.

00:32:01   But like all the sort of like host your own cloud stuff in your own house.

00:32:05   Like that just seems like a lot.

00:32:08   For someone who wants kind of a handoff solution or like I don't want to have to be bothered with it,

00:32:12   self-hosting is not that.

00:32:14   Like that's why you have other people host things for you.

00:32:16   That's why the cloud is so attractive.

00:32:17   Let someone else run the servers.

00:32:19   But I thought it was worth mentioning there are solutions.

00:32:22   Synology is one and this Image thing is another one.

00:32:24   If you want to run the cloud thing inside your house and take on that responsibility,

00:32:28   you may be more likely to find cross-platform solutions.

00:32:32   Yeah, and just to be clear, it's pronounced sort of kind of image as far as I'm aware,

00:32:37   but it's actually spelled I-M-M-I-C-H.

00:32:40   And obviously there will be a link in the show notes.

00:32:42   Image, which is I think a, not a portmanteau, but a play on Image.

00:32:47   I thought it was like a German word or something.

00:32:48   Por que no los dos.

00:32:50   I mean, maybe both.

00:32:51   Who knows?

00:32:51   All right.

00:32:52   John, you have some Nintendo Switch 2 follow-up for us.

00:32:55   Let me start reading to interrupt when you're ready.

00:32:57   Kyle McMahon writes,

00:32:58   Yeah, I misspoke in the last thing I said 4K 120 because I was so excited that it can output 4K to the TV and can do 120 frames per second,

00:33:17   but not at the same time, which makes sense because that is a current generation console capability and Nintendo just doesn't do the current generation console thing anymore.

00:33:25   But anyway, we'll put a link in the show notes to the spec page, which tells you Nintendo's spec page for the Switch 2.

00:33:30   It says that it does a maximum of 4K at 60 frames per second in, quote, TV mode, like when it's docked and connected to your TV.

00:33:36   And it says it supports 120 frames per second in 1080p, but also in 2560 by 1440.

00:33:43   So it can do 120 frames per second in theory at higher than 1080p.

00:33:47   We'll see if any games actually use that.

00:33:49   Then with regard to the kickstand, like elbows or lumps or whatever they are, Stuart Gibson writes,

00:33:53   Regarding the Switch 2 stand nubbins, those exist on the improved stand on the current Switch OLED 2.

00:33:59   They do act as a touch point when fully opened, but also have a multi-point hinge, which allows it to sit fully flush to the back of the console.

00:34:05   See, Marco should have told me this because I've never actually even seen a Switch OLED.

00:34:09   But yeah, the Switch OLED has the exact same things.

00:34:11   The two-point hinge is to, you know, again, help the, if you know how a two-point hinge works, to help it fold out more.

00:34:16   Like, I guess people might be familiar with them from kitchen cabinets, where you open the cabinet and it has a two-point hinge inside it to let the doors open up more.

00:34:22   Anyway, there you go.

00:34:23   That's the nubbins explained.

00:34:25   And then apparently there existed open-world racing games before Mario Kart, which I've heard of all of these except one,

00:34:34   but I don't think I knew that these were open-world racing games.

00:34:36   So we've gotten word from many listeners, Burnout, Need for Speed, The Crew, which I'd never heard of, and Forza Horizon are all open-world racing games.

00:34:45   Italians want you to say Forza.

00:34:46   Yeah.

00:34:47   I think the thing that's notable is that, like, I guess maybe I never thought of Mario Kart as, because I've heard of all these games.

00:34:53   I've never thought of Mario Kart as a type of game because it's, like, what is the world?

00:34:57   Like, in these things, they're generally set on Earth with cities and highways between them and stuff.

00:35:02   And so it's like, yeah, there's a world.

00:35:03   But Mario Kart is like, you're in space on an ice cream cone.

00:35:06   Like, it doesn't lend itself to, like, their, you know, what would the world be doing?

00:35:10   But I guess the answer is it's the Mushroom Kingdom.

00:35:12   Anyway, we'll find out.

00:35:13   Indeed.

00:35:14   And then one of the things that broke a lot of people's brains, including very much my own, was that Mario Kart has been stated as being $80 MSRP.

00:35:22   And we all thought that was kind of bananas because $80 is a lot of freaking money.

00:35:27   But as it turns out, maybe not?

00:35:29   Yeah, I referenced this graph in the last thing.

00:35:32   I just didn't have the link for it, so I just wanted to put the link in the show notes.

00:35:34   This is the graph I was talking about that shows the inflation-adjusted price for video games.

00:35:38   And you can see $80 is not out of line with what they have historically been.

00:35:43   I remember buying Ocarina of Time for $70 when it was released.

00:35:48   So whatever year that was, what was Ocarina of Time?

00:35:50   I think it was high school for me, so late 90s.

00:35:53   Yeah, anyway, $70 in the late 90s.

00:35:56   I can tell you that $70 in late 90s does not equal $80 today.

00:35:59   So we're still doing okay, but yeah, some people are angry about it.

00:36:02   But you know, if you look at this graph, I feel like it seems reasonable.

00:36:06   Now, some people complain.

00:36:06   We got one email that said, yeah, inflation-adjusted, it's fine.

00:36:09   But they're trying to trick you because you always have to buy a DLC, and you have to subscribe, and you have to buy the battle pass, and you have to do this.

00:36:15   And it's like, live service games are a totally different ball of wax.

00:36:18   Like, World of Warcraft was charging people subscriptions ages ago, but, you know, and maybe they'll sell DLC tracks for Mario Kart or whatever.

00:36:25   But the bottom line is, especially for sort of like, you know, when you pay $80 for Mario Kart World, you get what you get.

00:36:31   And if you want to buy a DLC later, fine, but you don't have to.

00:36:35   It's a complete game in and of itself.

00:36:36   Same thing with, like, Breath of the Wild.

00:36:37   You buy it, and you get the whole game.

00:36:39   So if they sell the next Zelda thing for $80 or whatever, again, I bought Ocarina of Time for $70 in a gold cart for the Nintendo 64.

00:36:48   I think I had that, too.

00:36:49   Yeah, I will gladly pay similar prices.

00:36:51   But, you know, it's difficult because, quote-unquote, AAA games are expensive.

00:36:58   They always have been.

00:36:58   When you look at this graph, look back at the early days of home consoles with, like, the Atari 2600 and stuff, and look at those prices.

00:37:05   It's inflation-adjusted $140 for, like, I don't know, Tapper or something.

00:37:10   I don't know what game that's supposed to be represented there, but it could be worse is all I'm saying.

00:37:15   Indeed.

00:37:16   Speaking of it could be worse, hey-o, Dropbox and file providers.

00:37:19   Eric DeReuter writes,

00:37:21   The Dropbox file provider API update is not tied to the app version and is not an account-level change either.

00:37:26   So you can have one Mac with a file provider API version and another Mac without it.

00:37:30   It does require a minimum Mac OS version of somewhere between 12.3 and 12.6, as well as the Dropbox app from late 2022-ish or newer.

00:37:37   What triggers your eligibility beyond the app and OS versions is not clear.

00:37:40   Dropbox support was not helpful in clearing up how you become eligible for the update.

00:37:43   Yeah, this next item will tell you how I ran across this, but I have confirmed that if you are eligible, not only is it not a separate download, but you can change it.

00:37:54   So I recently installed Dropbox on a fresh account, and it did the file provider one sort of by default.

00:38:00   But if you Google for it, which I had to because it's not clear, there is a way to say, I don't want the file provider thing anymore.

00:38:08   And it doesn't involve reinstalling the app.

00:38:10   It just involves signing out of the app and signing back into the app.

00:38:12   And then during the onboarding, clicking some advanced link and then going to this thing and opting out.

00:38:16   And, you know, anyway, it is silly.

00:38:19   The process is silly, but it is nice that it's not tied to your account, the, you know, switching back and forth.

00:38:25   And you don't need to re-download a second version of the app.

00:38:28   And I've never heard anyone tell me that the file provider one is better in any way.

00:38:33   So as long as they keep giving you this option, that's the one I'm choosing.

00:38:38   So, yeah, I took the account that I had installed on and I signed out and signed back in and opted out of file provider.

00:38:43   And it just switched to the other version, and I'm much happier.

00:38:45   Good deal.

00:38:47   And then several people wrote in, you have bought your daughter a laptop because of tariffs for school in the fall.

00:38:53   And lots of people wanted to know, what did John do?

00:38:56   Yeah, that actually, that's what I was getting at earlier.

00:38:58   I got that laptop today, and I've been setting it up and setting up Dropbox for people and putting accounts on it and stuff.

00:39:04   I'm actually setting it up now to essentially be the photo laptop for my Long Island vacation because I always take whatever our best laptop is to be the photo laptop for a Long Island vacation.

00:39:12   And that is now the M4 MacBook Air that I got my daughter, but I'm also putting her account on and all that stuff.

00:39:17   Anyway, the specs are boring.

00:39:20   Like, there are really no bad specs for this machine unless you have more data than will fit on the SSD.

00:39:25   What did I do?

00:39:27   Not probably what you should do.

00:39:29   I maxed out the RAM because, I don't know, I just did.

00:39:33   It wasn't that expensive.

00:39:34   You don't need that much RAM.

00:39:36   She's going to be writing papers and browsing the web.

00:39:38   She doesn't need 32 gigs of RAM, but it comes with 16 stock, which is plenty.

00:39:42   You can get 24, but her current one has 24 because that used to be the max.

00:39:46   You kind of see how this goes here.

00:39:48   I get max RAM a lot on machines where it's reasonable.

00:39:50   So I pick 32 gigs of RAM.

00:39:53   I don't know why.

00:39:54   It just makes me feel better.

00:39:55   And then for the SSD, she doesn't have much data at all.

00:39:59   Most of her stuff is cloud synced, but for my Long Island vacation, I like to have enough

00:40:04   space to put a substantial portion of my photo library on the laptop, just so sometimes you

00:40:09   like slideshows of like previous vacations or like, you know, I don't know.

00:40:12   I like having access to, at the very least, the favorites from my big photo collection if

00:40:16   we ever do like photo slideshows or whatever.

00:40:19   So I got a terabyte SSD.

00:40:21   I would have got bigger if the prices didn't go up ridiculous amount after that.

00:40:28   But anyway, there it is.

00:40:30   M4 MacBook Air with all the parts working on the chip again, because I'm me.

00:40:33   This is, I do not recommend this.

00:40:35   You should, you can just buy the stock while it could buy 16 gigs of RAM, whatever size

00:40:39   SSD you can fit that you can stomach because their prices are insane and you're fine.

00:40:44   But people wanted to know what I got.

00:40:45   So there you go.

00:40:46   32 gigs, one terabyte.

00:40:47   I love that you are so petulantly against having a laptop that you now enlist the rest of the

00:40:57   family to meet your laptop related needs.

00:40:59   There's so many laptops in this house.

00:41:00   I just, I don't need to have one for myself.

00:41:02   I just need one for one week a year and there's plenty to choose from.

00:41:05   And I, you know, I bring the best one.

00:41:06   When I had a work laptop, I used to bring that as my vacation laptop because it was the best

00:41:10   one in the house.

00:41:11   All right.

00:41:12   Peter Wagoner writes with regard to folding phone to aspect ratios.

00:41:17   If you take an iPhone mini and make it a trifold, like the Huawei Mate XT, you get nearly the

00:41:22   same dimensions as an iPad mini.

00:41:24   So this is MKB.

00:41:26   There's a video on MKBHD that MKBHD did on the Huawei Mate XT, which I watched, which was

00:41:32   very interesting.

00:41:33   Like, this is not for me, I don't think, but it was a much more compelling video than

00:41:40   I expected it to be.

00:41:41   I left that video thinking, oh, maybe that's not that bad after all.

00:41:44   So it's worth a few minutes of your time if you have some time to kill.

00:41:46   Yeah, that was like the problem we were talking about in his last video of like, if you just

00:41:50   have a, uh, a phone that opens and closes like a book, a foldable phone that opens and closes

00:41:54   like a book when it opens, it's not well proportioned for 16 by nine video, which is most TV and movies

00:42:01   is around those, that aspect ratio.

00:42:02   And so if you compare it to an iPad mini, maybe it's the same in area, but because the aspect

00:42:08   ratio is so much more square, if you try to watch a video on your quote, and you're like,

00:42:11   my unfold, I unfolded my folding phone.

00:42:13   I have such a big area to watch a video.

00:42:14   It's not actually that big.

00:42:16   Like it's compared to an iPad mini.

00:42:18   So you can solve that by having a trifold because now you have, you know, one extra vertical

00:42:24   strip instead of just opening like a book, take another one of those quote unquote pages

00:42:27   and put it alongside.

00:42:28   The problem is now you have a trifold phone.

00:42:32   It's incredibly thick.

00:42:33   Like you can look at the video to see when you fold all three pieces up, they did a good

00:42:39   job of making the individual pieces thin, but boy, that's big.

00:42:43   So it's quite a trade-off to get a screen that is larger than a phone and also well proportioned

00:42:50   to watch video.

00:42:51   And if you're not watching video on it, I guess, you know, you can use that space for all of

00:42:56   the great Android apps to take advantage of tablet proportions.

00:42:58   Sick burn.

00:43:01   And then finally, Jonathan Clayton writes with regard to Mac mini SSDs, some company from

00:43:07   Malaysia of unknown reputation called Iboff, I-B-O-F-F, claims to have reverse engineered

00:43:12   the SSD modules found in the Mac mini.

00:43:14   They are taking pre-orders for 250 gigs, 500 gigs, one terabyte, and two terabyte modules

00:43:18   for way cheaper than official upgrades.

00:43:21   According to them, they've made a quote, legally distinct quote board layout that will be recognized

00:43:26   by the Mac.

00:43:27   They claim to have shipped a batch of orders to people with a new batch coming off at

00:43:31   the, or coming offline at the end of April.

00:43:33   I watched this YouTube video that they themselves have put up.

00:43:36   So I mean, you know, take this with a serious amount of salt, but I see what they're going

00:43:40   for here.

00:43:41   And it is interesting and reasonably compelling.

00:43:45   Yeah, it's super compelling to me money-wise because I'm thinking of my future computer

00:43:49   and it's like, oh, especially since this, I'm pressing up against the four terabyte drive

00:43:54   that I have.

00:43:55   So the next jump is probably to eight.

00:43:56   And you know what happens if you try to buy eight terabytes of SSD from Apple.

00:44:00   It's just, it breaks the bank.

00:44:01   Yep, been there.

00:44:02   And so like, I just, I mean, granted these modules for Mac minis, there, maybe they'll do

00:44:07   them for the studio.

00:44:08   Like who knows and who knows how well they work or whatever, but it just, it seems like

00:44:12   the technology involved should be within the reach of companies like this.

00:44:19   You know, like kind of wish OWC would do them, but they would probably get sued by Apple or

00:44:23   whatever.

00:44:23   Like it's, it's kind of fly by 90 because we're like, where do they get the, where do they

00:44:26   get the NAND modules?

00:44:27   We talked on a past episode about how those NAND modules are not readily available.

00:44:31   And if you do get them, it's possibly some kind of gray market leftover thing.

00:44:35   Like who knows, who knows where this stuff is coming from.

00:44:36   But technology wise, it is indeed a printed circuit board with components on it.

00:44:40   That is a known technology.

00:44:42   And if you could get essentially the same components as Apple is using and put them on a board that

00:44:47   connects them in the same way, but doesn't have exactly the same traces.

00:44:51   So Apple can't see you, sue you for taking their proprietary, you know what I mean?

00:44:53   Like they're trying to sort of clean room, reverse engineer this because they can, they can

00:44:57   take one out of a Mac mini and they can look at it.

00:44:59   It's like, I see all these components.

00:45:00   I can buy these off the shelf from whoever Apple is buying them from.

00:45:03   The only thing I can't copy is the layout of the traces on their board.

00:45:07   Cause that's work that somebody did on contract for Apple.

00:45:10   So do your own work to do that.

00:45:12   And that's what these people are saying they're doing.

00:45:13   But if you go to the website, they're all sold out.

00:45:16   Who knows if this company will be here in a week.

00:45:19   Who knows?

00:45:19   You know, do you really want to spend, even though it's cheaper than Apple, it may be a

00:45:22   lot of money.

00:45:23   They don't even offer an eight terabyte.

00:45:26   They only offered two terabyte, but I'm keeping my eye on this because I tell you, whatever

00:45:30   next Mac I get, if I can get third party storage for sane prices, I may actually risk it, especially

00:45:39   if Apple offers them at insane prices.

00:45:41   What I would do is get the, go back to the good old days of buying Macs.

00:45:45   When we all said, get the lowest amount of storage you can get to get the lowest amount of RAM

00:45:49   you can get and then just swap it when you get the computer back when you do that.

00:45:52   Can't do that anymore, but I would love to be back in those days because I could get a

00:45:56   quote unquote cheap Mac with the lowest amount of storage.

00:46:00   And then if I buy one of the third party ones and it's a disaster, oh, well, I'm out that

00:46:05   money.

00:46:05   And then I could just buy the first party one and put it in there.

00:46:07   That's only possible, I believe, on the Mac Pro right now.

00:46:11   Maybe, maybe on the, uh, with the, uh, parts program on the Mac studio as well and the

00:46:16   Mac mini, but I'm thinking about it.

00:46:18   I'm looking at it.

00:46:19   I can't vouch for this Eiboff thing.

00:46:21   We'll put the links in the show notes.

00:46:22   You can look at it and see what you think anyway.

00:46:23   And like I said, they're all sold out.

00:46:24   It's kind of like reading reviews of these hypercars where at the end they always say,

00:46:28   and by the way, they're all already sold out.

00:46:30   Right.

00:46:31   Yeah.

00:46:31   Now, like I said, the video, I think it was like five or 10 minutes and you know, it's

00:46:35   not required viewing by any stretch of the imagination, but they made a pretty compelling,

00:46:39   a compelling, uh, argument in case for their, themselves.

00:46:43   And again, I mean, obviously the company who put up the video about themselves is going to

00:46:47   say that they're amazing.

00:46:47   I mean, I get that this is, you know, self-serving or whatever, but I thought it was an interesting

00:46:51   video and I, and they went through kind of the behind the scenes as to the choice, some

00:46:55   of the choices they made, why they made them and how actually they could have done better.

00:46:58   But you know, then it would have been a lot more money and blah, blah, blah.

00:47:01   Uh, again, if you have some time to kill, I definitely recommend spending a few minutes.

00:47:04   So like I said, like the thing that gives me encouragement about this is it's, it's just

00:47:08   a printed circuit board with surface mount components on it.

00:47:10   Like there's no magic.

00:47:11   There's no like thing that can only be fab by TSMC because it's some super, like it is

00:47:16   just NAND and a bunch of components and a board that goes with it.

00:47:20   So it seems it's so tantalizingly close.

00:47:22   It's not a standard.

00:47:23   It's not, it's like it's proprietary.

00:47:24   We get it.

00:47:24   Like there are lots of barriers to making this, but I feel like it is, it's plausible that you

00:47:29   would get one of these and not have it be a disaster.

00:47:31   And there's so much cheaper.

00:47:32   That's what's tempting.

00:47:33   We were sponsored this episode by Mack Weldon, my favorite brand of clothing and my most worn

00:47:41   brand of clothing.

00:47:42   This is why I wanted to have them as a sponsor.

00:47:44   I wanted to be able to talk honestly and say, I wear Mack Weldon every single day.

00:47:47   All of my underwear is Mack Weldon.

00:47:49   Most of my t-shirts, all of my workout gear, many of my other things, you know, pants, like

00:47:55   long sleeves, sweaters.

00:47:57   They have such a broad product line now and it's all really good stuff.

00:48:01   So my favorite is the silver fabric and they make all sorts of pieces with silver fabric.

00:48:05   What's great about it is that it's, you know, it's a nice cotton blend.

00:48:08   It's a little bit of silver fibers in it and this makes it antimicrobial, which means it's

00:48:12   really hard to make it ever stink.

00:48:14   So this is very valuable to me in things like t-shirts.

00:48:18   So all summer long, I'm wearing those, but honestly, I'm wearing them in the winter too.

00:48:21   When I'm wearing some kind of like, you know, fancy button down, which they also sell,

00:48:24   I'm always wearing a Mack Weldon undershirt under it.

00:48:27   So there's so many great options at Mack Weldon.

00:48:29   They now have a tech linen fabric and they have a few new tech linen items they've launched.

00:48:34   They're really, really cool, but there's something for everyone here.

00:48:37   And the thing with Mack Weldon is like their stuff lasts.

00:48:40   Like this will not be the cheapest shirt you ever see.

00:48:42   There's a reason for that.

00:48:43   I don't think I've ever worn out anything from Mack Weldon and I wear them every single

00:48:48   day.

00:48:48   It's really good stuff.

00:48:50   So you got to see for yourself, go to MackWeldon.com and you can get 25% off your first order

00:48:56   of $125 or more with promo code ATP.

00:48:59   What I strongly suggest is to go get yourself some really nice t-shirts.

00:49:04   I wear, I personally, I wear the silver V-necks.

00:49:06   They have all sorts of other options as well.

00:49:08   They last, they fit well, they look good.

00:49:11   And look, every single day you're wearing a t-shirt or underwear or all of these things,

00:49:15   socks, you know, you're wearing these stuff all the time.

00:49:17   Make your basics look good and fit well and work great.

00:49:21   And that's what Mack Weldon does for you.

00:49:22   Once again, MackWeldon.com.

00:49:24   Get 25% off your first order of $125 or more with promo code ATP.

00:49:28   That's M-A-C-K-W-E-L-D-O-N.com, promo code ATP.

00:49:33   Thanks to Mack Weldon for sponsoring our show.

00:49:35   All right, so I think we're only going to have time for one topic today because it's going

00:49:42   to be a doozy, and that is there was a report on the information a few days ago as we record

00:49:48   about how Apple fumbled series AI makeover.

00:49:51   And there's a few different, there's a lot in this article, and you can see a summary of

00:49:59   it on MacRumors.

00:50:00   I think, you know, John has done the yeoman's work of splitting this into different topics.

00:50:07   All right, so John entitled this particular section, The Blame Game, which I think is

00:50:11   perfectly apt.

00:50:12   And it says in the information article, some of Apple's struggles in AI have stemmed from

00:50:17   deeply ingrained company values.

00:50:19   For example, its militant stance on user privacy, which has made it difficult for the company

00:50:22   to gain access to large quantities of data for training models.

00:50:25   But an equally important factor was the conflicting personalities within Apple.

00:50:28   More than half a dozen former Apple employees who worked in the AI and machine learning group

00:50:33   led by JG, which that group is known by AIML for short, told the information that poor leadership

00:50:39   is to blame for its problems with execution.

00:50:41   They singled out JG's lieutenant, Robbie Walker, who oversaw day-to-day operations for Siri as

00:50:46   lacking both ambition and an appetite for taking risks on designing future versions of the voice

00:50:51   assistant.

00:50:51   Among engineers inside Apple, the AI group's relaxed culture and struggles with execution have

00:50:57   even earned it an uncharitable nickname, a play on its initials, AIMLESS, A-I-M-L-E-S-S.

00:51:04   Which is almost A-I-M-L-I-S-S, which would have been very funny.

00:51:07   That's not a good sign, by the way.

00:51:09   Like, just when the group that's responsible for Siri is, like, other people in the company

00:51:14   are, like, snickering at it behind its back.

00:51:16   That's usually not a good sign.

00:51:18   Honestly, I kind of took that as promising, that, like, you know, we've always felt from

00:51:24   the outside, like, Siri is so bad.

00:51:27   Does Apple not realize how bad it is?

00:51:30   So it is kind of comforting to know, like, no, lots of the company did realize how bad it

00:51:34   was.

00:51:35   Like, it isn't just us.

00:51:36   Well, it's like corporate infighting, though.

00:51:38   It's like, the group that's responsible for it, like, those people stink.

00:51:41   They're not...

00:51:42   Well, anyway, you can continue.

00:51:43   But, like, remember that the org chart diagram, maybe early 2000s, that was, like, how each

00:51:51   company is organized.

00:51:52   Do you remember this image?

00:51:54   I was going to say it's a GIF.

00:51:54   It wasn't animated, but it might have literally been a GIF as in a single image.

00:51:59   The only two I remember were the Apple one had Steve Jobs in the middle.

00:52:04   He was, like, the hub of a wheel with lines radiating any out from him, which is, like, yeah,

00:52:09   no, that seems right.

00:52:11   You know, it's like a star, starburst coming out from Steve Jobs.

00:52:14   And the Microsoft one was a bunch of different divisions all holding guns on each other.

00:52:18   Oh, yes.

00:52:19   I do remember this now.

00:52:20   Yeah, yeah, yeah, yeah.

00:52:21   It's like some amount of rivalry within your company is healthy, like everyone wants to do

00:52:25   better than the other groups.

00:52:26   But it shouldn't be to the level where maybe they're undermining each other, or maybe there's

00:52:33   a group that everyone knows is low-performing and the other groups get angry about it.

00:52:37   And that seems like what's happening here.

00:52:39   You can continue.

00:52:40   The dim internal view of the group in many parts of Apple is a stark contrast to that

00:52:44   of Apple's software engineering team, which Federighi has overseen since 2012.

00:52:48   It has built up a reputation for efficiency and execution with its work on Apple's operating

00:52:53   systems and messaging, photo, mail, and other apps.

00:52:55   Former Apple employees have reserved Siri as a hot potato continuously passed between different

00:52:59   teams, including those led by Apple's services chief at EQ and by Federighi.

00:53:03   However, none of those reorganizations led to significant improvements in Siri's performance.

00:53:07   So this is another thread of this.

00:53:09   Obviously, when you have these stories, the information sources this to a bunch of people

00:53:13   who used to work at Apple but don't anymore.

00:53:15   So anytime you're talking to ex-employees, you have to wonder, do they have an axe to grind?

00:53:19   What is their perspective?

00:53:20   Were they coming from the AIML group?

00:53:22   Were they coming from the groups that were mad at AIML?

00:53:25   What are their biases?

00:53:26   But it is interesting to see that Federighi's group actually within the company seems to

00:53:31   have a reputation for getting things done.

00:53:33   And I can see that.

00:53:34   We have complaints about what they're doing with software and the quality and all sorts of

00:53:38   other things.

00:53:38   But the bottom line is they do ship stuff.

00:53:40   And if you're complained about the AIML, this thing is they're aimless and they're not

00:53:44   accomplishing anything.

00:53:46   Federighi, it seems, when given a task, do this thing every year, have a new revision of these

00:53:52   operating systems with these features and so on and so forth, seems to actually be

00:53:57   executing on that and producing results.

00:53:59   What is the quality of the results?

00:54:02   We get this steady drumbeat of email.

00:54:04   I don't know if it's the same three people who have just a grudge against Federighi, but

00:54:07   they're always like, why doesn't anyone ever blame Federighi?

00:54:10   Is it because he's in the videos and he's got nice hair and everyone thinks he's funny?

00:54:13   It's like, oh, it's like our fun uncle.

00:54:15   We never say anything mean about him.

00:54:17   But he's in charge of all software.

00:54:18   So anytime you have a software complaint, shouldn't you be blaming Federighi?

00:54:22   Maybe, yeah, we tend to focus our blame more on like, well, what about the people who

00:54:26   do the UI stuff?

00:54:27   Like, that's not Federighi's specific job.

00:54:28   He may oversee that, but there is a dedicated group for interface design and I save most of

00:54:34   my ire for them when they do something that I don't like in the interface or whatever.

00:54:37   But in the end, yes, he is responsible for all software, but not all all software.

00:54:41   For example, until recently, he wasn't in charge of Siri, although he had been at some point

00:54:46   in the past and so had had EQ.

00:54:47   So it's not always clear.

00:54:49   Like, if you think, well, he's in charge of all software and you don't like Siri, therefore

00:54:51   it's his fault.

00:54:52   Had you not known that there was this AIML group, same thing with Vision Pro, that was

00:54:57   off into its own little corner before it sort of got reintegrated.

00:54:59   So the internal structure of how Apple arranged things is not always clear from the outside.

00:55:04   And I mean, this is just another data point.

00:55:08   Again, maybe these are people from his groups or of course they're going to say, oh, Craig's

00:55:11   great.

00:55:11   We are in our group.

00:55:12   We always did everything perfect.

00:55:13   It's those AIML people that are terrible.

00:55:15   Like, you got to take all this with a grain of salt.

00:55:17   We don't know what their biases are.

00:55:18   But I have personally not heard anything about Federighi that didn't align with this.

00:55:27   That essentially he does get things done according to what he thinks he should do.

00:55:31   And you could disagree with his tastes and decision making, but he is an effective operator within

00:55:37   the organization of Apple.

00:55:38   Like he is not, he does not struggle to accomplish the goals that he sets out for himself or the

00:55:43   people set before him within Apple.

00:55:45   Again, you could say maybe the goals that he's setting or that are being set for him are the

00:55:50   wrong goals, but he is actually executing them, which is part of the job of someone at his

00:55:54   level.

00:55:55   So that's why I personally usually don't come for Federighi because it's not because of his

00:56:00   hair.

00:56:00   It's because it seems like it seems like he is, he can get things done.

00:56:06   And if I, like for someone like that, I feel like I could persuade him to get the things done

00:56:11   that I think should be done.

00:56:12   But in the end, I think he's an effective executive.

00:56:15   Yeah.

00:56:17   You know, I don't know.

00:56:19   And I don't want to get too caught up in this tangent that I'm now starting, but I've heard

00:56:24   mixed about Federighi, mostly very good for sure.

00:56:27   And certainly his whole like stage persona, I, I just adore.

00:56:31   Like he, I think you said this a moment ago, he's like everyone's favorite, you know,

00:56:34   fun, fun, cool, fun, cool.

00:56:35   But he's not like that in meetings.

00:56:37   No, from what I gather, he's not like that in meetings.

00:56:40   Unlike, uh, JG, you didn't have to say, you have to try to say his full name at least once

00:56:43   per episode, Casey.

00:56:44   Do you want to try?

00:56:44   Uh, yeah.

00:56:45   So it's John G and Andrea, I believe.

00:56:47   Yeah.

00:56:47   And then we, we just got reports from people who've worked with him that he's just the

00:56:50   nicest person.

00:56:50   He's very good.

00:56:51   You know, but like, and I'm not saying we, we've discussed this, uh, at length and there's

00:56:55   actually a recent New York times article on this very topic that I didn't bring the

00:56:58   link for because I didn't think we'd go off on this tangent, but like, oh, you're such a

00:57:01   nice guy, but maybe, maybe you can't be a nice guy to be effective within an organization.

00:57:05   No, I haven't heard that Craig is not a nice guy, but you also have to be ruthless.

00:57:11   People, you have to, you know, be a serious person.

00:57:15   To quote, uh, uh, what's it called?

00:57:19   Um, the HBO show with the billionaires who are smarter than real billionaires.

00:57:22   Yes.

00:57:23   Thank you.

00:57:24   Succession.

00:57:24   Succession.

00:57:25   I love you all, but you're not serious people.

00:57:26   Craig Federighi is a serious person.

00:57:27   Uh, it doesn't mean that he, you know, you shouldn't be mean and cruel to people, but you

00:57:32   do have to be firm and clear in your direction and decisive and so on and so forth.

00:57:37   And again, everything I've heard is that Craig is smart, technical, decisive, going to make

00:57:45   wrong calls occasionally, but again, an effective operator within the organization of Apple and

00:57:50   JG.

00:57:50   All I've heard about him is that he's a really nice guy, very smart, very knowledgeable.

00:57:53   And all I've seen from the outside is not able to effectively operate with Apple.

00:57:57   Yeah, it's very true.

00:57:59   The people I've spoken to within Apple have almost universally said good things about Federighi

00:58:04   and not to say that he's perfect by any means, you know, like you were saying a minute ago,

00:58:07   but that he's very, very smart.

00:58:09   And I've heard several people say that he is one of the smartest people that they know

00:58:12   and apparently has an uncanny ability to understand really complex technical problems almost immediately.

00:58:18   It's, it's apparently incredibly impressive, which I don't really know Federighi.

00:58:22   I met him once for two seconds and he had no idea who I was, which is what I would expect

00:58:27   and did expect.

00:58:27   Um, but he's, he seems like a genuinely nice person.

00:58:30   And by the way, I want to say someone who's worked in big companies with big executives.

00:58:34   I, whenever I hear those stories of like, oh, Craig, he really knows his technical stuff,

00:58:37   blah, blah, blah.

00:58:38   I, this is how I take it because I've experienced this.

00:58:40   It, what it means is that he's not entirely clueless, which is, which is like the normal,

00:58:46   like he's been out of it.

00:58:47   Like he used to be a programmer, but he's been out of it for so long or whatever that he,

00:58:50   that he's not entirely clueless and that he understands that sometimes it actually is important

00:58:55   to know like the, the broad outlines of what the deal is.

00:59:00   But also I think he's the type of person who would feel the urge to, as he said in various

00:59:06   interviews, to like dive into the code himself.

00:59:08   And he should absolutely never do that.

00:59:09   Correct.

00:59:10   Because, because his skills are not up to date because that's not effective use of his time

00:59:14   because better programmers work for him and because that's not his job.

00:59:17   And I don't think he does that.

00:59:18   Like to be fair, but I just, people are amazed when an executive knows anything.

00:59:22   I think Craig mostly understands that it's good that he has that background and that background

00:59:28   should be part of his decision-making because you see so much in big companies where there'll

00:59:32   be a technical executive who will become an executive for enough time that they no longer

00:59:37   think technical matters are, should be part of their decision-making at all.

00:59:40   And they should be, they should always be an aspect.

00:59:43   It doesn't mean he needs to understand everything from top to bottom.

00:59:45   It means he needs to ask smart questions to people who know the real answers, who give him

00:59:50   information that he then incorporates into his decision-making process.

00:59:53   And people read that as like, he knows everything.

00:59:54   He could do all of our jobs.

00:59:56   He could not, he can't do all of your jobs.

00:59:57   Like that's never the case.

00:59:59   He's not supposed to do all of your, he's supposed to do his job.

01:00:01   But there's a little bit of hagiography of like, oh, he's so technical and understands

01:00:05   everything.

01:00:06   It just means that he is properly using, properly taking into account technical considerations

01:00:12   that worse executives discard because they're like, well, I don't need to understand that

01:00:17   at all.

01:00:18   And in fact, no one needs to understand that.

01:00:19   And in fact, that's not even important.

01:00:21   All I need to know is the finished product and I don't need to understand anything about how

01:00:24   it comes to be.

01:00:24   And, uh, Craig, uh, luckily does not subscribe to that theory.

01:00:28   All right.

01:00:31   With regard to John G and Andrea's, uh, Siri leadership in 2018, when JG arrived from Google

01:00:37   to run the newly formed AI group, he was, or excuse me, his hiring was seen by the tech

01:00:41   industry as a coup for Apple.

01:00:42   And I remember this, we talked about this, that, oh man, they pulled like the king of AI

01:00:46   from Google.

01:00:47   Oh, I never would have expected it.

01:00:48   And I mean, I stand by that from what we knew at the time, even actually from what we know

01:00:52   now, uh, even before G and Andrea took control of the assistant members of the group working

01:00:56   on Siri felt like second-class citizens at Apple series engineers were frustrated by the

01:01:00   software engineering team's control over iOS updates, feeling they weren't prioritizing

01:01:04   fixes for Siri.

01:01:05   So this is a problem of when you're not in the one big software group, like, well, isn't

01:01:09   everything under software at various times?

01:01:11   No.

01:01:11   And so Siri, for whatever reasons was off to the side.

01:01:14   And that means that like, if, you know, I can understand the frustration of like, well,

01:01:20   there's this train going every year where they release a new version of iOS and a new version

01:01:23   of Mac OS or whatever.

01:01:25   And is the Siri team even in the meetings leading up to that?

01:01:28   And what, what does the Siri team want out of the platform?

01:01:30   Or do they just kind of have to like, are they just getting shoved along by the freight train

01:01:33   that is the iOS and plus one that's coming down the line?

01:01:36   Uh, when you see stuff, I mean, everything in this article is kind of like organizational

01:01:41   dysfunction, but it's like, well, is Siri an important part of iOS?

01:01:45   Then maybe it should be better incorporated in the process that produces a new version of

01:01:51   iOS every year.

01:01:51   If the Siri group feels like they don't, they don't, can't get what they want out of

01:01:57   the iOS updates and their complaints go on Herb is like, well, you know, Craig's in charge

01:02:02   of iOS and Craig's not in charge of you.

01:02:04   And so talk to your executive and your executive talk to Tim.

01:02:06   Then Tim talked to us like, are you up the chain to try to get Tim and Cook's attention

01:02:09   to go back down the org chart back into Craig to get what you want is never going to work.

01:02:13   And so this is maybe not the correct shape for the organization.

01:02:16   If you consider Siri to be an integral part of iOS, which it seems like organizationally

01:02:20   Apple did not, which is kind of crazy.

01:02:23   Like it's Siri is, it's a foundational technology.

01:02:28   It is as foundational as many other foundational technologies in Apple's OS's things like the

01:02:34   code, like foundation at the code level, you know, Swift core libraries, networking libraries

01:02:40   like Siri is on that level.

01:02:42   Uh, and you know, like iCloud, iCloud is also integrated in everything.

01:02:45   Like it should be treated in a similar fashion as that.

01:02:48   And it's, it's just wild that they have been outside for so long.

01:02:53   Like it was an acquisition, right?

01:02:54   So it kind of makes sense.

01:02:55   They'd be off in their own little corner.

01:02:56   But if anything, Siri has become more integral to Apple's products because we just got done

01:03:02   talking about this in a couple of past episodes.

01:03:04   Apple keeps introducing products that essentially hinge on Siri to be any good starting with like

01:03:09   the HomePod, but continuing to like this HomePod with a screen thing and other, like other things

01:03:14   that you can just talk to or Vision Pro.

01:03:16   We'll get to that in a second.

01:03:17   Like things that don't have a screen or that it's convenient for you to use with the verbal

01:03:21   interface end up, you know, who knows if they're using quote unquote real Siri or whatever

01:03:25   they use it.

01:03:25   But the point is Apple's branding for like a voice thing that you can talk to, to do stuff

01:03:28   is Siri.

01:03:29   And at this point now, the bad quality of Siri is literally holding up products.

01:03:34   They can't release the HomePod with a screen until they get the better Siri because it's

01:03:38   been designed with that in mind and that's not ready yet.

01:03:40   And it's just a really bad situation.

01:03:42   The software engineers for their part felt the Siri team couldn't keep up with supporting

01:03:46   new features that came out of Federighi's group.

01:03:48   In some ways, Dean Andrea stood out among his colleagues at Apple.

01:03:52   Those who have worked with him described him as relaxed, quiet, non-confrontational, a contrast

01:03:55   to many other members of Apple's executive team, some of whom were known for their

01:03:59   demanding type A personalities.

01:04:00   Federighi's tough and demanding management style contrasts with JG's laid back approach.

01:04:05   When they were in meetings together, Federighi is known to bombard his colleagues with questions

01:04:08   while JG tends to do more listening.

01:04:10   After he joined, some of his colleagues told Gene Andrea that he should shake up Siri's

01:04:14   leadership, but he didn't do so.

01:04:15   Yeah.

01:04:16   So Federighi being described as type A and demanding, it seems like maybe that personality type is

01:04:24   more successful within Apple, especially like JG was joined up and he's put in charge of

01:04:31   Siri.

01:04:31   I feel like he didn't read the moment like, oh, you've just come in from the outside.

01:04:36   You're a hired gun.

01:04:37   Your job is essentially do the thing that we couldn't do.

01:04:40   We have had Siri for a while.

01:04:42   We haven't done well with it.

01:04:43   You, you know, a Google assistant is better.

01:04:45   We're hiring you.

01:04:46   Come fix things.

01:04:48   Not shaking up the leadership.

01:04:50   It's reads like, oh, do I really want to rock the boat?

01:04:53   Maybe I'll settle in here, figure out where things are, which kind of makes sense.

01:04:56   You don't come in and fire everybody because you don't know which person, you know, who's

01:04:59   responsible for what, but coming in and having your colleagues say, well, of course you're

01:05:03   here.

01:05:03   You are.

01:05:04   You're a big executive.

01:05:04   You've worked directly to Tim.

01:05:06   Uh, what's, what's your plan for shaking up Siri's leadership?

01:05:09   And, uh, and the plan is, oh, I don't plan on doing that.

01:05:12   Maybe he just thought it was the right call, but it reads as in this narrative of like,

01:05:17   well, maybe he was just a little bit too nice to do the shakeup that was required or

01:05:21   maybe didn't know what needed to change or I don't know.

01:05:23   But like, that's, that's, it's a difficult situation.

01:05:26   Like coming in as the fixer, like we've hired a big, uh, important person from outside who's

01:05:31   going to come fix the thing that we haven't been able to fix internally.

01:05:33   In some ways you expect that person to rock the boat.

01:05:36   And it seems like JG didn't do that.

01:05:38   With regard to Robbie Walker, one Siri leader often criticized by colleagues was Walker who joined

01:05:43   Apple in 2013 and became responsible for its daily operations at the end of 2022.

01:05:47   In the eyes of his critics, Walker was unwilling to take big risks on Siri and focused on metrics

01:05:52   that didn't move the needle much on its performance rather than having a grand vision for overhauling

01:05:56   the voice assistant.

01:05:56   For instance, he often celebrated small wins, such as reducing by minute percentages, the

01:06:01   delay between when someone asked Siri a question and when it responded.

01:06:04   Another pet Walker project was removing the hay from Hay Dingus voice command used to invoke

01:06:09   the assistant, which took more than two years.

01:06:12   Let me read that one more time.

01:06:14   Another pet Walker project was removed, was removing the hay from the Hay Dingus voice

01:06:19   command used to invoke the assistant, which took more than two years to accomplish.

01:06:25   I mean, that, look, that is a funny headline.

01:06:27   Obviously that's not the only thing they were doing for two years.

01:06:31   But I mean, I think this what this what this reporting shows and, you know, obviously we

01:06:38   have to disclaim that like this is obviously being reported by like, you know, employees

01:06:44   with an axe to grind.

01:06:45   So, you know, I'm sure there's lots of different opinions on how this team should be led and

01:06:49   what it should be doing.

01:06:50   And the people who were on the losing end of those fights are the ones talking to the information

01:06:53   and saying like, this is how this is why they were all wrong or they're all idiots or whatever.

01:06:57   But that being said, it sure does seem like the priorities of the Siri team have been focusing

01:07:09   on features that are like really kind of surface level, you know, paint job kind of features

01:07:14   without really fixing the underlying problem.

01:07:17   And we've seen that on the outside.

01:07:19   Like that seems like that seems right to us because what we've seen on the outside is

01:07:23   like, great.

01:07:23   This year, Siri has, you know, say a new voice.

01:07:27   OK, I mean, or new animation, right?

01:07:30   A new animation.

01:07:31   OK, that's nice, you know, but it still doesn't work a lot of times and gives bad answers and

01:07:36   is slow and unreliable.

01:07:37   And like none of those things seem like they were getting nearly enough work on them, whereas

01:07:44   the Apple will come out and say like, hey, you don't have to say hey anymore.

01:07:49   OK, well, first of all, that wasn't really a problem.

01:07:52   Second of all, now that you don't have to say hey anymore, it actually triggers itself automatically,

01:07:57   erroneously, way more often than it did before.

01:08:01   So you'll be doing something that has nothing to do with it, you know, in the next room over

01:08:06   and your HomePod will say hmm or it'll start playing random music when you weren't talking

01:08:13   to it or it'll in the middle of your conversation with someone else, but it and say, I'm sorry,

01:08:19   I don't I can't look at that right now.

01:08:20   Like, look at your iPhone or whatever.

01:08:23   So they're doing things like, OK, these are minor kind of, you know, cosmetic or interface

01:08:29   types of things.

01:08:30   You shouldn't be spending that much time on that if the fundamentals are bad.

01:08:35   And if you are, you know, if you're tight for engineering resources or, you know, money

01:08:42   from the higher ups, which we got to like if we're tight from that, maybe things like that

01:08:47   are not the best use of your time.

01:08:48   Maybe like, you know, making Siri look and sound a little bit better or improving the

01:08:53   interface, you know, in some relatively minor way, like dropping the hay.

01:08:57   That to me shows bad leadership like that.

01:09:01   That just shows like the leadership is not properly allocating time and resources.

01:09:06   They're not prioritizing the big problems.

01:09:08   They're like, you know, moving around some, you know, Titanic deck chair, like instead of

01:09:12   actually fixing the problem.

01:09:13   And hopefully that's all changed now.

01:09:16   But this I mean, this article, the reason why this hit so hard is because this kind of

01:09:22   confirms what we've all suspected all this time, which is what is going on in Siri and

01:09:28   why won't it ever actually get better in meaningful ways?

01:09:31   And I think here's a bunch of reasons why.

01:09:34   Yeah, I agree with you about the hey dingus thing, removing the hay.

01:09:38   Like, oh, it took two years.

01:09:39   Like, that's just how long projects take.

01:09:40   Like, it doesn't again, it's not the only thing they were doing.

01:09:42   They tried to do it for one release.

01:09:44   It missed.

01:09:44   They said, we'll schedule it for next year because it's not pressing.

01:09:46   That seems perfectly normal to me.

01:09:47   I don't think that is damning evidence.

01:09:49   And it's anything, although the most damning evidence against hay dingus is that they made

01:09:54   it optional because it does cause accidental things.

01:09:56   So maybe it wasn't such a great idea to begin with.

01:09:58   But anyway, the two year timeline on that is fine.

01:09:59   I don't, I don't, it doesn't bother me at all.

01:10:01   That's just every, everything they do.

01:10:02   They try for one year.

01:10:04   It doesn't make it.

01:10:04   They go for the next year.

01:10:05   It's fine.

01:10:05   Increasing the, the, the, like reducing the delay, making Siri respond faster.

01:10:10   That's important.

01:10:11   We complain about that all the time, but as Marco pointed out, yeah, okay, do that.

01:10:15   But also you see like the ship is going down, right?

01:10:19   Like, like sure.

01:10:20   By all means do rearrange the textures, but like there is another problem that is much bigger.

01:10:25   So I don't think any of these things shouldn't have been done.

01:10:27   They should have been doing stuff like in the meantime, can we get, you know, cause Siri

01:10:30   doesn't respond that fast.

01:10:31   Can we increase that?

01:10:32   And say, you've got a team trying to get rid of the hay because I think that'll, that could

01:10:36   pay dividends for a more conversational product, like a thing with the screen, or we could make

01:10:40   home pods better.

01:10:40   Like by all means do all those things, but, but I don't think that's what the entire team

01:10:44   was doing, but whatever they were doing, it was not what they needed to do, which is

01:10:48   seriously overhaul Siri.

01:10:50   And to that end, there's more to this section, believe it or not.

01:10:53   Indeed.

01:10:54   And last year, Walker dismissed an effort by a team of engineers to use LLMs to give

01:10:58   Siri more emotional sensitivity.

01:10:59   So it could detect and give appropriate responses to users in distress.

01:11:03   Walker told colleagues he wanted to focus on the next release of Siri rather than commit

01:11:07   resources to the project.

01:11:08   Without his knowledge, the project's engineers bypassed him to continue working on those capabilities

01:11:12   with the software engineering group, safety and location team.

01:11:15   Warning, warning, another giant red flag.

01:11:18   If someone in charge of some important group of things tells some part of their group not to

01:11:23   do a thing and they go off and do it anyway in cooperation with another group, that is

01:11:27   major organizational dysfunction.

01:11:29   They're essentially disobeying the person who's supposed to be leading them and going

01:11:34   over to like, it's like when, you know, your, your dad says no.

01:11:38   So you go to mom to try to get a yes.

01:11:39   That's not a healthy organization, right?

01:11:43   The fact that this even was allowed to happen, right?

01:11:45   Because this is the thing I believe what they're referring to is the thing that we've mentioned

01:11:48   on the show.

01:11:48   And Casey brought it up like, have you noticed that Siri is more emotional now?

01:11:52   I think this is a thing that shipped and did actually appreciably change something.

01:11:57   And again, maybe that wasn't the thing they should have been working on, but it's, it's

01:12:00   a dysfunction when they want to do it.

01:12:03   And Walker's getting a lot of crap in a lot of these reports of saying like he was, because

01:12:06   you know, he, I assume he's under JG and more directly in spark and charge of Siri, apparently

01:12:12   not setting appropriate priorities.

01:12:14   Now, were these his priorities or were they given to him by his management?

01:12:18   Who knows?

01:12:18   Um, but yeah, when you say don't do this and they go do it anyway with like, you know,

01:12:25   with Craig's group or whatever, like that's something is going wrong.

01:12:28   And I feel like, uh, like Walker should have been telling his boss, like, Hey, this is what's

01:12:34   going on here.

01:12:35   I told them not to do this and they're doing it anyway, but they're doing it with another

01:12:37   group.

01:12:37   But like, are we in charge of Siri or are we not?

01:12:39   Like maybe the current organization is not right.

01:12:41   And this should bubble up to the level of like, I don't know, companies that do like

01:12:45   reorgs every year, that's not healthy either.

01:12:47   But you do have to occasionally look at your org chart and say, in light of our current, uh,

01:12:54   products and our future plans, does this organization make sense?

01:12:57   And I think for years and years, Siri has not been correctly homed.

01:13:01   It is referencing it as a hot potato that had been bouncing around.

01:13:04   That doesn't reflect its importance to the company's current products.

01:13:08   And certainly doesn't reflect its importance to future products.

01:13:10   And then when the advent of LLMs definitely doesn't reflect like the future direction of

01:13:14   the company.

01:13:15   So, I mean, to Apple's credit, they did eventually figure this out and rearrange things.

01:13:20   And hopefully the new arrangement is better.

01:13:21   But yeah, like anytime I've ever seen that happen, like, uh, my boss says no, but this other

01:13:27   person over here says yes.

01:13:28   So we'll do it.

01:13:29   Like, I don't think it's in secret or whatever, but I'll do, we'll do it with them.

01:13:32   And then just everyone grudgingly allows it.

01:13:33   Someone needs to be noticing this is happening and saying, this is a sign of an unhealthy

01:13:38   organization.

01:13:39   We need to fix this.

01:13:40   Yeah.

01:13:41   Yeah.

01:13:42   Uh, it's weird too, because I'm, I am, and we are so dissatisfied with the state of Siri

01:13:49   Siri today that the point that was made earlier is true.

01:13:53   And it is worth remembering that there was a time that Federighi had Siri.

01:13:58   There was a time that Eddie Q's team had Siri and it was trash during those times too.

01:14:04   So I do think Craig has a well-earned reputation for delivering.

01:14:09   And certainly Rockwell does, uh, after delivering the vision pro because whether or not you think

01:14:14   the vision pro is a good product, it is a very, very impressive product and he, he got

01:14:20   it done.

01:14:20   And we're going to talk about more about him in a minute.

01:14:21   Um, but just because Federighi is pulling this back under his wing, it doesn't necessarily

01:14:26   mean it's going to get any better.

01:14:27   And it is better aligned with like, if he's in charge of all the software and it's an

01:14:31   incorporated thing, it makes sense.

01:14:32   And like, I don't know how long he had it and how long Eddie had it.

01:14:34   And the most important thing is when Siri was home to various other places, it was

01:14:39   not quote unquote obvious what should be done, which the advent of LLMs is like, oh, we totally

01:14:44   need to, they, their, their capabilities are good fit with what we're trying to do with

01:14:49   Siri.

01:14:49   And so now the, the direction is clear, something, something LLM.

01:14:53   Now it's much clearer than before when they, we bought this company, we don't even know this

01:14:57   technology.

01:14:57   They produced this voice assistant for us and it's okay, I guess.

01:15:01   And we can try to improve it, but it wasn't so clear.

01:15:03   Like, I mean, obviously Google assistant was doing better and Alexa was doing better.

01:15:07   So arguably Apple should have hired away Google's person.

01:15:11   Like they eventually did, or someone from Amazon earlier and asked them, Hey, why are your assistants

01:15:15   so much better than ours?

01:15:16   And they probably would have said, oh, because it's structured nothing like Siri and that company

01:15:19   you bought had a good idea back in 2008, but we have better ideas now, but they didn't

01:15:23   do that.

01:15:24   They said, well, we've got this product.

01:15:25   We're kind of stuck with it.

01:15:26   It's, it is what it is.

01:15:27   It's not entirely clear how it could be, how it could make like a quantum.

01:15:31   Leap to be better than it is.

01:15:32   So it got bounced around, but yeah, once JG appeared and once LLM's appeared, it's like

01:15:37   now we know how we can make a Siri better.

01:15:40   And the failure to do that is why this article exists.

01:15:45   We are sponsored this episode by HelloFresh.

01:15:47   With HelloFresh, you get farm fresh pre-portioned ingredients and seasonal recipes delivered right

01:15:53   to your doorstep.

01:15:54   Skip trips to the grocery store and count on HelloFresh to make home cooking easy, fun, and

01:15:59   affordable.

01:15:59   That's why it's America's number one meal kit.

01:16:02   Look, it's easy to find time to eat well.

01:16:05   HelloFresh gives you 50 wholesome, hassle-free meals to choose from each week delivered right

01:16:10   to your door.

01:16:11   And if you don't even have time for that, their new ready-made meals go from the fridge

01:16:15   to your fork in just three minutes, using the same high-quality ingredients and restaurant-worthy

01:16:20   flavor you expect from HelloFresh, but with none of the work.

01:16:23   They have all sorts of options to suit whatever you're going for.

01:16:26   They have prep-and-bake meals.

01:16:28   This new lineup comes together with minimal mess and only five minutes of prep, so your oven

01:16:32   does most of the work, not you.

01:16:33   And the 15-minute meals are done in just three simple steps, so you can eat better without

01:16:37   all the hassle.

01:16:38   HelloFresh gives you so many good options.

01:16:40   I personally have used HelloFresh, and I've been very impressed by it.

01:16:44   I really enjoy, you know, the food quality, the convenience, and I love having these faster

01:16:49   options.

01:16:49   I just, I love that.

01:16:50   Because so often, recently especially, I've been super busy, and to have these faster

01:16:54   options, it's just been so nice.

01:16:56   Also, you've heard us talk about Green Chef in the past.

01:16:58   Green Chef is now owned by HelloFresh.

01:17:00   So with a wider array of meal plans to choose from, there is something for everyone.

01:17:04   You can switch between the brands, and you can do whatever you want, you know, with all

01:17:08   their very many, many, many options they have.

01:17:11   So check it out today.

01:17:12   HelloFresh is a great plan.

01:17:15   Get up to 10 free meals and a free high-protein item for life at hellofresh.com slash ATP10FM.

01:17:24   One item per box with active subscription.

01:17:26   Free meals applied as discount on first box.

01:17:29   New subscribers only.

01:17:30   Varies by plan.

01:17:31   That's up to 10 free HelloFresh meals.

01:17:34   Just go to hellofresh.com slash ATP10FM.

01:17:39   Thank you so much to HelloFresh for sponsoring our show.

01:17:42   HelloFresh, America's number one meal kit.

01:17:44   Cross-group tensions.

01:17:50   We've kind of bounced off of this so far.

01:17:52   Other resentments between JG's AI ML group and Federighi's software engineering group also

01:17:56   built up.

01:17:56   Some in the software engineering group were annoyed by the higher pay and faster promotions their

01:18:00   colleagues in the AI group were receiving.

01:18:02   And they were bitter that some engineers in the AI group seemed to be able to take longer

01:18:05   vacations and leave early on Fridays while they in the software engineering group faced

01:18:10   more punishing work schedules.

01:18:11   Distrust between the two groups got so bad that earlier this year, one of Giannandrea's

01:18:15   deputies asked engineers to extensively document the development of a joint project so that

01:18:19   if it failed, Federighi's group couldn't scapegoat the AI team.

01:18:22   It didn't help.

01:18:24   Can I just for a minute here?

01:18:26   Yeah, yeah, yeah.

01:18:27   We're going to, this is going to keep showing how like, you know, terrible the AI team is

01:18:31   probably, but I do want to step back by one paragraph and say the workaholic culture is

01:18:38   not helpful.

01:18:39   No.

01:18:39   The idea that like software engineers and Federighi's org resented the AI org because

01:18:45   they were able to have weekends and evenings, like that's just healthy.

01:18:52   And, you know, Silicon Valley has kind of this, you know, this, this workaholism sickness and

01:19:00   obsession.

01:19:00   And there is this culture of, you know, if, if you're doing either really important work

01:19:06   or you're in some kind of crunch mode, like you can't take a weekend, you got to work all

01:19:11   the time.

01:19:11   And if you don't, you're not a team player or you're not good enough or you're not devoted

01:19:15   enough.

01:19:16   That's just, I mean, first of all, I find that incredibly unethical.

01:19:21   And second of all, it's unnecessary and tends to produce worse output.

01:19:25   So the idea that like the AI team was like lazy or whatever, because they were taking weekends

01:19:32   and occasionally vacations and leaving early on Fridays.

01:19:36   Oh my God.

01:19:38   What?

01:19:38   Like if the only way you can get good output is to overwork your people, you have a management

01:19:45   problem, a serious management problem and a cultural problem.

01:19:48   So, you know, I won't stand up for the Siri team on a lot of things, but I'll stand up

01:19:53   to them for this.

01:19:54   Like this is something that like, this is not a major deficiency.

01:19:58   They have many, many, many other problems.

01:19:59   This is not one of them.

01:20:00   So this, uh, the kind of, this type of work culture, this aspect of work culture does exist

01:20:06   on a spectrum.

01:20:06   And I think the one problem that Apple has had probably forever, but certainly in the jobs

01:20:12   to era and onward is kind of the same problem that the game industry has.

01:20:16   This game industry is even more notorious for crunch and burning through people and just,

01:20:21   you know, not, not so much workaholics, but just like, uh, you know, a wood chipper that

01:20:25   you throw employees into.

01:20:26   Right.

01:20:26   Uh, and part of the reason the game industry has that reputation is the thing that Apple

01:20:32   holds, holds victim to a little bit as well.

01:20:35   People really love games.

01:20:37   You know, someone who's like, I grew up playing doom.

01:20:41   I want to work in the games industry.

01:20:42   I want to work for id software.

01:20:44   I mean, that's a dated reference.

01:20:45   But anyway, um, acquired many times over by this point.

01:20:48   Um, people who get into the game industry, much like the entertainment industry, like they

01:20:52   really want to be there.

01:20:54   And not only do they really want to be there, there is not an infinite number of slots.

01:20:58   It's not the same as Hollywood where there's a tiny number of slots or whatever, but like

01:21:01   it's the games industry is big enough to absorb a lot of people, but it is not trivial to break

01:21:07   into, especially if you want to be doing something that's not menial.

01:21:09   Like I want to be the designer for the next big AAA game.

01:21:12   Well, how many jobs are there that is the designer of the next big AAA game?

01:21:16   There's not hundreds of those jobs in the world because there's not hundreds of triple next

01:21:20   big AAA games every year.

01:21:21   Um, so you have an industry that people really, really want to be in and you take these employees

01:21:27   often young and very enthusiastic and you hire them onto a company.

01:21:30   And the thing is they want to work their butts off.

01:21:33   They want to be workaholics.

01:21:34   They want to stay on the weekends.

01:21:36   They want to like make friends at work and do all this stuff where like they don't, these

01:21:39   people don't understand work-life balance and they want to do that.

01:21:43   And the manager's like, Oh, they want to do it.

01:21:45   Yeah.

01:21:46   We'll just make that the culture, I guess.

01:21:47   And then boy, look at this.

01:21:48   We get our games out sooner and we can just, if, if we're missing a date, we'll just make

01:21:52   them work 24 hours a day, seven days a week.

01:21:53   And, and we'll call it crunch and it'll just be part of the industry and it'll be great.

01:21:57   And that's one of the worst things about the games industry that it grinds people up like

01:22:01   that.

01:22:01   Apple has that problem, a smaller version.

01:22:03   People really want to work in Apple because they admire Apple and they like its products.

01:22:07   And so they get there and they're enthusiastic and they'd be like, we're working on the next

01:22:10   version of iOS.

01:22:11   And this is amazing.

01:22:12   We're doing this amazing thing.

01:22:13   And they care about it so much that they want to come in on weekends.

01:22:15   They want to work on it.

01:22:16   And it seems like the culture at Apple was like, oh, well, these people really want to

01:22:20   work on it.

01:22:20   I mean, I'm not going to tell them, no, they can't come in.

01:22:22   And, and so they come in and they work their butts off and they burn out.

01:22:25   And as Mark pointed out, that's bad management.

01:22:28   And it's tempting to do that because you have these people who are enthusiastic and who really,

01:22:32   really want to work there.

01:22:33   But like so much of Apple culture is sort of based on the premise that of course, everyone

01:22:37   wants to work for us because we're Apple and we're a great company.

01:22:39   And our, our, everything about our company, our, our buildings are great and beautiful

01:22:43   and our products are great and beautiful.

01:22:44   And don't you want to just be a part of that?

01:22:46   And no, you're not going to get any credit.

01:22:47   No, your name won't be in the credits.

01:22:48   The jobs get rid of that too or whatever, but don't you just want to be a part of it so

01:22:51   much so that you're just going to work your butt off.

01:22:53   And like, you have to figure out a way to balance employees' enthusiasm for doing their

01:23:00   job, which is a good thing with not grinding through them like human meat because it's not

01:23:08   good long-term for anything, but it's so tempting.

01:23:11   Even if you're a good manager, even if you mean well, like your incentives are aligned to

01:23:15   be like, well, one or two weekends shouldn't hurt.

01:23:17   And when it comes time to review people like, well, that person does come in a lot.

01:23:20   And so into that environment, you add someone like the AIML group, which is from a more like

01:23:24   maybe let's say academic background, or let's say a more Google-like background.

01:23:28   Google historically, if maybe if not today had a much different reputation, which was like

01:23:33   come work here and you won't have insane hours.

01:23:35   And we won't expect you to be crunching on the next big release of whatever.

01:23:39   And you'll be able to just kind of float around and follow your interests and go from group

01:23:42   to group.

01:23:43   And it's much more like a college campus type of atmosphere, much more relaxed.

01:23:47   And Apple people probably look at that and say, look at those people.

01:23:50   That's why they can't ship any products.

01:23:51   We're so much more serious here.

01:23:52   I'm going to say that the correct position to be is somewhere between the lackadaisical early

01:23:57   Google things and the current Apple philosophy.

01:24:01   And I can understand how a group like AML inside Apple looks to the rest of Apple like

01:24:06   a bunch of like floofy head in the clouds, college kids, especially if they're like, they're,

01:24:12   you know, Apple needs to hire people who know AI stuff.

01:24:15   So they're giving them more initial pay, giving them big promotions, and they work less than

01:24:20   we do.

01:24:20   That's where the resentment comes from.

01:24:21   It's like, well, this is the, you know, someone at Apple recognized if you want people who understand

01:24:26   this AI stuff, you've got to, we need to hire them away.

01:24:28   We need to give them a lot of money.

01:24:29   We need to get them to come here.

01:24:31   And we can't grind them up like meat because we're trying to attract them.

01:24:34   Like we need to get them here.

01:24:35   Right.

01:24:35   And the poor Apple engineers who are just grinding away at some objective C code are like, you

01:24:39   know, we're making the actual product here.

01:24:40   Like, yeah, but there's a million that you wait.

01:24:42   If you leave, there's 20 more kids coming out of school who want to do the exact same

01:24:45   job you do.

01:24:45   So keep working hard.

01:24:47   But anyway, over here at AIML, we give promotions freely and big pay packages and everything so much

01:24:54   more relaxed.

01:24:54   You don't have to worry about shipping a product.

01:24:56   just work on these big problems and publish your papers.

01:24:58   And yeah, that can lead to cross group tension.

01:25:01   And it is like, so that documenting what you're doing so that when the thing fails, you know

01:25:07   who to blame.

01:25:08   You don't want anyone in your organization spending any amount of time on essentially butt

01:25:13   covering because that's what that is.

01:25:15   It's like, I am already thinking that this project is probably going to fail.

01:25:19   And when it does, I want to make sure no one can blame me.

01:25:22   So let me write down everything that happens in every meeting.

01:25:24   So when they say, oh, look at here, it wasn't our fault.

01:25:26   How about you try to make the product succeed instead?

01:25:28   Right?

01:25:29   How about you work together to make a success instead of planning for a failure?

01:25:33   Like, so this is incredibly unhealthy.

01:25:36   Just, just incredible.

01:25:37   Like, like being instructed to do that or like having that take place.

01:25:42   It's like someone in management go, wait a second, you told them to do what?

01:25:45   Same team guys say like, how about we make the product not fail?

01:25:49   Like, and this is maybe like from bitter history.

01:25:52   It's like, well, when we tried to do that last time, we got blamed for all the failures

01:25:54   and everyone called us aimless.

01:25:55   And so we're going to make sure that we don't get blamed this time.

01:25:57   And it's just so unhealthy.

01:25:59   Like this whole, this whole org needs to be in like family therapy together.

01:26:03   Yeah, they really do.

01:26:05   Uh, all right.

01:26:07   So continuing on from the article, it didn't help the relations between the groups.

01:26:10   When Federighi began amassing his own team of hundreds of machine learning engineers

01:26:13   that goes by the name intelligent systems and is run by one of Federighi's top deputies,

01:26:17   Sebastian Marino-Mess.

01:26:19   I got to stop again.

01:26:20   Here we go.

01:26:21   He did what?

01:26:22   He hired his own team of hundreds of machine learning engineers because he didn't like AIML.

01:26:30   How, how, how does that happen?

01:26:33   How is there, does he have enough discretionary budget that he's like, you know what?

01:26:37   The Siri team's not getting a job.

01:26:38   I should hire some of my own machine learning engineers.

01:26:41   A couple hundred would do it.

01:26:42   Don't you think?

01:26:43   What?

01:26:44   How, how many are in AIML?

01:26:46   How much money does Federighi have?

01:26:48   Does Tim Cook know that there's a team being made with hundreds of machine learning engineers

01:26:52   in Federighi's org?

01:26:53   None of whom are working on Siri, but also there's the Siri group.

01:26:56   Like Apple is, you know, famously sort of tight with money internally in terms of they don't

01:27:02   give the highest salary.

01:27:03   They don't like hiring tons of people.

01:27:04   Their teams are small and yet somehow Federighi gets a hundred, hundreds, multiple hundreds

01:27:09   of machine learning people.

01:27:10   Because I mean, obviously there's machine learning stuff that's part of the OS group.

01:27:13   It's not all Siri.

01:27:13   Like I'm not saying he shouldn't have any machine learning people, but this seems like a duplication

01:27:18   of effort that should have been another huge set of red flags of like, shouldn't like having

01:27:24   machine learning expertise and stuff, shouldn't, shouldn't we have them all kind of like,

01:27:28   aren't we a functional org?

01:27:29   Why should Craig have his kingdom of them and JG have his kingdom of them?

01:27:33   And just, it's just, as this article went on, I just, it was a head, forehead smacker.

01:27:39   You're like, oh my God, this, this like, and setting aside people with access to grind

01:27:44   and grudges or whatever, just like explaining like, uh, intelligence systems or whatever.

01:27:48   Like, I think there's a perfectly good reason for that group to exist.

01:27:51   I'm not saying it shouldn't exist at all.

01:27:52   I'm saying, but there's Siri over there.

01:27:55   Like, shouldn't the intelligence and some people like have lunch with them and say, what are

01:27:58   you guys doing?

01:27:59   Hey, what are you guys doing?

01:27:59   Maybe should we, should we work together and like, should we be cooperating more closely

01:28:03   than we are?

01:28:04   Should we have the same boss as you go up the org chart before you get to Tim Cook?

01:28:08   The answer to that is yes.

01:28:10   Yeah.

01:28:11   Yeah.

01:28:11   It's, it's really not great.

01:28:13   Um, it, it, uh, let's see.

01:28:15   Okay.

01:28:15   Over the years, intelligence systems has trained its own models and built demos that

01:28:18   enabled users to control apps with voice commands, often without the help from the Siri team

01:28:21   that created tensions within the Siri group.

01:28:24   You don't say in one internal Apple presentation, a member of intelligence system showed a slide

01:28:29   depicting an animation of two mountains smashed together and flattened, which some saw as a

01:28:33   subtle dig at Gina Andrea's hill climbing philosophy.

01:28:36   Yeah.

01:28:36   I cut that one out because he's been a fan of hill climbing, which is a machine learning,

01:28:39   uh, term of art.

01:28:40   Uh, and I'm not sure how two mountains smashed together is saying hill climbing is dumb.

01:28:44   You shouldn't do it, but snarking at each other across the organization and your slide

01:28:49   presentations is yet another sign that things are going wrong.

01:28:51   And yeah, I can understand how, like, so it's just the teams feel like they're not getting

01:28:55   what they want out of Siri, but they do want to control things with the voice.

01:28:58   So they sick their private little team on it and show a demo of voice control.

01:29:03   And then JG is like in the room and someone was like thinking that's part of Siri, but

01:29:07   it's not part of Siri.

01:29:08   That's just, I can imagine the Siri team being territorial and saying, anytime you talk to

01:29:12   a thing at Apple that you're calling Siri, it should be going through our group.

01:29:14   And they'll say, well, we can't go through your group because you guys think.

01:29:17   So we did it all ourselves and not healthy.

01:29:20   Well, and this is the kind of thing, like when you have groups fighting with each other like

01:29:24   this at this, at the same level, you know, you have like, you know, Federighi's, you

01:29:28   know, organization versus JG's organization, like this is a problem that Tim Cook should

01:29:33   need to handle.

01:29:34   Like this, like when you have these two orgs fighting, like if they can't work it out amongst

01:29:38   themselves, then the person above them has to step in and say, hey, quit it.

01:29:44   This is ridiculous and wasteful.

01:29:46   Instead, let's, you know, let's make some changes to do this right.

01:29:50   Like that, and this is one of the problems with like Tim Cook generally seems like he

01:29:57   does not like to get involved with things like this.

01:30:00   What we hear a lot is that Cook's approach to conflict management is don't bring me conflicts.

01:30:07   That's what we hear many, many times.

01:30:11   So what happens?

01:30:12   The conflict just gets pushed downward in the organization into areas like this, where like

01:30:17   you have clear, obvious dysfunction.

01:30:20   Like when, when these two orgs are both kind of like fighting to do the same things because

01:30:25   they don't like each other and don't like the way that each other are doing things like that's

01:30:28   terrible.

01:30:29   Like that's obvious dysfunction.

01:30:30   It's not good for the products.

01:30:32   It's not good for the company.

01:30:33   It's not good for the orgs.

01:30:34   But yet the, because conflict cannot be managed correctly above them, it's pushed down into these

01:30:41   little fiefdoms and fights within them.

01:30:43   And that's on Tim Cook.

01:30:45   That is a hundred percent on Tim Cook.

01:30:48   I don't think it's a hundred percent because I think don't bring me conflicts is a valid

01:30:52   philosophy, but the whole point is it's supposed to breed people below you that can resolve

01:31:01   conflicts because they know they can't bring them to you for you to solve.

01:31:04   So they have to solve them.

01:31:05   And I, that's why I think some of the failures also with Federighi and JG, because they both

01:31:10   reported to Tim Cook and Tim Cook says, don't give me, bring me conflicts.

01:31:13   They should understand that to me.

01:31:14   And that means that we have to resolve this conflict ourselves.

01:31:16   And they simply failed to do so.

01:31:18   And you could say, well, he, you know, Tim should figure out that they're failing there

01:31:23   and help them.

01:31:23   But like, that's, that's delegation and organization.

01:31:26   And the one, like the one thing I think about Tim Cook in conflict is one of his very early

01:31:30   moves was in fact, to resolve a conflict, which was Johnny Ive versus Scott Forstall.

01:31:35   And he went for the more harmonious agreement, which is like, if I get rid of Forstall, everyone

01:31:42   else agrees they can get along.

01:31:43   If I keep him, people are still going to be angry.

01:31:45   Therefore, let's go for group harmony.

01:31:47   That was an extremely high level decision that involved Johnny Ive, which is as close as you

01:31:52   could get to like an important employee who's not Steve Jobs or the CEO.

01:31:55   And so I understand why he, uh, participated there, but I can also understand his philosophy

01:32:00   of don't bring me conflicts.

01:32:00   And so that's why I leave some of the blame for this to in fact be with, uh, Federighi and

01:32:05   JG, because maybe, maybe it's not them that were having the agreement, but it was the underlings

01:32:10   and whatever, like you can chase it down in the end.

01:32:11   Obviously it's Tim Cook's fault in the end.

01:32:13   Everything is Tim Cook's fault in the end because he's the CEO is in charge of everything.

01:32:16   But I don't think the don't bring me conflicts is a non-workable strategy.

01:32:22   It's just, it's kind of like a trust, but verify.

01:32:24   You have to also occasionally check in to see that the conflicts are in fact being dealt

01:32:31   with at a level below you and you're not just being shielded from them.

01:32:34   Like that's, we don't know how much communication goes up and down the chain or what people discuss,

01:32:37   but I can totally see like, cause it's such a big organization.

01:32:40   If everyone who, when they're meetings with Tim, just complained about the other groups,

01:32:45   like you, the whole meeting could just be about which groups think the other group is

01:32:48   letting them down or whatever.

01:32:49   And he just wants to hear how you're succeeding.

01:32:50   And so they should be motivated to work together.

01:32:53   And just, this seems like a failure of that system to work.

01:32:57   But I would also say like the, the whole idea of don't bring me conflict, you are right

01:33:04   that like that should breed the, you know, the people under them to, you know, work things

01:33:09   out amongst themselves.

01:33:10   But I think it's, it's based on a fantasy that that will always be fine or, or that is even

01:33:16   the best approach.

01:33:17   I think what happens in reality is if you say at the top, don't bring me conflict, that

01:33:22   doesn't prevent conflicts from happening.

01:33:24   It just prevents them from being resolved by the top, which means that they have to kind

01:33:30   of like shove solutions together below that level.

01:33:34   Not necessarily in a good way.

01:33:36   Like what, what happened here?

01:33:38   This, this was a conflict that they seemingly could not resolve.

01:33:42   So what happened was they solved it in a terrible way.

01:33:46   Like they, they kind of worked around the conflict, kind of exacerbating it, making worse outcomes,

01:33:51   you know, less efficiently.

01:33:53   And like, this was a terrible solution.

01:33:56   And so I, I do feel like the, you know, this should have involved higher levels stepping

01:34:02   in and they didn't, and that's a failure on their part.

01:34:05   I mean, they eventually did like, as with so many things, uh, as, as I think you've said

01:34:10   on past episodes a couple of times, Apple will eventually, assuming it stays alive, figure

01:34:15   out what it's doing wrong and correct it, but it's really slow.

01:34:17   And so, I mean, look, they just, this is an example of presumably Tim Cook stepping in

01:34:23   and resolving a conflict years, years too late.

01:34:26   Not, not even close to like in a timely manner.

01:34:29   Uh, yeah.

01:34:30   And you know, it's the, the other, you know, Ed Catmull book, success hides problems.

01:34:33   Uh, it's also very difficult to deal with these things when everything's going gangbusters,

01:34:38   even if it's not these particular things that are going gangbusters, but just, oh, the company

01:34:41   is successful.

01:34:41   Everybody loves iPhone stock price going up that hides a lot of this stuff because that

01:34:46   can lead people to do the things that they've done here.

01:34:49   We're just like, well, I'll just hire my own machine learning team and we'll work it

01:34:52   out and blah, blah, blah.

01:34:53   And it's just, it's difficult.

01:34:55   This is part of the difficulty of being, you know, one step down the org chart from a CEO

01:35:00   who says, don't bring me problems because I'm sure Craig tells his lieutenants the same thing.

01:35:04   It is a reasonable system.

01:35:06   And the opposite is way worse, which is all, all you ever do is that people complaining about

01:35:11   problems.

01:35:12   Like the whole point of you being getting the bazillion dollars is like solve it.

01:35:16   That's what we're paying you for.

01:35:17   Don't bring, don't bring every problem to your parents to have them resolve it.

01:35:20   That's not a functional organization either.

01:35:22   But yeah, this was a misstep.

01:35:24   And the, like, it's one of the difficulties of being Tim Cook is how thin your attention

01:35:30   is spread.

01:35:31   There are so many more things that he has to think about besides the quality of Apple's

01:35:35   products, which is a weird thing to say.

01:35:37   But when you're the CEO, things like where are we manufacturing them and what are we doing

01:35:41   financially and how are we hiring things and antitrust and government, like there's just

01:35:47   so much to think about that, you know, this one can sneak up on you in a few, in a few dozen

01:35:53   years if you don't pay attention to it.

01:35:55   So yeah, that's what, that's why we're, again, that's why we're talking about this is, this

01:35:57   is definitely one of the ones that did fester, didn't get addressed in a timely manner,

01:36:01   finally did do something about it.

01:36:03   And now we get to see the, uh, the postmortem or the, the, uh, the glimpses of the postmortem

01:36:09   from the outside.

01:36:10   With regard to the, when will my mom's flight arrive demo during WWDC, uh, during an onstage

01:36:15   demo at WWDC 2024, when Apple executive asked Siri when her mom's flight would land, the

01:36:20   voice assistant accessed her email and real-time flight data to provide the current arrival

01:36:23   time.

01:36:23   She then asked Siri to remind her about their lunch plans and the assistant plucked the

01:36:27   details from her iPhone's messages and plotted a route from the airport to the restaurant

01:36:30   among members of the Siri team at Apple, though, the demonstration was a surprise.

01:36:36   They had never seen working versions of the capabilities.

01:36:40   Woof.

01:36:41   At the time, the only new feature from the demonstration that was activated for test devices was a pulsing

01:36:46   colorful ribbon that appeared on the edges of the iPhone screen when a user invoked Siri.

01:36:49   Oh my word.

01:36:51   Yeah.

01:36:52   So this is some interesting wording from the information here, and you can read it in a way

01:36:56   that is not as ridiculous as it sounds.

01:36:58   Uh, because what they're saying is like the Siri team was surprised by this demo because they'd

01:37:02   never seen it working.

01:37:04   It doesn't mean they weren't working on it.

01:37:05   It just means they knew it didn't currently work.

01:37:07   And so they didn't expect to see it demonstrated because why would you demonstrate a thing that

01:37:11   doesn't work?

01:37:11   So all of a sudden you see it in the demo.

01:37:12   Maybe you've been working on it for a year, for two years, but you know, it's not in a state

01:37:16   where you can demo it.

01:37:17   And so they show the demo and you're like, we'd never seen that working.

01:37:20   And then the, this other part here, I felt like it was straight from the information from

01:37:23   the author of the article saying at the time, the only new feature demonstration that

01:37:26   was activated for test devices was the pulsing colorful border.

01:37:31   It doesn't mean the only thing that had been developed was the pulsing colorful ribbon thing

01:37:36   or whatever.

01:37:36   It's that's the only thing that was activated for test devices again within Apple or whatever.

01:37:40   Um, so I think this is just basically an example of the team being blindsided by a demonstration

01:37:46   that essentially put them behind the eight ball to say, we know you've been working on this.

01:37:50   We know it's not ready where we can actually demo it, which is why we're not going to show

01:37:54   it to the press at all, but we are going to put it in the keynote.

01:37:56   And as I said in past episodes, I feel like this is a type of thing that has happened many

01:38:01   times at Apple.

01:38:02   Not that I'm endorsing it or saying it is a good thing, but from the famous Steve Jobs saying

01:38:07   we're going to open source the FaceTime protocol and everyone on the FaceTime team going, wait,

01:38:10   what?

01:38:11   It's all the way down to half the other demos we've seen where we know the software is not

01:38:16   ready to be shipped, but they demo it anyway.

01:38:19   And they say this is shipping in iOS five or whatever.

01:38:21   And the team is like, oh God, I guess we have to, I guess we have to get this thing working

01:38:25   with time for iOS five.

01:38:26   Cause Steve job just on stage and said, it's going to come out on iOS five.

01:38:29   Uh, this smells to me like that, but in case you needed confirmation that that demo was way

01:38:36   out ahead of where the product was.

01:38:38   And that also that the team responsible for the feature was not apparently intimately involved

01:38:44   in giving a go, no go decision, because I guess product marketing does that, or it's, it's

01:38:48   above their pay grade to decide, is this going to be in the keynote or not?

01:38:51   But can you imagine being the audience and being on this team and going, you know, as market

01:38:55   would say bleep.

01:39:01   With regard to not invented here syndrome, despite the company's experimentation with

01:39:05   open AI's models, Apple managers told their engineers in 2023, they couldn't include models

01:39:10   from outside companies and final Apple products and could only use them to benchmark against

01:39:14   its in-house models.

01:39:16   Building large Apple models meant to compete with open AI was responsibility of JG's group.

01:39:21   However, they didn't perform nearly as well as open AI's technology, according to multiple

01:39:24   former Apple employees who use the models in 23 and 24.

01:39:28   In his new role overseeing Siri, Federighi has already shaken things up and a departure

01:39:34   from previous policy has instructed Siri's machine learning engineers to do whatever it

01:39:38   takes to build the best AI features, even if it means using open source models from other

01:39:42   companies and its software products, as opposed to Apple's own models.

01:39:45   So this is not shocking that Apple only want to use its own models.

01:39:49   It's totally seems like an Apple-y thing, but also setting aside whether it's an Apple-y

01:39:52   thing, I think it's actually a fairly prudent thing because especially back, you know,

01:39:58   a couple of years ago, but even today, there's so much uncertainty about the stuff that goes

01:40:04   into models and people getting sued over the things the models were trained on or whatever.

01:40:07   It's prudent to be able to have an answer when someone says, hey, what did you train that

01:40:14   model on?

01:40:14   That's it's one of the reasons Adobe makes such a big deal out of training all of its models

01:40:20   that are part of Photoshop on content that Adobe owns or licenses because they have an answer.

01:40:25   If someone says, what did you train that on?

01:40:26   They'll say, look, I can show you it's this and we paid for it all.

01:40:29   And we like, we have license to it all.

01:40:30   Like it's not, we didn't like pirate a bunch of books or throw a bunch of movies.

01:40:34   Like I think that is a prudent thing to do.

01:40:37   And of course, Apple being like, of course, we're going to use our own stuff.

01:40:39   We're not going to use their things.

01:40:40   It's where Apple will do everything ourselves.

01:40:42   So it makes sense.

01:40:44   You know, and we'll get to it in a little bit.

01:40:46   The whole deal.

01:40:46   Okay, well, you did that, but your models aren't as good at OpenAI.

01:40:49   So now what?

01:40:50   And so in this crisis situation, Federighi being more pragmatic and saying, all right, well,

01:40:55   we've, this has been gone on long enough that so far no one has been sued out of existence.

01:41:01   So if we use an open source model, I guess, worst case, whoever's responsible for building

01:41:06   that open source model, like they get sued or maybe we like, and it's like at this point

01:41:11   we can't, we have lost the ability to be as prudent as we want to be.

01:41:17   It's more important now to fix things.

01:41:20   And that means taking on more risk.

01:41:22   That means using open source models.

01:41:23   So be it.

01:41:25   So that again, it seems like Federighi being pragmatic, but it is not a situation Apple really

01:41:30   wants to be in.

01:41:31   Yeah, I agree.

01:41:32   I think this is, this is not like, this was not the wrong call for, for the stage Apple

01:41:38   was in like, you know, you gotta figure like the position Apple's in, in the market.

01:41:43   If anything goes wrong, they get raked over the coals.

01:41:48   Like they have so many eyes on them.

01:41:50   If anything is possible that somebody could sue them for, you know, any kind of infringement,

01:41:56   they're on the line for, they're on the hook for that, you know, they wouldn't

01:42:00   want probably to involve themselves with other people's models shipping in their products.

01:42:05   Like that, that is just the reality of their position in the market.

01:42:08   It would be too risky for them.

01:42:09   So I, even though like, I think the right thing to do would have been for Apple to have

01:42:16   developed their own models earlier than they did.

01:42:20   And what seems to have happened is they realized too late that this kind of product was a big

01:42:25   deal and they needed to have something in this area.

01:42:27   And so they rushed and put them together.

01:42:29   And in a rush, you might think, which maybe we'll get to later in a rush, you might think

01:42:34   maybe we should use other people's models to speed this along.

01:42:37   But I understand why, given Apple's, you know, position and, you know, the way they usually

01:42:43   do things, I understand why they didn't want to do that at that point.

01:42:46   All right.

01:42:47   And then on the 11th of April, which was Friday, the New York Times had its own article.

01:42:54   And I don't have the title in front of you.

01:42:56   What was the Trump?

01:42:57   The title was much broader than this, but the only part that I thought was interesting or

01:43:01   relevant was this tidbit, which is specifically about what we've just been discussing.

01:43:05   Right.

01:43:05   The title of the article, there we go, is What's Wrong With Apple?

01:43:08   Kind of a big topic.

01:43:09   And they went all over the place.

01:43:11   And I didn't think most of it was worth repeating, but this part was relevant.

01:43:14   Yeah.

01:43:15   This is Trip Mickel, who had recently written a book about the design team after Johnny Ive,

01:43:19   which I personally didn't really care for, but that's neither here nor there.

01:43:23   Anyways, reading from his article, the AI stumble was set in motion in early 2023.

01:43:27   JG, who was overseeing the effort, sought approval from the company's chief executive, Tim Cook,

01:43:31   to buy more AI chips, known as graphic processing units or GPUs.

01:43:34   At the time, Apple's data centers had about 50,000 GPUs that were more than five years old,

01:43:38   far fewer than the hundreds of thousands of chips being bought at the time by AI leaders

01:43:42   like Microsoft, Amazon, Google, and Meta.

01:43:44   Mr. Cook approved a plan to double the team's chip budget, but Apple's finance chief,

01:43:48   Luca Maestri, reduced the increase to less than half of that.

01:43:51   Mr. Maestri encouraged the team to make the chips they had more efficient.

01:43:55   A lack of GPUs meant the team developing AI systems had to negotiate for data center computing power

01:44:00   from its providers like Google and Amazon.

01:44:02   The leading chips made by NVIDIA were in such demand that Apple used alternative chips made

01:44:06   by Google for some of its AI development.

01:44:08   I should also add that after the article was published, well, they added the following later on.

01:44:13   After this article was published, Trudy Muller, an Apple spokeswoman, said the company had fulfilled

01:44:17   Mr. Gianandrea's budget request for GPUs over time rather than all at once.

01:44:22   She said Mr. Maestri had never asked the team to make its chips more efficient.

01:44:25   So if you're in the New York Times, you actually get Apple to respond.

01:44:28   Isn't that interesting?

01:44:29   Funny how that is.

01:44:30   If you're anybody who's not the New York Times, good luck.

01:44:32   Yeah.

01:44:34   So the earlier bit about, you know, Apple wanted to make its own models, but people inside Apple

01:44:39   said that the ones the JDS group making weren't as good at opening AIs.

01:44:42   Seems like he was trying to do the right thing, which is like, look, everyone else who's making

01:44:46   their own models, they're spending tons of money buying GPUs.

01:44:50   We have to do that as well.

01:44:52   Can we do that?

01:44:53   And Tim Cook said, yeah, but not as fast or as expensive as you want.

01:44:58   And I also imagine the CFO probably is not telling people what they should do in terms of chip

01:45:03   efficiency.

01:45:03   But it is an interesting contrast with DeepSeek, which was forbidden from getting the best of the

01:45:08   best GPUs due to like import restrictions and stuff.

01:45:12   And so I had to figure out a way to essentially do what the CFO, the former CFO was saying here,

01:45:18   which is like, can you just be more efficient with the GPUs that you have?

01:45:21   DeepSeek said, yeah, I think we can figure that out.

01:45:23   Apple did not figure that out.

01:45:25   But this is, I don't blame JG for this at all.

01:45:28   It seems like he was trying to do the right thing.

01:45:30   It seems like the organization, though, was like not ready to commit the amount of money

01:45:37   that other companies work.

01:45:38   So OpenAI is like, this is the only reason we exist.

01:45:40   We're raising money and we burned all on GPUs, like just where we are in the phase where

01:45:45   we're like gathering as much investment as we can.

01:45:47   And we dump all that investment into training models.

01:45:51   And if we fail, we fail.

01:45:52   But like, that's the state we're at.

01:45:53   And Apple is like, well, that's not really currently our business.

01:45:57   We have another business that works really well.

01:45:59   Have you seen our services chart?

01:46:00   We sell a lot of iPhones.

01:46:01   Then these people over here telling us we need to spend $200 billion on, you know,

01:46:06   GPUs in the data center.

01:46:08   Can we not do that or do it more slowly?

01:46:12   And can you just like use Google TPUs?

01:46:14   I think we had a story many months ago about Apple using Google's resources, you know, instead

01:46:21   of buying your own stuff, buying your own GPUs, rent time on somebody else's.

01:46:24   And you can do that as well.

01:46:25   But that is less money efficient and it's not what most of the big players are doing.

01:46:31   Google uses its own.

01:46:32   They probably get really good rates on using TPUs because they're their own things.

01:46:36   Open AI is buying tons of stuff itself and using stuff from Azure that Microsoft is letting

01:46:42   them use.

01:46:42   So I feel like this is, this is sort of like evidence of Apple, uh, not correctly prioritized,

01:46:51   not prioritizing, uh, Siri, AI, ML, uh, uh, LM stuff to the degree that it would produce

01:47:01   a successful result.

01:47:03   Cause I'm not going to say they didn't invest enough to do it because again, deep seek figured

01:47:07   out a way to do it with older hardware.

01:47:10   Apple could have done that, but didn't, uh, but yeah, this is, this is not as bad as everything

01:47:15   else that we saw, even though all the things like, oh, they, he wanted to buy stuff, but

01:47:18   Tim Cook wouldn't let him, you know, like they, the company didn't invest as heavily or as

01:47:23   quickly as in hindsight, it seems like they should have.

01:47:26   But if all that other stuff that we just went through about organizational dysfunction didn't

01:47:29   exist, they still would have done way better with the resources they were given.

01:47:32   Yeah.

01:47:34   I don't know.

01:47:34   It's to me, like it's hard.

01:47:36   Look, I've said before, I don't, I don't want to drill it in too much here, but like

01:47:39   I really think this is like when Microsoft missed mobile, like Apple missing LLMs and

01:47:44   other similar like modern AI techniques.

01:47:47   I really think this is as big and, and he will be judged harshly.

01:47:52   Tim Cook will be judged harshly in the future for like, they missed AI, like Apple missed AI

01:47:57   the same way Microsoft missed mobile.

01:47:59   And this I think is a, is a good example of maybe why that happened or, or the mindset that

01:48:05   was going on when it was happening.

01:48:06   Like Apple should have been investing in this way earlier.

01:48:10   And I think what's going to end up happening is their products will be uncompetitive in lots

01:48:18   of important ways that will grow in importance over time.

01:48:21   And they will either have significant, you know, losses to their market share in the future

01:48:28   potentially, or they will have to spend a huge amount of money buying AI companies or buying

01:48:34   a big AI company down the road to, to make up for what they're, what they're lacking

01:48:38   here.

01:48:39   And either way, that's going to be way more expensive than it would have been for them to

01:48:43   get on this earlier.

01:48:43   But it just seems like they, their heads were not in the right place over the last two years

01:48:49   to realize that this is an area they should have been investing very heavily in.

01:48:52   Instead, they seemed dismissive.

01:48:55   They seem like they were, you know, they, they were, they had, they were fully hubris and

01:49:00   they were dismissive.

01:49:00   They thought there's no use for this stuff.

01:49:03   This is all jokes.

01:49:04   This is all, you know, useless, you know, BS generating machines.

01:49:07   And I think they've missed a lot here.

01:49:09   They are still missing a lot.

01:49:11   Now they are scrambling to catch up because like Microsoft missing mobile, Apple missed AI

01:49:17   and who knows when it will be, if they ever catch up, uh, I'm guessing it's going to be

01:49:23   a while.

01:49:23   So that, you know, this, this times article about like, you know, the, you know, Luca Maestri

01:49:29   didn't denying them GPU budget.

01:49:31   It, if Apple knew what they were dealing with back then, they should have put all their, like

01:49:38   they should have put as much money into it as it took.

01:49:40   Certainly they had the money and instead they're, you know, they were being stingy because they

01:49:45   didn't respect this as a problem.

01:49:47   I hope they respect it now as a problem.

01:49:50   I hope.

01:49:51   Um, but they're already really far behind for reasons like this.

01:49:55   This is one of the most difficult decisions for a big company to make because there's always

01:50:00   going to be some industry fad or things that are getting a lot of startup investment.

01:50:04   And it's easy in the beginning to look at those and say, nah, we'll see how that turns out.

01:50:09   Like if it ends up to be a thing, then we'll look at it, but we can't go chasing after every

01:50:13   startup.

01:50:13   So when you see like how much money open AI was investing in its efforts, Apple's going

01:50:18   to look at that and say, well, of course you're investing in that open AI.

01:50:20   Cause you literally, this is literally your whole company.

01:50:22   You're it's like, this is your shot.

01:50:24   You're taking it.

01:50:25   But like, we shouldn't be investing in the same amount that open AI is because it's not

01:50:29   a bet the company thing for us.

01:50:30   It is.

01:50:30   It is the whole company for open AI.

01:50:33   So they're, they shouldn't be the model for us because look at all the other companies

01:50:36   doing things.

01:50:36   Like we can't invest at that level.

01:50:38   Anytime, anytime there's a startup or a bunch of startups in the thing, getting tons of money,

01:50:42   we can't chase that and say, well, we need to be investing the same amount as they do because

01:50:45   maybe they'll fizzle out or maybe it won't be that big a deal.

01:50:48   Or maybe we can just buy them the one that's successful or whatever.

01:50:50   So it's so hard as a big company to know which ones you should pay attention to and actually

01:50:57   which ones you should invest and like how, like maybe you don't do it the second they

01:51:00   do, but like, how long do you wait to decide, okay, well, open AI has been putting tons of

01:51:05   money in this and actually they're kind of getting results.

01:51:07   And I think that's kind of what Craig Federer is like the story of his revelation about chat

01:51:12   GPT of like, oh, they've made a product out of it and the product is good.

01:51:15   We need to get on this train and turning the ship is not, I mean, he's not the CEO.

01:51:20   But even if he was like, maybe at that point you already have already waited a little bit

01:51:24   too long and it seems like they just have been slow to react to this and have been reacting

01:51:30   in a more prudent big company kind of way of like, oh, you need more GPUs.

01:51:35   Okay.

01:51:35   You can have more, but you need them that fast and maybe you can rent some out.

01:51:39   And like the soon as like the, like Bill Gates with the internet tidal wave memo, right?

01:51:45   When someone's sufficiently high up in an organization has that eureka moment,

01:51:50   which by the way, you know, lower level employees are constantly having eureka moments

01:51:54   about things they think the company should do.

01:51:55   And sometimes they're right and sometimes they're wrong, but we only notice when an

01:51:59   executive of a sufficient high level has that eureka moment and says, oh my God, the internet,

01:52:03   we need to turn the whole company around because guess what?

01:52:06   I've just realized that the internet is going to be a really important thing and we should

01:52:11   get on that.

01:52:12   Arguably Apple was even slower than Microsoft in that regard.

01:52:16   It just didn't matter because they were so small and in the process of dying.

01:52:18   So no one cared what they did, but Microsoft was like all hands on deck.

01:52:22   Bill Gates went away on a vacation, came back and says, internet, go.

01:52:26   And he did that for security.

01:52:28   He did that for mobile.

01:52:29   And like, you know, sometimes you're too late.

01:52:31   Sometimes you're too early.

01:52:33   Sometimes you're just on time.

01:52:34   But like there's that time before that where it's so hard to know, like, is this going to

01:52:39   be the next big thing or is this not?

01:52:41   Or what should we be doing?

01:52:42   And like, you know, with Apple, we've always talked about the car program of putting all

01:52:46   that money into the car.

01:52:47   It's like, make big bets and I'm going to pay off like, oh, well, right.

01:52:51   Or even, even a vision pro like that.

01:52:53   It's a big bet.

01:52:54   It's not currently paying off.

01:52:55   It's not even entirely clear that that's going to be the next big thing, but it is kind of

01:52:58   cool.

01:52:59   Um, when we think of Apple is this big giant sack of money that they are, right?

01:53:03   And we say, well, you've got all this money, like it's doing you no good.

01:53:07   Just, you know, sitting there accumulating, like you might as well make these big bets.

01:53:11   I do wonder like in the pitch to say, okay, someone is convinced that actually, you know,

01:53:18   chat, have you seen chat GPT, Tim?

01:53:19   Like we, they should put money in this.

01:53:21   It only seems like a lot of money because we don't have Apple money, right?

01:53:25   If you're looking at the percentage of Apple money and do that math, like there was do

01:53:28   of like, uh, you know, uh, a billionaire, uh, spends a hundred million dollars.

01:53:33   It's the same as you spending $5, right?

01:53:34   It's just like, should we do that?

01:53:37   Like, I know you don't, you don't get rich by spending all this money, but it's, it's

01:53:42   so like Apple was so late on this.

01:53:44   And from this reporting, even when they had decided that was a thing they wanted, they

01:53:48   were still kind of acting like they needed to pinch pennies.

01:53:51   And it's like, you spent like billions of dollars on a car that never shipped.

01:53:56   You're going to slow roll the GPU rollout.

01:53:59   Come on.

01:53:59   Like just, uh, it is frustrating, but I think it is hard even, even when you have so much

01:54:06   money, maybe especially when you have so much money to, uh, get out of the mindset of like

01:54:11   the dragon on the horde of like, yeah, I have all this money because I don't spend it on

01:54:15   every single thing that people want me to spend it on.

01:54:16   But anyway, um, unlike mobile, I still continue to think that, um, so far, this is not guaranteed,

01:54:23   but so far, it doesn't seem like a thing that threatens Apple's platform because in theory,

01:54:27   if they get off their butts and do something more sane, they could be quote unquote, the

01:54:32   best platform for hosting voice assistants.

01:54:34   They're not currently that, but technically they could be.

01:54:36   And chat GPT does not seem to be threatening them as a platform or any other company threatening

01:54:41   them as a platform yet, because they don't have a platform or a product other than like

01:54:45   a webpage that you can talk to and type things into, but that's not guaranteed to be true

01:54:49   forever.

01:54:49   And so I really hope, uh, the, uh, dragon slumbering on its horde of money has now woken up and we'll

01:54:55   see some better results in the coming years.

01:54:57   No, honestly, like I think chat GPT has amazing products and, and I think they, it's way beyond

01:55:03   a webpage like at this point, but they don't, but they don't replace the iPhone.

01:55:05   They don't replace the iPhone, but I think they have the ability to replace a lot of

01:55:11   iPhone apps, uh, and a lot of iPhone features.

01:55:13   Yeah.

01:55:13   Like for instance, like take a, take 30% from all of them.

01:55:16   Oh yeah.

01:55:17   But like, like, you know, like, like in my car, like whenever I'm like driving and I want

01:55:21   to, I need some knowledge, I have a question about something I will, I will say, you know,

01:55:26   Hey, Hey thing, ask chat GPT.

01:55:30   And then my question, I say that every single time now, because it's so much better than Siri

01:55:35   that now at least Siri has the hook where it can just shell stuff out of chat GPT, you

01:55:39   know, not always well, not always at the right time.

01:55:41   By the way, the reports have shown that asking chat GPT directly has way better results than

01:55:48   asking Siri to ask chat GPT, which doesn't make any sense, but that's the only thing we've

01:55:52   seen because it's a, you know, man bites dog type of story.

01:55:55   We don't see the other ones, but like it, it boggles wine.

01:55:57   That should ever be the case.

01:55:58   Like, shouldn't it be the same, but somehow it isn't.

01:56:00   Right.

01:56:01   But regardless, like, you know, like chat GPT, like via CarPlay, via Siri is way better than

01:56:07   Siri ever was.

01:56:08   Um, you know, the, when, if you're actually on the, on the device and able to use it directly,

01:56:13   you know, mapping chat GPT to like the action button is a pretty common thing on iPhones,

01:56:18   on the Mac.

01:56:19   Of course, you can just tab open with the webpage or they have a Mac app that you can have a

01:56:23   shortcut key to, and it can integrate with other stuff.

01:56:25   Like I think of all the AI companies, you know, the different, whatever model is on top changes

01:56:31   week to week almost.

01:56:32   Uh, but chat GPT or open AI consistently has the best products in front of their model.

01:56:38   Usually like they, they really do well with like, how can we take this technology, this

01:56:44   amazing, you know, model that we have and actually make useful user facing features and

01:56:48   compelling user facing features.

01:56:49   Usually chat GPT is the, is the leader in that area.

01:56:52   And I, I really do not think it's out of the realm of possibility.

01:56:58   Maybe down the road, Apple buys open AI for just a ridiculous, massive amount of money.

01:57:04   That's after open AI buys Johnny Ives company, right?

01:57:07   Cause together, that's another story maybe we'll have in a future episode, but, uh, that

01:57:11   seems like it's coming down the pike too.

01:57:13   Oh God, I sure, I sure hope that isn't how we get Johnny Ives back into Apple.

01:57:16   No, I think he'll just take the money, he'll take the money and be gone by then.

01:57:20   But, uh, you know, cause that they're, they're working on whatever, like presumably something

01:57:24   that's like the AI pin, but not crappy.

01:57:25   Uh, but yeah, open AI buys his company.

01:57:27   He becomes even more fabulously wealthy, just spears into the wilderness and then Apple buys

01:57:32   open.

01:57:32   I mean, I don't know.

01:57:33   Open AI at this point is Apple shouldn't buy it at current valuations, but we'll see.

01:57:39   We'll see how that goes.

01:57:39   But Apple still trying to do it on their own.

01:57:42   And honestly, I think, uh, it is plausible.

01:57:45   They still can do it on their own because as you noted, like open AI has good products,

01:57:49   but the products they have, I was going to say, Apple should have no problem doing products

01:57:53   just as good, but you know, these days, who knows?

01:57:56   But anyway, it's not out of sight.

01:57:57   Apple's pretty good at making software products too.

01:57:59   So maybe they could do that.

01:58:02   And the question is, all right, well, what about the model part?

01:58:04   Well, I, it's not entirely clear that, that open AI is lead over the rest of the industry

01:58:09   is insurmountable, but it's still in the grand scheme of things, early days.

01:58:12   So we'll see how this turns out.

01:58:14   Well, and, and I think it's less, I mean, I think the lead is, is more, uh, insurmountable

01:58:20   than I think a lot of commentators to give it credit for because yes, other people can indeed

01:58:25   go, you know, train a model and, you know, give it, give them somebody enough money.

01:58:29   They can train a model from scratch and they can do okay.

01:58:31   So far that has happened a few times, but.

01:58:34   We are already very rapidly moving past just the model itself being really good into lots

01:58:41   of different ways to use the models, to hook them up in different ways, to engineer their

01:58:46   prompts in different ways, to do all sorts of things with, you know, what, what you're

01:58:49   feeding them and how you're feeding it to them.

01:58:50   And then what your steps are that you're taking to call them multiple times.

01:58:53   Like there's all these different techniques, like it's getting more and more complicated

01:58:56   very, very rapidly.

01:58:58   This is a, you know, rapid development happening, huge advancement happening.

01:59:02   And open AI is pretty, it's usually the head of the pack and that's going to, you know,

01:59:09   before too long, their lead is going to be pretty large in ways that are not easy to replicate.

01:59:15   So things like mind share among people, what, you know, habits, apps, you know, actual like,

01:59:21   you know, integrations they have, like they're growing, growing, growing.

01:59:25   I hope Apple can do a good job themselves, but I, I fear that for all the same reasons that

01:59:33   Siri has sucked all this time, it seems like they don't have the culture or the talent to

01:59:42   properly enable something like this to grow in their company.

01:59:46   for, in many, many ways, AI companies and the development of AI type stuff seem at odds with

01:59:54   Apple's, with Apple's culture.

01:59:55   And Apple as a company doesn't seem to give it the respect that I think it deserves.

02:00:00   And so I don't think Apple will ever do a really great job with AI.

02:00:04   What I think is most likely here is similar to, you know, things like, you know, Apple Music.

02:00:11   Apple Music is not an amazing music service.

02:00:14   I was just about to bring that up to say that I no longer had confidence that Apple can make

02:00:18   a digital jukebox style application that's, that's confident.

02:00:21   Remember when they used to be good at that?

02:00:23   Right, like Apple, like Apple Music is fine.

02:00:27   It's not great.

02:00:28   I would not say it's fine.

02:00:30   It's sub fine.

02:00:30   Yeah, fine.

02:00:31   We'll call it that, whatever it is.

02:00:33   But like, Apple Music is not great.

02:00:35   It's one of the biggest music services.

02:00:37   Why?

02:00:37   Oh, you're talking about the service.

02:00:39   I was talking about the app.

02:00:40   Okay, sure.

02:00:40   Oh, yeah.

02:00:40   Yeah, the, the, the, the services, again, it's fine.

02:00:43   It's not great, but it's fine.

02:00:45   But Apple gets by with that because of their lock-in.

02:00:49   They have so much lock-in infrastructure now that they can get by with a series of mediocre

02:00:56   offerings and the lock-in keeps us all using them and, you know, gives them the user base

02:01:00   they need.

02:01:01   Their 30% makes it hard for other people to compete.

02:01:03   Their technical boundaries make it hard for other apps to even integrate in the same ways.

02:01:07   So, like, they don't really need a great AI story.

02:01:12   They just need a functioning AI story.

02:01:14   Just like, you know, Apple Music is not a great music service.

02:01:17   It functions.

02:01:17   It's a functioning music service.

02:01:19   And that lets them compete really effectively against others in the area, not because they

02:01:23   have the best offering, but because they have unfair advantages they give themselves.

02:01:27   That's how this is going to work, too.

02:01:29   I don't see any outcome here where Apple has a great AI company.

02:01:33   Like, I just don't see it.

02:01:34   I hope that through various forces, internal and external, I hope they are forced to let

02:01:41   companies that are good at it, like OpenAI, put their models into different places in iOS.

02:01:48   Like, you know, replace Siri or integrate better, like, the way the current one's done, like,

02:01:52   maybe more extended.

02:01:53   Like, I hope that becomes the direction they go and Apple becomes more of the platform company

02:01:58   that it's actually, you know.

02:01:59   Apple as a platform is a very, generally a very good situation.

02:02:04   Like, when Apple makes really good OSes, they make really good developer frameworks, like,

02:02:09   they make really good APIs.

02:02:11   Like, they're really good at that stuff.

02:02:12   It's when they start getting into all the services stuff and all the, you know, gatekeeping and

02:02:17   rent-seeking and anti-competitive behavior.

02:02:19   That's when we see the worst of Apple.

02:02:21   I don't think AI is going to do well for them.

02:02:25   Like, I think it's going to be a second-rate feature, just like Siri has been.

02:02:30   Like, it's been okay at best and usually pretty bad.

02:02:34   But what do you see in Apple's culture that's any different now compared to, you know, the last 10 years

02:02:41   that will enable this to become a great AI company?

02:02:44   I see nothing different.

02:02:45   I see it being the exact same, you know, handful of people running the company, the exact same

02:02:51   handful of people in all these leadership roles who don't really understand AI that well, who

02:02:57   don't tend to do very well at big data problems and machine learning types of problems or big

02:03:02   services.

02:03:03   Like, that's, these are not Apple's strengths.

02:03:05   They do an okay job sometimes.

02:03:07   But then, you know, Google and other companies usually totally outclass them in a lot of these

02:03:11   areas.

02:03:12   That's probably what's going to happen here.

02:03:14   But at least it'll be better than having no AI features and no AI models or the first

02:03:21   round of Apple intelligence stuff so far, which has been like, you know, barely there.

02:03:24   But I wish that they would do something a little bit bigger and actually try to make something

02:03:31   really great themselves.

02:03:32   Like, if you're going to, if you're going to become, if you're going to own the whole platform

02:03:36   and lock out everyone else in lots of different ways, you got to do a really good job yourself

02:03:41   in every possible area.

02:03:43   That's not a great strategy.

02:03:44   But if you choose to do that strategy, please do a better job.

02:03:46   And so I hope they like, you know, really, I hope I'm totally wrong about this.

02:03:51   I hope like in, you know, in two years we do an episode and we, and somebody points out

02:03:54   how down I was about their AI prospects.

02:03:56   And it turns out they, they turn out awesome.

02:03:58   That'd be amazing.

02:03:59   I would love that.

02:04:00   I don't see any evidence that they've changed what it would, that they would, that they've

02:04:06   changed anything that it would take for them to change to get from here to there.

02:04:09   It seems like moving AI stuff under Federighi is good.

02:04:14   Like that's better than where the, wherever the heck it was before.

02:04:17   Like it's better than this situation, but they're not really making like the really big,

02:04:21   like cultural changes or leadership changes at larger levels to really, to really make them

02:04:27   a different kind of company.

02:04:29   You know, what we get with Tim Cook being a diplomat, as we were saying, as I was saying

02:04:33   earlier in the show, like he's a diplomat and that is what the company needs a lot of the

02:04:37   time, but he's really not a product person at all.

02:04:41   And people keep apologizing for that.

02:04:45   And people keep saying like, oh, well, you know, he doesn't need to be a product person.

02:04:48   He delegates that, but I don't think he delegates that very well because he fundamentally doesn't

02:04:54   understand or respect it enough.

02:04:57   we're seeing issues like, like missing AI.

02:05:00   We're seeing things like, don't bring me conflicts.

02:05:02   Like we're, we see things like that, that are just like, I don't know if Apple is too big

02:05:08   to have a product person at the head now.

02:05:09   It might be, but I wish it, I wish it would have a product person again.

02:05:14   I really, I miss having Apple have that kind of spirit in it.

02:05:19   I'm not saying it had to be Steve Jobs forever, but we lost a lot when we lost Steve.

02:05:25   And one of the things we lost is a really good product person.

02:05:28   Who is the head of product at Apple right now?

02:05:31   I don't know.

02:05:32   I have no clue.

02:05:35   And it's like the Craig ready situation where you, it's, it is good to have an executive

02:05:40   who doesn't have to, you know, know how every technical thing works, but has had, it comes

02:05:47   from a technical background and incorporates that into his decision-making because he considers

02:05:52   it an area where he can have insight and something to add.

02:05:57   And Tim Cook, again, as I always said, to his credit, realizes that he doesn't really

02:06:01   have anything to add on the product front because it's just who he is.

02:06:05   Right.

02:06:05   Ideally, if you build your own CEO construction kit, right, you'd pick one that has product

02:06:10   knowledge.

02:06:10   And, you know, if you're building your VP, you pick one with technical knowledge, but you

02:06:14   know, you can't have everything.

02:06:15   Like you got Tim Cook's amazing skills in certain areas that led him to become the CEO, but he has

02:06:21   deficits as well.

02:06:22   And this is one of them.

02:06:23   And so I, you know, every time I see him, and this is a story for maybe next week, every

02:06:28   time I see him trying to introduce product opinions, I think that he should not.

02:06:31   Oh yeah.

02:06:32   So he, he, he is who he is.

02:06:34   But yes, it would be better if we had a CEO who had all of Tim Cook's good qualities, but

02:06:40   also a bunch of good qualities that he doesn't have.

02:06:42   And, you know, that's again, he, he does the only thing he can do, which is, or the only

02:06:47   thing I think he should do, which is delegate that not because the CEO should always delegate

02:06:51   that, but because Tim Cook personally has no choice but to delegate that because it is

02:06:55   not one of his strengths.

02:06:57   And yeah, that's, there are so many things, there's increasing number of things that are

02:07:01   wrong with Apple that you look at and you think this is never going to change until there is

02:07:06   until a bunch more people retire.

02:07:07   Yeah.

02:07:08   Which will happen.

02:07:09   Time marches on.

02:07:10   There will be new leadership, but there's so many things that are, that you just see that

02:07:14   you just don't see possibly happening with the current leadership because they just don't

02:07:18   believe in it and you wouldn't want them to do something they don't believe in because

02:07:21   the heart's not in it.

02:07:22   But sometimes that means they've kind of outlived their usefulness to the company and

02:07:25   you need somebody who has a different opinion.

02:07:26   And maybe we could have a whole show on that at some point of just like all the things that

02:07:30   require new leadership for Apple to change and what they should be.

02:07:33   And they're so big and so counter to everything.

02:07:37   Every person on the Apple leadership page believes in that you just know they're not going to

02:07:40   happen until those faces change.

02:07:41   All right.

02:07:42   Thank you to our sponsors this episode, BetterHelp, Mack Weldon, and HelloFresh.

02:07:47   And thanks to our members who support us directly.

02:07:49   You can join us at atp.fm slash join.

02:07:52   One of the many perks of membership is ATP Overtime, our weekly bonus topic.

02:07:57   This week on Overtime, we're talking about Sony's new 2025 TVs.

02:08:03   This is a big new lineup and you know how much all of us, especially me and Casey, love

02:08:08   talking about TVs.

02:08:09   So it's going to be a very me and Casey heavy Overtime this week.

02:08:14   Well, one of us wants to get a new TV and it's not me and it's not Casey.

02:08:17   You can join us at atp.fm slash join to hear this and every other Overtime plus our member

02:08:23   specials and get a discount on our current Mercs and all sorts of other fun benefits.

02:08:27   Atp.fm slash join.

02:08:28   Thanks, everybody.

02:08:29   Talk to you next week.

02:08:33   Now the show is over.

02:08:36   They didn't even mean to begin.

02:08:38   Cause it was accidental.

02:08:40   Oh, it was accidental.

02:08:43   John didn't do any research.

02:08:46   Marco and Casey wouldn't let him.

02:08:49   Cause it was accidental.

02:08:51   Oh, it was accidental.

02:08:53   And you can find the show notes at atp.fm.

02:09:00   And if you're into Mastodon, you can follow them at C-A-S-E-Y-L-I-S-S.

02:09:09   So that's Casey Liss.

02:09:10   M-A-R-C-O-A-R-M-N-T.

02:09:14   Marco Arment.

02:09:16   S-I-R-A-C.

02:09:18   U-S-A-C-R-A-C-U-S.

02:09:20   It's accidental.

02:09:22   Accidental.

02:09:23   They didn't mean to.

02:09:26   Accidental.

02:09:27   Accidental.

02:09:28   Tech podcast.

02:09:31   So long.

02:09:34   So speaking of all this AI stuff and getting to what Marco was talking about in terms of like mindshare of like among people who are not listening to tech podcasts.

02:09:44   What do they come to think of?

02:09:46   What becomes the Kleenex of voice assistants at this point among, you know, the non-tech enthusiasts?

02:09:53   Siri does have mindshare as the one that's bad.

02:09:58   Yeah.

02:09:59   It's the punching bag.

02:10:00   Yeah.

02:10:02   And so that, you know, it's not like these things aren't penetrating to the mass market, but they're penetrating in a bad way when it comes to Siri.

02:10:09   But anyway, I was reminded of this when I was watching one of my many for I watch car rebuilding channels and I watch channels about people who are either rebuilding boats or live on boats or are sailing on boats, which is very strange for someone who gets massively seasick and will never be on a boat.

02:10:23   But anyway, that's what I watch on YouTube.

02:10:25   And I'm watching this one one channel that I've been watching for years and years, and it used to be like, hey, we live on a boat.

02:10:31   How does that work?

02:10:32   And I was like, oh, that's cool.

02:10:33   How does that work?

02:10:34   Anyway, that was years ago.

02:10:35   Now they're at the stage where they are.

02:10:38   They're upgrading their boat.

02:10:40   They're building a new boat.

02:10:41   They have their old boat is just they docked it somewhere and they're constructing a new boat from scratch, which is a very fascinating thing.

02:10:48   Doesn't involve a lot of being on the water.

02:10:49   It involves a lot of being in a workshop, but that's what they're doing.

02:10:51   And I'm watching mostly to see if they all die because they don't know what they're doing.

02:10:55   They're building their boat.

02:10:56   There are videos that I watch and I just shake my head and I go, you're all going to die.

02:11:00   It's like the scene that draws the YouTube can't recall because you probably haven't seen the movie or can't remember it.

02:11:05   Anyway, I've never seen it.

02:11:07   Yeah.

02:11:07   Anyway, setting aside whether they all die on their boat that's been shot all the constructed.

02:11:11   In a recent episode, they were faced with the problem where they were just, I don't know, they're putting a tube inside another tube and filling the gap between them with like epoxy to put it in a tube for like the prop shaft or whatever.

02:11:22   Anyway, the YouTube part, the YouTuber, the person running the channel made it a point to have an entire episode saying, here's how I decided to solve this problem.

02:11:34   And, you know, the tools I used or whatever, and they've made videos like this for years.

02:11:38   But this time, AI was in the mix.

02:11:42   And so they use both ChatGPT and DeepSeq, which kind of surprised me that DeepSeq had enough sort of mind share to even come up as an option.

02:11:51   I think maybe they made this video when it was like all in the news because there's a big delay when the videos come out, right?

02:11:55   It's not happening in real time.

02:11:56   So DeepSeq was in there as well, but also ChatGPT.

02:12:00   And this is a person like fabricating real things.

02:12:04   And they did like a screencast sort of like a conversation with ChatGPT and asking like, if I have a tube this diameter and another tube with this diameter inside it and it's this long or whatever, what's the area of the thing?

02:12:16   Like it's math that a middle schooler could do, right?

02:12:18   It's not complicated math.

02:12:20   But their solution to doing that math was to ask ChatGPT.

02:12:23   And anybody who has used LLM-powered things to do math knows, in general, it is not their strength.

02:12:28   They are large language models, not large math models.

02:12:31   As Marco pointed out recently, they're not as simple as that anymore.

02:12:34   There's a million different things being orchestrated.

02:12:37   And most of these things have some kind of math engine behind the scenes that they will feed the math stuff to.

02:12:42   But still, they were asking it to do things and then going, okay, and writing it down.

02:12:49   And similarly, there was another episode a little bit later, which is like, I have something that I have an electric motor and it gets hot and I need something to cool it.

02:12:56   And I want to build it out of aluminum.

02:13:00   How big of a, like, essentially heat sink do I need to dissipate this amount of heat in this amount of time?

02:13:04   And they're having this conversation with ChatGPT and DeepSeek saying, okay, here's the thing.

02:13:09   Tell me what this is.

02:13:10   And they would say, okay, well, I made these assumptions about the temperature of the water and the temperature of the thing coming from the engine.

02:13:17   And here's what the area you need, blah, blah, blah.

02:13:18   Not surprisingly, the two different LLM products came up with wildly different answers off by like a factor of three or four.

02:13:25   The thing that bothered me about this whole thing is I'm watching this waiting for the point where they say, but of course, this is all just like a fun exercise.

02:13:32   And I'm not going to actually build something based on what ChatGPT told me to do, especially since as I've just proven by asking two different LLMs and getting wildly different answers.

02:13:39   I can't just go by what they say because which I don't know which one to pick.

02:13:43   And the fact that they're so different makes me think that I don't know enough of what I'm doing to make use of these tools.

02:13:48   And yet I saw him starting to cut aluminum.

02:13:50   I'm like, well, wait, did we skip a step here where you talk to someone who knows what they're doing?

02:13:55   Like, don't.

02:13:56   Here's my thing.

02:13:57   I worry that people think ChatGPT is the same as Googling something, which is not helped by the fact that the top, you know, above the fold thing on Google search results is now, you know, has been for years.

02:14:10   The thing where Google tries to get the answer for you.

02:14:11   But with the advent of LLMs, it is much more prominent and much more likely to, you know, look like it's giving you the answer.

02:14:18   There was some an article I read ages ago.

02:14:21   I wish I had the link.

02:14:21   It was like, if you're don't try to use ChatGPT to win an argument, like if someone says, I think Tom Cruise is, you know, 60 years old.

02:14:30   I think he's 50 years old.

02:14:31   And we'll settle this by asking ChatGPT.

02:14:33   That doesn't settle anything.

02:14:36   All you did was put some words into a machine.

02:14:40   The new words came out of.

02:14:41   But you're exactly where you started off, which is like, well, it's plausible.

02:14:45   Is it right?

02:14:46   Maybe we should look up somewhere where we think has a slightly higher chance of knowing Tom Cruise's age.

02:14:51   Wikipedia, IMDb, call sheet, like, are they necessarily right?

02:14:56   No, but you kind of know, in general, how frequently does Wikipedia know the age of famous people?

02:15:02   You have no idea, in general, how frequently does ChatGPT know the age of famous people because it'll say all sorts of things.

02:15:08   And so if you're trying to figure out the area between two cylinders or how big a heat sink you need for the cooler for your boat

02:15:15   or what Tom Cruise's age is, if you've asked ChatGPT and said, well, there you go, I figured it out.

02:15:20   It feels so much like when we all first got smartphones and someone would ask them a question and we just Google it and say, look, I got the answer.

02:15:26   I just Googled it.

02:15:27   But that, even though the actions are the same, I took out my phone from my pocket, I typed in a thing, I hit a thing and I read a result.

02:15:34   It's not the same.

02:15:35   It's, and is it because, oh, people need to know how LLMs work?

02:15:39   They need to know, like, the intricate details of LLMs?

02:15:41   I just, I just feel like the, there's a disconnect between what LLMs are actually good for and what people are using them for.

02:15:48   And that disconnect may be dangerous, kind of in the same way that there's a disconnect between what driver assistance functions are actually good for and what people use them for and why they end up dying by going under a tractor trailer because they're not using the tools, quote, correctly.

02:16:01   But the tools are essentially designed to tempt them into not using them correctly, sometimes advertised in ways that are not used.

02:16:07   I feel like we're at that stage with LLMs and I really don't want this person to build anything on their boat base and something ChatGPT said because ChatGPT doesn't know what a boat is or how to build anything, please.

02:16:17   And I'm not saying that the first non-LM, non-AI search result on Google is also the right answer all the time.

02:16:23   I'm saying maybe you have to do more research than Googling to figure out how big the heat sink needs to be on your boat.

02:16:28   That's all I'm saying.

02:16:31   Noted.

02:16:32   Have you seen this in real life?

02:16:33   People saying, oh, I just looked it up in ChatGPT.

02:16:35   My kids do it and I sternly say no.

02:16:38   No.

02:16:39   That's not surprising.

02:16:40   Bad child.

02:16:40   No, you have not looked it up.

02:16:42   Is that right?

02:16:43   I don't know.

02:16:44   Do you spray them with a water bottle too?

02:16:46   It's like studying for a test.

02:16:48   Like, oh, you know, everyone uses like the notebook LLM type things where you tuck your notes into it and makes a study guide.

02:16:53   Like, that's actually pretty cool.

02:16:54   But it's like they show me things like that and they say, well, what does it say there?

02:16:59   I'm like, well, but I don't know if this is right.

02:17:01   This is just what ChatGPT said.

02:17:03   I'm like, oh, it's fine.

02:17:03   I'll probably be okay for the test.

02:17:04   And they're probably mostly right.

02:17:06   But again, don't fabricate things on your boat.

02:17:09   I would say don't put that answer on a test unless you actually know it's right.

02:17:12   Like maybe study from the actual study guide materials instead of running through the LLM and hoping it doesn't mangle them.

02:17:17   But I find it extremely bothersome.

02:17:19   And no, I know you think, oh, you're a crotchety old man.

02:17:21   Eventually LLMs will get it right.

02:17:22   Maybe they will, but they're not getting it right today.

02:17:25   So right now my crotchety-ness is correct.

02:17:27   But I mean, like, is that that different from doing a random web search and landing on some random web page?

02:17:35   Yeah, it's that different because the random web page, I think people have better instincts for its sketchiness.

02:17:39   Not as good.

02:17:40   Like, oh, people think you think people know web pages are good.

02:17:43   Well, it's kind of what we gave Google, like what Google is supposed to do, like how many people link to it and blah, blah, blah.

02:17:47   I know there's SEO and so on and so forth.

02:17:49   But reputation wise, I think people have heard of Wikipedia and understand its reputation.

02:17:54   Like, that's why we have institutions like, you know, as crappy as we may complain about them, the New York Times, Wikipedia, IMDb, whatever.

02:18:01   Like, it's the reason why if we landed on some person's GeoCities page that says Tom Cruise is 50 years old, we trust that less of landing on Wikipedia.

02:18:10   And I know people aren't good at making that determination, but it's something to hang your hat on.

02:18:14   People understand institutions.

02:18:15   And the problem is, to get back to your earlier point, Marco, is that I think people are starting to think of ChatGPT as the institution that has a reputation in the same way that Wikipedia or the New York Times does.

02:18:25   And that is unfounded because ChatGPT is many things, but it is not like they, ChatGPT, you know, OpenAI, the company or the product, they don't have any information.

02:18:35   They are just gathering information from elsewhere, compressing it, grinding it up and spitting it out to you in some form that looks plausible.

02:18:41   But it is not the same thing as what Wikipedia or the New York Times are doing with their information.

02:18:45   And so even though you may trust the brand ChatGPT and it has never steered you wrong, eventually you're going to be getting glue on your pizza and not knowing it.

02:18:53   That was Google.

02:18:56   Sorry.

02:18:56   Sorry, OpenAI.

02:18:57   I know you didn't do that one.

02:18:58   But you do say some very bad things and you do make people with three hands.

02:19:01   So there's that.

02:19:02   Beep, beep, beep.