PodSearch

Connected

537: Do You Know Apple?

 

00:00:00   Hello, Federico, how are you?

00:00:30   But yes, everything's cool. He just had a conflict. There's no, it's not baby time yet. As far as I know, it's not, it's not, as far as we know, it's not. I talked to him like 10 minutes ago. He didn't say anything. So, uh, he just had a conflict today. You would think, you would think he would tell you. I would hope so. Yeah. Yeah. He's like, oh, I've had a four year old this whole time. A couple of years ago, a couple of years ago, one of my favorite, uh, video game podcasts, uh, triple click, uh, Jason Schreier, uh,

00:00:59   co-host of triple click and also a reporter at Bloomberg, um, surprised, uh, his two co-hosts with his, uh, second, uh, baby that was actually born months before and he didn't tell them. And it was an incredible moment on the show. Uh, that's awesome. Yeah. That was, that was wild. Uh, but yeah, hopefully Mike will not try to copy that approach.

00:01:22   Yeah. It's like the time you told us that you had been using a PC for like, well, I wouldn't say that was as dramatic as revealing a whole baby, but you know, yeah.

00:01:37   Okay. They're slightly different. Yeah. A little bit, a little bit different. Uh, we got some followup.

00:01:42   That's what we do here at the top of the show. Um, and many people have written in about their own iCloud woes. This was, uh, the bulk of the episode last week, talking about my, uh, my whole iCloud situation with my legacy version of myself.

00:02:04   And the new iCloud family. And I couldn't do the additional space. Um, so it seems like a lot of people are in that situation. People are adding their legacy accounts to their iCloud families.

00:02:16   I will say, you know, a week and a half or a week and a half in, however long it's been. Uh, it seems like the dust is all settled except on my Mac and the Mac app store. I can't update any apps. So.

00:02:30   Okay.

00:02:31   I got to deal with that.

00:02:32   Sure. Why not? I mean.

00:02:34   You know, it asked me to log into the purchasing account, which I don't want to do.

00:02:39   Have you thought that maybe the Mac app store is so bad there's nothing to update?

00:02:43   Well, currently I've got four.

00:02:45   Oh, you do see the updates. Okay.

00:02:47   To be fair, two of them are Safari plugins.

00:02:51   Okay.

00:02:52   And one is an Xcode plugin. So it's, it's, uh, it's a ghost town in there.

00:02:58   Look at, look at you being a developer and everything.

00:03:00   I know, right?

00:03:01   The Xcode plugins. I didn't even know you could get Xcode plugins.

00:03:04   Well, it's, it's, uh, it's not really a plugin. It's an extension for the simulator.

00:03:07   Oh, it's still.

00:03:08   It lets you do fancy stuff.

00:03:10   Okay.

00:03:11   I'm a developer now.

00:03:12   I'm the developer now.

00:03:14   Yes.

00:03:14   Um, yeah.

00:03:17   So, I mean, uh, lots of people are dealing with this.

00:03:20   Hopefully it was helpful if you find yourself in this situation.

00:03:22   Uh, a couple of things that I didn't mention that some of them I didn't realize at the time

00:03:27   that we recorded.

00:03:29   Uh, a couple of them I just didn't get to because of time.

00:03:32   Uh, one is that I lost all my test flights because.

00:03:34   Oh no.

00:03:35   I was signed at a test flight.

00:03:38   With that old Apple ID.

00:03:40   Yeah.

00:03:41   Because test flight follows your purchases ID apparently.

00:03:44   Um, I mean, thankfully, like I'm friends with most of the people whose test flights I run.

00:03:50   Uh, but I'm, I'm choosing, I'm choosing to view this as a clean start.

00:03:57   I believe you will call it a blessing in disguise, right?

00:04:01   Yeah, that's right.

00:04:02   I like you, I'm sure, uh, I had a bunch of stuff in there that I, you know.

00:04:09   Oh yeah.

00:04:10   Expired apps.

00:04:11   Uh, I mean, going way back to like years and years ago.

00:04:15   I kind of, I kind of want to look now.

00:04:17   Let's see.

00:04:17   Yeah.

00:04:18   See if you can find the oldest thing in your test flight.

00:04:20   Um, and the two most important ones for me, uh, I have the login to App Store Connect.

00:04:26   So like, oh, I can just add myself back to Widgetsmith, you know, and underscores other apps.

00:04:30   But, well, uh, this is actually quite perfect because, uh, so, uh, the oldest is TV forecast,

00:04:40   which is odd because I still use it.

00:04:42   Excellent app.

00:04:43   Um, Parcel, uh, the package tracking app.

00:04:47   But I just think the, uh, these are like, uh, build expired and build removed because the

00:04:52   developers have new betas.

00:04:54   Like, um, these are like older versions.

00:04:56   But the first app that said tester removed is, uh, DevonThink 2go version 3.

00:05:03   They removed me from, from DevonThink.

00:05:07   That's a bummer.

00:05:08   Yeah.

00:05:09   Once you, once you enter DevonThink, you never leave unless you're you, I guess.

00:05:13   Some real gems in here though.

00:05:15   Some real stuff.

00:05:16   Yeah.

00:05:17   Yeah.

00:05:17   I'm sure.

00:05:18   Um, so anyways, uh, if I was on your test flight and you'd like me back, please let me know.

00:05:23   Um, I also was rebuilding my music library.

00:05:28   Some people had some suggestions on maybe what was causing this or what the deal was.

00:05:33   Um, our friend Zach had mentioned, uh, they said that, okay, maybe some of these are like

00:05:39   pre DRM free purchases that maybe weren't upgrade, upgraded.

00:05:43   I don't really know what the, what the issue was.

00:05:45   But again, like the test flight thing, blessing in disguise, like, let me just rebuild my library.

00:05:50   I knew that I had music that was not in the store.

00:05:53   So I have a copy of my old music library on my desktop.

00:05:56   And I'm just like, as I'm like finding things, it's like, oh yeah, let me just add this back

00:06:00   to the library.

00:06:01   Um, but a couple of things to note in rebuilding your Apple music library.

00:06:06   And my previous library dates back to like when I first started using iTunes in like 2002,

00:06:13   like a lot of stuff in there I hadn't listened to in a long time that I sort of cut loose.

00:06:17   But first thing, music encoded for Apple music in a lot of cases sounds way better than what I had

00:06:26   in my library.

00:06:27   Like, you know, I had some ripped CDs and I did iTunes match at some point, but you know,

00:06:32   who knows what the deal was, you know, is some, I've noticed on a couple albums like,

00:06:37   oh, this sounds better.

00:06:38   Or maybe it's been remastered at some point and I had an old version or whatever.

00:06:42   Um, but the thing that hurt me and Federico, I think you in particular will appreciate this.

00:06:47   Too many of my favorite albums now have 20th anniversary versions.

00:06:53   Oh yeah.

00:06:53   Like, like quite a few.

00:06:56   And they're, you know, they're remixed or like Death Cab has done it with a couple and

00:06:59   they've added like demos to the end of it, which, you know, I'm all for.

00:07:02   And it's like fun to hear alternative versions of songs and stuff.

00:07:05   But I just noticed as I was like going through adding things, I was like, oh, oh, oh goodness.

00:07:11   Like a lot of this, a lot of, a lot of pain in the past, you know?

00:07:20   Uh, so test flights, uh, Apple music, um, Mac app store, Mac app store and anything else?

00:07:30   So far, everything else has been pretty smooth.

00:07:33   You know, I said, I said last week that some like media wouldn't play back.

00:07:37   That's all basically been settled.

00:07:39   Um, I watched a couple episodes, a couple of the first couple episodes of Mr. Robot over

00:07:43   the weekend that I had purchased, you know, years ago on my legacy account.

00:07:47   They just played just fine.

00:07:48   That show is holding up.

00:07:50   I mean, I'm only like three episodes in again, but it's really good.

00:07:53   Remember?

00:07:55   Did you watch Mr. Robot or was it Mike?

00:07:57   Oh no, I watched Mr. Robot.

00:07:59   Maybe he did too.

00:08:00   I feel like we talked about it back in the day.

00:08:01   I don't, do you see Mike as a Mr. Robot person?

00:08:04   I can't remember.

00:08:06   I thought one of you did and the other of you didn't.

00:08:08   So maybe, maybe he didn't.

00:08:10   Hmm.

00:08:11   Hey, if you miss Mike on, on this week's episode, here's, here's, here's something that you can,

00:08:16   that you can send him.

00:08:17   Um, uh, use your favorite, uh, generative AI, uh, service.

00:08:22   Uh, it can be chat GPT, can be Gemini, can be DeepSeek, which we're going to talk about.

00:08:26   Well, it depends on what country you're in.

00:08:28   Well, it depends on what country and ask your favorite AI product to put together a short description

00:08:33   of what, uh, role Mike Hurley could play in Mr. Robot.

00:08:37   Wow.

00:08:39   You know, you know, what, what, uh, if Mr. Robot could feature Mike Hurley as a recurring character,

00:08:45   what would he do?

00:08:46   That's incredible.

00:08:48   Uh, I'm asking GPT-04 right now what it says.

00:08:51   Yeah, you're asking.

00:08:52   Okay.

00:08:52   Uh, do you mean, I'm just going to read a couple of these to you.

00:08:56   Okay.

00:08:57   Okay.

00:08:57   Uh, it's a list.

00:08:59   Number one, the underground broadcaster, uh, title.

00:09:03   Mike could play a podcaster or radio personality who operates a secret channel spreading anti-corporate

00:09:09   propaganda.

00:09:09   Yes.

00:09:10   Helping F society communicate with the masses.

00:09:12   Incredible.

00:09:13   Okay.

00:09:14   Number two, corporate insider turned whistleblower.

00:09:19   Ooh, okay.

00:09:20   Because he used to work at a bank.

00:09:22   I guess so.

00:09:23   Given his real world experience running a media business, Mike Hurley could be a high-ranking

00:09:28   E Corp executive in charge of internal communication who slowly realizes the company's dark secrets

00:09:34   and leaks them.

00:09:37   And there are a couple of others here.

00:09:39   Uh, this one's, I think, my favorite after the first one.

00:09:42   Tech culture commentator.

00:09:44   In a meta twist, Mike could play a fictionalized version of himself hosting a tech podcast that

00:09:50   dissects the rise of E Corp and the decline of privacy, inadvertently influencing key players

00:09:55   in the show.

00:09:56   Okay.

00:09:57   Well.

00:09:58   Uh, and then the final one, a surreal take, uh, Elliot's consciousness voice where Mike

00:10:04   is never physically present, but his voice is, his voice constantly plays in Elliot's mind

00:10:09   as an inner monologue or guide.

00:10:12   Fascinating.

00:10:13   But why, why would Mike's voice be in Elliot's mind?

00:10:17   Um, okay.

00:10:18   Sure.

00:10:20   What kind of role do you think would fit him best?

00:10:22   I'm not engaging with you, Chachi Petit.

00:10:23   Okay.

00:10:23   Uh, yes.

00:10:25   I like that a lot.

00:10:26   We have some other, I don't know what this is.

00:10:29   This is follow.

00:10:30   Follow in.

00:10:32   Follow in.

00:10:33   I don't know.

00:10:34   Uh, follow FM.

00:10:36   Yes.

00:10:37   Uh, we've dropped the FM out of our name.

00:10:41   We are now officially just relay.

00:10:44   Uh, there's a link in the show notes to a blog post I wrote over the weekend.

00:10:48   Uh, we got our redesigned live new logo.

00:10:52   You see it this week in the show art, uh, slightly tweaked show art, new logo.

00:10:56   And, uh, I'm really, really happy with this.

00:10:59   And if you are a member of relay, which you can do, there's a link in the show notes to

00:11:05   get connected pro, which is the longer ad free version of the show.

00:11:07   All memberships come with access to crossover, which is this feed where we, we publish, uh,

00:11:14   members only episodes.

00:11:15   And in February, I'm going to be interviewing JD Davis, the designer who redid relay's branding.

00:11:21   And, uh, so that'll be fun in the discord.

00:11:24   I'll ask for questions in February, but, um, looking forward to that conversation with JD.

00:11:28   JD is awesome.

00:11:29   He did a great job with this and, uh, I'm really, really happy with it.

00:11:33   Nice.

00:11:34   Yeah.

00:11:34   I think it looks cleaner, uh, modern, fresh.

00:11:39   I really like this change and it looks, it looks very good at small sizes and it's still

00:11:44   recognizable even at a small size.

00:11:46   So yeah, did a, did an excellent job.

00:11:48   Yes.

00:11:48   Yeah.

00:11:49   That was one of the goals was like, how does this look small?

00:11:52   Uh, cause usually podcast artworks is upon your phone, 400 pixels across.

00:11:56   So, uh, yes, glad to have that, have that out.

00:12:00   I did want to take just a second, um, and kind of talk about the times that we live in.

00:12:08   We were going to do this last week, but the three of us felt like we wanted a few more days

00:12:12   to kind of put some thoughts together.

00:12:14   Um, we know that a lot of people in and beyond our community are hurting given the state of

00:12:23   things in the U S and really across a bunch of other countries.

00:12:26   Yeah.

00:12:26   Yeah.

00:12:27   Um, and we want to do what we always have done, uh, provide a place to hang out, talk about

00:12:32   the topics we love with people who respect each other.

00:12:36   And the thing I'm, I'm honestly most proud of is Relay's community.

00:12:41   And in that community, there's, uh, a strong shared belief that everyone should be respected

00:12:47   and that we should all practice compassion.

00:12:50   And that comes out in a thousand different ways in our community.

00:12:53   Um, and one thing that it means is that there's no room for hatred of other people in our community.

00:12:59   And unfortunately, feelings like that, uh, towards, uh, individuals has become more mainstream and

00:13:07   that's troubling to us.

00:13:08   And we want to be a place that we can, again, hang out with our friends, talk about the things

00:13:13   we love and be in an environment where we can be, uh, where we can know we can be safe, whoever

00:13:18   we are or wherever we come from.

00:13:20   And, uh, you know, we're not turning our shows, uh, into anything that they're, they're not

00:13:24   already.

00:13:25   Um, we're going to keep doing what we do, uh, but we did want to, uh, to tell people

00:13:29   that, you know, that's where we stand.

00:13:31   And we think that, uh, a lot of the stuff in the world right now is, uh, not okay.

00:13:36   And we want to use our platform where we can to, uh, protect those who are in our community.

00:13:43   If I may just add some, uh, personal, uh, context, um, these are my personal opinions, but I feel

00:13:51   like using the exact necessary words is important.

00:13:56   So, uh, I want to start from my personal belief, uh, that trans rights are human rights.

00:14:04   And I personally find what is happening in the U S despicable and, uh, dehumanizing.

00:14:11   And I cannot even, because I cannot imagine, I, I personally cannot imagine what it must feel

00:14:18   like to, to feel like, uh, other people want to make you and your identity feel unjustified

00:14:29   and, and, uh, like something that, that shouldn't exist.

00:14:33   I cannot imagine that, but I, I, I can relate to that feeling of feeling powerless and feeling

00:14:46   like there's nothing you can do.

00:14:48   And to an extent, I, you, Mike, we, at this network are, uh, are, are, you know, it's not

00:14:57   like connected can change the U S government, right?

00:15:00   Uh, if they only would adopt the bill.

00:15:03   If, if, if only, if only they would do it.

00:15:05   Um, but there's a couple of things that I will, that I would say, um, now more than ever,

00:15:14   I feel like, and unfortunately these discussions are trickling down to Italy as well, you know,

00:15:20   because we have a government who very much sympathizes with, with, with the, with the American government.

00:15:28   And so those discussions about human rights, uh, they're happening here as well.

00:15:32   Uh, but I feel like now more than ever, uh, it's important if you feel like it to, to be, to be organized, to dissent and to, and to seek a safe space in real life and online.

00:15:46   Um, and I think, um, you know, um, I think it's important to find your people and to, and to not accept what is going on.

00:15:57   Um, and I know that, um, and I know that I am saying this from a position of, of privilege myself.

00:16:03   Um, because I, you know, my identity is not at risk and, but I think it's important for people to be together and to feel those things together.

00:16:14   And the second thing is that from my position and I think from our position, um, it's important to continue to provide the, the utility.

00:16:34   If, if every so often we, we share something useful, that's also my hope, but I think it's, it's important.

00:16:41   Like once a quarter, once a quarter, maybe a segment about task managers.

00:16:45   Uh, but I think it's, it's important.

00:16:47   It's important to at the same time, keep doing what we do because in, in any dark time, you still need to find that light that never goes out to, to quote a famous song.

00:17:02   Um, I think it's important to, to continue to provide that service for people who want it because like, you know, I, I think in any, in any,

00:17:11   difficult period of your life, uh, you need to have something that brings you a little joy.

00:17:19   And so, uh, I think it's why now more than ever, we gotta keep doing what we do.

00:17:26   And, uh, you know, if I could hug my listeners, I would.

00:17:33   Um, and this is like, this is one of those moments where I really miss not having a live show and seeing people in person.

00:17:41   Um, and, but yeah, that's all I wanted to say.

00:17:46   No, it's, it's, it's, it's well said.

00:17:48   I agree a hundred percent.

00:17:49   And it, it is something that I agree with you.

00:17:53   It is hard to, it is hard to put myself, uh, in the place where I'm, I'm told that who I am shouldn't exist.

00:18:03   Yeah.

00:18:04   But that doesn't mean that we can't, um, first of all, empathize with it to the degree that we can, but also, um, provide a place where everyone can feel safe and comfortable and welcome.

00:18:16   And that's what we want relay and Mac stories, um, to be.

00:18:21   And look, our, our discords are the best places on the internet and our communities are incredible.

00:18:26   Yes.

00:18:27   And, um, you know, we're still going to joke about iPod socks.

00:18:32   Yeah.

00:18:32   Yeah.

00:18:33   Uh, exactly.

00:18:34   Uh, sound pretty good to me.

00:18:36   Yes.

00:18:38   So we, we are literally going to talk about iPods.

00:18:41   Uh, we are.

00:18:42   Yeah.

00:18:43   For the next stop.

00:18:44   Well, not socks, uh, but we are going to talk about iPods and, and more specifically, uh, Steven,

00:18:50   I wanted to ask you what's your budget for, um, iPods looking like these days, uh, because if you have an opening, uh, you could participate in the, uh, Sotheby auction for the custom iPods from the late car Lagerfeld.

00:19:11   It's a massive collection of how many, 500, like how many, 300, I'm not sure.

00:19:18   500 iPods.

00:19:20   Um, of all kinds, including a, a, the, the, like this diamond encrusted iPod and microphone, there's multiple color variations.

00:19:32   I did.

00:19:33   I didn't even know that Carl Lagerfeld was collecting iPods.

00:19:37   So this entire story, uh, that this collection is now, uh, up for, uh, sale at an auction is new to me.

00:19:44   Were you familiar with the existence of these iPod collection?

00:19:48   No, I wasn't.

00:19:49   Okay.

00:19:50   I mean, I was familiar with the name, you know, uh, seeing Carl's name float around, but had no idea.

00:19:58   And, uh, that goes deeper than we think it did.

00:20:01   So we'll get to that in a minute.

00:20:02   But, um, right now the, uh, bejeweled, uh, iPod and microphone, 500 euro.

00:20:12   So, you know, we'll see how high that goes.

00:20:14   A little rich for my blood right now, but it's, uh, this is incredible.

00:20:19   And you can go through here and like so many iPods.

00:20:22   There's some custom, custom, like colorways of the first gen nano, which Apple just shipped

00:20:29   in white and black.

00:20:30   There's red, pink, blue, and yellow.

00:20:33   And I'm not normally like a yellow fan.

00:20:37   Yellow iPod looks good though.

00:20:39   But the yellow iPod looks sick.

00:20:42   Yes.

00:20:43   It's so good.

00:20:45   Um, I'll have a link in the show notes to, uh, to some of these.

00:20:49   And, uh, it's, it is just, it's just incredible.

00:20:55   You were talking about this on Macedon and then we learned a lot more.

00:21:00   So what, what was uncovered?

00:21:01   Uh, that apparently Carl Lagerfeld obviously was very much into iPods.

00:21:07   I saw some posts and some old like stories saying that he was basically treating iPods as, um,

00:21:15   as like, uh, cassette tapes.

00:21:17   So each iPod would be loaded with a specific type of music or a specific collection of albums

00:21:23   and so forth, which, I mean, if you are really into music and the storage limitations of the

00:21:30   time where, you know, the storage limitations at the time, I can sort of see that, you know,

00:21:35   if you have an unlimited budget and you want to be fashionable and you are a fashion icon,

00:21:40   in fact, and you're really into music and you, and you'd be like, you know what, I am

00:21:45   going to accumulate hundreds of iPods and each of them, I will treat as a cassette tape.

00:21:51   Cool.

00:21:52   But what was interesting is that apparently, uh, Lagerfeld had a whole team of people, uh,

00:21:58   dedicated to managing these iPods because obviously you needed to sync them with, with iTunes.

00:22:04   And so the story goes that there was an entire raid of XServ to run this massive iTunes library

00:22:14   and sync hundreds of iPods to this library.

00:22:18   Yeah.

00:22:19   Have you seen an XServ raid?

00:22:21   Are you familiar with this?

00:22:23   Never.

00:22:24   Never, never seen it in real life.

00:22:25   That's because you've never come to my house.

00:22:26   I've got one.

00:22:28   Um, but yeah, uh, terabytes of music, which, I mean, think about the timeframe of the iPod,

00:22:35   right?

00:22:35   Like that's, uh, that's pretty wild.

00:22:37   We weren't, we weren't, you know, doing eight terabyte, uh, MacBook pros at the time.

00:22:43   So absolutely incredible.

00:22:46   Uh, man, what a move.

00:22:51   It's like, oh, this is my iPod team.

00:22:53   On one hand, like, I feel like I, I shouldn't be allowed to judge because like I do keep

00:23:00   several video game handhelds for specific types of games.

00:23:05   Sure.

00:23:06   But on the other hand, I don't have a hundred handhelds.

00:23:10   Like I don't even have 20, I think.

00:23:13   But I mean, I'm also not Karl Lagerfeld.

00:23:16   So it kind of cancels out, you know?

00:23:20   Um, I don't know though, it's what a power move to have hundreds of iPods servers to manage

00:23:27   the syncing to the incorrect, just the entire story is incredible.

00:23:32   Uh, yeah.

00:23:33   Uh, there are also some modified, I just found them some modified iPod minis, including an

00:23:40   all white and all black model, which look really cool.

00:23:45   Are you sure you don't want to try just like even getting one of these?

00:23:50   How much are the minis?

00:23:51   The minis are at 400, 400 euros.

00:23:53   One of, just one.

00:23:55   Just a random one.

00:23:57   Just be like, hey, this, this used to belong to.

00:23:58   You could, you could, you know, you could touch it and be like, this iPod was also touched

00:24:03   by Karl Lagerfeld.

00:24:04   And now I am touching it.

00:24:06   You know?

00:24:07   This click, this click wheel was once clicked by Karl Lagerfeld.

00:24:12   Yeah.

00:24:13   There's a whole collection of third gen iPods, which we know were the best ones.

00:24:18   Mm-hmm.

00:24:20   And those have like stickers on the back with notes, I guess, with maybe what was on them.

00:24:24   Wild.

00:24:25   Have you ever attended like an auction in real life?

00:24:29   I have not.

00:24:30   I've always wanted to.

00:24:31   Me too.

00:24:32   And I've always wanted to get into a bidding war with somebody, but that only ever happens

00:24:35   on eBay.

00:24:35   It's one of those things that, you know, I want to do at some point in my life.

00:24:39   Yeah.

00:24:40   Like getting into a taxi and be like, follow that car, but it's a random car and you got to pretend

00:24:45   that it's important, you know, that sort of, that sort of thing, but also like attending

00:24:50   an auction and getting into a bidding war for like a totally random item.

00:24:54   Like it doesn't have to be important and you have to be like really and oddly into it, you

00:25:01   know, that'd be, that'd be fun.

00:25:03   Like, sir, are you sure you want to offer 3000 euros for this hat?

00:25:11   And I'll be like, yes, I need the hat.

00:25:14   I do.

00:25:15   I do want to do that.

00:25:16   I do think it'd be fun, but no, I've, uh, I've never even attended one.

00:25:23   It's kind of a bummer.

00:25:24   Yeah.

00:25:25   Yeah.

00:25:26   Maybe, you know, maybe one day.

00:25:29   Maybe one day.

00:25:31   This episode of Connected is brought to you by OneBlocker.

00:25:36   OneBlocker is a premium content blocker for Safari on iOS and the Mac.

00:25:42   In addition to blocking obtrusive ads, OneBlocker can block trackers, annoying pop-ups, EU cookie

00:25:48   notices, comment sections on blogs and YouTube, and so much more.

00:25:52   And they're thrilled to introduce the new OneBlocker 6.

00:25:55   It has an updated design with an entirely new interface to make the app even more intuitive.

00:26:02   Plus, you can expect improved blocking and free ad blocking because users can now enable one filter at no cost, including advanced blocking for YouTube.

00:26:12   Then upgrade to premium to unlock all the filters and automatic weekly cloud filter updates.

00:26:17   I've been using OneBlocker across my devices for a long time.

00:26:22   One of my favorite things about it is you can create custom rules for domains that may not fit in in OneBlocker 6's other options, but their built-in options are extensive and the new redesign makes it easier to use than ever.

00:26:35   OneBlocker was featured in the Mac App Store by Apple under the best Safari extensions, and it's available for the iPhone, iPad, Mac, and Vision Pro as a native app.

00:26:45   You can get premium for just $1.24 per month, billed at $14.99 a year, or go for a lifetime license.

00:26:54   And a single purchase unlocks OneBlocker across all your Apple devices, so you can share premium with up to five family members.

00:27:01   For unlimited time, OneBlocker is offering listeners of Connected one month of premium for free.

00:27:08   Premium unlocks all features across iOS, macOS, and Vision OS, and can be shared with up to five family members.

00:27:15   Go to oneblocker.com slash connected and use the promo code CONNECTED.

00:27:20   That's OneBlocker.com slash connected, the link is in the show notes, and use the code CONNECTED for one month free.

00:27:27   Our thanks to OneBlocker for the support of the show.

00:27:30   All right, Stephen, have you heard the good news of DeepSeek?

00:27:37   I have seen the word a lot this week, but you're like, you're wired in.

00:27:43   Yeah, yeah, I've been tracking this entire story.

00:27:46   And it's going to be a long segment, and Stephen allowed me to do my research and to take my time.

00:27:54   So, hey, I think it's going to be an interesting one and a fun one, but if you're not interested, well, that's your problem, not mine.

00:28:03   Skip to the closing.

00:28:04   It's going to be exciting this week.

00:28:05   It's going to be exciting because, yeah, it's going to be an exciting moment for the show.

00:28:11   So, in case you haven't seen it, or in case you've seen this name repeated to death and you couldn't be bothered to look into it,

00:28:20   DeepSeek is this new large language model, this new chatbot that is supposedly rivaling in performance OpenAI, which had GPT, Gemini from Google, Claude by Anthropic.

00:28:36   And not just in performance, but also in how it was put together.

00:28:38   So, I've been trying to make sense of what multiple sources are saying.

00:28:46   It's very hard to pin down exactly, also because of the language barrier, and also because everybody's saying different things.

00:28:57   I'm going to try my best and sum it up.

00:28:58   DeepSeek was put together, more specifically, the underlying large language model is called DeepSeek V3.

00:29:05   This is a word that I learned this week by QUANT, which stands for Quantitative Hedge Fund.

00:29:15   It's basically a hedge fund in China that uses sort of like advanced mathematical operations and whatnot to track market movements and that, you know, whatever hedge funds do.

00:29:28   Yeah, I think they go around trimming people's bushes.

00:29:31   Yes, that's what they do.

00:29:34   So, come on.

00:29:36   Yes, Stephen, please let me move on from whatever that was.

00:29:42   So, this company, they were using machine learning and large language models to predict, you know, market trends and stocks and all, you know, that fancy money stuff.

00:29:53   But they had assembled a collection of GPUs.

00:29:57   And so, the story goes that they started at some point a couple of years ago as a side project training a large language model called DeepSeek that they were going to use in their main business.

00:30:18   Then, last year, they started training with a lower number of NVIDIA GPUs than are typically used in America, in American AI labs to train large language models.

00:30:35   They started training version 3 of their DeepSeek large language model with a reasoning version called DeepSeek R1.

00:30:46   And they have published, so this is, I'm getting to why everybody's losing their mind.

00:30:52   They have published a white paper, and this model has open weights, which sort of means open source, but I don't kind of want to get into that.

00:31:04   They have published this model and this reasoning model for free.

00:31:10   You can use it for free.

00:31:11   There's a white paper that you can read, and it goes into the details of how it was put together.

00:31:16   Now, for those unaware, a reasoning model means that there's a version of DeepSeek.

00:31:23   When you go to the DeepSeek website or you're using the DeepSeek app on your phone, you can enable R1, which shows you, it takes a little bit longer,

00:31:34   to execute, but it shows the model, the COT, or the chain of thought of the model.

00:31:41   It shows the model, quote-unquote, thinking and trying to understand your problems and basically thinking out loud about the query that you just asked.

00:31:50   Now, most people haven't been exposed to a reasoning model because both Google and OpenAI, I believe they only make it available if you pay for either Gemini Advanced or ChatGPT Plus and Pro.

00:32:05   So ChatGPT, which is obviously the most popular in the world, they don't give you a reasoning model because you've got to pay, and most people don't pay for ChatGPT.

00:32:13   And with ChatGPT free, you only get GPT 4.0.

00:32:18   So what's impressive here, according to the theoretical story of DeepSeek, is that the company used a much lower number of GPUs.

00:32:30   So for a much, much lower cost to train this high-performance large language model.

00:32:40   And they did that by basically using constraints.

00:32:46   There's a, you know, there's this new export laws in the United States that were put forth by the Biden administration, I believe,

00:32:57   that basically prohibit NVIDIA from exporting, you know, too many GPUs to China.

00:33:05   So through, you know, the cluster of GPUs that DeepSeek already had before the laws were enacted,

00:33:12   and also apparently, now we're getting into speculation territory,

00:33:16   apparently through black market channels, they were able to, so some people are saying 2,000 GPUs,

00:33:24   some other people are saying it's 5,000, other people are saying, no, it's 15,000.

00:33:29   But apparently, it's a much, much, much lower number than what, again, supposedly OpenAI and Anthropic are using.

00:33:37   Like, I saw someone reporting that OpenAI is using half a million NVIDIA GPUs in their data centers.

00:33:44   So the theory would be like, imagine that for one-tenth of the horsepower and one-tenth of the cost,

00:33:53   again, in theory, because it's not like DeepSeek is publishing their finances,

00:33:57   but imagine that for one-tenth of the power and one-tenth of the cost,

00:34:01   DeepSeek was able to match and exceed the performance of ChatGPT-01.

00:34:07   ChatGPT-01 being the basic version of the reasoning model.

00:34:13   Yeah.

00:34:14   How could they do this?

00:34:16   Well, I was going to say, there's speculation that maybe they didn't do it all on their own, right?

00:34:24   So, yes.

00:34:25   So, there's also speculation.

00:34:27   So, this is where we get to the...

00:34:30   It's a complicated story, so I'm going to try and break it down piece by piece.

00:34:36   What's clever and interesting about DeepSeek is that DeepSeek is obviously...

00:34:42   Like, it's not like DeepSeek, they invented the transformer model or the idea of a large language model.

00:34:49   Sure.

00:34:50   They built on the foundation of something that was invented by American companies, right?

00:34:58   It was invented by Google.

00:34:59   I believe the first paper on the transformer model is, you know, it's credit to Google engineers for that.

00:35:06   And then later came OpenAI, which had ChatGPT as a product based on that.

00:35:10   And I believe OpenAI was also first with a reasoning model.

00:35:13   So, DeepSeek...

00:35:17   This is what's fascinating, that DeepSeek optimized the foundation created by American companies.

00:35:24   And they did that by applying constraints to their engineering teams that were forced to work with less power and less money.

00:35:36   And they did that by basically sort of putting a unique spin on how large language models can be trained.

00:35:46   I'm going to try and simplify here because it gets boring fast and also because, like, I'm not an engineer.

00:35:52   But my understanding is that typically there's a lot of supervised training where humans are actually supervising the model and, like, feeding the model the correct answers and the correct process when they're training.

00:36:07   There's supervised training and then there's reinforcement learning where basically you are applying a reinforcing technique to say, yes, model, good job.

00:36:19   This is the right answer.

00:36:21   Basically, DeepSeek only used the reinforcement learning to train their larger language model.

00:36:28   And apparently they used DeepSeek itself to train DeepSeek V3 and to train R1.

00:36:39   So, and this is where we get to what you just mentioned.

00:36:43   Apparently, OpenAI is now speculating that DeepSeek used training data from ChatGPT to train DeepSeek V3.

00:36:56   So, basically, the news of the day is that OpenAI is mad because DeepSeek used data from ChatGPT to train their model, which is kind of ironic where the company that scraped terabytes and terabytes of data from the open web is now upset that a Chinese startup scraped their data to train their model.

00:37:20   There's a, what do they call it, poetic justice?

00:37:23   Something like that?

00:37:24   Yes, it's kind of beautiful.

00:37:27   Anyway, the result of the debut of DeepSeek last week, DeepSeek, so DeepSeek V3 launched in beta last month, came out officially last week alongside the reasoning version R1.

00:37:40   It took the world by storm.

00:37:42   By that, I mean that it became the most downloaded app on the App Store, that the entire tech and AI industry lost their minds.

00:37:52   Because, like, they were like, how can this Chinese company, that's coming out of nowhere, spoiler, it's not coming out of nowhere, people who are really into AI knew what DeepSeek was up to.

00:38:03   It's just the rest of the world realizing now that there's a public product.

00:38:07   But in any case, as a result of the debut of DeepSeek, they basically took how many hundreds of billions of dollars off the U.S. market?

00:38:21   Yeah, it was wild.

00:38:23   So, yeah, we're looking at basically, like, if you consider the NVIDIA stock that was down 17% or something.

00:38:42   Google was down, Meta was down, a bunch of other tech companies were down.

00:38:47   Only Apple was up 3%, but we'll get to that later.

00:38:51   There is a conspiracy theory here somewhere.

00:38:56   And if I were into conspiracy theories, I would kind of believe it, which is, like, wouldn't it be fun if you like this entire narrative?

00:39:05   Because you could say, like, what if this entire narrative about DeepSeek doing this for cheap and doing this with, like, one-tenth of the power?

00:39:14   What if all of it is fake?

00:39:15   What if the entire narrative is fake?

00:39:18   But this little story that they put together was a tactic by the Chinese government to put a little dent into the NASDAQ, into the U.S. market, and wipe off a little, almost a trillion dollars in a single day.

00:39:34   Wouldn't that be fun?

00:39:35   Now, that's a fun conspiracy theory.

00:39:37   I mean, I am definitely not an expert in any of this.

00:39:43   Yeah.

00:39:44   But, you know, there's also the angle of, like, was it a stock market move?

00:39:49   Or, like, is it just China doing China things, right?

00:39:53   Like, we're going to get to this, but, you know, many, well, some countries in the U.S. Navy are like, hey, don't use this.

00:40:02   This is very much wrapped up in those international politics that are above our pay grade.

00:40:10   But, you know, there is an element to consider that, you know, what they say should be taken with a grain of salt until it's verified by other people.

00:40:19   Just as the same as we should verify what OpenAI and Google and these other companies say, right?

00:40:25   Like, these AI companies make really grand statements and you've got to put them to the test.

00:40:32   But, so, even if we set the potential politics of it all aside, I think it's interesting to look at DeepSeq and specifically the white paper that the company published.

00:40:45   Because it does paint an interesting picture for the future of American companies and the future of large language models and also the future of open source.

00:40:55   So, it is undeniable that the DeepSeq team used a series of really creative approaches to build this large language model.

00:41:08   So, for example, they were able to work with one-fourth of the memory consumption because instead of using 32-bit floating point operations, they used 8-bit floating point operations.

00:41:21   So, that's like four times less RAM when you're running this in your data center, which is fascinating.

00:41:29   Using reinforcement learning and with some, so basically, like, I was listening to a podcast about this yesterday.

00:41:40   They were convinced that they could do just reinforcement learning without supervised learning, but then they realized that the model was not responding correctly and it was continuing to, like, mix and match English and Chinese in the same sentence.

00:41:54   And so, for DeepSeq V3, they started from scratch again and they did some, like, they did all reinforcement learning with some supervision in the final stages of the process.

00:42:05   So, even the story that it was entirely based on reinforcement learning is not correct if you read the white paper.

00:42:11   So, really some interesting approaches that, as a result, like, there is a product that you can go there, you can go to DeepSeq on the web, or you can download an app on your phone, and you can talk to it.

00:42:25   And I can tell you, because I have been testing all of these things, I can tell you that the performance of DeepSeq, it most definitely rivals O1 and the latest Google Gemini version 2 advanced experimental models.

00:42:46   So, the performance is up there, now, does that performance come from scraping the same data sources as OpenAI and Google?

00:42:55   I'm sure.

00:42:55   Was it made possible because they used ChatGPT responses to train DeepSeq?

00:43:01   I don't know.

00:43:02   But, whatever it is, it works.

00:43:05   And I think there's a few interesting takeaways, and also some other things to consider, which I'll get to in a minute.

00:43:13   First of all,

00:43:16   I think it's undeniable that this large-language model and this product was built because of the work that had previously gone into large-language models in America.

00:43:32   Once again, it's not like DeepSeq has invented a completely new technology.

00:43:38   They have taken an existing technology and said, but what if we optimized the costs?

00:43:43   And what if we optimized the performance?

00:43:46   What result would come out of that?

00:43:47   So, they optimized the foundation of something that was created in America.

00:43:52   And I think it is undeniably eye-opening for American companies that maybe you can achieve.

00:44:02   And, you know, people who have been keeping an eye on the open-source AI scene knew that this was going to happen.

00:44:08   If you follow, you know, if you follow blogs and podcasts and YouTube channels about this stuff, you knew that this was months in the making.

00:44:17   This idea of a free and open-source model coming out and matching the quality of all one.

00:44:22   It's been like, even the people who saw DeepSeq a couple of months ago, the V3 beta said, oh, when this launches, it's going to be a big one.

00:44:31   But it is eye-opening for both OpenAI and Google to have this kind of competitor.

00:44:36   And I think, realistically speaking, what's going to happen here is that OpenAI will have to move up the timeline for the release of Chagipity O3.

00:44:46   Sam Altman has already said on X that they will accelerate the timeline of O3.

00:44:53   And O3, by the way, is the even more advanced reasoning model that they announced in December.

00:45:01   Such terrible names.

00:45:02   Yeah, they have terrible names.

00:45:04   Terrible names.

00:45:05   OpenAI will also, they also said they will make O3 Mini, which is the smaller model based on O3, they will make it available for free for maybe or at least up to 100 free queries per week.

00:45:23   And I think on the Google side, because like these are the two biggest players right now, right?

00:45:29   It's in the sort of models that you can pay for.

00:45:33   It's OpenAI with Chagipity and Google with Gemini.

00:45:36   And I think what Google will need to do, Google will need to make an even bigger deal of their reasoning model, which is Gemini Flash 2.0 thinking.

00:45:46   These names, man.

00:45:48   These names.

00:45:49   Slightly better than O3, okay?

00:45:51   Slightly better.

00:45:52   Flash 2 thinking.

00:45:53   It's slightly better.

00:45:54   But I think Google, the whole interaction with Gemini, when you have to pick a model from a drop-down menu, is just as bad as OpenAI.

00:46:08   And right now, I think both Google and OpenAI will need to clean up their list of models and have something simple.

00:46:17   Because DeepSeq is showing that you open, you have one model, you tap a button that says, do you want to have the reasoning one or not?

00:46:24   And that's it.

00:46:26   And so I think both Google and OpenAI will need to simplify and make available to people a thinking model of some kind.

00:46:36   And I will keep an eye on Google.

00:46:37   I think a lot of people are under the assumption that Google Gemini is as bad as Google Bard used to be.

00:46:43   And I can tell you that it's not like that anymore.

00:46:46   Especially the, what's it called?

00:46:49   The Gemini 2 Advanced.

00:46:50   Oh gosh, Stephen, you're going to make fun of this once again.

00:46:54   I'm sorry, but look, I'm not the guy in charge.

00:47:00   Gemini 2.0 Experimental Advanced version 1206.

00:47:06   It's not my fault.

00:47:08   I'm sorry.

00:47:09   It rolls right off the tongue.

00:47:10   But that model is really, really good.

00:47:13   That version of Gemini is really, really good.

00:47:15   So I think both Google and OpenAI are feeling the pressure from DeepSeq.

00:47:24   There's another conversation to be had about the safety practices of DeepSeq and the idea of censorship by the Chinese government and the CCP sort of looming over DeepSeq.

00:47:42   First of all, it seems pretty much clear that the folks at DeepSeq don't have a safety team, don't have any safety practices in place in terms of like how exactly was DeepSeq trained, especially if DeepSeq trained itself via pure reinforcement learning.

00:48:03   They haven't published anything in terms of like how safe it is, what kind of content was it trained on.

00:48:09   They don't have a dedicated safety team or even person that we know of.

00:48:15   So that's a big question mark.

00:48:18   And also, try and ask anything about the Chinese government or Chinese history or Taiwan, Taiwan independence or the Tiananmen Square massacre.

00:48:30   And you will be able to see in real time DeepSeq R1 exposing its thoughts and censoring itself in real time.

00:48:43   Yep.

00:48:44   You will literally see the chain of thought stop once the model realizes that you're asking about the Chinese government or Chinese history.

00:48:53   It'll say, ah, let's talk about something else.

00:48:56   Not great.

00:48:58   What's interesting, though, is that DeepSeq is open source and you can host it somewhere else.

00:49:05   You can download the model and put it on your own server or your own machine.

00:49:12   Now, when you run the model yourself or when you're using DeepSeq hosted by somebody else, for example, I believe Perplexity, they are hosting DeepSeq R1 in the United States.

00:49:24   I have seen other services already implement a DeepSeq option.

00:49:27   And once again, you can run it yourself.

00:49:31   In that case, it will not be censored.

00:49:35   It'll happily tell you about the oppressive Chinese government and the Tiananmen massacre in the past.

00:49:41   And it'll tell you about Taiwan.

00:49:43   And it'll tell you about anything.

00:49:44   So interesting that obviously DeepSeq, when it's running on the DeepSeq service in China, it's censored.

00:49:53   But when you run it yourself, it's not.

00:49:55   And obviously, like with DeepSeq, like most people are not going to download DeepSeq and run it on their computers.

00:50:05   Which is why, for example, the Italian privacy watchdog, which is an entity of the Italian government, has banned DeepSeq from the app store.

00:50:16   So the DeepSeq app is gone from the Italian app store.

00:50:19   And they have issued a warning to DeepSeq and the parent company in China, asking for details about privacy collection, user data collection, and user data retention, and about their privacy policies.

00:50:34   And, you know, what exactly is DeepSeq doing with the data from Italian citizens?

00:50:39   What are they collecting?

00:50:40   Why?

00:50:41   And where are they storing that data?

00:50:44   So this kind of story, I think you will see it happen in more and more countries in the next few days.

00:50:51   Because for most people, once again, this came out of nowhere.

00:50:55   Even though people who are tuned into the AI industry knew that this was going to happen.

00:51:00   For people, like, I mean, I saw it everywhere.

00:51:03   On the Italian newspaper, on newspapers, on Italian news websites.

00:51:07   It's everywhere.

00:51:08   And so this kind of, like, governments getting into this and be like, hey, hold on a second.

00:51:13   This model, this AI from China.

00:51:16   What's it doing with the data of our citizens?

00:51:18   I think it'll happen in more and more places.

00:51:20   I think it will, too.

00:51:22   And, I mean, the upside to this sort of thing is, like, if these models can be created and trained on less hardware, that's good.

00:51:32   It's good for the environment.

00:51:33   It's good for smaller companies wanting to get into it.

00:51:35   Good for competition.

00:51:37   I can't help but think that some of the freak out over this, at least in the U.S., is that, oh, this has been an American company thing.

00:51:46   And now it's in China and there's some freak out around that in corners of the U.S.

00:51:53   But I'll say this, that undeniably, the DeepSeq team, they applied some really clever engineering constraints to come up with DeepSeq V3 and DeepSeq R1.

00:52:10   That's, I think, objectively undeniable, especially if you read the white paper.

00:52:15   And they, for sure, have documented some fascinating techniques that, because of open source, I'm sure others will copy and implement.

00:52:25   And that sort of, it's the rising tide that, what's it, that floats all boats?

00:52:30   Like, what is it?

00:52:30   Basically that.

00:52:32   Like, everyone benefits from that innovation.

00:52:37   But, if you're running a product that hundreds of millions of people use, you still need to, you still need to run this somewhere.

00:52:49   And you still need to scale this service.

00:52:52   Like, the scale that Google has and the scale that OpenAI has.

00:52:57   I mean, it's no surprise that the DeepSeq website has been barely accessible for me.

00:53:03   Not because I'm in Italy, but, like, it was often down.

00:53:06   The DeepSeq chatbot was not loading.

00:53:09   I've seen people having all kinds of issues with the DeepSeq hosted API.

00:53:15   Like, you still need to, like, you still need to, if you really want to be the next model, you still need to match the arguably incredible performance of American models.

00:53:28   Because they have a whole impressive, like, you know, set aside the fact that it's bad for the environment.

00:53:35   But when you look at it as an object, it's undeniable that it's, you know, something that is scaling to how many hundreds of millions of people and how many billions of requests per day.

00:53:47   From a purely web engineering perspective, it's remarkable, right?

00:53:54   And so, it'll be fascinating to see if the DeepSeq company can do that.

00:53:58   I saw some reports saying that they are using these NVIDIA GPUs that are not the H100s that, like, most companies are using are the H800s, which are smaller and not as powerful.

00:54:13   I also saw some other reports saying that they're using some Huawei CPUs.

00:54:18   It's a really fascinating infrastructure that they have put together.

00:54:24   But, here's the thing.

00:54:25   Once again, I think the most important aspect to understand is that DeepSeq optimized something that already existed.

00:54:35   Yeah.

00:54:36   I think, though, that the angle that OpenAI and Google will use to continue justify their capital and operational expenses will be, well, but if you want to push the bleeding edge of this stuff, like, if you want to come up with the next new thing that then others will copy, we'll need the money and we'll need the horsepower.

00:55:02   So, it'll be interesting to see how this shakes out, like, is something like DeepSeq enough for most people?

00:55:10   I mean, OpenAI, you know, they literally are now, what's it called?

00:55:14   Project Stargate in the US?

00:55:15   Like, that 500 billion project, you know, backed by Microsoft, by Oracle, by the US government and others, like, building a massive data center.

00:55:25   Like, that's a lot of money going into this.

00:55:28   And that's why it is scary to have this little Chinese startup come up and say, hello, we have this and it's free and we spent less than $10 million on it.

00:55:40   Goodbye.

00:55:41   It's, you know, it's interesting.

00:55:43   It is.

00:55:45   Yeah, and the timing really couldn't be worse for some of those people asking for billions of dollars.

00:55:52   Like, I understand why they would be freaking out.

00:55:55   What is also fascinating here in this multi-layer AI onion that we just unpeeled is the Apple angle.

00:56:08   For a couple of reasons.

00:56:11   Now, obviously, as we have established, Apple is at least a couple of years behind.

00:56:17   Apple doesn't even have a large language model to begin with.

00:56:20   And here we have companies, you know, doing reasoning models, O3, Gemini 2, Deeps.

00:56:28   Like, Apple doesn't even have a basic large language model in the first place.

00:56:32   So, setting that aside for a second, Apple also has a problem in China.

00:56:37   Because they want to roll out Apple intelligence in China, but they cannot because the Chinese government doesn't allow models like ChatGPT or Gemini.

00:56:48   You know, obviously, you know, this is the same as social media.

00:56:52   You have to use specific services to be approved by the Chinese government.

00:56:58   So, there's a couple of interesting angles here.

00:57:01   The first and more obvious one, I think, is that Deepsik could be the in that Apple was looking for to be able to offer Apple intelligence in China with an extension.

00:57:17   Basically, the equivalent of the chatGPT extension in the U.S. and other markets where Apple intelligence is available.

00:57:23   Deepsik could be the Chinese equivalent for it.

00:57:27   A Chinese company with a large language model, high performance large language model, could be the next extension for Apple intelligence to be approved in China.

00:57:37   I could see that.

00:57:38   It would also be fascinating to have another approach and to see Apple acquire Deepsik.

00:57:45   Now, Apple doesn't often make big acquisitions.

00:57:48   But Deepsik is a small company.

00:57:51   The talent, you know, Deepsik is a hedge fund.

00:57:55   They have a small team of engineers working on this.

00:57:58   Again, I saw some reports saying it's less than 50 people.

00:58:02   Who knows what is correct these days anymore.

00:58:04   But it would be interesting for Apple to say, well, to kickstart our own large language model, maybe we could acquire this.

00:58:14   And it would fit Apple's approach of like, it's high performance, it's more cost effective, it's more energy, like it consumes less energy than an equivalent to chatGPT or Gemini.

00:58:33   It would be interesting to see Apple consider an acquisition for Deepsik.

00:58:37   Although I have also long thought that eventually Apple will acquire Mistral, which is the France-based AI company.

00:58:48   But I do think that eventually Apple will need to acquire some kind of large language model.

00:58:54   We'll see.

00:58:58   But I think more realistic, I could see a Deepsik integration in China to get Apple intelligence out the door in that market.

00:59:05   I think I pretty much covered it all, except, oh, there's one final thing that I think it'll also be something that we need to keep an eye on in the short term, which is the, oh, there's actually a couple more things I want to say.

00:59:25   The first one is, I saw this article on the information about how a bunch of different businesses and what do they call it, Stephen, SaaS companies?

00:59:34   Yeah, software as a service.

00:59:36   Yeah, yeah.

00:59:37   A bunch of these SaaS companies are like jumping ship to the Deepsik R1 API because it's much cheaper to run than OpenAI or Google or Anthropic.

00:59:51   Which, by the way, Anthropic, what's going on at Anthropic these days?

00:59:54   Like, what are they doing?

00:59:55   Like, I feel like we haven't seen news from Claude in how many months now?

00:59:59   I don't even know.

01:00:00   I think they're mostly trying not to get sued by content makers.

01:00:04   No, that's perplexity.

01:00:07   That's perplexity.

01:00:08   That's right.

01:00:08   Yeah.

01:00:09   Yeah.

01:00:09   To be fair, Anthropic is the only company that's actually thinking in a more serious way about safety.

01:00:14   So that's commendable of them.

01:00:16   I do think, I saw some people say that, and this is where I disagree with the Apple community consensus.

01:00:26   I saw people say, oh, actually, Apple being late to the game with larger language models is a good thing.

01:00:35   Because Apple now can be an aggregator of AI services and larger language models.

01:00:43   I think, I cannot use the expression that I want to use on the show.

01:00:47   But you can imagine what I want to use.

01:00:52   I don't believe that.

01:00:54   Because let me tell you, if Apple were not, if Apple hadn't been caught flat-footed in this space,

01:01:03   if Apple could offer a large language model now, don't you think that they would?

01:01:09   Yeah.

01:01:10   And also charge for it with a more expensive and powerful version?

01:01:15   You would think that if Apple could, if they were in a position to do this, they would say, nah, we're good as an aggregator.

01:01:23   Let's, you know, that's our strength.

01:01:25   The App Store is our strength.

01:01:26   I mean, do you know Apple?

01:01:27   Do you know modern Apple?

01:01:28   Do you think Apple relishes the idea of being an app aggregator for real?

01:01:35   And second, the whole idea of aggregation, if you're just a, if you put the value in Apple as being a distributor of apps

01:01:47   rather than the provider of core services on your phone,

01:01:53   you know what, today, the more I use these AI tools, it's never been easier for me than it is today to switch from an iPhone to an Android device if I wanted to.

01:02:08   And that's because all the work that I've done in Chagipity and Cloud and Gemini, it's all based on a web service.

01:02:16   It follows me around.

01:02:17   And same with Dipsick.

01:02:19   I can use Dipsick on my phone.

01:02:20   I can use Dipsick on Android if I wanted to.

01:02:22   I can use Dipsick on Windows if I wanted to.

01:02:25   So this whole idea of like, no, well, actually, it's not that Apple is behind.

01:02:29   It's that Apple is an aggregator.

01:02:31   It's that Apple is late to the game.

01:02:32   And this is good for them because Apple gets recognized as the App Store.

01:02:37   I mean, come on.

01:02:38   That is like, you know, the classic tale of the fox and the grape.

01:02:44   Like, no, absolutely not.

01:02:46   If Apple could offer a larger language model today, they would.

01:02:49   Now, all these AI tools, this is the, I think, the greatest threat to Apple right now.

01:02:57   It's not necessarily the fact that they are behind in large language models because I could see Apple coming out late and offering something that is, you know, powerful and much more integrated.

01:03:10   The greatest threat to Apple right now is the web and the fact that all of this is web-based.

01:03:16   And web-based means inherently it's portable and you can take your data and you can take your identity and you can take your projects and it doesn't matter the phone that you're holding.

01:03:27   It doesn't matter the computer that you're using.

01:03:30   The web is also the number one thing they should be worried about as a platform maker, right?

01:03:37   I mean, just looking at the things on my dock, a whole lot of them, just web apps and thin wrappers, right?

01:03:44   Like, there is a world that could be coming where being a vendor of an SDK doesn't matter as much as it does now.

01:03:57   Yeah, I think this is, this is something that I've been thinking about a lot lately.

01:04:01   I think over the past five years, we were looking somewhere else, you know, in the Apple community and we haven't kept an eye on what exactly was going on in the web industry and in progressive web apps.

01:04:17   And I think it's incredible what is possible today.

01:04:21   And that's, you know, I think in hindsight, 20 years from now, we'll maybe look back at this and be like, that's a problem that Apple ignored.

01:04:29   And it, and it calcified into what became a big issue for them, which is developers and a whole new generation of young developers being fed up with the rules of the app store.

01:04:41   I mean, like, you know what, I'm going to build whatever I want to build and I'm going to do it in Google Chrome and in a web browser and on the web when there are no rules.

01:04:51   And the progress that has gone into the kinds of modern web apps that you can build today.

01:04:56   I mean, look at Notion, right?

01:04:58   Look, look, look at all these services that are now taken for granted in any workspace.

01:05:02   And it's all web-based and it's all web first, actually.

01:05:06   I saw yesterday a post by someone who was running DeepSeq in a web browser with JavaScript-based acceleration.

01:05:14   I was like, you know, like, incredible.

01:05:17   Dude, Finn has it running on an iPhone using Core ML.

01:05:21   Like, wild.

01:05:24   Yeah, so I think the whole idea that, I know, actually, this is great for Apple because it, you know, ties Apple to this identity of the app store as an aggregator of the best stuff.

01:05:38   Well, what about the Google Play Store then?

01:05:41   Like, it's literally the same idea.

01:05:43   You go to the Google Play Store, like, but do you, like, I don't think of Google as the Google Play Store.

01:05:50   No.

01:05:50   Yeah, so we'll see what happens.

01:05:54   Also, like, yet another reason that's a bad take.

01:05:58   What part of Apple's business has the most risk of being blown up by regulation around the world?

01:06:05   It's the app store.

01:06:06   Like, and Apple is clinging on to their model.

01:06:10   We've talked about this a lot.

01:06:12   They've clung on to that model at their own peril around the world.

01:06:17   Like, what are we doing?

01:06:18   Yeah, yeah.

01:06:20   So, it'll be interesting to see what Apple does here.

01:06:23   Not just because of the threat of web-based products that doesn't really matter which device you're using.

01:06:30   And Apple is a hardware maker.

01:06:32   The threat of open source.

01:06:35   Like, what happens?

01:06:36   And this is obviously extends to Google, extends to OpenAI, extends to Microsoft.

01:06:41   Like, what happens if the best models are not close proprietary models, but they are open source models?

01:06:46   Like, what happens then?

01:06:47   What happens to the economy?

01:06:49   What happens to all of these investments?

01:06:51   Is Silicon Valley going to continue to freak out?

01:06:53   There, you know, anyone who's been following this space for the past few months will know that there's a curve of progress to the open source community and to the open source models that's only going up and up and up.

01:07:07   And it's, you know, I can see why, you know, Sam Altman puts up a good face on Twitter.

01:07:13   But, you know, did you see the report that, for example, at Meta, they have assembled multiple war rooms to figure out how exactly DeepSeq built the model?

01:07:25   And I can see that because it's, you know, and Meta is also doing open source, obviously.

01:07:31   Yeah, it was Llama.

01:07:32   Yeah.

01:07:33   Man, it's been an, even if DeepSeq, you know, the story ends up being that actually they had more GPUs than they said.

01:07:44   Actually, they spent more money than the narrative initially said.

01:07:48   Which is definitely possible.

01:07:50   Which is definitely possible.

01:07:51   But regardless, what's in the white paper is in the white paper.

01:07:54   And regardless, the approach and the engineering was clever.

01:07:59   And so, I think it'll have repercussions.

01:08:03   It'll have consequences in, you know, it'll ripple out to the entire industry and it'll be fascinating to see.

01:08:08   But it'll also be fascinating to see, since we are an Apple show, what Apple does here.

01:08:12   And I am convinced that I think the most realistic option is that they will partner with DeepSeq in China.

01:08:18   Yeah, we will see.

01:08:21   I am sure someone at Apple is looking at it, for sure.

01:08:24   Oof, thank you for letting me do this.

01:08:28   Yeah, no, thank you.

01:08:29   When we talked the other day, you're like, I want to talk about this.

01:08:32   I'm like, sweet.

01:08:32   I'm not going to pay attention.

01:08:34   Just like, let you sink this information into my brain.

01:08:38   So, thank you for your service.

01:08:43   There's no denying we've had some rocky outros for the podcast recently.

01:08:50   And our friend Rob has made a new tool that lets you design your own outro for the podcast.

01:08:59   It's called 123 Outro Maker.

01:09:01   Incredible.

01:09:02   Perfect name.

01:09:03   Rob is the best.

01:09:05   There's a link to that in the show notes if you want to make your own outro at home.

01:09:10   But for now, I'm going to give it a shot.

01:09:12   You can leave feedback for the show at connectedfeedback.com.

01:09:16   You can join at getconnectedpro.com and get longer ad-free versions of the show each and every week.

01:09:22   This week, we talked about task managers.

01:09:24   And I issued an apology because I messed something up on the show last week.

01:09:29   You can find our work elsewhere.

01:09:32   Federico is the editor-in-chief of MacStories.net.

01:09:35   And he is Vatici across the Fracture social media landscape.

01:09:39   Mike is not here, but we still love him.

01:09:42   He's the host of many shows on Relay.

01:09:44   You can check out his work at Cortex Brand.

01:09:46   They have some new Sidekick products out this week.

01:09:50   And you can find him, Mike Hurley, across social media.

01:09:55   You can find my writing at 512pixels.net.

01:09:58   I co-host MacPowerUsers here on Relay.

01:10:01   It comes out each and every Sunday.

01:10:02   This coming Sunday is our review of Apple Intelligence.

01:10:06   We spent a lot of time preparing for that episode.

01:10:12   I think it's a really good one.

01:10:13   So look for that on Sunday.

01:10:14   And you can find me on social media as ISMH86.

01:10:18   I'd like to thank OneBlocker for supporting this episode.

01:10:23   And Federico, until next time, say goodbye.

01:10:25   Arrivederci.

01:10:26   Bye, y'all.