444: The World's Biggest AI Onion


00:00:00   [Music]

00:00:08   From Relay FM This is Connected episode 444.

00:00:12   Today's show is brought to you by our excellent sponsors ExpressVPN and Indeed.

00:00:16   I'm one of your co-hosts Federico Vittucci and it's my pleasure to introduce Mr. Stephen Hackett. Hello Stephen.

00:00:22   Hello Federico, how are you?

00:00:25   I am doing fantastic.

00:00:27   Good.

00:00:28   It's good to be back, episode 444.

00:00:30   - It's got a nice ring to it, done it.

00:00:32   - It's nice.

00:00:33   - We're also, two things,

00:00:34   we're coming up on 500 really quickly,

00:00:36   and also our 10th anniversary coming up really quickly.

00:00:39   It's wild.

00:00:40   - Wait, what?

00:00:41   Oh my God, okay.

00:00:43   - And I am joined by Mr. Mike Hurley.

00:00:46   - Hello.

00:00:46   I think that like 56 weeks is not that quick.

00:00:51   - Oh, is that how many we actually have?

00:00:53   Yeah.

00:00:54   - Well, you said we're coming up on 500 quickly,

00:00:56   and it's 444.

00:00:57   I mean, it's not that far.

00:00:59   - 450 is sooner.

00:01:01   - That's true.

00:01:01   - That's coming up pretty quick.

00:01:03   - 450 will be near,

00:01:04   now it'll be before-- - Closely I checked.

00:01:06   - WBC.

00:01:07   When's 500?

00:01:08   Is that near WBC next year?

00:01:10   - I mean--

00:01:11   - What's 56 weeks?

00:01:12   - I don't know.

00:01:14   - Yeah, well that's given that we do an episode a week,

00:01:17   which sometimes we take off during Christmas,

00:01:19   which is how Upgrade passed us.

00:01:21   - Yeah.

00:01:22   - Now I cannot count in weeks.

00:01:25   Like when you say something in weeks,

00:01:27   - This is one of the things that Siri is actually good at

00:01:30   is like how many, like what's the day,

00:01:33   X amount of days from today or weeks from today or whatever.

00:01:36   I do that kind of stuff a lot and it's good.

00:01:38   - It's like, you know when you have those parents

00:01:40   and they have a kid and you're like,

00:01:42   for example, like how old is she?

00:01:46   And some parents go like, oh, she's 72 months.

00:01:49   Like what does that mean?

00:01:50   Give me like a year.

00:01:53   - Yeah, I think after 12 months, stop doing that.

00:01:56   But no, but there's people who do that.

00:01:58   Like it actually happened to me like a few months ago.

00:02:02   I heard someone say, "Oh, she's 44 months old."

00:02:06   Like, wait, what?

00:02:07   - I don't know what that means.

00:02:08   - Episode 500 will be around May 1st next year.

00:02:12   - Okay.

00:02:13   - Ish.

00:02:15   May the first be with you.

00:02:17   - So we have to take like five weeks off or something?

00:02:20   - Can't do that.

00:02:21   - Okay.

00:02:23   - I got bills to pay.

00:02:24   That was the beautiful thing where me and Brad hit episode 500 on the exact day of our

00:02:31   10th anniversary for the Pan-Edict.

00:02:33   It was incredible.

00:02:34   Like, it just lined up.

00:02:36   Basically, we'd spotted it.

00:02:37   We usually take an episode off over Christmas, right, like we do on this show.

00:02:40   And it worked out.

00:02:41   If we didn't do that, if we just did an episode every single week and noticed it like months

00:02:46   before we were going to hit it perfect.

00:02:48   And it was, that was awesome.

00:02:49   It's like the exact day.

00:02:52   I did realize while recording today that next week I've been podcasting for 13 years.

00:02:58   Wow.

00:02:59   Now, horrible thought.

00:03:01   That's such a long time.

00:03:03   A bunch of people wrote in, Stepp and Ben being two of them, that I'm basically doing

00:03:07   my AirPod shuffle backwards, right?

00:03:10   So I talked about I take one out, put it in the case, get the other one, put it in, and

00:03:14   the audio doesn't restart?

00:03:15   Mm-hmm.

00:03:16   Wait.

00:03:17   No, that's what I should do.

00:03:18   Oh my god.

00:03:19   What I'm doing is putting them in both ears and pulling one out.

00:03:21   So anyways, it's very confusing.

00:03:24   So what are you actually saying here?

00:03:25   Okay, so what I've been doing, I take, I'm wearing one AirPod, right, because I don't

00:03:31   want to get hit by a car.

00:03:32   But I want to switch ears for some reason.

00:03:34   What I've been doing is putting the second one in, and then taking the first one out,

00:03:39   and then it pauses the audio, and it won't restart.

00:03:42   Yeah, right.

00:03:43   Because it's the ear detection thing.

00:03:44   What I should do, which everybody wrote in, I think y'all said this maybe, but maybe we

00:03:47   We didn't, I didn't go back and listen, but I should take the one out and then put the

00:03:52   other one in and then it should pick back up.

00:03:55   So anyways, I'm going to try that next time.

00:03:57   I'm not sure that you know what you're trying.

00:04:00   Maybe not.

00:04:01   I gotta say, I don't follow here.

00:04:06   Like I'm so confused.

00:04:07   Yeah.

00:04:08   Here's the thing.

00:04:09   I don't understand.

00:04:10   I don't think Stephen understands.

00:04:12   So how about this?

00:04:14   You report back next week.

00:04:17   I feel like I'm staring, I'm staring at this text.

00:04:20   And the more I try to read it, the more I keep drifting off.

00:04:25   First take the AirPod out of your ear and place it in the case.

00:04:28   The next one plus is the audio.

00:04:29   You don't put the second...

00:04:31   Oh, okay.

00:04:34   I think I follow.

00:04:35   If you take the AirPod out and then put the AirPod in,

00:04:39   I don't think it does restart.

00:04:41   Hmm. So maybe there is no fix.

00:04:43   What AirPods do you have?

00:04:44   The AirPods Pro 2, the newest ones.

00:04:46   Right, well then do what Step says. You just squeeze the AirPod and it will restart again.

00:04:52   I think I have all that turned off, I should look.

00:04:54   Why? It's the best feature!

00:04:56   I don't like squeezing my ears!

00:04:58   No you're not squeezing the ears! You're squeezing the AirPod!

00:05:01   Merzy you've been getting it wrong the whole time.

00:05:04   What you need to do is...

00:05:06   Why is squeezing my earlobe not doing anything?

00:05:08   Yeah, why doesn't it do anything?

00:05:09   Apple's doomed!

00:05:10   Just squeeze the stem and it restarts the audio again.

00:05:14   Okay, okay. I will do some practicing and I'll report back, please

00:05:18   Okay, one of you want to take this next one?

00:05:21   So I made reference to the delayed game is never good bad game is always

00:05:27   Rubbish quote or whatever the quote is from yes

00:05:30   Not Shigeru Miyamoto

00:05:32   Max wrote in to say I recently did an interview with a person that sparked the hunt for the origin of the Miyamoto quote about

00:05:38   delaying games

00:05:39   We chatted about the quest to find the source and the twist and turns that it took plenty of links in the original threads and sources

00:05:45   If you'd like I am going to listen to this. This is a part the max frequency podcast

00:05:49   Because I want to know this I got I was familiar recently that someone was trying to find it out

00:05:55   Which is how it ended up becoming a thing that oh it turns out. He didn't actually say it

00:05:59   So this sounds like a good history of that. So I'm gonna

00:06:02   Okay recommend that connected listeners do too we also spoke about voice isolation

00:06:09   And we got an email from Anthony who said that they were a FedEx driver

00:06:14   And this so they drive this like loud diesel rattly truck and they have used voice isolation since the beta and it's like

00:06:20   amazing on phone calls

00:06:23   Which is really cool. I'm glad that

00:06:26   I'm glad they're doing more processing because for years the iPhones have had multiple microphones

00:06:31   Right and Apple's had some limit of noise reduction

00:06:34   But this is on top of that and in fact, you can play with this in GarageBand or Logic. There's a new

00:06:40   Apple audio unit called AU sound isolation that you can put on a track and

00:06:47   It gets rid of basically all background stuff. So I did I played with this this morning and

00:06:53   just with some audio that I had laying around and

00:06:56   It does do a good job of getting rid of background stuff

00:06:59   I'm not sure it sounds as good as something like isotope, which is like a

00:07:03   a very expensive suite of audio editing tools

00:07:08   that Jason and I use for some things,

00:07:11   but it's pretty good.

00:07:13   And if you're looking to clean something up,

00:07:15   I think in particular, using AU Sound Isolation

00:07:18   and something like GarageBand

00:07:20   would get you really far down the road

00:07:22   without having to go out and buy something expensive.

00:07:25   So I was unaware that it was an audio unit,

00:07:28   but I am super glad that it is.

00:07:30   And I think it's cool when Apple uses

00:07:33   bits of technology across different products and things.

00:07:36   And so, good work.

00:07:38   - That's super cool.

00:07:39   I had no idea that that was the thing.

00:07:40   I wanna try that out.

00:07:41   Those things to me though, I never really liked them.

00:07:43   I tend to like, I'll use some kind of, you know,

00:07:46   like removal of background sound processing.

00:07:50   And I tend to end up going back, like for me,

00:07:53   I think sometimes you can suffer through the, you know,

00:07:56   sound of a fan or whatever,

00:07:58   then like the kind of roboticness that I find

00:08:02   that a lot of those sound isolation things do.

00:08:04   For a phone call is fine, but for podcasts, I don't know.

00:08:08   - My mom told me I sounded metallic

00:08:11   when I tested voice isolation with her.

00:08:13   - There you go.

00:08:14   - It's like, hey, how do I sound?

00:08:15   She's like, oh, you sound metallic.

00:08:17   Okay.

00:08:18   - But I, do you know what?

00:08:20   I understand what she means by that.

00:08:22   It's hard to describe, but like, it's just something sounds.

00:08:26   So in the past few weeks,

00:08:27   we've been talking about the potential capacitive buttons

00:08:30   coming to the next iPhone.

00:08:32   And I made an offhand quote of like,

00:08:34   I'll have reference to like,

00:08:36   how is it gonna work with cases and gloves

00:08:38   and are you gonna need cut outs on the cases

00:08:39   and all that kind of stuff.

00:08:41   MacRumors is reporting from a source that they heard from

00:08:45   and from more information in the MacRumors forums

00:08:47   from this person, that Apple will building the ability

00:08:51   to adjust pressure sensitivity in iOS for these buttons.

00:08:54   Quote, "The tipster has revealed that iOS 17

00:08:57   "will include a new toggle in settings

00:08:59   "that will enable users to customize the sensitivity

00:09:01   of the buttons to accommodate these different usage scenarios.

00:09:05   It will detect presses, holds, and respond to various levels of pressure via the use

00:09:09   of a new force touch style mechanism and Taptic Engine feedback.

00:09:14   So if all of this is true, you're going to be able to customize everything about your

00:09:19   iPhone 15 Pro.

00:09:21   You'll be able to customize the action button, you'll be able to customize the sensitivity

00:09:26   of the capacitive buttons.

00:09:28   Everything becomes customizable.

00:09:31   sounds fun but also weird though right like let's that's if we like you know

00:09:36   we're talking about the fact of like you know is is this button are they like an

00:09:40   action button and not a mute switch or like just is that better for regular

00:09:44   people right we were talking about that like how good is the toggle or whatever

00:09:47   but if it's like oh you buy a case for your iPhone oh now you have to go into

00:09:51   settings and play around with the pressure sensitivity to turn your volume

00:09:56   up and down yeah you know it's like it's like a I like that they if they have to

00:10:01   do it, I'm happy that they would do it and not just be like, "We don't support cases

00:10:04   other than our own," right?

00:10:07   There's still like a... that sense... it's weird, right? Like that will be a weird reality

00:10:13   to be in.

00:10:14   I wonder if we're reading maybe too much into this or if this is kind of overstated in the

00:10:19   reporting. What it makes me think of is the iPhone 7 and 8 where you can adjust the level

00:10:27   of force you need to click the button, the haptic solid state home button.

00:10:34   And that was customizable and it could detect different, or I guess it would register a

00:10:39   different pressure.

00:10:41   But that's, in that scenario, that's just something you can do, right?

00:10:47   This would be something you might have to do, which is different, right?

00:10:52   But like you buy a case and your case buttons will not work.

00:10:57   And you have to troubleshoot to the point where you find this area of settings and tweak

00:11:02   it.

00:11:03   Like the home button thing, you just do that if you want to.

00:11:06   It doesn't change your experience with the device so you don't need to do it because

00:11:09   nothing covers the home button in that way, right?

00:11:13   So like, yes, may have been a thing in the past of other stuff.

00:11:16   Like I can customize my, the feedback from my trackpad.

00:11:20   I don't have to do those things.

00:11:22   They're just because you can, they offer that.

00:11:25   And if, you know, so my, that's what,

00:11:27   if this is that, right, it's just like,

00:11:29   hey, we can offer it, so here you go.

00:11:31   It's like a fun thing, like play around with it.

00:11:33   Great.

00:11:34   But if it's like, oh no, to get like this case

00:11:38   from this company to work, you need to set it at this level.

00:11:41   It's like, it's weird.

00:11:43   But like, you know, for us, as Federica said,

00:11:45   like that's great, right?

00:11:46   Like if I have the ability to like,

00:11:48   If I hold down on one button, it does this, hold down,

00:11:51   you know, that's awesome if it does that.

00:11:53   - Here's an idea, here's an idea,

00:11:55   I'm just throwing it out there.

00:11:57   What if, what if, imagine a new sort of made for iPhone

00:12:02   program where compatible cases, you know,

00:12:06   they have MagSafe, like the official MagSafe.

00:12:09   Imagine if like, if the case came with built-in information

00:12:15   on how to set the settings for the buttons for you.

00:12:19   So that when you put it on, it automatically--

00:12:21   - Similar to that, is if Apple specified thicknesses,

00:12:26   which they do stuff like this.

00:12:27   So like you can go to Apple's website,

00:12:30   it's in the development center,

00:12:32   and you can get all of the placement

00:12:34   of the magnets for MagSafe.

00:12:36   Like that's just a thing that they offer.

00:12:38   It's like a thing.

00:12:39   And I don't even know if it's part

00:12:41   of the made for iPhone program.

00:12:42   They just offer that stuff.

00:12:44   So like, yeah, like through that program,

00:12:46   they could maybe offer it.

00:12:47   But the made for iPhone program though,

00:12:49   is like you as a case maker now have to pay a tax to Apple,

00:12:53   if you're in that program.

00:12:55   So my hope would be that like similar to that,

00:12:57   they might say like, oh,

00:12:59   a thickness of this amount of millimeters

00:13:01   will make sure that you won't need to adjust it even.

00:13:05   You know?

00:13:06   - Discord pointed out the SC2 and 3

00:13:09   also have that button, just correct.

00:13:11   And that on those phones and you set up,

00:13:13   it puts that option in the setup,

00:13:16   but that's I think still different

00:13:17   than what you're talking about, so.

00:13:19   - Again, it's not necessary.

00:13:21   It's just like, here's the thing you can do,

00:13:23   so we're letting you do it.

00:13:23   It's not the same as like, you might have to do it.

00:13:27   - Maybe they're gonna do something like,

00:13:28   cases have NFC in them, or who knows?

00:13:30   There could be a lot of cool things they could do here.

00:13:33   I hope that this is not though,

00:13:35   an attempt to lock down the case market.

00:13:38   I think that would be a mistake.

00:13:40   (

00:13:40   Now you know what that means, just a short sound, it's administration.

00:13:44   Unless you hear the, you have to hear the whole sound for the quizzes to begin.

00:13:48   That is just an administration sound for the quizzes.

00:13:51   I've quizzes administration.

00:13:53   So people have asked when I did the do the passionate ones know you quiz a few weeks

00:14:01   ago for the two of you, people wanted to know like what about me?

00:14:05   like what do the passionate ones think my favorite movie is or my favorite band.

00:14:10   So there is a form in the show notes where you can go and fill out "do the passionate ones know

00:14:17   Mike" and you can give your answers for what you think are my favorite xyz. This is going to form

00:14:24   our member special for this year. So in what I've mentioned this before, quizzes can be in members

00:14:32   content and so the member special that we're going to do later on this year will be the

00:14:37   quizzes for this and we will follow up later on with what the point scoring was and how

00:14:42   that affected things. So you can go and fill that out and later on in the year, I think

00:14:47   maybe in the next couple of months, we'll be recording our member special and seeing

00:14:51   how well the passionate ones know me and how Steven and Federico think you're going to

00:14:58   answer to these questions.

00:15:01   This episode of Connected is brought to you by ExpressVPN.

00:15:05   Watching Netflix without using ExpressVPN is like playing your favorite game and not

00:15:09   using all the power-ups.

00:15:10   Why limit yourself when there's so much fun to be had?

00:15:13   Netflix has thousands of shows across different countries, but with a VPN you only get access

00:15:17   to a limited selection based on your location.

00:15:20   With ExpressVPN you can unblock those shows by amending where it thinks you're located,

00:15:24   and it works on a bunch of other streaming services too.

00:15:27   This means that you can do things like watch The Office, the US version, on Canadian Netflix,

00:15:33   or Lord of the Rings on Netflix in Turkey.

00:15:37   Lots and lots of options.

00:15:38   Back to the Future, Mike and I's I think joint favorite movie on Canadian Netflix.

00:15:43   They got it going on in Canada is what I'm learning here.

00:15:46   With ExpressVPN it's just one click.

00:15:48   You open the app, choose the country, and you're off to the races.

00:15:53   There's a bunch of other reasons to choose ExpressVPN.

00:15:55   of my favorites is the blazing fast speed. You can stream video, do your calls,

00:16:00   whatever you need over ExpressVPN. It is super fast and it's compatible with all

00:16:06   your devices, phones, laptops, media consoles, smart TVs, and more. They have

00:16:11   servers in over 94 different countries, so you can gain access to thousands of

00:16:15   new shows and it works with other streaming services like BBC iPlayer,

00:16:19   YouTube, and more. Stop paying full price for streaming services and only getting

00:16:23   access to a fraction of their content. Get your money's worth at ExpressVPN.com/connected.

00:16:30   That's ExpressVPN.com/connected to get an extra three months of ExpressVPN for free.

00:16:38   Our thanks to ExpressVPN for their support of the show and Relay FM.

00:16:42   So automation April has begun, which is, we spoke about this before, it is a, did you say a contest?

00:16:49   Well, it's an event that also includes a contest, yes.

00:16:55   Automation April spans across many things in the Max Stories Extended Universe, but

00:17:00   also has the Automation April contest where Federico and a panel of judges will judge

00:17:05   your shortcuts for their amusement.

00:17:11   And what they're looking for is the best possible shortcuts.

00:17:15   So Federy Cover Teacher comes along and flexes on everyone by creating a shortcut that is

00:17:22   ChatGPT inside of a shortcut. Like what is this? Day one of Automation April? You're

00:17:28   like "thou shalt not pass!"

00:17:30   Day three.

00:17:31   Day three. Oh, you gave him two days. That's good of you.

00:17:34   Day three. Yes. Well, obviously this is not done to flex on people, as you say. It's

00:17:43   - It's just, it's part of the automation April,

00:17:45   in addition to the contest, in addition to the podcast,

00:17:48   has an editorial component where we write articles

00:17:52   and make shortcuts.

00:17:54   And this just happens to be a really special shortcut

00:17:57   that I've been working on for kind of the past month or so.

00:18:02   And I really wanted to, because it's special,

00:18:05   I wanted to save it for a special occasion,

00:18:07   like a fancy suit, you know, that's,

00:18:11   So I put on the suit today for Automation April.

00:18:15   Yes.

00:18:16   Okay, so what do you wanna know?

00:18:17   - From a content planning perspective,

00:18:18   this is a pretty epic shortcut.

00:18:21   You should use it now, not earlier, right?

00:18:23   Or later.

00:18:24   Like I get, you just like, there's gonna be,

00:18:26   you'll get to say like, hey, it's Automation April,

00:18:28   while people are reading about a shortcut

00:18:30   that I assume is probably gonna get linked

00:18:32   all over the place, I would assume, right?

00:18:35   - Yeah, it's going really well so far.

00:18:37   So yes.

00:18:38   So what does SGPT do?

00:18:43   Like what is it doing in a nutshell?

00:18:45   - So in a nutshell, it's a shortcut

00:18:47   that lets you have conversations

00:18:49   with OpenAI's chat GPT assistant.

00:18:53   It lets you have back and forth conversations,

00:18:56   but there's a bunch of shortcuts like that.

00:18:58   We've seen, you probably have seen on Reddit

00:19:01   and other places, other shortcuts

00:19:03   to have a chat GPT conversation

00:19:06   in the shortcuts UI or inside Siri.

00:19:09   My shortcut does that, but it also does other things.

00:19:14   Meaning in addition to conversations,

00:19:17   it lets you integrate chat GPT with iOS,

00:19:21   iPadOS and macOS directly to,

00:19:26   in different places of the OS.

00:19:30   So it's in addition to the conversations,

00:19:32   it comes with support for native functionalities of your computer. There's a small list of

00:19:40   integrations right now. There's more on the way, which we can talk about in a couple of

00:19:45   minutes. My idea was, I like ChatGPT. Before the Italian government banned it, I was a

00:19:53   ChatGPT+ subscriber. And then I got my refund.

00:19:57   I'm only tangentially aware of this.

00:19:59   - Yeah.

00:20:00   - Why is the Italian government banned chat GPT?

00:20:04   - It can curse on this show, right?

00:20:05   - No.

00:20:06   - Allegedly due to privacy concerns

00:20:09   because of the data breach that they had a few days ago.

00:20:15   And so the Italian privacy authority wants to know more

00:20:20   about how the data of Italian users and customers

00:20:23   is being used by open AI.

00:20:25   Something like that.

00:20:26   I had no idea OpenAI had a data breach.

00:20:28   I didn't even know that was a thing.

00:20:30   - Anyway, I'm a fan of chat GPT, what you can do with it.

00:20:34   But I think chat GPT, its true potential lies

00:20:39   in how you can give it your data

00:20:43   and basically get another super smart brain

00:20:49   to assist you with your stuff.

00:20:52   Like, yeah, it's fine.

00:20:53   You can ask chat GPT trivia questions

00:20:56   or you can ask for recipes, or you can ask for, I don't know,

00:21:00   travel plans.

00:21:01   I mean, we've all tried that.

00:21:04   But in many ways, I think that the true power

00:21:07   lies in getting to know--

00:21:11   making it-- putting it to work on data

00:21:15   that you provide to chat GPT.

00:21:19   And also, the power lies-- and I'm

00:21:21   going to get to this in a minute, to teach ChatGPT how to return data in a specific format.

00:21:31   So what I did was I started thinking about this and I thought, well, I have the shortcut

00:21:35   that lets you have conversations with ChatGPT and you can ask it for dinner ideas or you

00:21:42   can ask it for, you know, "Oh, tell me more about the country of, I don't know, Argentina"

00:21:49   something like, yeah, you can ask all these questions, but what if I figured out a way

00:21:53   to make stuff happen on your computer based on GPT's answers and data?

00:22:00   So, in this first version of SGPT, which stands for Shortcuts GPT, or Siri GPT I guess it also

00:22:09   works because it's a shortcut so you can use it manually in the Shortcuts app or you can

00:22:15   invoke it via Siri. Totally works. In this first version you can do a few things that are

00:22:20   native integrations, right? You can, for example, access the contents of your clipboard.

00:22:29   So by using the trigger word clipboard you can do whatever you want. As long as you use the word

00:22:36   clipboard in your query, SGPT will take whatever text is in your clipboard and give it to chatgpt,

00:22:45   the cloud service, which will respond with an answer. So for example, if you fire up the

00:22:51   shortcut and you say "summarize my clipboard", the first time you do it you'll get a permission prompt

00:22:59   to give it access to your clipboard and then you get a summary of whatever is in your clipboard.

00:23:04   Or for example, something that I like to do is, maybe you just wrote a blog post in Obsidian

00:23:10   or in whatever text editor you like.

00:23:14   You copy the text and you can ask something like, "Check my clipboard for grammar mistakes

00:23:20   and tell me what they are and give me suggestions."

00:23:24   You run that command, you wait a couple of seconds and you get back a detailed response

00:23:29   by SGPT saying, "Okay, I took a look at the text in your clipboard.

00:23:33   Here's the grammar mistakes that I found.

00:23:36   So there's clipboard integration.

00:23:38   There's an integration with Safari.

00:23:41   So you can summarize web pages from Safari, which I will improve.

00:23:46   This feature I want to improve in the next update, which should come up pretty soon.

00:23:51   You can also open URLs.

00:23:54   So for example, in chat GPT, sometimes you ask a question and then you can follow up

00:23:57   and say, "Hey, can you actually give me a URL source for this?" And if the shortcut

00:24:04   sees that ChatGPT responded with a URL, it opens that link in Safari. You can... what

00:24:12   else do I have? You can... oh, there's a sort of an assistant for your reminders and your

00:24:20   calendar events. So if you say to SGPT, "Can you help me with my schedule?" SGPT will take a look

00:24:31   at your upcoming reminders, your upcoming calendar events. Again, there's permission prompts for all

00:24:37   of this, and ChatGPT will tell you something like, "Okay, I took a look in your calendar.

00:24:42   I see that you're a little overbooked for Wednesday because you have five reminders and

00:24:47   five calendar events. Maybe you should consider moving some to Thursday when you seem to be

00:24:52   more free. Something along those lines. So you can take a look at your agenda and basically

00:24:56   give you suggestions. But the real big integration in this first version is the Apple Music integration.

00:25:06   So one of the things that you can, one of the many things that you can ask ChatGPT is

00:25:12   Can you give me a list of songs based on whatever criteria you want to ask?

00:25:20   So you can go on the ChatGPT website and say, "Can you give me the most popular songs by

00:25:26   Arctic Monkeys?"

00:25:28   Or you can do, "Can you give me a list of 20 emo songs released between 2005 and 2010?"

00:25:36   Or you can say, "Give me the top 20 songs by Oasis, Blur, and My Chemical Romance."

00:25:42   Or you can get real fancy and be like, "I'm in the mood for something nostalgic, and I

00:25:48   want to listen to some indie folk songs from the past 10 years.

00:25:55   Can you give me a list of 15 and sort them in alphabetical order?"

00:25:59   ChatGPT will understand all of this, and it'll give you a list of songs.

00:26:04   So I started thinking about this and I was like, "Hmm, what if I figured out a way to

00:26:10   get that list of songs and actually make a playlist in the real music app, like in the

00:26:19   native music app on your device?"

00:26:23   So this feature works in this first version of SGPT, and there's a couple of fascinating

00:26:33   technical aspects behind it. So to make this happen, the shortcut, SGPT, teaches

00:26:42   chatgpt, so the main service, how to return this list of songs in a specific format. Like it tells

00:26:55   it, this is my query, I want you to return the following list of songs, such as the most

00:27:03   popular Oasis songs, but when you return this data, it should be in this specific text format.

00:27:13   You need to give me the name of the artists first, there has to be a separator between

00:27:18   them and there has to be the name of the song. And there's literally a text prompt, which,

00:27:25   if you want, I can read it back to you, which is basically like a set of instructions for

00:27:31   ChatGPT to say, "Okay, so I see that the user wants to have data in this specific format."

00:27:39   The instruction is, let me find it, so there's your query and then it says, "These songs

00:27:47   must be real and not a product of your imagination.

00:27:50   They must be available on either Spotify or Apple Music.

00:27:53   YouTube alone is not enough.

00:27:55   The playlist should be a list of text

00:27:58   with each line in this format.

00:28:00   And there's the format that I'm using.

00:28:02   And then I explain, that's the name of the artist

00:28:05   for the song, followed by a space,

00:28:06   followed by a separator, followed by a space,

00:28:09   and so and so.

00:28:10   And then it says, your response should only be

00:28:13   the list of text with no additional sentences.

00:28:16   Like I am literally asking, I am literally telling chatGPT,

00:28:21   do this in this very specific format in natural language.

00:28:25   - Why? - Yeah.

00:28:27   - Why can't you ask it that, what you just said?

00:28:30   Like, you know what I mean? - I asked it that.

00:28:33   I asked it that, but the reason is you wouldn't want

00:28:36   to ask this yourself every single time, right?

00:28:40   So the shortcut--

00:28:41   - No, what I mean is, so sorry,

00:28:43   like I didn't ask it very clearly, right?

00:28:45   The way that you're asking it is like, give me this list and give it in this way.

00:28:53   Why do you have to be so specific with it?

00:28:56   Why can't you just say what you just said to me then of like, just give me this natural

00:29:00   language?

00:29:01   You know what I mean?

00:29:02   Because my shortcut needs to parse the response in a specific way.

00:29:07   Otherwise, there's going to be extra characters or something that I cannot anticipate.

00:29:15   And so this, I guess, the real trick here is in the fact that I have built a hybrid

00:29:20   shortcut. It's a bridge between chatgpt and your device, whether it's an iPhone or iPad

00:29:27   or a Mac. Chatgpt is the brain in the cloud that returns information. Sgpt is the local

00:29:34   shortcut that takes that information and based on what it is, makes stuff happen on your

00:29:40   device. In this case...

00:29:41   I want you to help me with something here, right?

00:29:43   Because this bridging part is the part that is very confusing to me.

00:29:48   Yes, yes.

00:29:49   How can you give it an image and say use live text?

00:29:54   Like, I don't understand.

00:29:56   Yes.

00:29:56   And like, because I assume you can then say like,

00:29:58   use live text and like, summarize this.

00:30:01   How is that?

00:30:02   I don't understand how this is all doing this.

00:30:06   So, shortcut has a lot of system integrations, right?

00:30:09   as a lot of system actions. They are local to your device. ChatGPT, the cloud service

00:30:17   by OpenAI, doesn't have any hooks into your device, right? It's just an API in the cloud

00:30:23   that accepts text and returns text. But you can use shortcuts as the glue in the middle.

00:30:30   So all of this, what you were wondering about, works with...there's a function. There's a

00:30:37   little module, which is the second shortcut that you need to install, right? You installed

00:30:41   it. It's called sgpt_encoder. That's a little function, essentially like a little submodule

00:30:50   that you never see, in fact. It runs behind the scenes and it checks. Every time you ask

00:30:59   something, that little function kicks in and basically checks your question and says, oh,

00:31:06   did the user ask the word clipboard? Did the user ask the word live text? Or did they ask for a

00:31:13   schedule? Or did they ask for a playlist? It checks your query and depending on what you ask, it runs

00:31:19   a different set of actions. So for the live text example, if you say the words live text, it lets

00:31:26   you pick a photo, that's a local action of shortcuts. That's the select photos action.

00:31:31   You select a photo and then it runs the Apple-made extract text from image action.

00:31:38   That's the live text. It's built into shortcuts. As long as you end up with some text

00:31:45   that you can eventually provide to the chatgpt API, it doesn't matter.

00:31:53   Basically, all these actions happen before making the call to the ChatGPT API.

00:32:00   And that's why it's a hybrid approach. It runs some local actions before eventually invoking

00:32:09   the real ChatGPT and saying, "Okay, here's the text. Do whatever I'm asking you to do."

00:32:15   So for the live text, it uses the live text action. For the clipboard, of course, shortcuts can read

00:32:21   the contents of your clipboard. For the Apple Music one, it's a bit trickier because it needs to

00:32:28   assemble a playlist in between the response from ChatGPT and your next question. And of course,

00:32:38   Shortcuts doesn't have real Apple Music actions. So I'm still using, unfortunately, the iTunes

00:32:45   store actions as a fallback. Oh my god. How does that even work? Like, I don't even understand how

00:32:53   those things still talk to each other. The catalog, I have no idea why Apple doesn't have a real Apple

00:33:00   Music web search action. I really hope that changes in iOS 17. But yeah, so the gist of it all is

00:33:08   we've seen a bunch of shortcuts that let you have conversations with Chai GPT. This one has that.

00:33:14   It's holding the entire conversation.

00:33:18   In fact, this initially was the tricky part

00:33:22   of building this.

00:33:24   Like this shortcut works because it's got a repeat loop

00:33:29   inside of it that repeats a series of actions

00:33:35   a thousand times.

00:33:37   Like literally it says repeat 1000 times.

00:33:41   Because you gotta be able to ask follow-up questions, right?

00:33:44   That's the whole point of ChatGPT.

00:33:47   And so what was tricky was,

00:33:49   how do you store a conversation in shortcuts?

00:33:53   And that conversation grows longer and longer

00:33:55   and longer over time.

00:33:56   - Yeah, 'cause just for clarification,

00:33:58   the way that these models work

00:34:01   is like every time a question is asked,

00:34:03   for it to continue,

00:34:05   like it feels like it's having a conversation,

00:34:07   you need to send the question an answer

00:34:09   for every question that has already been asked.

00:34:13   every time you contact the ChatGPT API, you give it the updated transcript of the entire conversation.

00:34:24   So every time it looks at the entire thing and it looks at the latest item, it's like "Okay,

00:34:32   I will now give you a new answer." So that's the way that it works. And this is the new ChatGPT

00:34:38   API that came out like a month ago. It's not the old text completion API that was available before,

00:34:47   this is the new one which is super cheap and very easy to program.

00:34:51   Where does it store the information during the conversation?

00:34:55   Another difference from other shortcuts that I've seen, I've seen shortcuts that every time you run

00:35:00   them they store the conversation in a text file or somewhere, like they actually save your

00:35:07   a conversation somewhere, like in storage.

00:35:11   My shortcut doesn't do that.

00:35:12   It's all dynamic and it's all stored in variables

00:35:17   that exist when the shortcut is running and disappear

00:35:20   when the shortcut is no longer running.

00:35:22   It's all--

00:35:23   - I thought you were gonna say base64.

00:35:25   I thought that was where we were leading.

00:35:27   - No, no, no, no, absolutely not.

00:35:29   It's all in variables.

00:35:32   I only save the transcript

00:35:35   if you ask the shortcut to do that.

00:35:37   Right. So that thousand loop that you're saying, it's just a number you've put in.

00:35:42   It's just a large number.

00:35:44   But if somebody did a thousand and one...

00:35:49   No, the shortcut would break.

00:35:51   Right. That's good.

00:35:52   The shortcut would break before.

00:35:53   Which is obviously like, that's not going to happen.

00:35:54   It would run out of memory before.

00:35:55   Yeah. It's not going to happen.

00:35:56   It would run out of memory before.

00:35:57   Yeah.

00:35:58   Like the whole iPhone just like catches on fire.

00:36:02   Yeah. So, okay. So I don't know.

00:36:05   What else do you want to know?

00:36:06   - So like this is, it is wild to me that you're,

00:36:10   I understand what you have said

00:36:14   about how you tie it into the system actions.

00:36:19   It still feels a little unbelievable.

00:36:22   Like I feel like it's so, it is so weird.

00:36:25   Like it's weird.

00:36:27   And so from what you've explained to me,

00:36:29   all of that stuff is happening in the other shortcut.

00:36:32   like that's passing every query

00:36:35   to check for some trigger phrases

00:36:38   that then indicate that something needs to happen.

00:36:41   Is it before the text is then sent to the API?

00:36:45   - Yeah.

00:36:46   - Okay, so it's, for example, if you say,

00:36:49   I want my schedule,

00:36:50   this is running the native shortcut action

00:36:54   for get my calendar events.

00:36:55   - Yes.

00:36:56   - And is including that information in the,

00:37:00   do you have to then format the text in a certain way

00:37:03   to pass it to the API?

00:37:05   Okay, so say like, this person has asked summarize,

00:37:09   da da da da, here are my,

00:37:12   here are the events that you are summarizing.

00:37:14   - That's exactly what's going on.

00:37:16   - Yes. - That's wild.

00:37:18   - And the other thing like on that,

00:37:20   there is a thing that you reference in the blog post

00:37:24   where like you have to do this like

00:37:26   weird opening kind of gambit.

00:37:29   incantation. With the API? Yeah, to say like, here's who I am, here's what this is, here's

00:37:34   what you are. Does that happen at the beginning of every shortcut interaction? Like the beginning

00:37:42   of it, right? That's called a system message in the ChatGPT API. The system message is

00:37:49   something you can use to program a certain behavior for the model. And in this case,

00:37:55   I wanted to program ChatGPT to be informative, helpful, to refer to itself as SGPT.

00:38:06   The system message says, "You are SGPT and you were created by Federico Vittucci as a

00:38:13   fork of ChatGPT that runs inside shortcuts."

00:38:17   This is who you are, and then it says, "This is what you're supposed to do."

00:38:21   What you're supposed to do is provide short but informative answers that users must be

00:38:28   able to listen to in 15 seconds or less.

00:38:31   Do not go longer than that unless I ask you to be more detailed.

00:38:36   That is literally how you program the behavior of the model.

00:38:41   And my reasoning for this was, if people are going to be using this in Siri, you want to

00:38:45   be able to have concise information that you can listen to from Siri in just a few seconds.

00:38:52   But if you want to have the option to get, like, to be more detailed about a topic, or

00:38:57   like, okay, I need you to be more detailed about this specific thing, like, give me all

00:39:02   the steps to a recipe, for example, it needs to be able to do that.

00:39:07   And if you don't program your, the behavior of the model in a certain way, you will reach

00:39:14   some walls, like the model will just refuse to do something for you.

00:39:19   It's weird when you see it in action.

00:39:22   One of the things that I will release in Mac Stories Weekly this Friday is a set of personalities

00:39:27   that I've created, like alternative personalities that I've created for SGPG.

00:39:32   One of them is an absolute...

00:39:34   You guys have seen screenshots of this.

00:39:36   One of them is an absolutely evil personality that is actually much, much worse than care

00:39:43   weather and actually insults you, which I think is quite funny because it's like extremely

00:39:49   unkind. And there's another that is the sort of a digital personification of Steve Jobs.

00:39:57   So chat talks like Steve Jobs and drops Apple references everywhere, even if it's not necessary.

00:40:05   Oh, this is like I'm going to install this one. This sounds great.

00:40:10   For example, today there's a screenshot, let me show you this, there's a screenshot where

00:40:16   I asked the Steve Jobs personality to describe ancient Rome or something like that.

00:40:24   And the Steve Jobs personality said, let me read this to you because I thought it was

00:40:30   funny.

00:40:31   Rome was founded in 753 BC by its first king, Romulus.

00:40:36   grow into a rich and powerful city throughout the next few centuries, similar to how Apple

00:40:42   has grown over the years. The Roman Empire left an unparalleled legacy in terms of art,

00:40:49   architecture and law. The Colosseum, one of its most astonishing creations, could hold

00:40:55   up to 50,000 spectators, almost like a modern Apple event. It's like...

00:41:02   That is incredible. I love that.

00:41:04   So anyway, in weekly I will release the multiple personalities, just if you want to have some

00:41:09   fun with it.

00:41:12   And then I will do a proper making of, like from a technical perspective on the automation

00:41:17   academy, I will do a making of because it's very technically involved, even though it

00:41:23   looks like, oh, it's just a bunch of unnatural language prompts.

00:41:27   But the difficulty was in sort of making all of these happening shortcuts with the limited

00:41:33   tools that you have in shortcuts.

00:41:36   And lastly, I have a question for you.

00:41:37   Yes, go for it.

00:41:39   I feel like I know the answer to this, but I need to ask it anyway.

00:41:42   Does the shortcut have feelings?

00:41:46   Is it possible?

00:41:47   Well, that, but like, could it make changes to things without me wanting?

00:41:54   There's no way it could do that, right?

00:41:56   Because the shortcut, unless you give it the actions, it can't do it, right?

00:42:02   Not in this version. So right now, no, it cannot make any changes on your device.

00:42:08   But in theory, you could have an encoder that could, right? Like in the same way that you have the encoder that looks for words,

00:42:19   you could similarly have one that takes what the AI passes back. And like when it says to me, you should move calendar events, like,

00:42:30   In theory, there could be some actions at least where you could pass that and put it through as a variable and have it do things.

00:42:37   So you have touched upon the very thing that I was going to say.

00:42:43   For version 1.1, one of the things that I will enable, I've already started working on it.

00:42:49   If you run this on macOS, if you run sgpt 1.1 on macOS,

00:42:55   I should be able to make it run code on your computer.

00:43:02   Mmm.

00:43:03   [LAUGHTER]

00:43:05   I don't like it.

00:43:06   [LAUGHTER]

00:43:09   Let me continue. Let me continue. Let me continue.

00:43:13   There will be multiple layers of permission prompts.

00:43:17   It will not do that without your consent.

00:43:21   like I will literally put in a box that says "this is what's about to run, do you want to run it?"

00:43:27   And that's also like how shortcuts work, like you need to confirm the step. But one of the

00:43:34   really powerful things about chatgpt is that you can ask it for code, you can ask it for

00:43:42   terminal commands, you can ask it for apple script, you can ask it... And I know developers,

00:43:48   I know developers today, developers we know who make apps we use without naming names,

00:43:55   I know developers who do this, who get assistance from chatgpt for Zwift UI code or Zwift code.

00:44:01   That's no problem. I mean, this is what that GitHub co-pilot thing has been.

00:44:05   Exactly.

00:44:06   Very good, right? I don't understand why people get so overwhelmed about that.

00:44:09   So, for example, you can already go on the chatgpt website and say, "Hey, I need an

00:44:14   an AppleScript that takes a bunch of files in a folder and renames them sequentially.

00:44:21   Can you give me that script? And you can go on the chat.gpt website, copy that script,

00:44:26   paste it into AppleScript editor on your Mac and run it, and it'll likely work. Why not

00:44:33   make it easier? Like, why not make it happen contextually in Shortcuts? Shortcuts has actions

00:44:38   for the terminal, has actions for AppleScript. You could be like, "Hey, I need some code

00:44:44   this folder, maybe the folder is itself a variable in shortcuts, can you give me an

00:44:51   AppleScript or can you give me a terminal command to do this? You will have to verify what you

00:44:56   received, you will have to confirm what you received, but then the code will just run

00:45:01   without having to do the copy and paste, without having to do the open and different application

00:45:07   for it. Another thing I want to do is what you also suggested Mike, what about rescheduling

00:45:13   reminders, what about rescheduling things? Like, in addition to like telling me how my

00:45:18   agenda is wrong, can you also actually help me make more sense of it? And there's also

00:45:23   one of the other things I'm looking into. There's a demo that I already have on my computer

00:45:29   of SGPT processing tasks from Todoist in addition to reminders. So I got that working as well.

00:45:37   But yeah, the big one I think will be actually run,

00:45:41   like I need some code to do something on my Mac.

00:45:46   Can you give me that code and run it?

00:45:48   So yeah, I am approaching this very carefully, obviously,

00:45:51   because like nobody wants chat GPT

00:45:55   to just being able to arbitrarily return code

00:45:58   on your computer or like imagine a scenario

00:46:01   in which it disobeys your orders

00:46:03   and instead of sending you back a playlist,

00:46:06   sends back the word code and it runs some code you don't want.

00:46:10   Like obviously the shortcut will have to be designed with safeguards in place.

00:46:15   And that's like what I also I think is ultimately safe about this is that chatGPT will never

00:46:22   be able to alter the code of the shortcut on your machine because of how the system

00:46:27   is designed with sandboxing on iOS and iOS and Mac OS, like it's got sandboxing all over

00:46:33   the place.

00:46:34   it needs to be, for example, there used to be a time years ago when you could make a shortcut

00:46:40   just out of pure XML and just convert that XML code into a shortcut. These days if you want to

00:46:47   do it you got to sign that shortcut with your developer identifier or Mac OS. So like it's not as

00:46:54   loose as it used to be years ago, which is why it's ultimately safe, but still

00:47:01   when I do that, once I will unlock that integration,

00:47:05   like, "Yay, run terminal commands

00:47:08   and run AppleScript code on the user's machine,"

00:47:10   there will have to be confirmation steps.

00:47:13   It's the only way.

00:47:15   Otherwise it will be kind of uncomfortable.

00:47:17   - Dangerous.

00:47:18   - Dangerous, yes, yes.

00:47:20   - My favorite thing about this is that ChatGPT,

00:47:23   right, this like newfangled technology

00:47:26   can write AppleScript, which is ancient

00:47:29   and somehow still hanging on.

00:47:31   And there's something about that that makes me laugh.

00:47:34   - I mean, it's kind of the beauty of it really.

00:47:37   It's like, it's, AppleScript is so old,

00:47:39   there's so much of it on the web.

00:47:40   So like, it has no, it will have no problem with it, right?

00:47:44   Like it's just like, it can just do it.

00:47:46   - This episode of Connected is brought to you by Indeed.

00:47:49   When it comes to hiring, you need to trust your gut.

00:47:52   But what if you could give your gut some help?

00:47:54   When you want to find quality talent fast, you need Indeed.

00:47:58   Indeed is the hiring platform where you can attract,

00:48:00   interview and hire all in one place. Don't spend hours on multiple job sites

00:48:05   looking for candidates with the right skills,

00:48:08   when you can do it all with Indeed. Find top talent fast with Indeed's suite of

00:48:13   powerful hiring tools like matching, assessments, and virtual interviews.

00:48:17   If you hate waiting, Indeed's US data shows that over 80%

00:48:21   of Indeed employers find quality candidates

00:48:24   whose resume and Indeed match their job description the moment they sponsor a job.

00:48:29   Indeed Match really is incredible. Candidates you invite to apply through instant match are three

00:48:34   times more likely to apply to your job than candidates who only see it in search according

00:48:39   to US Indeed data. Join over 3 million businesses worldwide who are already using Indeed to hire

00:48:45   great talent. Indeed knows that when you're growing your business you have to make every

00:48:48   dollar count. That's why with Indeed you only pay for quality applications that match your

00:48:54   your must-have job requirements.

00:48:56   Visit indeed.com/connected to start hiring now.

00:49:00   That's I-N-D-E-E-D, indeed.com/connected.

00:49:05   Terms and conditions apply,

00:49:07   cost per application pricing is not available for everyone.

00:49:11   Need to hire?

00:49:12   You need Indeed.

00:49:13   Our thanks to Indeed for supporting the show and Relay FM.

00:49:17   - So I think just in hearing you talk about this Federico,

00:49:21   the elephant in the room is Sherlock Holmes, right?

00:49:25   Like you,

00:49:26   you like just hearing you talk about this.

00:49:31   - Watson, bring the AI.

00:49:33   - This should be, this should be what Siri is, right?

00:49:39   Like it's not hard to see that.

00:49:41   Like this is what we wanted Siri to be, right?

00:49:45   You should be able to have a natural conversation with Siri

00:49:49   and have it communicate to you things about your schedule

00:49:53   in a way that seems natural,

00:49:55   and then do the things that you're hoping to do

00:49:57   of like have it move things around for me

00:50:00   and maybe even take into account the tasks

00:50:03   that I have available to me on that.

00:50:05   Do you know what I mean?

00:50:06   Like all of this stuff should be

00:50:08   what a digital assistant should be.

00:50:12   And I think there is a ticking clock, right,

00:50:15   on is it Google or Amazon that implement

00:50:20   a large language model into their assistants first?

00:50:24   Like which one is it gonna be, right?

00:50:25   'Cause this just feels like the natural evolution

00:50:30   of these assistants, right?

00:50:32   Was it the Wall Street Journal who wrote an article

00:50:34   about like just how stupid like Siri and Google

00:50:39   and this is, right?

00:50:40   Like when you compare it to these things

00:50:43   that like now our home assistants just feel like

00:50:47   they have no brain, right?

00:50:50   So I guess what I wanted to get from you two

00:50:53   is a sense of like, one, do you think Apple should

00:50:57   have Siri become a large language model?

00:50:59   And two, how long until these features

00:51:03   that you're creating in federal could get Sherlock by Apple?

00:51:07   - Personally, I think it's not that they should,

00:51:10   they have to at this point.

00:51:13   They have to respond to this.

00:51:16   Because when you compare Syrian Chad GPT,

00:51:19   the difference is just embarrassing at this point.

00:51:23   And you see, this is no longer a techy thing.

00:51:30   Regular people know about and use Chad GPT.

00:51:34   You see kids in school using it.

00:51:36   It's why so many schools are banning Chad GPT,

00:51:38   because it's writing essays, and it's helping students there.

00:51:42   And there's actually a very practical example

00:51:45   in the article that I put out today

00:51:48   when I asked Chad GPT to give me a list of songs

00:51:52   from the members of Boy Genius.

00:51:54   Boy Genius is a supergroup, so-called,

00:52:00   comprised of Phoebe Bridgers, Julian Baker, and Lucy Dacus.

00:52:05   And Chad GPT gave me a list of singles

00:52:08   by the three artists who are part of the boy genius supergroup. When I asked Siri, Siri

00:52:15   had no idea. And music should be one of Apple's strong domains in theory, right? So that's

00:52:23   because like, Chai GPT is powered by a large language model that can go multiple layers

00:52:30   deep into the meaning of a query, right? It can actually understand what you mean. And

00:52:36   it's the neural network that's behind it can make, I don't want to say infinite levels

00:52:42   of connections, but it goes, it's like the world's biggest AI onion, if you will. Like

00:52:48   it goes layer after layer after layer to understand what you mean. And when you compare that to

00:52:55   Siri, the difference is just embarrassing. So my answer is they will have to respond.

00:53:01   My concern is how long will it take them to respond?

00:53:06   That's exactly my question.

00:53:08   When you see a lot of what's coming out now, including the incredible thing that you've

00:53:12   built, it's standing atop of open AI's work.

00:53:16   I don't see Apple doing that.

00:53:17   I see Apple wanting to go it alone.

00:53:21   And that's going to take years.

00:53:23   Well, and also, you would assume Apple probably want to run on device if they're going to

00:53:28   do this?

00:53:29   Yes.

00:53:30   And how good can the model be?

00:53:34   if it has to be small enough to run on device, right?

00:53:38   Because, I mean, I know we're in it,

00:53:41   really we are in the infancy of this technology

00:53:43   at the moment, but like all of the small models

00:53:46   are just not as good, right?

00:53:49   So like this is one of the issues

00:53:51   with Google Bard right now is Google are running it

00:53:55   currently on a smaller version of their models

00:53:57   and they're getting ready to use one

00:54:01   of their more powerful models.

00:54:03   But that's why, comparatively,

00:54:06   it just doesn't seem as impressive.

00:54:08   - So I think what's really concerning if I were Apple,

00:54:13   and what I think people are misjudging at the moment,

00:54:18   is that you look at chat GPT,

00:54:20   or you look at BARD by Google,

00:54:22   and you just see a little website

00:54:24   where you go and you type in stuff,

00:54:26   and chat GPT, of course, also has an API.

00:54:29   But what I think the real vision for this is,

00:54:33   Then Microsoft, we spoke about this before,

00:54:36   actually understood the potential of this.

00:54:39   The real potential here, the real vision here, I think,

00:54:42   is not like in the little website, in the little chat box

00:54:46   where you go and you type in stuff

00:54:48   and the thing tells you the recipe for carbonara.

00:54:52   The real potential is when you start thinking about this,

00:54:55   I was actually discussing this idea with a friend

00:54:58   of the show, Steve Transmith today.

00:55:01   The potential for this thing is when the language model, in this case, ChatGPT, when the language

00:55:08   model becomes the software, right?

00:55:13   Like, that's what Microsoft is doing.

00:55:17   They are infusing their software with the capabilities of the language model so that

00:55:25   it actually looks into your Word documents, it actually looks into your web browser, it

00:55:29   into your email client and it's right there, okay? It's not just a chat box, it's not just

00:55:36   Bing, it's right there in the apps that you use. What if you project this, you know, five

00:55:43   years from now, maybe that's even too long a timeline, maybe even two years from now

00:55:48   is enough.

00:55:49   I mean, honestly, four weeks could be, you know, like with the way this stuff is moving.

00:55:53   I guess what I'm wondering is how long until chat-gpt, or I don't know, the GPT-5 model

00:56:00   that should come out in late 2023, like how long until this thing writes its own OS from

00:56:09   scratch?

00:56:10   Like how long until GPT is able to display little interfaces for you and be like, "Hey,

00:56:20   I need a window to compose some markdown text.

00:56:23   And it's like, OK, here's the app you're looking for.

00:56:27   Like if you're Apple, and if you're looking into this,

00:56:30   I mean, this thing knows how to write code.

00:56:33   It knows what code means and how to check for errors.

00:56:36   How long until it can write its own code and write its own UI,

00:56:41   write its own functions, and it's like, hey,

00:56:44   I need an audio editor for a podcast.

00:56:47   And it's like, here you go, I made an app for you.

00:56:49   And it's composed at runtime, and you

00:56:57   have an app made just for you.

00:56:59   How long until that happens?

00:57:00   And if you're Apple, how scary is this for you?

00:57:03   As a company who rightfully so until so far,

00:57:07   you pride yourself on the idea of--

00:57:11   I don't want to say artisanal software, but also kind of

00:57:13   that.

00:57:14   very designed and custom tailored software.

00:57:22   This must sound like a nightmare to Apple,

00:57:25   this idea of it's a large language model

00:57:30   able to write software.

00:57:32   If that's the timeline, five years from now,

00:57:35   is an app store even necessary anymore?

00:57:39   So that's--

00:57:41   I think if in that scenario,

00:57:44   I don't know if they would be able to run the App Store

00:57:47   because people would be submitting apps.

00:57:49   - Yep.

00:57:50   - And I am assuming it's already increasing.

00:57:54   Like the amount of apps that are being submitted

00:57:56   to the App Store has probably increased

00:57:59   because of the fact that, as you mentioned earlier,

00:58:02   people are able to use these tools today

00:58:05   to help them write SwiftUI apps.

00:58:08   - Yep.

00:58:08   Like that is like year one in many ways.

00:58:12   Like, I don't know that GPT-1, GPT-2 was used years ago,

00:58:16   but like GPT-3 and 3.5 really were the turning point

00:58:20   for the large language model.

00:58:22   - Yeah, this all kicked off in like November, December.

00:58:24   - Yeah.

00:58:25   - Like that's when this really like truly started.

00:58:28   Everything else is like-

00:58:29   - And the timeline is accelerating.

00:58:30   - You've got like BC and AD, right?

00:58:33   Like there was a point at the end of the year,

00:58:36   probably that day when

00:58:38   ChatGPT was launched.

00:58:40   That is zero.

00:58:42   Everything like I even I'll say

00:58:45   Dali and all that kind of stuff.

00:58:47   Do you ever hear about the image

00:58:49   generation anymore?

00:58:50   Like, no. When was the last time

00:58:52   you heard someone like talking about

00:58:54   stable diffusion, like as a product,

00:58:56   not as just like a thing that exists?

00:58:58   Right. Like that was and that

00:59:00   lasted for like two months.

00:59:02   And then we have continued since

00:59:05   then. But it was the pop photo.

00:59:07   There was the fake, right.

00:59:09   But like you just but now it's just

00:59:11   like that was just made of AI or

00:59:13   like most people like if you ask

00:59:15   someone, well, how did that pop

00:59:16   photo come around?

00:59:17   Some of them someone use chat GPT

00:59:19   to make that.

00:59:20   But we know they didn't.

00:59:21   But like we're at a point now where

00:59:23   it's like that is it.

00:59:24   That pop photo, by the way, 100

00:59:26   percent fooled me.

00:59:27   Like, yeah, 100

00:59:29   percent. Because this is the issue

00:59:31   with that pop photo specifically,

00:59:33   is it looks

00:59:35   it's not absurd enough, right?

00:59:37   I could imagine the pope wearing a coat like that.

00:59:39   Because it's cold.

00:59:42   But it's just a white puffy coat, like it's not, you know,

00:59:45   if it had like diamonds on it, right?

00:59:48   Like then I'd be like, oh, I don't believe that.

00:59:50   It's a very hype pope.

00:59:52   Yeah, I just feel like that,

00:59:54   all right, if you take away that one thing that I said about when was the last time,

00:59:58   that we've moved on from image generation as like,

01:00:01   Yes, by and large, yes.

01:00:04   because now it's like Bing just does image generation,

01:00:08   like you know, that's going to be what happens to image generation

01:00:11   is you'll ask chat GPT to make you an image,

01:00:13   which GPT-4 can do some of this stuff, right?

01:00:15   So like the image generation now will be sucked into the text-based language,

01:00:19   large language models, I feel like,

01:00:21   as like just a thing that they can ask another model to do for them.

01:00:26   Like this is where it's changed and that's where we are now.

01:00:29   Like we are in this wall of large language models,

01:00:31   Like that's what this technology is.

01:00:36   - I don't know what Apple is.

01:00:38   Zach also pointed out in the discord,

01:00:40   something that we discussed, I think,

01:00:43   I'm sure I posted it on Mastodon a few weeks ago.

01:00:47   Did that rumor of Apple wanting everybody

01:00:52   to be able to create apps on the Reality Pro headset,

01:00:56   the upcoming headset, using Siri.

01:00:59   And we also left about,

01:01:00   We all made fun of it, but like that's exactly what I was talking to regards to like, what

01:01:06   if you used AI to write software for you?

01:01:09   And I'll go one level deeper than that.

01:01:14   Apple should use shortcuts as the system behind this.

01:01:18   I have, I'm sure I tweeted this last year.

01:01:23   You can go find a tweet of mine where I said something along the lines of, "Hey, imagine

01:01:30   the old feature from Beats Music, the sentence, but applied to making shortcuts with natural

01:01:38   language.

01:01:40   And if you think about that, like, think about how sort of prehistoric it is that I'm still

01:01:47   building shortcuts with a drag and drop visual editor.

01:01:51   I should be able to say, "Hey, system, make me a shortcut that takes my five recently

01:02:00   edited reminders and renames them sequentially," or something like that.

01:02:05   Like I should be able to write shortcuts in natural language.

01:02:08   I shouldn't have to build shortcuts with drag and drop.

01:02:12   And if you...

01:02:13   That is a great point.

01:02:14   If you take that one step beyond, then yeah, you should be able to write your own little

01:02:19   apps for the headset via voice because you wouldn't want to use a drag and drop editor

01:02:25   or you wouldn't want to code on the headset itself, I'm assuming.

01:02:31   But yeah, if I were Apple I would be concerned about all of this and I have no idea where

01:02:42   they stand at the moment.

01:02:44   If they have their own large language model or not, obviously they don't talk about this

01:02:50   stuff.

01:02:51   What concerns me about Apple is just looking down the street at Google and seeing BARD

01:02:56   and how kind of janky it is compared to what OpenAI has been able to do, right?

01:03:03   And like, Google's fine doing it on the server.

01:03:05   Google's fine scraping a bunch of content to do it.

01:03:08   And I just, I think generally in Apple analysis, it's kind of lazy to fall back on, well, Apple

01:03:18   won't do this because of their values or their ideals, right?

01:03:21   Like, oh, Apple's always been bad at gaming, so they always will be bad at gaming because

01:03:24   they don't understand it.

01:03:25   Like, that's true, but also like we could go further in our analysis and be a bit more

01:03:29   specific.

01:03:33   But I do think this is going to be one of those things where they run headlong into

01:03:38   what they say is important.

01:03:40   And that's not to say that, I mean, yes, they've had a data breach and who hasn't, but it's

01:03:47   not to say OpenAI is like doing a bunch of wrong things.

01:03:50   I honestly don't know enough to say that definitively one way or the other.

01:03:55   But I do imagine that Apple looks at it and thinks, "Oh, that's a little icky.

01:03:59   That's a little icky.

01:04:00   It's not something we want to do, not the way we'd want to do it.

01:04:03   And at some point, do you have to break from that model?

01:04:09   I mean, in ways they do break from that model with Siri, right?

01:04:12   Like a lot is done on device, but there's still a lot done in the cloud.

01:04:17   And all you gotta do is put your iPhone in airplane mode and see what Siri can and can't

01:04:21   do, right?

01:04:22   You find the limits very easily.

01:04:24   But this does feel different in a way.

01:04:28   And I think you're right Federico, I think they have to address it because if this is

01:04:32   the future, and I'm more and more convinced that it is, I'm not totally convinced but

01:04:37   more than I was a month ago, then they do have to deal with it because they are already

01:04:43   in the space with Siri.

01:04:45   That New York Times article, it wasn't Wall Street Journal, it was New York Times.

01:04:48   The New York Times article had a lot of good thoughts in it, but the biggest thing I walked

01:04:51   away with was, you know, voice assistants and like LLMs, they're kind of in the same

01:04:57   category in that you just give a computer a question or a prompt and it

01:05:03   goes off and does things, right? Like at the high level they are the same thing.

01:05:07   They're already getting the floor wiped with Siri, right? So I don't know.

01:05:12   They have to get into this. Like, why are they moving into AR and VR?

01:05:17   This is just that, right? Like, it is an emerging technology,

01:05:24   which realistically today large language models feel more like the future than they

01:05:31   are in VR but it's too late. Yeah, Apple's headset is due out at any time.

01:05:38   I mean it's too late for every company that's getting into it. Now you've kind

01:05:41   of just you got to go to see this through and hope that you can connect

01:05:44   these two later on like what Federico was just saying right like maybe you can

01:05:48   say like oh we actually can but if you today like what wire is technology like

01:05:53   this is where it's going we've all forgotten about crypto we've all forgotten about web3

01:05:57   those things I feel like we could all look at those and be like that's not gonna work

01:06:02   in the way that it's moving this though this does not feel like a bubble this does not feel

01:06:09   like a fad like there is a there is there's a thing here like yes it is problematic it is

01:06:15   is existentially problematic potentially, right?

01:06:20   But this is powerful technology

01:06:24   that can be used in really interesting ways.

01:06:27   They have to do this, but it's how, right?

01:06:31   Like I've heard a bunch of people reference

01:06:33   the stable diffusion thing that they did, right?

01:06:35   Where they enabled the ability to run stable diffusion

01:06:40   locally on Apple Silicon.

01:06:43   So like, there is a possibility that Apple have been smart enough that it just so happens

01:06:47   that we can finally use that neural engine for something, right?

01:06:51   And it's going to be their large language model.

01:06:54   But they have to be here soon.

01:06:57   They can't walk in four years late and two years behind.

01:07:03   Honestly, I think Federico, I think your work, and I cannot tell you how impressed I am with

01:07:09   this, it has blown my mind to see you develop this.

01:07:12   Thank you.

01:07:13   But what you have shown me in this is exactly what Mike just said.

01:07:17   Mike said it's not a fad.

01:07:18   It's not.

01:07:19   You could look at crypto and other things and see how they would fail.

01:07:21   The reason this is different, I think, or the lesson I'm learning from this is that

01:07:26   this can be a general computing tool.

01:07:28   Crypto is never going to be the blockchain.

01:07:30   Maybe had other applications, but you're very problematic in other ways.

01:07:34   You built this into a shortcut that anyone can run and do stuff.

01:07:39   And it's the first time in a while that I think a really new type of technology has

01:07:44   come on so quickly that could also be so broad.

01:07:48   Right?

01:07:49   There are always things on the nerdy end that like, come on fast and nerds get excited.

01:07:55   Maybe eventually they trickle down to everybody else.

01:07:57   That's how voice assistants were in the beginning.

01:07:59   Siri started life as a third party app you could talk to, and then eventually became

01:08:03   general computing when it came on the iPhone, it can do everything it can do now.

01:08:08   This feels like it's on a similar track where other things that we've seen over the last

01:08:13   couple years that have come and gone never had the potential to follow that track.

01:08:18   And your shortcut, honestly, is like example number one in my book now of, "Oh, this could

01:08:23   be a thing that people just use in their everyday lives."

01:08:27   And that's wild to me.

01:08:29   I have one last question for the two of you here.

01:08:32   All right, so let's all just for the sake of this, and I think we're all pretty much

01:08:36   an agreement that Apple will have some kind of large language model thing in the future,

01:08:41   right? Like, I feel like it's just inevitable. They're not stupid. That they, I mean, there's

01:08:45   been reports that they are already knee deep in this stuff, right? And have been for a while,

01:08:50   like all large tech companies. Do they call it Siri? Probably not. You think they cut their

01:08:58   losses on Siri? Yeah. They got their money's worth out of it. It's been what, 12 years of Siri?

01:09:06   they probably want to start fresh with something that doesn't have the implications of the

01:09:14   baggage of the joke, you know, like, ah, it's Siri.

01:09:17   What I'll say is Bing. It's still called Bing, although maybe it's fair to say that Microsoft's

01:09:26   actual brand for this is Co-Pilot. Like that is that's that's taking time, which is a great

01:09:33   brand better than Bard. I mean, geez, Steven, what do you think?

01:09:36   Yeah, I think it gets a new brand or some sort of sub brand. Already the Siri brand is suffering from its early days.

01:09:45   Because Siri is way better than it used to be. It's not great all the time, but it's miles better than it was at launch. And in the years after. But Siri is one of those things. People are like, Oh, I use it on my iPhone six, and it was terrible. And I haven't touched it since.

01:09:59   And it would be a shame if Apple doesn't do any of this, but I just want to

01:10:03   reiterate this because what OpenAI doesn't have is they don't make the computers

01:10:10   that we use.

01:10:11   They don't make the phones that we use.

01:10:13   Imagine a large language model made by Apple integrated with the computers that

01:10:19   we use.

01:10:19   Yeah.

01:10:20   Well, we see that now a little bit with a co-pilot and Microsoft.

01:10:24   Yes.

01:10:25   And you see a little bit with co-pilot.

01:10:27   you see a little bit of a little bit of a little bit of a little bit with my shortcut,

01:10:31   which is inspired by that idea. But imagine if Apple really went for it, which is why

01:10:39   I am excited, but also, I mean, sad that we're likely not going to see any of this this year

01:10:46   or probably next year or the year after that. Yeah, we'll see.

01:10:51   Knowledge Navigator, man. It's going to happen.

01:10:53   Yes.

01:10:54   Yes.

01:10:55   It's going to happen. Knowledge Navigator, any day now.

01:10:57   That's a deep cut.

01:10:58   1987.

01:10:59   Do we get to 2027?

01:11:02   I don't know.

01:11:03   I don't know.

01:11:04   Look, I honestly think your timeline is not too incorrect.

01:11:08   Before we go, Federico, what's coming up next in Automation April?

01:11:13   Thank you for asking.

01:11:14   So the contest to submit your shortcut is live at shortcuts.maxstories.net.

01:11:20   You can either log in with your cloud account or make a free Max Stories account.

01:11:24   You can submit up to two shortcuts.

01:11:27   There's gonna be multiple categories,

01:11:29   but there's gonna be the big one,

01:11:31   the best overall shortcut

01:11:33   where you can win a loop deck live S.

01:11:36   It's like a little, like a stream deck like device with...

01:11:41   - Is it John's old one?

01:11:43   - No, no, no, it's a new one.

01:11:44   It's not a new one.

01:11:45   - Okay, I was gonna say, this isn't like a yard sale.

01:11:48   I would want, if I won, I would want John's one.

01:11:50   - I mean, you can put in a request if you want the John.

01:11:53   - I'll put in, okay.

01:11:53   Give me the Jon version. I want the knobs that Jon has touched.

01:11:57   Yes.

01:11:58   Give me Jon's knobs.

01:11:59   Or, and, or, and, in addition to the loop deck live S,

01:12:05   you can also win a CalDigit TS4 Thunderbolt dock.

01:12:08   So that's the best overall shortcut.

01:12:10   You can submit up to two shortcuts and you have until April 17th.

01:12:15   You have, what, 12 days left to submit your shortcuts.

01:12:19   And then what's coming next,

01:12:21   I'm gonna do a workshop in Discord

01:12:25   to teach people a bunch of techniques about shortcuts.

01:12:28   I have my making gov for SGPT.

01:12:32   We're gonna have interviews with developers

01:12:35   in MacStories Weekly,

01:12:36   and more episodes of App Stories about shortcuts.

01:12:40   So all shortcuts themed for the rest of the month.

01:12:43   There's a contest, and it's gonna be fun.

01:12:47   - Well, you started off with a bang, that's for sure.

01:12:49   Given how, like I was checking Mastodon now, and given how successful it seems to be,

01:12:55   I think I will have to revise my plans for other articles I want to do and maybe instead just focus on

01:13:02   quick updates to SGPG. I'll think about that.

01:13:06   Give the people what they want.

01:13:08   Yeah, I'm leaning towards that.

01:13:10   What the people want is for their Macs to accidentally have their hard drives deleted

01:13:14   by your shortcut.

01:13:16   Yeah.

01:13:16   That's what they want.

01:13:17   Yeah.

01:13:17   Get the people what they want, Federico.

01:13:19   Get the people...

01:13:21   Get the people...

01:13:21   Hey, imagine if I design a really evil personality

01:13:24   and I give it the instructions to, you know, whatever command people ask you,

01:13:30   give them the wrong command to wipe their computers.

01:13:33   That's terrible.

01:13:34   Don't even joke about that.

01:13:36   No, I would never do that.

01:13:38   But you could.

01:13:39   But I could.

01:13:45   and giving people what they want and what they want is a freshly formatted drive

01:13:49   [laughter]

01:13:51   chat gpt meet apfs

01:13:53   [laughter]

01:13:55   if you've won a f- [cough]

01:13:57   if you've won a- [laughter]

01:13:59   ooh heee

01:14:01   get chat gpt to do the outro for you

01:14:03   I should

01:14:04   oh do it!

01:14:06   okay, shortcuts

01:14:08   [laughter]

01:14:09   give me an outro for the connector podcast

01:14:11   right?

01:14:12   Maybe say like, and say where to find all the hosts online.

01:14:16   For the connected podcast and include where to find them online.

01:14:26   Done.

01:14:27   I'm just going to read whatever this box says.

01:14:31   Thanks for tuning in to the connected podcast.

01:14:33   If you've enjoyed the show, you can find more episodes on their website, relay.fm/connected.

01:14:40   You can also follow the host, Federico Vittucci, Steven Hackett and Mike Hurley on Twitter.

01:14:45   Don't forget to subscribe to the podcast on your favorite podcast app and never miss an

01:14:49   episode.

01:14:50   Thanks for listening.