PodSearch

Connected

509: Get Good

 

00:00:00   [Music]

00:00:07   From Relay FM, this is Connected.

00:00:10   Episode 509, today's show is brought to you by NetSuite, Squarespace and Ecamm.

00:00:16   I am your Annual Chairman, Mike Hurley, and I have the pleasure of introducing to you Federico Bertucci. Hi Federico.

00:00:23   Hello Annual Chairman, how are you?

00:00:25   I'm very good, I'm very good indeed.

00:00:28   We are without the American today.

00:00:31   We are, we are. We're celebrating Independence Day by being independent from Stephen.

00:00:38   Right, right, yes. We are free from the shackles of, you know...

00:00:45   American tyranny.

00:00:47   American tyranny, yes. Europe is finally independent again. Wait, is that...

00:00:53   I think so, I think that's what it means.

00:00:55   Right, do I have it backwards?

00:00:57   Do you know, because this has been a funny thing to kind of poll some of my friends,

00:01:01   do you know that we have an election tomorrow in the UK? You're aware of this?

00:01:06   Wait, for what? No.

00:01:07   Prime Minister, we have our German election is tomorrow. This is very strange to me.

00:01:11   Wait, what? Nobody knows about it.

00:01:13   Yes, everyone I keep talking to about this doesn't know about it, and I'm not sure what this says.

00:01:20   I knew everything about the France elections, which we've been following in Italy,

00:01:28   like it's been all over the news. I literally had no idea you had elections tomorrow.

00:01:33   For the Prime Minister!

00:01:34   For the Prime Minister, well, we'll see, our election, the German election,

00:01:38   you're voting for your Member of Parliament, and then whoever has the most members of Parliament,

00:01:45   like the political party, they become the governing party, and then they say who the Prime Minister is.

00:01:49   We don't vote for the Prime Minister directly.

00:01:52   I had no idea, and I can tell you, I do read Italian news websites and watch the news every day,

00:01:59   multiple times a day. I can tell you, it hasn't been mentioned.

00:02:04   So I have a theory about why I don't think anybody knows about this.

00:02:07   This election is not exciting at the moment because it is essentially a foregone conclusion what's going to happen.

00:02:14   And so that's my theory on it anyway, and I think you will know about it on Friday

00:02:20   because it's most likely to be the biggest landslide victory in British politics is the way it's looking right now.

00:02:26   Which is...

00:02:27   For who?

00:02:28   Labour party, who are a more liberal party.

00:02:31   So the Conservatives are going to be obliterated is how it's looking right now.

00:02:36   And that is a bit of a change, right, from a few years ago.

00:02:40   Well, yeah, I mean the Conservatives won the last four elections, which is about 15-20 years.

00:02:46   Oh, wow.

00:02:48   Which tends to be how it goes here.

00:02:50   Personally speaking, that seems like good news.

00:02:54   Yes, I think so.

00:02:55   It is for me.

00:02:57   Because, you know, that's kind of...

00:02:59   Labour have more policies that align with my views.

00:03:03   So, I mean, that's, you know, obviously I've already voted.

00:03:07   It doesn't bother me. I've already voted for Labour. I voted by post like two weeks ago or something.

00:03:12   I wish we could do that in Italy. The system is so antiquated here.

00:03:15   Oh, we moved to postal votes during Covid and it's one of the best decisions I've ever made.

00:03:20   Because now I just get to... Everything just comes to me in the mail and we just tick it and send it back.

00:03:25   Yeah, yeah. Well, honestly, I don't think I am a misinformed or disinformed person.

00:03:32   Like, I follow the news and I had no idea.

00:03:36   Yeah, but this is... It's interesting. I just feel like nobody seems to really know about it.

00:03:42   And I have my theories about why that's happening, but it is just really interesting.

00:03:47   I think, in Europe especially right now, there's a swing towards the right.

00:03:53   And I think that because Britain's swinging in the other direction,

00:03:57   it doesn't necessarily fit in with the, like, consensus news stories.

00:04:03   But I think that come Friday, it will be world news.

00:04:07   Well, I mean, obviously it will be world news. I'm a prime minister, so I freaking hope it will be world news.

00:04:12   Just like, no one bothers talking about it at all.

00:04:16   But I think that if it goes the way it's going to go, which is like a landslide,

00:04:22   then everyone's going to know about it. But anyway, you're not here for that.

00:04:27   No one's here for this. They won't follow up.

00:04:30   Federico, how would one manage six or seven tabs?

00:04:34   Ah, this has been... This is... It's not in the past. This is still happening.

00:04:40   This is a journey, OK? And I heard from a few people.

00:04:45   There's even a surprise twist at the end that I will mention in a couple of minutes.

00:04:50   There are two main existing options that I want to mention.

00:04:55   The first one that was sent to us, also the developer reached out to me over email,

00:05:01   is, and I should have thought of this, is WebRoulette by impending the folks who make it clear.

00:05:07   Now, WebRoulette, this is almost the perfect idea for me,

00:05:13   but there's a few things that prevent me from using WebRoulette for my six or seven websites

00:05:19   the way that I want it to behave.

00:05:22   The idea is you set up WebRoulette, you enter your favorite websites,

00:05:26   and then you just swipe through them.

00:05:29   And it's like there's basically no UI Chrome around the web page.

00:05:34   You just swipe and you're like a roulette. You're cycling through your favorite tabs.

00:05:39   And it's almost perfect because I thought, OK, so I'm going to set up the URLs that I need,

00:05:45   and I'm going to cycle through all the websites that I want to read.

00:05:49   Now, my main issue here is that by default it looks like when you tap on a link

00:05:57   on one of my six or seven websites, those links they always open in new tabs.

00:06:05   Now, WebRoulette doesn't have, like I mentioned, it doesn't have any UI

00:06:10   which works both in favor and against the experience of the app, I think.

00:06:15   So let's say that I'm reading Platformer and I click on a link that Casey added

00:06:21   in one of their stories and the link opens in a new page in WebRoulette.

00:06:26   Now, the app has no concept of like favorites or pinned tabs or main tab.

00:06:34   It just opens a new page. And then at that point I need to cycle through all the pages,

00:06:40   including the ones that opened from the links.

00:06:44   So this is almost perfect. I wish there was a way to say, let me check out this link,

00:06:51   but then return to the previous one and close the link that I opened,

00:06:55   like some concept of hierarchy between tabs, because really what I want is

00:07:01   this is the main tab and I always want to return to it.

00:07:06   Like I always want to return to the homepage of Platformer, to the homepage of Chorus FM,

00:07:12   to the homepage of Matt Burchler website, birchtree.me.

00:07:17   Like I always want to return to the main view, but WebRoulette doesn't have a concept of like,

00:07:22   OK, this is your main tab and these are your children tabs.

00:07:27   I like to think that there are some passionate ones out there who are like

00:07:32   noting down every time you mention one of these websites and they're like planning out

00:07:37   like what are the six, right? Like it's almost like Thanos Infinity Stone style.

00:07:42   Like there's like one left and they're just not sure what it is.

00:07:45   You know, like you every time you mention like, ah, that's one of them, you know,

00:07:48   and they're just like ticking them off one by one.

00:07:50   Yeah. Yeah. The next item in the list,

00:07:55   which I think maybe what I'm going to use for now,

00:08:00   we'll get to this in a minute, is Kish Browser.

00:08:04   I hope I'm pronouncing it right.

00:08:07   I mean, I'm assuming so. Like that is the name of food, which is Kish.

00:08:12   I don't know how it could be.

00:08:16   Yeah. Kish Browser, Nelian previously reviewed on Mac stories.

00:08:21   There's an iPad version in test flight testing. Right now it's iPhone only.

00:08:27   This is a very customizable browser to the point where like

00:08:32   it's customizable, but not in the way that iCab, for example, is

00:08:37   to an extent, maybe too customizable to the point where like it looks complicated.

00:08:42   Yeah. Kish is customizable. Lots of settings, but maybe just the right amount of settings.

00:08:48   And what I like about this is that you can mark certain tabs as your favorites.

00:08:53   And when you open a new tab, you can just see your favorites.

00:08:57   Otherwise, I mean, it's a regular browser, so it opens links in separate tabs.

00:09:02   You can search your history.

00:09:05   But what I like here is that, first of all, I can have like the idea of it's like a

00:09:12   miniaturized Safari that I can use just for this six or seven tabs.

00:09:19   It doesn't have to be my main browser and I can customize it in a way that I like.

00:09:25   For example, I can customize the bottom address bar.

00:09:30   So there's multiple toolbar styles that you can choose from.

00:09:35   And one of the styles that you can use is one with the address bar

00:09:40   at the bottom that shows you both the URL and the title of the story.

00:09:46   So in the same toolbar, I can see both the address and the title of the article

00:09:51   and the favicon of the website. And you can tweak all sorts of settings,

00:09:56   like colors, tint, like accent colors for like the address bar.

00:10:02   You can move around the buttons. You can do all kinds of things.

00:10:05   And it's very native to iOS. Like it feels like a very native app, which I appreciate.

00:10:12   It's iPhone only, which right now, you know, a bit of an issue.

00:10:16   There's an iPad version in TestFlight, but doesn't have all the features of the iPhone app.

00:10:22   So it's a bit of a work in progress.

00:10:24   But I think if I were to pick an app that is on the App Store right now,

00:10:29   like publicly available, I would go with Kish.

00:10:32   However.

00:10:34   However.

00:10:36   What we wanted to happen happened, really.

00:10:38   What we wanted to happen happened.

00:10:41   A friend of the show, listener of the show, indie iOS developer Jonathan Ruiz created what could be,

00:10:51   and I'm not exaggerating here, what could be App of the Year material.

00:10:56   So right now this app is in TestFlight and it's called the Tchee Tabs.

00:11:02   That's it. App of the Year. Done. We can stop.

00:11:06   Tchee Tabs is a browser for your favorite tabs, and it has exactly the features that I want.

00:11:19   Very simple UI, available on the iPhone and on the iPad.

00:11:24   On the iPad, it uses a sidebar. On the iPhone, it presents you with this start page.

00:11:29   That is a list of your six or seven tabs.

00:11:32   I mean, you can go longer than six or seven.

00:11:34   I am going with six or seven.

00:11:36   You can see your favorite websites, then you tap on one, and it opens in Safari View Controller.

00:11:42   That's it.

00:11:43   It syncs with iCloud.

00:11:45   It lets you rename websites.

00:11:47   It shows you the favicons of the websites.

00:11:50   And that's literally it. There's nothing else.

00:11:55   It's perfect.

00:11:56   It's a simplified launcher for websites that uses Safari View Controller.

00:12:01   I love it.

00:12:03   My only issue, and this is in TaskFlight, and I'm guessing that Jonathan listens to the show, given the existence of this app.

00:12:10   I don't think you could be sure of anything more.

00:12:13   Yeah.

00:12:14   It just so happened to come across this idea.

00:12:19   My only problem now is that, and this is actually like, it's not even Jonathan's fault, to be fair.

00:12:26   So there's plenty of websites these days that use the CMS called Ghost.

00:12:35   You've heard of the Ghost CMS, like a WordPress?

00:12:39   I'm aware of this mostly because I know that a lot of people moved to Ghost from Substack.

00:12:44   After Substack. Yeah, exactly.

00:12:46   One of the websites that moved to Ghost is Platformer by Kaysing Newton.

00:12:51   Birch Tree, another of my favorite websites, also uses Ghost.

00:12:58   Now, I have a subscription to both of those websites.

00:13:02   However, those subscriptions, those membership programs, they don't work like Memberful, for example, which is what we use at Mac Stories and Relay,

00:13:10   where you create an account and you have a username and you have a password.

00:13:14   You sign instead with Ghost, you sign up for those memberships and you never create an account.

00:13:22   You just put in an email address.

00:13:23   I hate this so much, by the way.

00:13:25   And every time you want to log in.

00:13:27   I hate it.

00:13:28   Say, on a new device or with teachy tabs in a new browser, you need to enter your email address and you get emailed a login link.

00:13:38   Login links suck. I hate it.

00:13:40   They suck.

00:13:41   I hate it.

00:13:42   It's horrible.

00:13:43   Stop being a coward and make a password system.

00:13:47   Or give me a passkey or something.

00:13:49   Anything.

00:13:50   Anything.

00:13:51   Really, just don't email me a login link.

00:13:53   Now, the problem is that because teachy tabs uses Safari View Controller, Safari View Controller doesn't let you paste a login link.

00:14:06   So for me right now, there is no way to log into Platformer or Birch Tree because they use Ghost with my login link with teachy tabs,

00:14:17   because there's no way for me to paste and navigate to the login link.

00:14:21   So these login links, are they different every time?

00:14:24   They are different every time.

00:14:26   OK. All right.

00:14:28   They're different every time and they expire after 24 hours.

00:14:32   So, yeah, I don't know how Jonathan could fix this, given the limitations of Safari View Controller.

00:14:41   But otherwise, I mean, obviously, thank you, Jonathan, for making this happen.

00:14:46   I think, you know, this app is very much in beta right now.

00:14:51   It's like version 0.5 is in TestFlight.

00:14:54   And Jonathan, I should mention, is also the developer of an excellent utility called Bridges, which is a very niche app for like managing links in all kinds of formats.

00:15:06   Like you can copy links with rich text, markdown HTML JSON even, if you need to do that.

00:15:13   And it's got excellent shortcut integration.

00:15:15   So I have really high hopes for teachy tabs to become like a very niche web browser utility that also happens to have shortcuts integration.

00:15:24   But, I mean, obviously the problem is these Ghost publications.

00:15:28   I don't know how to log into them using Safari View Controller.

00:15:33   So, yeah.

00:15:35   But there are, yeah, as you can tell, there have been developments for my six or seven tabs.

00:15:39   And I'm glad I brought this issue to the show because now, look, we're making the economy move, right?

00:15:46   A new app is happening.

00:15:48   We are.

00:15:51   We're making the economy move.

00:15:53   Connected is good for the economy.

00:15:55   Okay?

00:15:56   So we're creating jobs.

00:15:58   Oh, my God.

00:16:02   I think I might have blown out my microphone.

00:16:04   That one really got me, man.

00:16:06   Oh, boy.

00:16:07   So I'll keep you posted on how this goes.

00:16:09   Speaking of moving the economy, the Vision Pro App Store is now available in the UK.

00:16:17   Thank you to listener Tim, who let me know.

00:16:20   I have now swapped over to my UK Apple ID and it's going the way I thought it was going to go, which is very sad.

00:16:28   I have to delete and redownload every app that I want to use because otherwise I won't get updates on them anymore.

00:16:34   So this is going to be a project for me one day where I essentially set it all up.

00:16:39   I will essentially be setting up my Vision Pro again.

00:16:42   So, you know, pray for Mojo.

00:16:45   I'll get it done, but it's not going to be fun.

00:16:48   But I'm happy now that I don't have to do the management of a US Apple ID, you know?

00:16:54   You know, I had the thought today of like, should I ever maybe consider stop using my US Apple ID, just consolidate everything into an Italian Apple ID?

00:17:11   And then I thought about it for 30 seconds while I was driving and realized, no, I'm never going to do it.

00:17:16   There has never been a worse time for you to do that than right now.

00:17:19   Yeah.

00:17:20   Because, you know, we're going to talk about AI again.

00:17:25   I'm just priming the pump so everyone knows we're talking about AI today.

00:17:29   But like if you at least want to see what Apple intelligence features are like, right, as part of your review process, you may need your US Apple ID to do that.

00:17:40   Or any potential feature, right? Because it's not even just that if you want to use screen mirroring, maybe you'd need your US Apple ID to do that.

00:17:51   So I think right now would be the one time that you definitely should not do it.

00:17:56   But yeah, this is in my future now.

00:17:57   I'm happy that I have the Vision Pro App Store in the UK.

00:18:00   I'm especially looking forward to my solo top adventures that will be in a couple of weeks time.

00:18:06   I have a shipping of the next Friday, I think, for my second solo.

00:18:11   Okay.

00:18:12   Because I think that is when I think the 12th of July is the Vision Pro launch day in the UK.

00:18:18   And so I'll be getting it then.

00:18:20   But yeah, I will now start kind of plugging away on moving my Vision Pro over to be like just a UK Vision Pro, which I'm excited about.

00:18:26   Nice. Nice.

00:18:28   I wanted to mention, highlighting journalism on Mastodon, it's a post that John had on MacStories.

00:18:36   It seems like that you guys are included in like some kind of beta program with official Mastodon about...

00:18:44   So see if I got this right.

00:18:47   Essentially, any link that is posted on Mastodon can have a byline associated with a Mastodon user.

00:18:57   So even if it's shared from an account that isn't theirs, they will always be tagged to that link. Is that correct?

00:19:03   Right. Right. So the Mastodon team reached out to us a few weeks back.

00:19:08   They were working on this addition to the Open Graph meta tags.

00:19:14   You know those metadata that you add to the HTML source of a webpage?

00:19:19   And it's called the Creator tag.

00:19:22   So it basically identifies for each link the creator's profile on the Fediverse. That's the idea.

00:19:31   So whenever John publishes something on MacStories, when that link gets shared on Mastodon, and on versions of...

00:19:40   I think this is a Mastodon 4.3 feature, and Mastodon 4.3 is still not officially out.

00:19:47   It's in nightly beta version, whatever.

00:19:51   But once that shows up, for any link, you will see John's profile on the official Mastodon web app, in the official Mastodon app.

00:20:02   And I think it's also going to be supported by third-party apps.

00:20:06   In fact, I believe Icecubes, the third-party Mastodon client, already rolled out an update today with support for the Fed for the creator profile.

00:20:16   That app, man. Wild. Icecubes is like the iCab for Mastodon.

00:20:23   Icecubes, like the developer of Icecubes, I wish I could be that productive.

00:20:29   Yes, I don't know how they do it. It's truly incredible.

00:20:32   Yeah, yeah.

00:20:34   It feels like they defy the ability of App Review.

00:20:38   You know? Like, App Review should take longer than it takes to get these features implemented. It's incredible.

00:20:45   Yeah. So yeah, I think this is a very cool idea, and I'm obviously grateful that we were included along with The Verge and MacRumors in this initial rollout.

00:20:55   And yeah, I mean, this is also already rolled out on mastodon.social, so any Mac stories link there will show you the profile of the author of Mac stories, and when you click it, you will find our respective Mastodon account.

00:21:10   Very cool.

00:21:11   In the future, it sounds like Mastodon is working on a way to, instead of having to manually review every website and whitelist every website, to just have a self-serving mechanism where I assume publications will be able to verify,

00:21:26   "Yes, we want to verify that the authors of this article are indeed okay with having the creator tag show up on Mastodon with these links."

00:21:39   I don't know what that system is going to look like.

00:21:42   You feel like they kind of have to make that, right? Like, it seems counter to Mastodon's whole thing.

00:21:47   No, no, no. They said, "Yeah, we're going to do it." Like, it needs to be automated so that it would be useless otherwise.

00:21:57   So yeah, go check it out. mastodon.social. You can see it in ice cubes. You can see it, and obviously it depends on which instance you're on.

00:22:06   If you are on the nightly Mastodon releases, that should be fully supported. Otherwise, you just got to wait a little.

00:22:13   I don't know what's going to happen to mic.social. You know what I mean? That thing out there, it's just living its life.

00:22:20   I didn't know there were builds of Mastodon. I had no idea.

00:22:24   I actually think I use... what's the self-hosting site that I use? Masterhost?

00:22:30   Masterhost, yeah.

00:22:31   I think they do it for you.

00:22:33   Oh, yeah.

00:22:34   Yeah, which is nice because I once got an email where it was like, "Update." And I was like, "I don't know. I'm not the master host people. I don't know what to do."

00:22:42   And they're like, "Don't worry. We got this for you." I'm like, "Yay!" I didn't know what to do. Luckily, they handled it.

00:22:48   This episode is brought to you by NetSuite. Here's some quick math.

00:22:54   The less your business spends on operations, multiple systems, and delivering your product or service, the more margin you have and the more money you keep.

00:23:02   But with higher expenses on materials, employees, distribution, and borrowing, everything costs more.

00:23:08   So to reduce costs and headaches, smart businesses are graduating to NetSuite by Oracle.

00:23:13   NetSuite is the number one cloud financial system, bringing accounting, financial management, inventory, and HR into one platform and one source of truth.

00:23:22   With NetSuite, you reduce IT costs because NetSuite lives in the cloud with no hardware required. You can access it from anywhere.

00:23:29   So you cut the cost of maintaining multiple systems because you've got one unified business management suite.

00:23:34   You improve efficiency by bringing all your major business processes into one platform, slashing manual tasks and errors.

00:23:41   Over 37,000 companies have already made the move. So do the math. See how you'll profit from NetSuite.

00:23:48   This just sounds great. If I had the kind of company where I had to worry about all of these separate things, for sure.

00:23:55   Like if you had lots of employees and all this kind of stuff, I would want it all in one platform.

00:24:00   Because I've been inside of corporations where this stuff is split up and it always leads to either duplication or errors.

00:24:07   By popular demand, NetSuite has extended its one-of-a-kind flexible finance program for a few more weeks.

00:24:12   Head to NetSuite.com/connected. That is N-E-T-S-U-I-T-E dot com slash connected.

00:24:19   One last time, NetSuite.com/connected to find out more.

00:24:23   And thanks to NetSuite for their support of this show and all of Relay FM.

00:24:27   I have been looking forward to this conversation all day with you.

00:24:33   Oh, okay.

00:24:34   Because I know that talking about AI and feelings about AI is something that a lot of people don't want right now.

00:24:43   Because it's, you know, it's like it's pseudo-political or just political depending on how you approach it.

00:24:50   It concerns the death of content.

00:24:54   You know, like there's just like a lot of stuff in there.

00:24:57   And also it's just like a conversation that kind of just continues to move and revolve without significantly changing in any way.

00:25:04   You know, because that's just where it is right now.

00:25:07   But I find it in a very interesting intellectual topic to discuss with smart people.

00:25:14   And the reason we're talking about it today is that the Max Stories team, I think namely you and Jon, have produced an open letter on AI.

00:25:26   You wrote, it's like a post on Max Stories where you wrote something which is kind of addressing the Max Stories audience.

00:25:34   And then you published underneath an open letter that I believe Jon wrote?

00:25:38   Yes.

00:25:39   Correct.

00:25:40   So it's kind of like your part is for people that know you and Jon's part is for people that don't.

00:25:46   I'll read a quote from you, which was my favorite quote in your introduction.

00:25:51   "If ignored, we fear that these tools may lead to a gradual erosion of the open web as we know it, diminishing individuals' creativity and consolidating knowledge in the hands of a few tech companies that built their AI services on the back of web publishers and creators about their explicit content."

00:26:07   And I think-

00:26:08   Consent.

00:26:09   Consent. Thank you. Thank you for correcting me about what you said, which is probably the best kind of correction you can get.

00:26:16   You know, and I think that there are a lot of different takes to have, but this is one of the key ones right now that I think is the reason we're talking about this is that these AI tools will do nothing except consolidate power around the largest tech companies, which is what makes it different to any other technological innovation we've had, especially around the web.

00:26:43   Yeah, that was the way that I wanted to frame my introduction.

00:26:48   The idea of, you know, usually, and there are exceptions, but usually technology has been service of helping people create more.

00:27:00   I'm coming at this from a purely creative perspective.

00:27:04   Usually technology, I mean, look at me, right? Look at you.

00:27:08   We're doing what we do because of technology.

00:27:12   And I am concerned that this new technology and many parts of it will replace what we do.

00:27:24   And so I'm coming at it from a purely selfish perspective, but also will prevent a new generation of teachers and mics and Stevens from existing in the first place.

00:27:39   I want to come back to that second part in a minute, but just first off, can you talk about what you and John are aiming for here?

00:27:50   Like what you've done and what your aim is with this open letter, like who it's going to, who it's for and what you hope to achieve from it?

00:27:59   Right. Yeah, that's a good question.

00:28:04   And there's a few reasons.

00:28:07   One of them was that it was for us.

00:28:12   It was a way for me and John to at least temporarily put a cap on this whole discussion and have a single thing we could point to and say, this is what we think.

00:28:24   This is what we believe in. A way to sum up our thoughts in a single place that we could share with people.

00:28:32   Then it was also something that we wanted to do to send to politicians, to send to our representatives in the US and in the EU to make our, and by our I mean Mac stories, voice heard.

00:28:55   And to share our opinion with them and to share our concerns with them, hoping that maybe somebody is paying attention and they could ask us to clarify our position.

00:29:07   They could ask us to explain better what the concern is.

00:29:11   And they could ask us maybe to help us understand why you, an Italian citizen running an American company, paying taxes both in the US and in Italy, why are you concerned?

00:29:27   And I would love to clarify my position and I would love to explain.

00:29:31   Because, you know, there's nothing I can do as an individual.

00:29:35   And as I wrote in my intro, we're a small media company. We cannot sue anybody. We're not the New York Times. I am not launching a class action law. I'm not doing any of that.

00:29:48   It really only is the New York Times, the only company that can afford to do it.

00:29:52   No, there's other companies doing it.

00:29:53   No, but I think as a news outlet or as a media company, I feel like they are economically the only one in a place where they could manage to do it.

00:30:04   So there was, yeah, there was like the angle of like maybe somebody who has the power to start a process, not even to make changes happen, to start a process and to start a proper discussion.

00:30:20   Maybe somebody will pay attention to this email or to this letter.

00:30:24   And third, I would say we also wanted to do it for other publishers, because I think there's other folks who are in a similar situation to ours.

00:30:40   And maybe they were looking for something to use as a frame of reference to also say, yes, I also agree with those Mac stories, folks, you know.

00:30:57   And I think Jon and I, we've had many discussions over the past few weeks, and I'll get to that process in a few minutes.

00:31:08   But we felt like, look, we're going to get a lot of criticism for this.

00:31:13   And we are.

00:31:15   And it's OK, because I think something that I strongly believe in, that was a, you know, it's an idea that my mom taught me when I was a kid,

00:31:31   is that you should never be afraid when you're right about something, when you've done nothing wrong, when you know in your heart that you're right about something you believe in.

00:31:43   You shouldn't be embarrassed and you shouldn't be concerned and you shouldn't be afraid of people telling you you're wrong or making fun of you.

00:31:50   And I'm OK with people making fun of me because I don't care, because I feel deep inside of me that I'm right.

00:31:58   So that was sort of the reasons why we did it.

00:32:02   And but, yeah, I mean, obviously a lot of people are not liking it.

00:32:07   Some people are liking it. And but there's something that I wanted to say before we talk about the thing itself, which is.

00:32:18   This has been a process for the past few weeks, and it made me understand that I now live with a new regret in my life.

00:32:31   And we all have regrets, right? For things we haven't done, for things we should have done, you know, for mistakes we've made.

00:32:38   We all live with regrets to an extent.

00:32:40   I mean, God bless people who don't. But I mean, I'm jealous, but I think we do it to a degree we all do.

00:32:50   And now I have a new one. And my regret is that I wish I cared about all of these sooner.

00:32:59   I wish I paid attention earlier.

00:33:03   I wish I was not distracted, you know, two years ago by the cool factor of chat GPT and like ignored or just didn't inform myself about, like, the issues and why some people were so upset about it.

00:33:21   I wish I had done it and I didn't do it. And there's nothing I can do about the fact that I didn't do it then.

00:33:28   But I can do something now. Is it too late? Probably.

00:33:33   But I prefer to be late than always silent about it.

00:33:41   So, but I regret it and I want to say it because I was wrong and I should have done it and I didn't.

00:33:48   I don't. I understand. You know, I'm not telling you not to feel the way you feel.

00:33:53   I don't think that you should hold yourself by any hard standard there.

00:33:58   Like, because realistically, if you would have made this point, then it would have made less of an impact than it is making now.

00:34:06   I wholly believe that to be the case because I think if we go back to the end of 2022 or whatever, when this stuff was kicking off, it was different.

00:34:19   It didn't feel the same and people didn't know as much. Like, I don't really feel like you should, you know, regret the situation.

00:34:29   I think that you're, I genuinely believe that with all of the tools that you have at your disposal, this is the best possible time for you to be talking about this stuff.

00:34:37   Yeah, maybe. And yeah, so in thinking about all of this and like for the past few weeks, I've been, you know, talking to a lot of people and reading a lot.

00:34:48   And I think I have landed on something that made me understand.

00:34:58   And there's an article that I want to discuss with you in a few minutes.

00:35:03   I understood this idea that is at the core of this feeling of feeling uncomfortable about this stuff, this AI features.

00:35:16   Like, why is it that even the simplest feature, like a grammar checker or like the thing that rewrites a sentence for you, like that's coming with Apple intelligence, for example, like the writing tools.

00:35:29   And when I look at those, I feel uncomfortable and I feel a sense of uneasiness about them.

00:35:35   Why is it that even the simplest feature make you makes you feel that way?

00:35:40   I've been doing a lot of like, it sounds stupid, but I've been doing a lot of analysis about this stuff, like because I'm supposed to be reviewing this functionalities at some point.

00:35:49   And I think I get it now. There's an excellent article by Sebastian De With, one of the designers of the excellent Halide and Kino apps.

00:35:58   Sebastian has a blog now, which is like a really, really good read.

00:36:03   I may become one of my six or seven websites and Sebastian shared this idea a few days ago in a post about like the idea of getting there is a slog.

00:36:20   The idea of in our, you know, Sebastian is coming at it from a perspective of a designer.

00:36:27   And he writes like when I was a kid, you know, and I was using a pirated version of Photoshop and I started designing my designing my first flyers for like the local pub or the local concert venue.

00:36:42   And I felt like and I and I felt inside of me like that was the thing I wanted to do for a living.

00:36:47   And those flyers sucked and they were probably terrible.

00:36:51   But it was that process that got me started on the career path that eventually became my job as an accomplished designer.

00:37:02   And the idea of when I read that, like getting there is a slog and the slog is part of the process.

00:37:09   Right. And I could see myself in that idea.

00:37:14   When I started Maxories in 2009, my English was horrible.

00:37:18   I like and it was honestly terrible.

00:37:23   And the process of improving my English and getting feedback from people.

00:37:28   And to this day, I still get emails from Chris Pepper.

00:37:31   Bless you, Chris, for sending those typos.

00:37:34   You know, if you're a blogger in the Apple community, you know what I'm talking about.

00:37:39   But that is the process, right?

00:37:42   That yes, that is I think that what and look, this is going to get philosophical as a topic.

00:37:52   Bear with it.

00:37:54   I think what makes us human is the ability to learn from mistakes and reason over the mistakes.

00:38:02   Right. To to understand what is what was wrong about like a piece of work or a piece of art or whatever.

00:38:11   Reason over them and improve in the future.

00:38:15   My concern is that without the slog, without being a 16 year old kid designing flyers in a pirated copy of Photoshop,

00:38:27   or, you know, without the slog that I felt as a 19 year old writing blog posts about Mozilla Firefox with the really, really, really terrible English.

00:38:38   And making mistakes and being corrected by people and understanding.

00:38:43   My brain connecting neurons and understanding why was this verb wrong?

00:38:50   Why was this tense wrong without that process, that multi-year process?

00:38:56   I wouldn't be where I am today.

00:38:59   And my concern is that if we apply this to other tasks and to other jobs and to other industries.

00:39:09   Will we have a new generation?

00:39:14   Of creators learning from their mistakes, if we are going to be if the flyer for a pub or a concert venue,

00:39:24   if the pub owner decides to, you know, you have a choice and you can either hire for 50 dollars your neighbor's kid who says they're skilled with Photoshop and give them 50 dollars,

00:39:39   you know, very much underpaid.

00:39:41   Versus going to Chai GPT and asking it to design a flyer.

00:39:47   Right? I mean, you know what's going to happen. The pub owner is going to use Chai GPT and have a much better result from Dali.

00:39:55   No doubt about it.

00:39:58   But that kid will never have their first job, will never be underpaid for that terrible flyer,

00:40:07   and may never become a designer in the first place because they will never have their first work experience.

00:40:14   So that's what I was saying a few minutes ago, like the selfish perspective of like, I care about it for me,

00:40:22   but also I care about it because like if all these simple, if even the simple tasks are replaced by a machine,

00:40:34   it means that we are not forming a new generation of kids and a new generation of creators who do a horrible job at those simple tasks,

00:40:46   but it's the slog that forms the creators of tomorrow.

00:40:51   And if we don't have those, eventually we're going to run out of creators. And I think that's bad.

00:40:57   So that was my perspective.

00:40:59   Yeah, I want to touch on that and come at it from a slightly different angle.

00:41:03   I want to start by actually reading a quote from John's open letter, which by the way, do not sleep on John Voorhees' open letter.

00:41:10   Yeah, don't.

00:41:11   I could imagine a lot of people read your part of the article and then was like, this next part is going to be full of legalese and it's for the…

00:41:18   No, no, it's not.

00:41:19   Read it. It's actually more simple, right?

00:41:22   It's intentionally simpler. I was talking to John about this today and I was like giving him his praise, like he did an incredible job.

00:41:29   So he says, "Left unchecked, this devaluation of internet culture will undermine the ability of today's creators to earn a fair wage from their work and prevent the next generation of creators from ever hoping to do the same."

00:41:42   So I think my feeling now, which I think is slightly different to yours, but not way off, is we are going to be fine.

00:41:52   Me and you, we're going to be fine.

00:41:54   Because we have spent the last 15 years building an audience that values our opinions and personality.

00:42:02   And in an increased AI world, I actually believe that opinions and personality will become more valuable.

00:42:11   But that also, the proliferation of AI tools and the speed and the fact that there will be a lot of "content" made by machines will make it so much harder for somebody to gain a foothold the way that we have.

00:42:33   I believe this is more important for the future people than it is for me.

00:42:40   And this is something that I've clarified over the last couple of years.

00:42:44   When this stuff started to come around, I had concerns for my own job as well as others.

00:42:51   I don't have concerns for my job. I have concerns for the next generation.

00:42:59   So my concerns are in that there will be too much content for people to be able to break through and everyone is going to treat any new content. They're not going to believe it.

00:43:11   I'm seeing this all the time now. I think I mentioned this in the show recently.

00:43:17   If you see any brand post something that is illustrated, people think it's AI.

00:43:24   Even if it isn't, they just think it is. Being able to get noticed and be respected is going to become so hard.

00:43:33   But also, I hadn't thought of it in that way. I think that's a great take from Sebastian.

00:43:39   This idea of the slogan reminds me of the Malcolm Gladwell popularized 10,000 hours idea.

00:43:46   You familiar with the 10,000 hours thing?

00:43:49   Do you know who Malcolm Gladwell is? He's an author. He popularized a thing many years ago.

00:43:56   He wrote a whole book that focused around it, which was to get good at anything, it requires 10,000 hours of practice.

00:44:03   The specifics of it are important, but you can take from the idea of it where Relay is 10 years old.

00:44:13   I believe I started getting good at podcasting 10 years ago, but I had been podcasting for five years before that.

00:44:23   I believe it takes a lot of practice to get good at things. I don't think that is a wild thing to say.

00:44:32   But as you point out, someone who writes something and then is like, "improve this for me."

00:44:40   You're not learning the tools necessary to be able to produce.

00:44:44   Maybe it's not important to you. For example, I use ChatGPT and I will use it today when I write the description for this show.

00:44:53   Because my grammar is not very good in general, but it doesn't bother me enough to learn.

00:45:00   It's just not a thing that I need enough that it matters to me.

00:45:06   But if I wanted to be a blogger, that should matter to me.

00:45:11   Similarly, I would never use an AI podcast editing tool.

00:45:17   Because then I'm not going to get good at the editing.

00:45:21   You've got to get good.

00:45:24   Difficulty settings, who needs them? Get good.

00:45:28   But I believe in the power of that. In the practice making perfect.

00:45:36   You have to put the time in. And these tools are a jump.

00:45:41   And of course these tools will help raise some people to a level that they can't get to on their own.

00:45:49   But I don't know if it can give you the foundation you need to go out and do a thing.

00:45:58   It's like the difference between using a calculator or not.

00:46:03   Which is fine. I'll use a calculator and I can do my math. The computer can help me.

00:46:09   But if what I was doing in my job relied on this calculator all the time, I should at least understand the theory.

00:46:17   Of how do I get to the calculation. Rather than just blindly relying on the calculator.

00:46:24   Now for me, I can blindly rely on calculators because what I do does not require that fundamental knowledge.

00:46:30   But it's in the same way that I should know how to edit a podcast.

00:46:35   I should know what it takes to remove filler words.

00:46:39   I should know what it takes to be able to fix audio.

00:46:43   Because that is intrinsically important to the work that I do.

00:46:47   I shouldn't just feed it all into a machine and let the machine do it for me.

00:46:51   Because I don't believe in the near future it's going to produce good enough results.

00:46:57   And also, it doesn't help me get better.

00:47:02   I've worked with a lot of young and aspiring podcasters over the years.

00:47:07   And one of the things I always tell people is when you start, you should edit.

00:47:13   You should edit your own audio because you'll hear in yourself the things you don't like.

00:47:18   And that proliferates and feeds back into what you will talk about.

00:47:22   You become a better content producer if you edit yourself. I'm sure this is the same in writing.

00:47:28   If you edit your own writing you become a better writer because you stop wasting your own time by doing things you shouldn't be talking about.

00:47:35   And people need to put the work in to get good.

00:47:43   And I think there's also a matter of if you decide you want to do something for a living

00:47:48   and if you think, "Oh, I want to be a podcaster" or "I want to be a writer" or "I want to be a painter."

00:47:58   In theory, your purpose, I mean let's face it, as humans, ultimately we are driven by a selection of impulses.

00:48:14   And arguably the strongest one is the pursuit of happiness.

00:48:18   And so in theory we do what we do, sure, to earn money.

00:48:23   But especially in a creative field when you are creating something, it's the process of creating that should bring you happiness.

00:48:33   I find it very hard to, and that is when burnout happens for example, when you are a creator and you feel like you have to create

00:48:45   and it doesn't bring you happiness anymore, now that's a problem, especially in a creative industry.

00:48:51   And so it's at the very core of being a creative person, whether you produce text or images or audio or music, whatever, film, right?

00:49:00   At the core of the creative process is the fact that the process brings you happiness.

00:49:06   You're happy to work and to make mistakes and to improve and to see, "Oh, this actually sounds better than before."

00:49:13   That is the thing that brings you joy.

00:49:15   And then there's the final work and you reach the final work and you think, "Oh, this actually sucks."

00:49:19   Because this happens to all of us. And then you get people saying, "What do you mean it sucks? It's beautiful."

00:49:24   And then you move on to the next thing. This happens to me all the time.

00:49:27   It's the process. And the reason why I've been doing this job for 15 years is that I love the process.

00:49:33   I love the grind. I love doing it. I love having a rough draft and sending it to somebody and be like,

00:49:40   "Hey, dude, this paragraph doesn't make sense."

00:49:43   It's that thing that brings me joy and to have that replaced by something that automates it for me to a large extent,

00:49:57   to an extent where maybe I'm not even involved anymore.

00:50:00   You're just taking the happiness away from me.

00:50:03   That is the thing that I love.

00:50:08   And it's a hard feeling to convey because I think, and I've been thinking about this, right?

00:50:12   I think a lot of the criticism that I'm getting, that Jon is getting, that you are getting for Cortex.

00:50:18   I've been in these minds, I checked today, since September 2022.

00:50:24   I still get on a daily basis new comments on the videos, the YouTube videos that me and Grey did,

00:50:32   like what the podcast put on YouTube. We did ethics of AI art and then something about Marionette's destroying.

00:50:38   People are still like, "These guys are wrong. They don't know what they're talking about today."

00:50:42   Two years later.

00:50:44   Yeah. So I've been thinking about this and I think a lot of the feedback that we're getting comes from people who say,

00:50:52   "Why are you against tools that make me more efficient? Because I just want to be faster.

00:50:59   I just want to be done with my work faster."

00:51:04   And you attacking or criticizing these tools means that you don't care about people's time or efficiency.

00:51:12   And I think I understand now what maybe the issue is.

00:51:18   And that is there is a gap, there is a chasm between people who create for a living and people who just need to do something for their job.

00:51:32   My mom, for example, works at this national office for like, it's called the Automobile Club.

00:51:43   It's not a fancy thing. It's the office that regulates license plates, paying car taxes, that sort of stuff.

00:51:48   It's like the DMV.

00:51:50   It's like a DMV, but not as horrible as the DMV. My mom doesn't create anything. My mom is not a creative person.

00:51:58   My mom sits down at her desk and gets her job done. She doesn't need to create anything. She likes her job.

00:52:05   There's parts of her job she would like to be faster.

00:52:09   And she wouldn't care about like, if you go to my mom and you're like, "Hey," teach his mom, her name is Beatrice,

00:52:16   "would you like this tool to be faster at filling out forms about license plates?"

00:52:21   She'd be like, "Yes, please." Me, and here's the concept of driving it.

00:52:30   And this is probably sounds stupid to people like my mom.

00:52:36   It's the inefficiency that makes me happy. It's the being slow.

00:52:42   It's the tweaking and refining and creating slowly.

00:52:48   You know, it's the inefficiency is actually the thing, and this sounds like a paradox,

00:52:59   but the inefficiency is what I love most about what I do.

00:53:03   The worst part for me is when I publish an article, because it means I'm done with it.

00:53:11   You know, I enjoy, to this day, I enjoy working on drafts of my stories a whole lot more than publishing them,

00:53:22   because it means I'm done. And I know that it sounds silly, right?

00:53:26   Because I mean, what do you mean you're not happy? You're done.

00:53:28   Yes, I'm happy that I'm done. But what I do, what I set out to do was writing, not publishing.

00:53:39   You know what I mean? Like, I enjoy creating more than the final output.

00:53:45   And so if you take the creation part away from me, it just makes me sad.

00:53:51   That is something that I've realized over the past few weeks. Why is it?

00:53:55   And so I look at writing tools, you know, to go back to my original question.

00:53:59   I look at writing tools and like, "Why does it make me uncomfortable?"

00:54:02   And now I think I understand why it makes me uncomfortable, because it takes away a tiny part of the inefficiency that gives me pleasure.

00:54:13   And that's why.

00:54:15   And like, you know, here's the thing, right?

00:54:19   I feel like if these tools were limited to these productive use cases, I'd be fine with it more.

00:54:25   And I am, like for productive use cases, this is where, again, I feel like we've been trying to impress upon this recently.

00:54:33   I don't know how well it comes across.

00:54:35   There is a spectrum of belief that is held by kind of all of your favorite podcasters.

00:54:39   Me and Federico do not have the, we do not share the same views, right, on this subject.

00:54:44   We're close, but not exactly.

00:54:47   Where I am less concerned about productivity use cases for AI, like creative use cases for AI concern me more.

00:54:56   But you are more concerned than I am about how the datasets have been generated.

00:55:02   Where like, I am sympathetic to your, and I believe, like you guys do, that it is theft.

00:55:08   But I am also, I'm just less hard-lined on it.

00:55:12   Where there is an element where I'm like, I don't believe that this is right, but I'm not doing anything about it, right?

00:55:21   Other than like talking about it.

00:55:23   But I'm not doing like what you and John have done.

00:55:25   Like I'm not drafting an open letter to send to politicians, right?

00:55:30   But like, there's two problems.

00:55:34   One, these tools will be used for creativity, and that upsets me because it was creativity that then feeds creativity.

00:55:39   That is, there is no passion in the middle of it, and I don't like that.

00:55:44   There's nothing human in the middle of it.

00:55:46   But the other thing is like, for the people that like your mom who would want that tool, the problem is, it will remove her job, right?

00:55:53   Like this is the thing, that like, if you, if your job is made easier by AI, you are heading down the road of your job being replaced.

00:56:03   Yeah, yeah, yeah.

00:56:04   And look, I mean, my mom is going to retire in three years anyway.

00:56:07   Yeah, she's good. She's good.

00:56:09   It won't come that soon, I don't believe, but it's coming.

00:56:12   But yeah, yeah.

00:56:15   And yeah, I feel like I understand, I don't want to sound like some sort of spiritual guide or whatever, you know, telling people that, oh, actually it's all about, you know, dude, it's all about creativity.

00:56:36   Like, I don't mean to sound like that.

00:56:39   I am creating text is my job and creating, you know, speaking into the microphone.

00:56:47   Like, you may prefer to what we do, something like the Perplexity Daily Show, which is an AI generated podcast that you can find anywhere.

00:57:01   It's an actual podcast generated by Perplexity, spoken via 11 Labs voices, and you can listen to every day.

00:57:10   And look, maybe you like it because it's, you know, a summary of tech news and it's, I'm going to use everybody's favorite word, it's objective and it's to the point.

00:57:25   But I think that there is, and you know, or you could say, I love listening to AI generated music.

00:57:38   That is also a thing that you can do.

00:57:41   Soon enough, you will be able to, I'm sure, look at AI generated short films and TV show episodes.

00:57:50   Did you see that Toys R Us thing?

00:57:52   I did.

00:57:53   That's terrible. It was so bad.

00:57:55   I did.

00:57:56   I did.

00:57:57   But I strongly believe that, I think this is a concept that I've shared on the show before.

00:58:05   That is also one of my core beliefs.

00:58:07   It sounds so simple and stupid and I'm going to sound like a child saying this.

00:58:12   I think we, people like being around and being in contact with other people.

00:58:21   And there's something about being inspired by somebody else's art.

00:58:27   Whether that's a documentary or a blog post or a podcast or a film or whatever.

00:58:35   There's something so unique about being inspired and connecting with somebody else who creates art.

00:58:42   Or going to a concert that I don't think it can be replaced by an AI that doesn't have feelings.

00:58:56   Some of my best, some of my favorite memories are going to concerts and after the concert,

00:59:04   hanging out in the back of the venue and sort of waiting for artists to come out and shake their hands and taking a selfie with them.

00:59:15   You know, it's these things.

00:59:16   And maybe while you're there, you get to know somebody and maybe that somebody becomes your fiancé five years from now.

00:59:23   Something like, it's that randomness of people's connections that I think drives the human race and drives society.

00:59:35   And to lose that for a more efficient, more objective, infinitely available, machine-generated art.

00:59:46   I think it's just sad, honestly.

00:59:49   Yeah, that's what I think.

00:59:52   I think it's sad.

00:59:54   Before we move on to more of this, I do want to just read this one last part that I wanted to pull out from Jon's letter.

01:00:02   Quite simply, it's theft, which is something as old as AI is new.

01:00:07   The thieves may be well-funded and their misdeeds wrapped in a cloak of clever technology, but it's still for theft and it must be stopped.

01:00:15   I just love cloak of clever technology.

01:00:17   Eternal phrase.

01:00:19   Jon is good with words.

01:00:21   This guy, this guy.

01:00:23   Really good.

01:00:25   This episode is brought to you by Squarespace.

01:00:29   The all-in-one website platform for entrepreneurs to stand out and succeed online.

01:00:33   Whether you're just getting started or managing a growing brand, you can stand out with a beautiful website, engage with your audience and sell anything.

01:00:40   Your products and services are the content that you create.

01:00:42   The human content that you create yourself.

01:00:45   Your own art, your own words.

01:00:47   You can publish this with Squarespace.

01:00:49   They have everything you need, all in one place, all on your terms.

01:00:52   You can start, easier than ever before, with a completely personalised website with Squarespace's new guided design system, Squarespace Blueprint.

01:01:00   You choose from one of their professionally curated layout and styling options and you'll be able to build a unique online presence from the ground up.

01:01:06   Tailored to your brand or business and optimised for every device.

01:01:10   You can easily launch your website and get discovered fast with integrated, optimised SEO tools.

01:01:15   So you show up more often in web searches to people and grow the way that you want to.

01:01:20   You can also integrate flexible payments.

01:01:22   If you want to sell products, they have great tools for allowing you to do this.

01:01:27   You can make checkouts seamless for your customers with simple but powerful payment tools.

01:01:31   You can accept credit cards, PayPal and Apple Pay.

01:01:34   And in eligible countries, offer customers the option to buy now and pay later with Afterpay and Clearpay.

01:01:40   You can really hit the ground running with website design with Squarespace using their tool called Fluid Engine.

01:01:46   This is their next generation website editor.

01:01:49   It's never been easier for anyone to unlock their creativity with the use of Fluid Engine.

01:01:54   You choose your website's starting point and you can customise every design detail.

01:01:58   The colours, the fonts, the layout, everything.

01:02:00   But it's all with drag and drop technology for desktop or mobile.

01:02:04   You can stretch your imagination online with Fluid Engine, included in any new Squarespace site.

01:02:08   I love how Squarespace grows with you.

01:02:11   You can start with just a blog and you're writing.

01:02:14   But then maybe you want to sell some merchandise where you can set up an online store to do it.

01:02:18   They've got everything you need.

01:02:19   And then maybe you want to send out your content via email.

01:02:22   You can have Squarespace email campaigns.

01:02:24   It all just grows with you over time.

01:02:26   They have all of the tools that you're going to need.

01:02:28   It's the perfect starting place.

01:02:30   It's the perfect growing point too.

01:02:31   Go to squarespace.com.

01:02:33   You can sign up for a free trial and you can build your website.

01:02:35   Then when you're ready to launch, go to squarespace.com/connected

01:02:38   and you'll save 10% off your first purchase of a website or domain.

01:02:42   That is squarespace.com/connected.

01:02:45   And when you decide to sign up, you'll get 10% off your first purchase.

01:02:49   And you'll be showing your support for the show.

01:02:51   Our thanks to Squarespace for the continued support of this show and all of Relay FM.

01:02:55   You shared with me something you said I'd really appreciate if you read this.

01:03:01   It's called "Context, Consent and Control, the 3 Cs of Data Participation in the Age of AI."

01:03:12   Can you explain to me what this article is and where it came from?

01:03:15   So it's an article by Eric Salvaggio about this.

01:03:19   Trying to understand why people like me, for example,

01:03:25   why do we feel this vague discomfort with AI and why do we feel so anxious about AI?

01:03:38   Sometimes in a way that is so overwhelming and so difficult to put into words that we sound like lunatics.

01:03:45   And I know that I have felt that way.

01:03:50   We've spoken about this and I was able to identify some of the things that were making me uncomfortable,

01:03:56   but there are still so many other aspects of this entire industry that I find difficult to articulate.

01:04:05   And this article tries to get to the bottom of these three principles that these three Cs,

01:04:14   Context, Consent and Control, that are common threads for a lot of folks looking at AI products

01:04:23   and feeling uncomfortable and trying to explain why.

01:04:28   So I really loved it. There's a couple of parts I wanted to pull out.

01:04:35   Unfortunately, anti-AI anxiety often presents itself through vague and inexpressible discomfort.

01:04:44   This is something that is overwhelming people, often in ways they cannot capture a precise language.

01:04:49   If we don't have the language to describe what's happening, we certainly don't have the language to articulate solutions.

01:04:55   And I know this from myself, going back to when I first started talking about this two years ago and even to today,

01:05:01   where it's hard to get across my feelings on this when they're so emotion-led and I just sound hysterical.

01:05:11   And then there are people that come in and they use their smarts to make me sound like I'm stupid.

01:05:18   This is what I've been dealing with for two years.

01:05:20   Where they're like, "But don't you see?"

01:05:22   Or like, "Oh, this guy, he's just worried about his own job!"

01:05:25   And like, you know, whatever.

01:05:26   And exploring our biases is important with this, right?

01:05:31   And I have owned up to this from the beginning of my feelings about this, but people still like to throw it back at me.

01:05:38   I have a bias towards creative work.

01:05:41   Oh yeah.

01:05:42   But the thing that I've expressed today, and you have too, is what's changed for me is my bias remains, but I don't feel threatened anymore.

01:05:52   So, for example, right?

01:05:57   The idea that my work could have been taken and fed into a thing, it annoys me, but I don't feel threatened by it.

01:06:04   Because I feel very confident in why people come to the shows that I make.

01:06:11   I have no doubt that there is a certain sect of podcast that would be obliterated by AI-created podcasts.

01:06:21   I don't believe it's the content that I make.

01:06:24   Because I believe the content that I make is not general.

01:06:30   It is specific.

01:06:32   And part of its specificity is the people.

01:06:35   And I just don't think that that can be replicated.

01:06:40   And I can't imagine the people that choose to listen to this show would be happy with an AI-generated version of this show.

01:06:48   Because I just feel like they would understand they're not going to get the same thing.

01:06:53   But nevertheless, right?

01:06:55   If you are someone who is worried about this stuff, if it causes anxiety in you, it is hard to express something when it's so complicated.

01:07:07   Yeah.

01:07:08   Yeah.

01:07:09   That is something that I felt.

01:07:11   And something that I still feel.

01:07:13   And part of the reason why we wanted to publish the letter was that, to be able to convey in words, in a single page, what we felt.

01:07:25   And I think this article, it is to this day the best thing I've read on the subject, from my perspective.

01:07:34   Like, every single item of these three Cs, I could see myself agreeing with that.

01:07:45   And I think, starting with the idea of context.

01:07:51   Basically, the understanding when my data is used and where and by whom it gets used.

01:08:03   You extracted this quote from the article.

01:08:06   "Why does sharing a work on a website turn it into facts, while sharing the work with my friends keeps it a work of art?"

01:08:15   This, I think, is a really important point.

01:08:19   And I think it is a key point for me in this whole conversation.

01:08:26   Something that I keep hearing from people, including today.

01:08:30   I got an email a few hours ago about this very, very concept.

01:08:36   And the email said something on the lines of, "What makes it different?"

01:08:42   You know, if I'm a person that reads Mac stories and learns from you, and learns from your articles, learns from your shortcuts.

01:08:50   And then I go on about my life, and that knowledge reshapes, rewires my brain.

01:08:59   And I end up creating a work that is unintentionally inspired by the knowledge that I accumulated from Mac stories.

01:09:08   First of all, I'm glad that sounds like a possibility.

01:09:11   But the person said, "Do I owe you a cut of what I make with my work?"

01:09:18   Because I was inspired by Mac stories.

01:09:21   Why is it that you draw the line between people reading Mac stories versus an artificial intelligence scraping your website?

01:09:32   And reusing part of that knowledge for future tasks.

01:09:37   And that is a very important point for me.

01:09:44   Because I think there's something that I believe in, is the fundamental difference.

01:09:53   That I'm surprised that maybe...

01:09:57   I'm surprised to be getting this sort of email.

01:10:00   Because I think it's quite obvious from my perspective to see the difference between people learning something.

01:10:11   And being inspired by other people versus becoming, unintentionally from Mac stories perspective,

01:10:21   part of a corporation scheme with their large language models.

01:10:28   I decided to write articles that people read.

01:10:37   And can do, as far as copyright law goes, do whatever they want with it.

01:10:43   I cannot come to your house and knock on your door and be like, "Hey guy, I want you to forget my article."

01:10:51   That sounds ridiculous, right?

01:10:53   Or even the idea of, if we think about fair use.

01:10:57   Fair use is a problem all in and of itself for reasons before AI.

01:11:03   I feel that what we're doing on this episode today is fair use.

01:11:07   We are reading some quotes from an article, but then we are talking about it for 15 minutes.

01:11:13   Where I feel like you can tell in this scenario that we have had thoughts about this article and we're talking about them.

01:11:19   My hope is that you will go and read this article now.

01:11:22   But I feel like this is the implied thing.

01:11:25   This to me is the implied thing.

01:11:27   If you put something online, you open yourself up to a scenario where someone can do what we're doing now.

01:11:32   I feel like this has been a known thing for a long time because it's very human.

01:11:36   That someone may quote you. This is very human.

01:11:40   It's very human.

01:11:41   And remixes, it makes me think of the excellent series by Kirby Ferguson, "Everything is a remix."

01:11:50   This is what humans do. This is what humans have been doing forever.

01:11:54   We have always, always, and we will always continue to remix something that has been done, produced, or come before us.

01:12:03   We will always do that. It's what we do as a species.

01:12:07   We learn from things we have seen and we remake them to an extent that sometimes is fair, sometimes in the case of plagiarism is not.

01:12:17   But the difference, you know, when I got that email, which I want to respond to, the difference is that I don't want to become...

01:12:29   I don't want my text to become part of a system, to become part of a machine.

01:12:39   Where I'm using the word "machine" in the abstract sense.

01:12:42   I know that it's not a machine, you know, moving gears left and right, up and down.

01:12:47   But to become part of a system that may be putting people out of jobs.

01:12:54   I feel uncomfortable with that.

01:12:56   I set out years ago, I posted this on Mastodon a few, I think a couple of weeks back, or maybe last week, I don't recall.

01:13:04   I said something on the lines of...

01:13:07   I publish a website. If I wanted to open an English school to teach English to bots, I would have done that.

01:13:16   I decided to write a website for people.

01:13:20   I am not...

01:13:23   At any point, I did not make a decision to say, "Oh yeah, I want Mac Stories to be one of the sources for large language models.

01:13:35   And I want a tiny part...

01:13:37   You know, I want to be one of the many tokens used by an LLM to learn the English language."

01:13:45   Which, by the way, is a thing that you can check, whether you're part of Dataset, any other Mac Stories, it's part of this dataset.

01:13:52   That is not a decision that I made.

01:13:57   And so, that is outside the context of how my work is getting used as stepped outside of the boundaries of my editorial decisions.

01:14:12   The content was taken from me without my consent, which is the next item in the article.

01:14:20   I wasn't told about it until after the fact.

01:14:25   And I was never consulted upfront.

01:14:29   I knew that I was publishing a website.

01:14:32   I didn't know that that content may also be taken by a machine, by an AI system, and used for training.

01:14:42   And I don't feel comfortable with that.

01:14:45   Right?

01:14:47   Just like, for example, you know, there's...

01:14:52   I remember, what was it, a couple of years ago, a few years back...

01:14:59   I don't mean to get political, I'm just using this as an example.

01:15:03   I believe Trump was using... Was it a Bruce Springsteen song that was used at one of his rallies?

01:15:10   I think so.

01:15:11   And the artist said, "Look, I'm not comfortable with that."

01:15:15   It is an artist's right to be able to say, "Look, I'm not comfortable with you using my art for this."

01:15:22   Right?

01:15:24   So, yeah, that is context, I think, is key to understand the difference between humans using reasoning over your creation versus a machine just taking it.

01:15:37   So, first C, context.

01:15:40   When and where can my data be used and by whom?

01:15:43   That is an established thing that you understand.

01:15:46   Then we move into consent.

01:15:48   Inform me of how my data is going to be used.

01:15:52   For me, the thing that I took away from this is people may want to give their data to be trained on for varying purposes.

01:16:02   I think of Creative Commons, right?

01:16:04   The idea of Creative Commons basically being a layer on top of copyright where you can essentially give a license, kind of like open source, too.

01:16:16   You're saying, "You can use my work for X, Y, and Z, but not A, B, and C."

01:16:21   But people aren't given a choice.

01:16:24   There's no consent, right?

01:16:26   So, the context is you've shared something on the web and you believe that to be used for a certain thing.

01:16:31   But then maybe somebody wants to use it to train their large language model, but they don't give you the choice.

01:16:37   For me, I was thinking about this, what would my consent model be like?

01:16:42   For me, if somebody wanted to take my podcasts, right?

01:16:48   Let's just imagine that I'm having the say on this, right?

01:16:51   But for me, just my voice in my podcast.

01:16:54   If they wanted to take that, transcribe it, and use that data to train an intelligent assistant, that would not bother me.

01:17:02   It just doesn't.

01:17:04   That thing just doesn't bother me.

01:17:06   What would bother me is if they wanted to take my voice and make a synthetic voice out of it.

01:17:11   That would bother me.

01:17:12   Obviously.

01:17:13   Right?

01:17:14   But I don't get to be able to make that choice because nobody is asking.

01:17:19   No one's asking.

01:17:21   I don't know, but I'm convinced that my podcasts have been used to create the whisper model that OpenAI has.

01:17:30   For sure.

01:17:31   Because why wouldn't they?

01:17:33   For them, it's stupid if it didn't do that.

01:17:35   It's just stuff that's out there.

01:17:36   You can just scrape RSS feeds and you've got tens of hours of audio, hundreds, millions of hours of audio.

01:17:42   It's just free and available.

01:17:44   But I just want to be able to say that I'm okay with something before it's been done.

01:17:48   And that comes back to one of the inciting events of all of this, which is what Apple did.

01:17:52   Where they gave you consent, but after they'd done it, it's like madness.

01:17:57   Which Jon so beautifully described in the open letter as a thief going to a shopkeeper after they've robbed their shop and offering them a lock.

01:18:10   After the fact.

01:18:12   Which is effectively what Apple has done from my perspective.

01:18:15   Here's a lock, because you've got to stop me from doing this again.

01:18:19   What about the first time?

01:18:22   Okay, thank you.

01:18:24   But yeah, we go back to that original point from a few minutes ago.

01:18:29   Now that it's all said and done, unfortunately, that Apple as an example.

01:18:36   You mentioned Unconnected a few weeks ago when I was not on the show.

01:18:45   Why, you know, you said I don't want to speak on Federico's behalf, but I feel like the Apple thing was the sort of the last, you know, it was that sort of pushed Federico over the edge.

01:18:59   And yeah, that is true.

01:19:02   I felt like I was not expecting it and it caught me off guard.

01:19:07   And there was also something that Steven said, you know, I think it's pretty much been established that the three of us, we have very different opinions about this stuff.

01:19:21   But Steven said, you know, something along the lines of, I don't think it's useful to have an absolute stance on anything at this point.

01:19:38   And I disagree with it.

01:19:40   I think I do have an absolute stance on how I feel about my content being taken from me without my consent.

01:19:50   Because, for example, now I feel very uncomfortable about the fact that part of Mac Stories was crawled and scraped and maybe a part of it gets used to, was used to build the model for writing tools.

01:20:07   And maybe, and it sounds silly, but maybe that means that, you know, six months from now, there will be somebody who needs to write something and instead of going to an editor and, you know, paying them $100, they just say, well, you know, I could just use the AI to do it for me.

01:20:29   Why do I need a person?

01:20:31   And that makes me feel very uncomfortable knowing that I unintentionally play the part in it.

01:20:41   It makes me feel very uncomfortable and I think it's very wrong.

01:20:46   So, yeah, consent, I think is something that I take an absolute stance on.

01:20:52   Because it's wrong.

01:20:53   And I think when you see something that you think is wrong, you should say it.

01:20:57   Because nuance only goes so far.

01:21:03   Because then you risk of playing into the, oh, both sides have their arguments kind of, you know, position.

01:21:12   And I think when it comes to the, you know, you're literally, Mike, you did it two years ago.

01:21:20   The ethics of AI art.

01:21:22   Like, you know, you've been thinking about this stuff for two years and when it comes to these ethical matters, I think you got to call it.

01:21:29   When you see something that is wrong, you got to say it.

01:21:32   And like, you know, you said about the inciting event, like the reason you were surprised about it made perfect sense.

01:21:38   Because leading up to WWDC, we were talking on this show over multiple weeks about where is Apple going to license their data set from.

01:21:46   And we were referencing their photo training because it just seemed natural to us that they would have done it differently.

01:21:54   And really, they kind of didn't.

01:21:57   Like, they kind of waved their hand around about certain things that they've licensed, which seem to be imagery.

01:22:03   But text, they just took the same stance as everybody else, which is just like, hey, if it's on the web, it's fair game.

01:22:09   So then the last point is control.

01:22:13   Allow me to refuse the use of my data for certain uses.

01:22:17   So, you know, this feels like the consent and control feel like two sides of the same coin, right?

01:22:22   Where like the consent piece, you're saying it, but in the sense of where someone hasn't taken that, that you still get the ability to be like, no.

01:22:32   Right? And that you have some kind of control over it.

01:22:37   And to kind of, I think, a real great way of exemplifying the fact of how unlikely it is that we're going to get this control.

01:22:44   So this is from Eric's article.

01:22:48   Many from my generation can't understand why single moms will find $1.5 million for downloading MP3s while tech companies can now Hoover up our YouTube videos and have seats on government panels about regulating AI.

01:23:00   Yeah.

01:23:02   Kazaa and Napster were never going to be in the room regulating the use of MP3s, right?

01:23:10   That was never going to happen.

01:23:12   How we've gone from that to the scenarios we have today is incredible if you think about it.

01:23:19   Where you should be in the room, right?

01:23:21   Because it was the RIAA or whatever, but they were working on behalf of the record labels, working on behalf of the musicians, were in the room trying to get rid of the piracy, right?

01:23:34   Where the same result is that you or a representative for people like you should be in the room.

01:23:39   And I'm sure there are some, but they're also the same people around the same tables as Apple and Google and OpenAI, and they're trying to regulate themselves.

01:23:48   And Ben Thompson always talks about this and it's such a great point.

01:23:52   Of course they want to regulate because then they get to protect their advantage.

01:23:56   They regulate around their current systems and have the money and resources to be able to change around any regulation because they're in the room helping make it happen.

01:24:06   And it entrenches them even from other companies that want to do the same thing can't.

01:24:11   Like madness.

01:24:13   What we're in.

01:24:14   Yeah.

01:24:15   So yeah, I don't really know.

01:24:18   Obviously we have not solved anything with this discussion, but I hope that we have clarified our positions and I hope we have given people who maybe disagree with us a way to understand how we feel and how we think.

01:24:35   I strongly believe, you know, I very strongly believe in, I love people.

01:24:46   This is a very simple sentence to say, but I love people for all the weird, beautiful things people do.

01:24:53   I love being around people.

01:24:56   I love talking to them.

01:24:57   I love arguing with them.

01:25:00   There's something to otherwise, I mean, I would just stare at my computer all day and I will never leave the house.

01:25:09   I would leave alone.

01:25:10   I, you know, I wouldn't consume art.

01:25:15   I would go live in a cabin in the woods and die alone and never talk to anybody.

01:25:21   But I very much believe in the power that people have when it comes to obviously creating art because I'm a creative person.

01:25:33   But also in people being able to find a common ground to understand each other.

01:25:41   At least I hope so.

01:25:44   And I think I want to make it clear, I understand that for a lot of folks, these AI services sound and seem and are incredible.

01:25:58   And I also understand how for a lot of non-creative industries, whereby non-creative, like I'm simplifying here, but I mean, you're not blogging, you're not recording the podcast.

01:26:11   I understand how for science, for cancer research, for, you know, complex mathematics, for all kinds of industries that go beyond the Mac stories and go beyond relay,

01:26:25   why large language models are the future and are just an incredible new thing to have.

01:26:33   But at the same time, I urge you, if you can, to stop for a second instead of opening Mastodon, because I know you are, or sending an email.

01:26:44   Stop for a second and think about a potential future in which you don't get to send that email anymore,

01:26:54   in which you don't get to post that Mastodon or Threads post anymore, because there's nobody to email it to, right?

01:27:03   Just think about a future where Mike and I are probably okay, as Mike said, because we're fortunate to have an audience.

01:27:13   But for a second, just imagine that maybe your kid, 20 years from now, you know, wants to listen to a podcast in between class breaks at college,

01:27:24   and all they have is some flavor of a large language model that instead of saying stupid jokes about, you know, Bill of Rickeys,

01:27:37   or, you know, teaching the Italian language, or, you know, saying very dumb jokes, as we do on this show, it's just an LLM spitting out facts.

01:27:50   And if you're a listener of this show, and you disagree with us, but maybe at the same time you have come in the past to one of our live shows, right?

01:28:03   I think it's very possible that there's people who have come to our live shows and also disagree with us on AI.

01:28:11   And for a second, to understand our perspective, think about a scenario in which there's no live show to come to anymore.

01:28:19   And maybe, and I hope, because I believe in people, and I hope that you will understand why that's said to not have it anymore.

01:28:32   Because, once again, I do think that Mike and I are probably fine, but I hope that there will be new, you know, new creators, 20 years from now,

01:28:45   doing this and sharing dumb jokes and meeting each other, you know, falling in love, getting married, whatever, all the things people do.

01:28:53   So yeah, I hope you can understand our perspectives now, and if you don't, it's fine, but we wanted to have this conversation.

01:29:03   Alright, this episode is brought to you by Ecamm. Ecamm is the leading live video production and live streaming studio built for the Mac.

01:29:13   Ecamm does all aspects of video, not just live streaming. It's perfect for simplifying your workflow.

01:29:18   It's easy enough to get started quickly, but powerful enough that you can create just about anything with video.

01:29:24   You can do it all with the Ecamm app. Whether you're streaming, recording, podcasting or presenting, everything is there in Ecamm,

01:29:31   including support for multiple cameras and screen sharing, plus a live camera switcher that lets you direct the show in real time.

01:29:39   You can stand out from the crowd of high quality video, add logos, titles, lower thirds and graphics, share your screen, drop in video clips, bring on interview guests and use a green screen, and so much more.

01:29:50   Ecamm Live does it all. And like to be able to have this power on the Mac is rare. Like Ecamm is incredible in what it does.

01:29:57   I was talking to Stephen before the show and he was telling me how much he loves Ecamm Live.

01:30:02   He told me that he loves how it integrates with the Mac's excellent media features, but it's so easy to mix and match video playback with HDMI capture and screen capture and more, layering them all together in an easy to use interface.

01:30:14   And just some exciting news, Ecamm for Zoom is now available for everyone on the pro level plan, so you can automatically send Ecamm's live audio and video output into a Zoom meeting, Zoom webinar or Zoom event.

01:30:28   You can add up to eight Zoom participants as camera sources in your broadcast or recording. Plus, you can automatically create individual participant audio and video recordings and add Zoom chat messages to your broadcast or recording as text overlays.

01:30:42   Ecamm's members are entrepreneurs, marketing professionals, podcasters, educators, musicians, church leaders, bloggers and content creators, streamers of all kinds.

01:30:51   Get one month free today at Ecamm.com/zoom using the code connected. That's a whole month free of Ecamm Live at Ecamm.com/zoom with the code connected.

01:31:06   Go there now and check it out. Our thanks to Ecamm for their support of this show and Relay FM.

01:31:13   So to wrap up today, I want to circle back a little bit about the open letter.

01:31:18   Okay.

01:31:19   By the way, we had like four other topics today that we've just thrown out.

01:31:22   Yeah, yeah.

01:31:23   I didn't know. You know, we're just doing our thing over here.

01:31:26   So this is an open letter that has been sent to Congress people in the US and will be sent to some MEPs in Europe, in Italy, right? I think specifically.

01:31:38   Will be sent to some Italian representatives on the EU committee about AI.

01:31:44   And did you take a similar tact with the letters to the Congress people? Like how did you, or John, how were the Congress people chosen?

01:31:53   John found the, the centers that sponsored AI legislation publicly and was able to find their obviously web pages.

01:32:06   And it was a bit of a process in the US because John tells me that these websites are horrible.

01:32:12   One of them wouldn't even let him paste the letter because the website kept saying that it was a security risk to be able to paste plain text.

01:32:21   And so John had to mail physical letters to senators just to make sure that they were getting them because their web systems are a bit old, it seems.

01:32:32   I think given what I've seen on the EU website that I can just email people because each of them, they have an email address.

01:32:41   I mean, the open letter had a lot of links in. How do you do with links in a printed form?

01:32:47   Oh, the printed version as a citations at the bottom of the letter.

01:32:51   What a lawyer that man is.

01:32:53   Yeah, yeah, yeah.

01:32:55   Obviously in sending these things out, you are like, you know, you are making yourself aware to these people.

01:33:04   If you were called to testify, would you do it?

01:33:08   A hundred percent.

01:33:10   Okay.

01:33:10   Yes. Yeah. Yeah. If that even ever happens and I get to share my take on what happened, I mean, it's all out in the open.

01:33:26   I feel like if I can play any role in because I strongly believe that we do need legislation when it comes to regulating training and data sets and rethinking.

01:33:42   I mean, obviously, like the elephant in the room here, which is very much outside the scope of this episode.

01:33:51   The concepts of copyright and fair use are in deep need of a rethink.

01:33:57   And I mean, let's, you know, let's face it.

01:34:01   They were old before.

01:34:05   It's become just a hundred times more complex now.

01:34:10   But yeah, I think I would love to do it. And I do think we need at a fundamental government level to be able to put in place guardrails for publishers and creators to make sure that we protect.

01:34:28   Like I said, that we protect the economic viability for people to be able to create art and to publish content for a living instead of being replaced by large language models.

01:34:43   And I think there's a, you know, smarter people than me, I'm sure could work on a framework so that the two entities, people and large language models can coexist without one of them suffering from the existence of the other.

01:35:03   I have a lot of respect for what you two are doing, like, you know, forming new opinions, clarifying those opinions and then speaking so clearly and loudly about them.

01:35:13   Even, you know, as we were talking under what is, I have no doubt mounting criticism.

01:35:18   I'm sure people are happy for you and support you too, but sometimes it can be hard to see that for the noise.

01:35:25   Yeah, I know. There's something, I've been listening obviously to Cortex for many years, since the beginning, really. What was it, 2015?

01:35:36   Yeah.

01:35:37   Yeah. And there was something that Gray said in either episode one or two that has really stuck with me all these years. Something along the lines of you put out something on the Internet and you get 100 items of feedback, 99 compliments and one criticism.

01:35:57   And the way the human brain works, you fixate on that one criticism out of 99, you know, items of praise.

01:36:07   And but it's OK. It's part of the job. And like I said, like I said, when you are honest with yourself about something that you believe in, you shouldn't be scared of saying it.

01:36:21   If I were a hypocrite and, you know, said or wrote things that I don't fully believe in.

01:36:30   I would feel horrible. I would feel physically sick, but that's not what I do. And so I can take the criticism of taking it before I can take it now.

01:36:40   It's fine. I just hope that, you know, instead of and I am getting those emails, people be like, oh, I'm subscribing from Club Max stories because I'm tired of you and John criticize AI.

01:36:55   Like I'm seeing that. It's OK.

01:36:58   I hope that the I hope that we can work toward a solution in this conversation. And that includes, you know, something that I was pretty disappointed by.

01:37:12   When it comes to our community, when it comes to our industry, I was pretty disappointed by the fact that Apple has stayed silent on all of this.

01:37:20   And I hope that I hope that that will change. I hope that that we will get at some point some kind of comment from Apple about what they think of the training that they have done and how they have approached this.

01:37:32   I was I am pretty disappointed by that.

01:37:36   So I hope that we will be able to work toward the solution and whether that solution, you know, also includes working with regulators to share our feedback. I'm up for it.

01:37:51   Yeah.

01:37:53   All right. Thank you for listening to this week's episode of connected. You should go read Federico and John's post.

01:38:00   It'll be in the show notes for this week's episode. But you can also go to Max stories dot net where you'll find Federico and he is at Vatici online.

01:38:09   V I T I C C I. He also hosts a bunch of podcasts. We were talking in the pro show about one of my new favorite podcasts, which is NPC next portable console that Federico and John host together with Brandon.

01:38:21   Wonderful.

01:38:22   If you want to find Stephen online, he'll be I hope he'll be back next time. I don't know. Maybe he'll have too much freedom and goes wild with it.

01:38:31   You could never tell. He's at five topics that and he is at ISMH 86 to really think about that for a second.

01:38:39   I am at I Mike I M Y K E. You can hear this show, my show and other shows here in relay FM.

01:38:46   You can also find my product work over at cortex brand dot com. Thank you to NetSuite Squarespace and Ecamm for the support of this episode.

01:38:54   And thank you to our members who support us with connected pro. You can go to get connected pro dot com and get longer ad free episodes each and every week.

01:39:04   We'll be back next time. Until then, say goodbye Federico.

01:39:08   Arrivederci.