PodSearch

The Talk Show

362: ‘Grand Scale Foot-Shooting’, With Anil Dash

 

00:00:00   The saying that's been in my head ever since you kindly agreed to be on the show today,

00:00:04   and I was thinking about the stuff I wanted to talk about,

00:00:07   is the famous Chinese blessing that is in fact a curse, "May you live in interesting times."

00:00:17   Got what you're wishing for, right?

00:00:18   And if that does not in a single sentence summarize November 2022, I don't know what else does.

00:00:28   I feel like I keep hearing that phrase, and I keep hearing, "The dog who caught the car it was

00:00:32   chasing."

00:00:33   That very much true. So if you don't mind, and I don't talk electoral politics often on this show,

00:00:41   but it's been an eventful month, and I kind of feel it ties into the discussion of Twitter,

00:00:46   which is going to be the main thrust of the show. I don't know if you've heard, I'm presuming that

00:00:53   you've heard.

00:00:53   What? What happened? What?

00:00:55   But I don't think that they are unrelated at all.

00:00:58   No, they're inseparable.

00:01:00   Right. Former TV game show host and New York real estate magnate Donald,

00:01:05   let me check the spelling here, Trump, announced that he'd be running for president last night.

00:01:10   Surprising, absolutely nobody.

00:01:12   Yeah, can't believe it.

00:01:13   But there is a weird thing going on, where the good news for me and you,

00:01:18   probably most people listening to the show, but I realize that because this is not a political show,

00:01:22   there's certainly some people, lean Republican who listen, and I'm not trying to turn this

00:01:27   opening segment into a go blue power hour. But you know, the Democrats had a good election.

00:01:33   Everybody, I don't know any-

00:01:35   Shockingly good.

00:01:35   Yeah, shockingly good. Historically, in the US, where we have elections on a very regular schedule,

00:01:42   as opposed to like a parliamentary system, every four years, we have a presidential election,

00:01:47   and in between, we have what we call the midterm. And historically, whichever party holds the White

00:01:53   House, the midterms tend to go to the opposition party. And it makes-

00:01:57   Well, usually they get shellacked. It's like it's not a slight tilt.

00:02:00   It's usually they get their butts handed to them.

00:02:02   Right. And most recently, or among the most recent, but the one that almost doesn't seem

00:02:10   intuitive, given how we think of him and his presidency is Barack Obama,

00:02:15   who finished two successful terms and left with very high ratings and is still,

00:02:20   I would say universally acclaimed as the most popular political figure on the left in the

00:02:26   United States and whoever is in second place-

00:02:28   In generations.

00:02:29   In generations.

00:02:30   Maybe since Kennedy or something. Yeah.

00:02:31   Yeah. And Kennedy obviously left on martyrs terms. So it's hard to compare. Whereas Bill Clinton,

00:02:39   who left with very high approval ratings, was much more controversial because of his personal

00:02:46   predilections, for lack of a better term, among other things.

00:02:50   And I don't think the years have been kind to his legacy, even amongst his supporters.

00:02:54   Right. But he did leave office, let's just say, you know, he's very popular,

00:02:58   but not in a way- so let's say, how long are we out of Obama? So we're six years post-Obama, right?

00:03:04   Is that right?

00:03:05   That sounds right.

00:03:05   So six years post Bill Clinton would be 2006. He felt more like, not history,

00:03:12   but-

00:03:12   Wasn't really relevant.

00:03:13   Bit of a relic.

00:03:14   Yeah. Already in 2006. And 9/11 was sort of a snap of the fingers change of the dynamics

00:03:22   of everything anyway. And it had nothing to do with Bill Clinton, but Barack Obama, very popular,

00:03:28   he got killed two years in the midterm elections, two years in.

00:03:32   Yeah. Yeah. Again, almost at a historic level.

00:03:35   Right. Like, just devastating rejection of his policies or revert back to the center,

00:03:42   whatever you want to call it. Democrats did very well this year for a midterm. There are

00:03:47   two opposing explanations. I mean, I know all politics is very complicated and multi-

00:03:53   I thought we were going to go straight to, and therefore it's time to fully socialize Twitter.

00:03:57   No, that's not- okay, sorry. I just- full communism, we're taking it over,

00:04:03   this is now the state media.

00:04:04   Sorry, I should jump the gun. Sorry. Okay.

00:04:09   One explanation is that Donald Trump is still obviously the leader of the Republican party,

00:04:14   and he is so deeply unpopular that, and has insisted upon for fealty, for his political

00:04:24   followers to promulgate the big lie, a true capital B, capital L, big lie out of the big lie,

00:04:34   playbook of demagoguery, authoritarianism, whatever you want to call it, that he actually

00:04:41   did not lose the election that he very clearly lost by not that- he wasn't that close. It was closer

00:04:50   than I would like it to be in various states, but when you overall look at it, it wasn't even that

00:04:54   close. He lost. And again, I don't know if you recall, but before he left office, he attempted

00:05:03   what I honestly believe was a coup. Yeah, to stay in the right historical term. So that's one

00:05:10   explanation of why the Republicans lost in this midterm is that he's still a leading figure,

00:05:15   his party still hasn't completely rejected this nonsense and said, "No, you lost, you big dummy,

00:05:21   let's move on." The other is that earlier this year, the Supreme Court overturned Roe v. Wade,

00:05:30   which had been 50 years of legal precedent, making abortion, legalized abortion, the law of the land

00:05:37   in all 50 coast to coast, 50 states, all US territories, and that this was so unpopular.

00:05:45   And this is, I think, where you're going with the dog catching the car, right? That for you're in my

00:05:49   lifetime and most the lifetimes of most of the people who are going to be listening to us, if not

00:05:54   their entire lifetimes, this has been a hot button issue in United States politics every year. It's

00:06:00   never abated. It doesn't ebb and flow. It's always red hot. And it has been incredibly effective for

00:06:10   Republicans because there is a certain portion of their base, the evangelical vote in particular,

00:06:16   who is electrified by this issue, who truly believes that they want abortion to be illegal,

00:06:22   and that they could use that issue every single election to drum up a certain portion of the

00:06:29   electorate who would vote on that issue.

00:06:31   Right. There's a substantial number of single-issue voters who are galvanized by that to the exclusion

00:06:36   of everything else.

00:06:36   To the exclusion of everything else. And therefore, if what Republicans wanted to do in a state level

00:06:43   or at a national level had nothing to do with abortion or even healthcare at all, just something

00:06:48   to do, let's say, with the tax rates, which often comes up, but they could galvanize some portion,

00:06:55   a significant portion of voters who care about nothing else, or at least while they might care

00:07:01   about other things.

00:07:02   They might compromise on everything else for this.

00:07:04   Right. And so that strategically, the best thing that could happen to them, according to this

00:07:11   theory of politics, would be for Roe v. Wade to remain the law of the land forever, but keep it

00:07:20   out in front of the voters and keep saying, "We'll keep nominating judges who are skeptical about it,

00:07:25   and we'll keep doing it."

00:07:26   Always on the cusp.

00:07:27   Always on the cusp.

00:07:28   James Bond never dies. He's just always in danger.

00:07:32   Cinematically, I think it's one of the great scenes of movie history is in Indiana Jones and

00:07:38   the Last Crusade with, of course, Steven Spielberg as the director, and it's near the end, and

00:07:43   there's the Holy Grail. And this temple is falling apart, and Indiana Jones has fallen into this

00:07:49   crevice, and there's his father is holding onto his hand. He's slipping, but he's got one finger

00:07:55   on the Holy Grail, and he can almost tilt it. And if he could just tilt it, maybe—

00:07:58   I can almost get it, Dad.

00:07:59   —a quarter inch, just that! And his dad finally calls him by the name he prefers, Indiana Let's

00:08:07   Go. Or I forget the exact line, but he gives it up. Let it—

00:08:10   I think he says literally, "Let it go."

00:08:11   "Let it go." And—

00:08:12   Yeah.

00:08:13   But that finger on the cusp of the whole—we're so close, we're so close—

00:08:18   has been—now, where those two theories—and they've both—

00:08:23   And this is that one rare Spielberg movie where the dad's actually sticking around, right?

00:08:26   So it's as deep as it gets.

00:08:28   It truly is. I think it's a great movie.

00:08:31   Yeah, it is a good one.

00:08:33   And yeah, it's true that that's the one where the dad—yeah, it's deep on that level. In the

00:08:39   Spielberg-y—and when we look back on his whole career, it'll stick out for that reason, too.

00:08:44   Of course, both of these things can be true. That Donald Trump's overall unpopularity with

00:08:49   the overall electorate, combined with the overwhelming majority of U.S. voters who favor

00:08:56   some level of legalized abortion, including support for Roe v. Wade remaining the law of the land,

00:09:04   and even while Roe was the law, there obviously have been, especially in recent years,

00:09:10   state-by-state restrictions.

00:09:12   Yeah, it's been effectively illegal for a lot of people for a long time.

00:09:15   It has been.

00:09:15   Or—

00:09:16   —difficult.

00:09:16   Illegal by your and my perspective from Pennsylvania and New York, right? But from

00:09:22   where we might be going was legal. So they're not in conflict. Both things could be true.

00:09:27   But the interesting confluence of them is that Donald Trump personally was aghast when the

00:09:36   Supreme Court Dobbs decision that overturned it leaked. The leaked version was other than—

00:09:42   Was pretty accurate.

00:09:43   It was like a few bits of copyediting different.

00:09:47   It's like the final draft, not final, final draft.

00:09:51   Right. Well, the one thing—every once in a while, it's not like I sit there and read

00:09:56   Supreme Court decisions as a hobby, but it is interesting to me as a layperson to the law

00:10:03   how well-written they are.

00:10:04   They're great writers. They're incredible writers.

00:10:06   Right.

00:10:06   The journalist Dahlia Lithwick, who is one of my favorite journalists, period, and she

00:10:11   covers Supreme Court law for Slate, and you see in the way she quotes it, that level of

00:10:16   fluency they have where they know they are many times just literally writing history.

00:10:20   Correct. And she's been at Slate for, ooh, a long time.

00:10:25   Yeah, a hens' age.

00:10:26   Yeah, for all the turnover in the media world, Dahlia Lithwick at Slate covering the Supreme

00:10:31   Court has been a mainstay, and you're correct. I will say this. I, of course, did not agree

00:10:36   with his politics almost at all. I also feel that he was quite a bit of a hypocrite in terms

00:10:43   of his proclaimed adherence to certain principles and the way he voted in certain cases where

00:10:51   it went against his way.

00:10:52   But Antonin Scalia, with the standard of Supreme Court justices being good writers, was an

00:10:58   exceptional writer.

00:10:59   I mean, like, really, I mean, just read some Scalia decisions and it's like, oh, man, this

00:11:05   guy had, like, a columnist's bones in him for, like, a turn of phrase.

00:11:09   He's a good enough writer that it was exasperating that he was wrong about everything for me.

00:11:13   Yeah.

00:11:13   Like, it's one of those things where you're like, ah, yeah, it'd be better if you sounded

00:11:16   like a caveman while being, while pushing all my buttons.

00:11:19   But it, but as soon as this, that decision leaked and it turned out the leak was going

00:11:25   to be their decision, also then leaked that sources close to President Trump down in Mar-

00:11:31   Mar-a-Lago, that he was aghast at this and thought this is going to be a disaster for

00:11:37   Republicans with suburban women and young women.

00:11:40   Yeah.

00:11:41   Yeah.

00:11:42   And guess what?

00:11:43   Yeah.

00:11:44   It was.

00:11:45   So in some ways, these two things, while they could both be true, they're sort of in conflict.

00:11:50   And then there's a part of you that wants to say, my God, Trump, you're the one who

00:11:55   nominated three of these fucknuts.

00:11:57   So, yeah, I mean, I think there's a sort of, I'm very skeptical of, like, grand unifying

00:12:03   theories and everything is 12 dimensional chess and everybody has a plan and understands

00:12:07   what's going on.

00:12:07   I think we ascribe historical narratives retrospectively and like, this is why this

00:12:12   happened, right?

00:12:12   And it's like, you got to get at least 15, 20 years, maybe 50 years clear or something

00:12:17   and then that narrative forms.

00:12:18   But this is one of those things where I think there is a common thread and believe it or

00:12:22   not, I think this actually even ties to the Twitter story, which is you have true believers,

00:12:26   right?

00:12:26   Obviously, there are people who are anti-abortion in America.

00:12:29   And like I said, that's the, almost the animating principle of their political view, social

00:12:33   view of the world.

00:12:34   And also importantly, a big part of their identity, right?

00:12:37   That's how they sort of form groups and make friends and navigate the world is this is

00:12:41   their sense of who they are as a person.

00:12:44   And then a much larger adjacent audience for whom some of that is there, some part of principle

00:12:51   there, but also there's an animating force.

00:12:54   And this is again, one of the things where like, just objectively, it is much more common

00:12:57   on the right, which is like, I want to own the libs.

00:13:00   And it's a weird, it's like a, I guess it's very intoxicating.

00:13:03   Like it's a thing I actually struggled with.

00:13:04   I'm saying like, you know, I get mad at people I politically disagree with, but when I'm

00:13:08   not thinking about that issue, I'm not thinking about them.

00:13:10   I'm not sort of saying, how can I rub my hands together and find a way to make their

00:13:14   life worse today?

00:13:15   But it is almost like kind of the sport, right?

00:13:18   It feels very much like that.

00:13:19   Like it's fun.

00:13:20   It is fun for people who hate the Yankees to hate Yankees fans.

00:13:23   Right.

00:13:24   Yeah.

00:13:25   It's a really, and so like that's kind of the best analogy I can think of.

00:13:28   And also there's been this sort of similar dynamic around popular culture, right?

00:13:32   So people who are stands of a certain artist on social media, like they have to perform

00:13:37   their fealty to the artist, even if it's kind of like an opposition to something else.

00:13:40   Like, I don't think as far as I know, there's no actual animosity between Beyonce and Taylor

00:13:45   Swift, but their fans kind of have to hate each other because that's the way you play

00:13:49   fandom right now.

00:13:51   That is that's how you perform fandom right now.

00:13:53   And so that thing about the, like on the lips piece, I think it's a big piece of where

00:13:57   they're like the impact of the policy or even the substance of the policies are relevant

00:14:01   compared to, does it antagonize the right people in a way that I find satisfying and

00:14:06   that performs in group sort of identification for us.

00:14:09   And what's funny about this is, I mean, like there's not funny at all, obviously in terms

00:14:13   of like real world policy and real world impact.

00:14:16   But what's striking about this is this is kind of impacted the most powerful people

00:14:20   in the tech industry, like the wealthiest, some of the wealthiest people who've ever

00:14:23   existed in the history of humanity are juicing themselves up on the same, like kind of own

00:14:29   the libs energy and getting that far divorced from reality, even when they're going to

00:14:33   face a backlash.

00:14:34   And it's really striking because I don't think Trump and his people in his circle are

00:14:39   under any delusions that they're going to be liked by people.

00:14:41   I mean, they're like, we know who our audience is and we care about them and everybody else

00:14:45   we explicitly don't care about.

00:14:47   And that's the game, right?

00:14:48   Striking thing though, is like, Musk is in this bubble and this is, there's pure teal.

00:14:54   There's sort of this cohort, these guys that have gotten high on their own supply as they've

00:14:58   become incredibly wealthy and sort of detached in the real world where they've kind of lost

00:15:03   the plot.

00:15:04   Right.

00:15:04   And I think about this sort of when Musk was building base acts rockets, he's not like,

00:15:09   I care about your alignment with me more than I care about whether the rocket goes up and

00:15:13   we've turned the corner fully into like rocket goes up to secondary to literally your field

00:15:19   to me, as we record this, he was just sort of had a, he's like fifth consecutive day

00:15:23   of loyalty pledges.

00:15:24   He's demanding from engineers of all people, right?

00:15:27   That sort of least inclined personality type to say, let me leave with fealty.

00:15:32   Right.

00:15:32   And so it's such a, it's such a striking thing.

00:15:34   You're like, well, how did they get to this place that is obviously grand scale foot shooting?

00:15:39   Like it's just an unbelievable misread of like cultural standards and norms around technical

00:15:46   culture.

00:15:46   And you realize it's because they've been sort of hyping themselves up and the folks

00:15:49   you've got in the room, like a Jason Calcanis, who's fairly well known within the industry,

00:15:53   but I think outside the industry is not known.

00:15:55   He's a doofus and he's been a doofus for like 20 years.

00:15:58   People have been like, ah, that guy's kind of a chucklehead.

00:16:00   He's not dumb per se, but he's definitely a kiss ass, right?

00:16:04   Calacanis for people who don't know or vaguely remember his name started web blogs, Inc.

00:16:11   I believe was his endeavor.

00:16:13   Right.

00:16:13   Well, he really started with Silicon Alley insider, which was sort of the trade mag for

00:16:17   New York during the dot com bubble, which is both true and damning with faint praise.

00:16:21   But then web blogs, Inc.

00:16:24   Was like when blogs first took off and blog became, it was like a fresh word.

00:16:30   In everybody's mouth.

00:16:31   And it was trying to be the rival to the Gawker network.

00:16:33   So you had Gizmodo and you had Engadget and this was a rival.

00:16:36   I mean, you know, meaningful rivalry.

00:16:37   And more or less though, he would find anybody, any blog that had gotten any traction in some

00:16:45   measure of success, then web blogs, Inc.

00:16:47   Would come out with a clone of it.

00:16:48   And yeah, yeah, very explicitly.

00:16:50   There was no, there was, I mean, it was very straightforward.

00:16:53   It's like if Nick Denton and Gawker won in a category, then we're going to do that category

00:16:57   too, with like a very obvious imitation of it.

00:17:01   It was sort of the first generation Samsung phones, like, boy, that looks a lot like an

00:17:05   iPhone.

00:17:06   Right.

00:17:06   And it was like this.

00:17:08   And again, everything eventually mixes together.

00:17:10   And there were very talented people who worked at Gizmodo, which was the clone and Engadget.

00:17:16   Gizmodo was the Gawker winning.

00:17:17   Oh no, that's right.

00:17:18   That's how confused I am.

00:17:19   That's how confused I am.

00:17:21   Exactly.

00:17:21   Well, I have friends who had worked at both.

00:17:24   They were both good, but that was basically the idea was you come up with one.

00:17:27   And then all of a sudden Nick Denton got like a car blog and now Weblogs Inc had a car blog.

00:17:32   And then as soon as blogs became not really the hot thing anymore, he just lost interest

00:17:37   in it and moved on.

00:17:38   100%.

00:17:39   Sold it to AOL and then skipped out.

00:17:41   Yeah.

00:17:41   And some, now he's in Elon Musk's inner circle.

00:17:44   Well, and the texts have come out and this is the kind of thing where like in the substance

00:17:48   of it, it doesn't matter that much, but it's been telling because the texts that were

00:17:51   subpoenaed for Musk for another trial.

00:17:53   So they included these texts with Calcanis and it was this very obvious ass kissing.

00:17:58   Right.

00:17:58   And it was kind of striking because I was like, I would think, well, one, if you're the richest

00:18:03   man in the world, I'm sure you're constantly getting peep kissing your butt because they

00:18:06   want to get close to you.

00:18:08   But I would think that would naturally put you off of it because it's very obvious.

00:18:11   And yet here they are in there.

00:18:13   And I was like, man, this guy doesn't have any friends.

00:18:15   And that started starting the same thing as when Musk had done SNL because it was just

00:18:20   brutally unfunny.

00:18:21   Right. It's just a really horrible misread.

00:18:24   And it's the kind of thing where it's like, oh, you don't have any friends because somebody's

00:18:28   going to tell you like, this isn't working, man.

00:18:30   Like this isn't, you don't, it's not winning.

00:18:32   Like this isn't your lane or whatever the language is.

00:18:34   But like, it was such a striking thing because I'm like, that's a rough place to be as if

00:18:39   you don't have that backstop.

00:18:41   And it is striking because you look at, there have been similarly powerful tycoons, right?

00:18:45   The most pertinent for you, the Steve jobs.

00:18:47   Right.

00:18:48   And certainly a domineering presence, certainly somebody who held no truck with bullshit from

00:18:54   people and could bowl people over.

00:18:55   But clearly the kind of person that could take feedback.

00:18:59   And I mean, actually, I think the thing he would tend to do with the next day, come back

00:19:01   and sort of say, oh, that was my idea, but he took the feedback and heard the point where

00:19:06   somebody's like, that's not it.

00:19:06   That's not working.

00:19:07   And also stayed in his lane.

00:19:10   Like he was not trying to be a cultural figure.

00:19:12   He was not trying to date supermodels.

00:19:13   He was not trying to the things are that the trappings of losing the plot about your ego

00:19:18   in the world.

00:19:19   And he had a huge ego, but not in that lane.

00:19:21   Well, let's say this.

00:19:23   I would say most obviously it almost indisputably would be his leadership of Pixar where, yes,

00:19:32   where by all accounts, he never tried to, he never stepped in and told them how to make

00:19:40   movies.

00:19:40   Now they famously, and John Lasseter has said this, Ed Catmull has said this, other directors

00:19:45   have said this where, you know, while Jobs was alive.

00:19:48   And in those like early years of Pixar, I know it's not the earliest years of Pixar as

00:19:53   a company, but Pixar post independent.

00:19:55   Yeah.

00:19:56   Yeah.

00:19:56   In between Toy Story and Disney acquiring Pixar that they and everybody who's ever worked

00:20:03   at Pixar always talks about its story.

00:20:05   It's not about the computers.

00:20:06   It's not about the actors, it's story.

00:20:08   And every Pixar movie has always had a point where they realize while they're in production,

00:20:14   uh, the story, we got to start over.

00:20:17   The story breaks here, 40, 45 minutes into it or 50 minutes into it or somewhere between

00:20:23   act two and three.

00:20:23   Oh, this really breaks.

00:20:26   And that there have been, there were times where they would show Steve Jobs the rough

00:20:29   cut and he would have like, just like two comments, but it was because, Hey, Steve,

00:20:33   take a look at this.

00:20:34   And they asked him and then he'd get, and they said like often he would come up with

00:20:39   very insightful, but very brief, not acting like, Oh, I can direct movies.

00:20:44   I can do line readings better than Tom Hanks.

00:20:47   Yeah.

00:20:47   That's sort of it is the idea of like expertise.

00:20:50   Right.

00:20:50   And I think, and you look at the, also the legacy of execs surrounding Jobs where like

00:20:56   there is not historically a better global supply chain manager than Tim Cook probably

00:21:02   ever.

00:21:02   Right.

00:21:03   Right.

00:21:03   Johnny Ive speaks for himself.

00:21:04   I mean, and I think those things is like, people like that are drawn to a leader who's

00:21:08   going to give them the shot to do the thing where they can be the best in the world at

00:21:11   what they do.

00:21:11   Right.

00:21:12   Nobody's going to be what he is in a keynote.

00:21:14   And there is zero people in the world who know how multi-million scale social networks

00:21:19   work, who would tell you Jason Calcanis is the guy to get in the room.

00:21:22   All right.

00:21:24   Let's step back before we keep going with Twitter.

00:21:26   I'm going to go out in the lane now.

00:21:29   I've got, I've already got probably some crow to eat here on Twitter and I'm not

00:21:33   afraid to admit when I'm wrong.

00:21:34   I would like to think good trait.

00:21:36   I've often said that the way to be right all the time is to hopefully be right most

00:21:42   correct most of the time, but to have the self-awareness and humility to recognize

00:21:46   when you're wrong and then go back, figure out why you're wrong and say, Oh, I was

00:21:50   wrong.

00:21:50   And now here's what I think.

00:21:51   And if you can do that, you can come pretty close to being right all the time.

00:21:55   Probably said some things that have already not aged well about Twitter and under

00:22:02   Elon Musk, but let me go out on a limb here politically and say something publicly

00:22:06   that I have been saying to friends privately for over a year, which is that I do not

00:22:11   believe Donald Trump is going to be the nominee for the Republicans in the next

00:22:16   presidential election.

00:22:17   And I don't even think it's going to be close.

00:22:19   He can announce, you know, he did yesterday, but I don't even think it's going to come

00:22:24   to the point where it's some kind of close.

00:22:26   Oh, which the next slate of whatever.

00:22:29   I think by the time the actual primary start in early 2024, he'll already be out of the

00:22:34   race.

00:22:34   It'll be a runaway.

00:22:35   So this is interesting because I was talking to a friend of mine who is an immigrant

00:22:38   who's been very successful in the U S and, and again, I hate grand unifying theories,

00:22:42   but this is another one where I'm like, Oof, this one it's hard for me to argue against,

00:22:46   which is that, and this is the thing where immigrants always have the insight into

00:22:49   American culture that like we miss growing up here.

00:22:52   Cause I grew up in Pennsylvania too, right?

00:22:54   Is he's like, you know, Americans hate losers.

00:22:56   It's like, they don't want to come back story.

00:22:59   They don't mind the water in the wilderness and then, and now you're going to try and

00:23:02   regain your title, but they really hate losers and these hate sore losers.

00:23:08   Like it's sad and it's gross and they don't want to get any on them.

00:23:11   And that was the thing where I was like, man, if that ain't the truest truth.

00:23:14   And especially in traditionally like conventional masculinity, conservative American identity,

00:23:21   you know, I grew up like I, you know, my, my girlfriend drove me to my senior prom in

00:23:25   an F one 50, right?

00:23:26   Like this is where I'm from.

00:23:27   Although she was driving.

00:23:28   So they probably got mad about that.

00:23:29   But the thing is that like that identity cannot stand the idea of a loser.

00:23:36   Like a loser, like a sore loser is who I got to say is my guy.

00:23:41   Like, no, like, no, I'm not going to do it.

00:23:43   I'm not going to do it.

00:23:44   Right.

00:23:44   Cause like we're super fickle.

00:23:45   Like I like my team when they're winning.

00:23:48   And so I think that thing where like the nuanced, like it definitely like folks like me who

00:23:53   are like now the big city liberal media types are like, oh, it's sometimes you have a downtime

00:23:58   or whatever, but like the people who got mad about Luke Skywalker taking a sabbatical in

00:24:02   the star Wars sequels, right?

00:24:04   Like they're not trying to find a guy who's like still real mad about something that happened

00:24:09   years ago as their leader.

00:24:11   And I was like, that is a very plausible argument.

00:24:14   And that does lead me to believe your assertion about a nomination doesn't look good because

00:24:19   it's like, he's got loser stench all over him.

00:24:21   The, what I'm predicting is not based on Trumpism declining, which I hope people who are in

00:24:26   or in they're not going away, but I would love it if it did.

00:24:28   And I would love if whoever the Republican nomination nominee ends up being is untrumpy

00:24:34   in as many ways as possible.

00:24:36   Even though I know that the least Trumpy Republican who might win, let's say like a Mitt Romney,

00:24:43   right?

00:24:43   Like I would not have been a fan of a Mitt Romney or a John McCain presidency in numerous

00:24:49   ways, but I also respect both men.

00:24:51   And that's part of living in a democracy is I understand that as long as I have my health

00:24:57   maintains for the remaining decades of my life, there are going to be Republican presidents

00:25:03   and Democratic presidents.

00:25:04   And I'm probably not going to like any of the Republicans, but I at least hope that

00:25:07   whoever they are, are honest people with the best intentions for the country.

00:25:13   And I think Trumpism is a political slant that that's not true for.

00:25:17   But yeah, no, it's a poison to the system, right?

00:25:19   It's not a balance of this vision of democracy versus that one.

00:25:23   It's like, let's break the system.

00:25:24   But that's it.

00:25:25   But, but so I ideally and talking, you know what I mean?

00:25:30   Speaking of ideals, political ideals, I care more about that, but I'm just saying as an

00:25:37   observer who's, you know, overall pretty good at predicting elections.

00:25:42   I actually want some money betting on this one on predicted.

00:25:48   No, I mean, it wasn't just because I was the market is wise.

00:25:51   I wasn't just voting based on who I hoped win, but who I actually thought would win

00:25:55   like here in Pennsylvania.

00:25:56   I thought I think John Fetterman will pull this out.

00:25:59   And he did.

00:25:59   But that's another one again, where it's like having grown up around the folks who would

00:26:03   be voting for him.

00:26:03   I'm like, Oh yeah, this guy's got it.

00:26:05   Yeah, it's a no brainer.

00:26:06   Oh, and if he hadn't had the stroke, I mean, he would have run away with it.

00:26:09   Yeah, I think he's going to be all right on that front.

00:26:10   But anyway, it's that he's a loser.

00:26:13   And Trump knows it too.

00:26:15   And I think that's why he's so latched on to this big lie theory is that it's his only

00:26:20   chance because otherwise he lost and you can't.

00:26:23   These two things can't both be true.

00:26:26   You can't say Joe Biden totally sucks.

00:26:30   And I lost to the guy who totally sucks.

00:26:34   You should vote for me again next time.

00:26:36   Look at my power.

00:26:37   Yeah, it doesn't work.

00:26:38   It doesn't work.

00:26:38   He's a loser.

00:26:39   And you're so right.

00:26:41   When you're an outside observer, like an immigrant and you just observe it, you just say, Oh,

00:26:44   yeah, Americans hate losers in a way that other countries maybe don't have.

00:26:49   Nobody will be empathetic about it.

00:26:51   Yeah, because they're like, Well, everybody takes a turn.

00:26:53   Everybody's a loser sometimes.

00:26:54   Or we're like, No, I don't.

00:26:55   I'm in straight denial.

00:26:56   We've only ever been winners.

00:26:58   You look at our narratives around national greatness.

00:27:00   And I would say I bet the majority of Americans believe we've never lost a war.

00:27:05   Oh, I grew up thinking that.

00:27:07   I mean, we were probably taught that in school.

00:27:09   Yeah.

00:27:09   And it's just like a fascinating like delusion, which I honestly kind of find charming.

00:27:14   Like, yeah, obviously, there's really there's a lot of really negative effects to it.

00:27:17   But like as a personality trait for a country, it's like, Oh, that's like, oh, that's kind

00:27:21   of sweet.

00:27:21   Like you have this weird fixation that like you have to be the best and you won everything

00:27:25   and you're a winner all the time.

00:27:26   So anyway, that's my theory.

00:27:28   But I also I just I just see a certain irony in it in that it was exacerbated in this midterm

00:27:34   in a way that Trump himself exactly predicted.

00:27:38   And I actually his political instincts are astute.

00:27:42   I mean, sadly, he did a moment very well.

00:27:44   But he also is zero impulse control.

00:27:47   And so I think it's a really weird combination, right?

00:27:51   Usually people who are good at understanding the political temperature, like, and therefore,

00:27:55   I will use that to decide what I do in response.

00:27:58   But that's not him.

00:27:59   Let's get back to leaders who have very poor impulse control in a moment.

00:28:06   But I am going to take a break here and thank our first sponsor, our good friends at Collide.

00:28:10   That's K-O-L-I-D-E.

00:28:12   If you are listening to the talk show, the odds are very good that at some point you

00:28:17   your company is going to go through an audit like SOC 2 or ISO 27001.

00:28:25   I don't know what those things are because I don't work for a corporation, but they

00:28:28   sound terrible to me.

00:28:30   And when you do go through one of these audits, you have to answer some tough questions about

00:28:34   endpoint security, like do all of your company laptops have their disks encrypted?

00:28:39   Does everyone have the company's password manager installed?

00:28:42   Do you have a system in place to monitor and maintain compliance throughout your cross

00:28:48   platform fleet?

00:28:50   Even if you're confident, the answer to all of those questions is yes.

00:28:55   The bigger question when you're facing an audit is, can you prove it?

00:28:59   If you're not quite sure how you'd go about proving that, you need Collide.

00:29:04   Collide is an endpoint security tool for Mac, Windows, and Linux devices that does things

00:29:11   MDMs cannot do, and it gives you visibility you need to achieve and maintain compliance.

00:29:16   Best of all, Collide doesn't resort to surveilling your employees or locking down their devices.

00:29:22   Instead, it works with your end users to resolve the issues, to teach them, and it relies on

00:29:29   their cooperation and informed consent.

00:29:32   You can meet your security goals and pass your audit without compromising on privacy

00:29:37   or a relationship with your employees based on trust.

00:29:40   Visit collide.com/thetalkshow to find out how.

00:29:44   If you follow that link, they will hook you up with a goodie bag that includes a free

00:29:48   t-shirt just for activating a free trial.

00:29:51   That's collide.com/thetalkshow.

00:29:57   My thanks to them.

00:29:58   You don't want to mess around with SOC 2 compliance. You gotta get it right.

00:30:02   SOC 2? See, I never know what you pronounce.

00:30:04   So, for the first time in my life, I work at a publicly traded company.

00:30:07   I've never had a grown-up job. I've been in startups my whole career.

00:30:10   And now I work at Vasily, and they're a publicly traded company, and they do all the grown-up

00:30:14   stuff, and it's like, "Oh, this is fascinating."

00:30:16   It's actually very interesting.

00:30:18   And it's the contrast to organizations that don't have grown-up leadership that wants

00:30:21   to do things right.

00:30:22   What was I talking about?

00:30:23   Billionaires with impulse control.

00:30:25   Although, actually, one of them is a legit billionaire.

00:30:27   Yeah, his name is Elon Musk.

00:30:29   All right, let me read a tweet from my good friend.

00:30:31   Everybody.

00:30:32   The internet's good friend, Cable Sasser.

00:30:34   Oh, yeah.

00:30:35   Yeah, he tweeted at me, I believe it was this morning.

00:30:38   Nope, it was yesterday.

00:30:39   But he's a good friend.

00:30:41   He was the first live guest.

00:30:44   When the first time I ever did a live episode of this show, Cable was my guest.

00:30:47   Good friend, and I can take this in all the good spirit.

00:30:50   But he said on Twitter, "Every day, I am champing at the bit."

00:30:54   He got "champing" correct, of course.

00:30:56   Yep, not "champing."

00:30:57   "I am champing at the bit to make a small run of,

00:31:00   quote, 'I'm more optimistic about Twitter's future than I have been in years.'

00:31:07   John Gruber.

00:31:08   T-shirts."

00:31:08   Do they taste like crow?

00:31:12   They taste like claim chowder.

00:31:15   Claim chowder, yeah.

00:31:16   You got chowder'd, man.

00:31:18   You got chowder'd.

00:31:19   Well, I did say that.

00:31:20   And I would not say that today.

00:31:28   I still am not as pessimistic about Twitter's future as many people are.

00:31:35   But perhaps one of the reasons I wanted to have you on the show is maybe you'll talk

00:31:39   me into being even more pessimistic than I should be.

00:31:41   I want to diagnose why you said it.

00:31:44   All right, can I dig into that?

00:31:47   Absolutely.

00:31:48   That's a great question.

00:31:49   All right, number one, my theory of Elon Musk.

00:31:53   And I'm not disabused of it entirely yet.

00:31:56   Although I've obviously-- this is the part that I would have to backtrack the most on,

00:31:59   is regardless of Twitter, but Elon personally, is that I see Elon Musk as an unreliable narrator,

00:32:07   a highly unreliable narrator.

00:32:11   And that this is--

00:32:12   As a strategy.

00:32:13   It's an intentional--

00:32:14   Partially strategic.

00:32:16   And again, that is-- I've said this on podcasts before, too, that-- and some people, when I

00:32:21   said it, made any comparison to Steve Jobs, some people object to it because people still

00:32:27   admire Steve Jobs.

00:32:28   And now that he's passed, there's a martyr aspect to it.

00:32:31   And people see it as sacrilegious or offensive.

00:32:34   You can't be critical.

00:32:35   To say that there's any similarity between a despised figure like Elon Musk and a revered

00:32:41   figure like Steve Jobs.

00:32:42   But with Steve Jobs, the canonical example to me is when they first came out with the

00:32:47   iPod that had a color screen, and the reason-- why would you add a color screen to these devices?

00:32:55   And it was, well, you can now, in addition to your music, you can sync your photo library--

00:33:00   But we're never adding video.

00:33:01   Yeah, and in the hands-on area after the keynote, and Steve Jobs was there, and they actually

00:33:08   let the media ask questions at the time.

00:33:10   Yeah.

00:33:11   And they said, hey, Steve, these photos look great on the iPod, but why not?

00:33:17   What about video?

00:33:18   And he said, ah, nobody wants to watch video on a screen that small.

00:33:21   And then 12 months later, they came out with an iPod with the same size screen that, in

00:33:28   addition to showing your photos, played video.

00:33:31   Now, in addition to-- it's not like, oh, in those 12 months, they realized that.

00:33:37   The way that these lead times worked, the day that Steve Jobs said no one wants to watch

00:33:41   video on a screen that size, he knew damn well that next year's iPod is already well

00:33:48   into some phase of--

00:33:49   In development.

00:33:50   --of engineering.

00:33:52   It was halfway through development and was in product testing.

00:33:55   At the quantities they were making iPods at that time, you don't add a feature like that

00:34:00   six months beforehand.

00:34:02   He knew that they were coming out with a video playing iPod.

00:34:05   He was an unreliable narrator for strategic reasons because he didn't want to say anything

00:34:09   that would discourage people from buying an iPod today and waiting a year for one that

00:34:14   played video.

00:34:14   Why not have them buy one today and then buy another one a year from now to get the video?

00:34:19   So I do see Elon Musk as an unreliable narrator strategically in the same way.

00:34:25   But I also see him as an unreliable narrator in a completely unstrategic way, which is

00:34:32   his lack of impulse control.

00:34:34   And he'd be the first to admit it, that he at times has gotten high and tweeted things.

00:34:39   Sure.

00:34:40   Which is not a thing I think we would say about Jobs.

00:34:43   No, and not a thing you would say about the CEO of any company or--

00:34:49   Any of the other previous richest people in the world in the last several centuries.

00:34:53   Right.

00:34:53   It is not something that those people in those positions would tend to do.

00:34:58   Right.

00:35:00   And so therefore, my basic idea of why I felt like, "Hey, this might be good for Elon

00:35:06   Musk to buy Twitter, take it private, alleviate it of any shareholder pressures, and seriously

00:35:15   rejigger and do a lot of stuff."

00:35:18   Yeah.

00:35:19   I want to sort of share a couple theories of how we get to--

00:35:23   Because this is not me dragging you.

00:35:25   Like, I have made these bad calls many times too, being on social media for as long as

00:35:31   you have been too.

00:35:31   And I think there's a couple things I've seen in terms of the pattern of like, "Why

00:35:34   do we get it wrong?"

00:35:35   Or "Why do we get optimistic about these things?"

00:35:37   I think the first is to the unreliable narrator point.

00:35:40   Visionaries don't deal in facts, right?

00:35:44   Because if you say factually, "I'm going to make a phone and we're going to sell a

00:35:48   hundred million of them," it's an irrational statement.

00:35:51   And every person who's ever said it except for like two was lying.

00:35:56   And so you have to have an irrational belief, or whatever, "We're going to finally revive

00:36:02   the electric car market."

00:36:03   These are irrational statements.

00:36:04   And so--

00:36:05   Or "We'll have millions of electric cars on the road by 2017."

00:36:09   Right, exactly.

00:36:10   Right.

00:36:11   So these are things where you have to believe beyond what's rational.

00:36:14   And I mean, I've been a founder and run companies and obviously nowhere near that

00:36:17   scale.

00:36:17   But in terms of like, we have a platform that people have made millions of apps on.

00:36:22   And five years ago, you said people are going to make millions of apps and they're going

00:36:24   to do it on the regular old web.

00:36:26   That was an irrational statement with Glitch.

00:36:28   And so I think that's a thing where like, I have a tremendous respect for that.

00:36:31   And I think there's something like, this is what we tell our kids, right?

00:36:34   Dream big and shoot for the stars, right?

00:36:35   Like, that's a really-- that is actually a good thing.

00:36:38   And it's indistinguishable from narratively.

00:36:43   I'm surrounded by a bunch of kiss asses in a boardroom and we're all high on our own

00:36:48   supply or just high.

00:36:49   And our takeaway is we can do anything we want, right?

00:36:53   And so because those are so structurally similar, then the question is like, how can we distinguish

00:36:59   when it's in the former visionary category and not in the latter, like I said, high on

00:37:04   your own supply category.

00:37:05   And I think part of it is we have a weird erroneous belief in the genius myth.

00:37:10   And again, this is one of the things where Jobs being an outlier has set us up for a

00:37:15   whole generation of people that think they're him to make the same mistakes because they

00:37:19   learned the wrong lessons about the narrative that was applied to him, which is that they

00:37:23   have this idea of an ineffable genius that a person can have and it's all dudes that

00:37:27   some guy can have that is applied across different domains and genres and contexts and problem

00:37:33   spaces, right?

00:37:35   And the thing that I love is this conversation about the Twitter and Musk coming in.

00:37:39   They're like, "Look, this isn't going to be that hard for him.

00:37:41   It's not rocket science."

00:37:42   Right?

00:37:43   And I'm like, "You're right.

00:37:43   It's harder.

00:37:44   It's harder than rocket science."

00:37:46   Because when you launch a rocket at SpaceX, you're only launching one at a time and nobody's

00:37:52   trying to take the rocket down.

00:37:53   There's only gravity, right?

00:37:55   Whereas you have bad actors that are trying to take Twitter down or share misinformation

00:38:00   messages and the problem is largely social, cultural, political, as much as it is technical.

00:38:05   And it's really hard.

00:38:06   It's a really hard problem, right?

00:38:08   And so this set of skills is not applicable across these domains.

00:38:13   And that's a thing that is not popular to say.

00:38:16   And also the important thing here too is who are the deputies?

00:38:21   Who are the people that surround you?

00:38:22   Are they as enamored with the mission of doing this work?

00:38:26   And is it the thing that gets them up in the morning?

00:38:28   Because I think everybody, again, I don't know much about SpaceX and I'm not some huge

00:38:32   fan of theirs, but they got rockets in a long time.

00:38:34   It's a hard problem.

00:38:35   I guarantee you everybody in senior management there is obsessed with getting rockets in

00:38:39   the air, right?

00:38:40   There's no question about it.

00:38:41   You can tell.

00:38:42   It wouldn't get there if that wasn't happening.

00:38:45   And there is nobody in the inner circle around Musk that is like, "I want to enable

00:38:51   this new kind of conversation."

00:38:52   And the tell was he came in talking about the blue checkmark, which is a concern that

00:38:56   is an obsession with a tiny percentage of people, even on Twitter itself, let alone

00:39:01   in society as a whole.

00:39:02   And also a particular obsession of the sort of extremist tech tycoon circle jerk that

00:39:09   he's in, right?

00:39:10   Where they're like, this is the kind of thing they think is really important.

00:39:13   And they kind of, this is why journalists hate us and all this kind of stuff.

00:39:16   And you're like, normal people are just like, I go on here to see what's happening with

00:39:19   sports scores, maybe like what's happening at an awards show.

00:39:22   That's just like some detritus on the screen while I do it.

00:39:25   It's Chrome and UI.

00:39:26   It's not a thing that I think about conceptually as having to do with my sense of self or what's

00:39:31   important in the world.

00:39:32   But they started with this thing.

00:39:33   You're like, of all the problems this company had, that was so low on the list of things

00:39:37   that were broken to start with.

00:39:40   I will say that in these, what are we at?

00:39:43   Three weeks?

00:39:44   Three weeks into the mascara?

00:39:46   In these very long three weeks.

00:39:48   Yeah, incredibly.

00:39:48   Long Musk era.

00:39:49   That was the moment where I first, my stomach sunk and I thought, oh, well, this guy's a

00:39:54   chump.

00:39:55   Um, no, but let's, I don't want to move past the, uh, your, your keenly asked question

00:40:01   of why would I say that I felt more optimistic when it was officially made, it was officially

00:40:07   going to be handed over to Musk.

00:40:08   Why did I feel more optimistic about Twitter than I have in years?

00:40:11   All right, let me go back to that.

00:40:12   I don't want to, it's, it's worth examining.

00:40:15   So let's put Musk aside.

00:40:17   Twitter as it stood, I thought was in a terrible position.

00:40:22   It, and I love Twitter.

00:40:24   I think it is from the moment it has been something special right from the start, from

00:40:28   the start, as soon as I saw it, it's the rare thing.

00:40:31   And I tend to see, I tend to be confused by new things.

00:40:35   And if I don't get it, I don't want to sign up for it.

00:40:37   I got it right away.

00:40:38   It's like, oh, you just get a name and then you have a box and you hit a button and then

00:40:42   it's like, you've said a thing and then it has a URL and everybody can see that you said

00:40:47   the thing.

00:40:47   And the limit is bloggers.

00:40:49   It was better in some ways than what we had.

00:40:51   Yeah.

00:40:51   And the limit was, it was fascinating.

00:40:55   In fact, I was so enamored of the 140 character limit that I was opposed to raising it to

00:41:00   280.

00:41:01   But I realized the most John Gruber thing I've ever heard in my life.

00:41:04   I remember that.

00:41:04   And I was like, yeah, a lot of writers were, I remember Stephen King and JK Rowling both

00:41:10   were too.

00:41:11   Again, I don't want to, again, there's a whole discussion we could have.

00:41:14   Fraud examples, but sure.

00:41:15   Let's keep it.

00:41:16   But well, Stephen King, let's just say that Stephen King was not a fan of it because they

00:41:21   saw the, I think I said, I think I said this on dithering.

00:41:27   So because that's a paid podcast, I'm not, yeah, or maybe it was on, well, whatever.

00:41:33   I said it recently, but I'll say it again.

00:41:34   To me, it fitting a complex thought into 140 characters to me satisfied the same part of

00:41:42   my brain that solving wordle every day does.

00:41:44   Right.

00:41:45   And it's a puzzle.

00:41:46   And it's like, ah, crap.

00:41:48   What I did, this thing I want to tweet this funny joke or I got to get in this box.

00:41:53   Yeah.

00:41:53   But Christ, I mean, they would tell you like a game.

00:41:58   They would tell you how far away you were.

00:41:59   Like, that's not even close.

00:42:00   You're at a 211 characters.

00:42:02   And it's like, it's puzzle solving in community with people.

00:42:05   Right.

00:42:05   Now I had this where I had a link blog.

00:42:07   Actually, I had one before you, as a matter of fact, I had a link blog and I love because

00:42:10   my design was crappy.

00:42:11   I had a very finite amount of space.

00:42:13   It might've been 140 characters virtually.

00:42:15   And I loved writing headlines there.

00:42:17   Cause it had to fit into this little sidebar on the crappy blog layout that I had.

00:42:21   And I did that for like four years and it was such a joy cause it was exactly that thing,

00:42:26   which is like, I have this much space to get.

00:42:29   And for me, it was like to get this story across and also have it like be funny.

00:42:33   And it's probably like the best training grounds I could have had for Twitter.

00:42:37   Yeah, that was possible.

00:42:38   So I, you know, I've learned my lesson.

00:42:40   Two 80 did not break Twitter.

00:42:42   It still is a game because it turns out many thoughts are 300 or more characters and still

00:42:46   feels like a game.

00:42:47   But I think having a limit period when I know that there are numerous people who just think

00:42:52   that you should be able to post this, why have any limit at all?

00:42:54   Computers can handle lots of texts.

00:42:56   And I think that would be disastrous because I know, again, no offense to anybody who's

00:43:01   ever sent me a long email.

00:43:03   I love the email I get from readers, but people have, sometimes people have a hard time editing

00:43:09   and putting what could be a 280 character observation into less than 200, 2,800 words.

00:43:17   So I love Twitter.

00:43:18   I still like using it.

00:43:19   It is the, it is and remains the only communal forum for daring fireball readers that has

00:43:26   ever existed.

00:43:26   I mean, if it does go away or becomes unusable, I mean, I'll find something else, but I want

00:43:32   Twitter to thrive.

00:43:33   I think it's worth saving, but I feel like they've never had great leadership.

00:43:38   I feel like, or it's always been a kind of a clown car when people like Ev Williams were

00:43:45   nominally in charge.

00:43:47   They were under, it's not that I think Ev didn't get it or want to see it go in a good

00:43:52   way, but they were under pressure from like financial sources.

00:43:56   And well, and also, I mean, one of the things I think about, like, you know, I was, I still

00:44:00   talked to Ev a lot, Jason Goldin who was early there, like all those early Twitter folks,

00:44:04   except for Jack, I still have some degree of relationship with.

00:44:06   And I think one of the things is like, they were in over their heads from day one.

00:44:10   Like it was, again, it would be irrational.

00:44:13   They were the opposite.

00:44:14   They weren't like, we are going to make a text box that you type in and it will control the

00:44:19   political and media conversation of the world in five years.

00:44:22   Right.

00:44:23   Like that's not, they weren't Babe Ruth pointing at that fence.

00:44:25   Right.

00:44:25   Right.

00:44:25   It is like a thing that happened to them.

00:44:27   And then they're sort of holding on.

00:44:29   Right.

00:44:29   They are like, they're a dog that caught a car they didn't know they were chasing.

00:44:32   Right.

00:44:34   And so I got, I have so much more empathy for that, which is like, oops, a multi-billion

00:44:38   dollar company happened to us.

00:44:40   Like I have way more empathy for that than the person that's like, I set out to do this.

00:44:43   I think part of why I'm always skeptical about the approach people have to either competing

00:44:47   with it or fixing it is like, it's a real hard thing to do on purpose.

00:44:51   Nobody's ever done it on purpose.

00:44:53   And so, so I think that's really important, but I think the other thing that, that underpins

00:44:57   a lot of this too, is the, you know, you and I are of a cohort of folks that came into

00:45:02   technology sort of pre-internet right.

00:45:05   And then the rise of the internet, I think justified why we liked computers right back

00:45:10   when they used to be called.

00:45:10   And then what came of that, and this is keep in mind, like we are in the cohort too.

00:45:15   That is mostly in political power, mostly in professional power.

00:45:19   Like most organizations are run by people who came up from that same kind of gen X cohort

00:45:24   that we did or late gen X cohort that we did.

00:45:26   And there's this baked in kind of Clinton era techno solutionism, which is like, we'll

00:45:33   throw software at it and that'll fix it.

00:45:34   And that's happened in politics has happened on these areas.

00:45:37   And I think it's really closely related to that idea of the genius creator, right?

00:45:41   You know, like if you can get something to compile in X code, then you can solve any

00:45:45   problem, right?

00:45:46   Even if it's a domain you've never heard of or aren't good at.

00:45:49   And that's a intoxicating thing that suits, like I have been poisoned by this.

00:45:56   It suits my ego to believe that is true.

00:45:58   Even though demonstrably, it is false, sometimes catastrophically false.

00:46:03   And so that's the thing that I'm very, and why I asked you that question about, like,

00:46:07   why do we say this?

00:46:08   Cause like I have been, I was seduced by that same siren many times when I built social

00:46:13   media tools that I think you still use.

00:46:16   I was like, Oh, and then this will democratize publishing.

00:46:19   And like, it was true.

00:46:20   And we did get voices like yours and complete denial about the downside risk of like, also

00:46:27   it might amplify conspiracy theorists or hate groups or whatever else that came along with

00:46:31   it.

00:46:32   And we took, I literally had a job that was to say, we're going to enable people to do

00:46:36   things like daring fireball.

00:46:37   We're going to enable great voices like John Gruber.

00:46:39   And I was happy and proud to do it.

00:46:42   And then when people use it to make hateful things, I was like, Oh, that's not our fault.

00:46:45   That's just human nature.

00:46:46   Well, and think back to go before publishing on the internet to earlier computer technologies,

00:46:52   like the laser printer, where all of a sudden anybody could print out eight and a half by

00:46:58   11 output that looked professional.

00:47:02   It didn't look like it came out of a typewriter.

00:47:04   It didn't look, it wasn't dot name.

00:47:05   It was indistinguishable from the most expensive thing you could do.

00:47:08   Right.

00:47:08   And that there were companies, big companies who had infrastructure that was all based

00:47:14   around the typewriter or dot matrix printers for their stuff.

00:47:17   So that you, the person, the Joe or Jane computer user who just had access to a laser printer

00:47:24   could get better looking output than bigger name people.

00:47:27   Right.

00:47:28   And of course, anybody, you could print anything on your laser printer, right?

00:47:32   So the person whose politics you agree with who had could just as easily make a very lovely

00:47:37   looking flyer as somebody whose politics you completely disagree with and nobody thought,

00:47:42   well, that's the printer's fault.

00:47:44   And I think that was the same mentality that we all had in the early years of publishing

00:47:49   and publishing, creating publishing tools on the internet.

00:47:53   I think there was also a naivete about the legitimizing effect that everybody having

00:47:57   the same tools would do if you couldn't tell the difference, right?

00:48:01   It's the same fonts and the same way of the paper and all that kind of stuff between the

00:48:07   whack job and the expert.

00:48:09   The fact that we were relying so much on the social signals is, I mean, again, you know

00:48:12   this better than anybody having been a designer.

00:48:14   We take our credibility cues from the packaging.

00:48:18   Even people who claim that they don't care about design don't even realize how subliminally

00:48:24   they are affected by it.

00:48:25   Right.

00:48:25   They can't resist the cues that we've been trading.

00:48:27   And again, for good reason historically.

00:48:29   There were lots of, like, there was a lot of time where like, okay, this is well-produced,

00:48:32   it's in the magazine.

00:48:32   That means that there was a gatekeeper, which is bad, but it also means there was an editor,

00:48:36   which is probably good.

00:48:37   And so that thing of that real balance, we weren't taught, right?

00:48:40   That level of fluency, the meta level of fluency and the communications and media, we weren't

00:48:45   taught.

00:48:46   And so, because nobody had encountered it before, like the last time that came around,

00:48:49   you know, it was the printing press.

00:48:51   And so to come back around to this, we're like, oh, I didn't realize this was the stakes.

00:48:56   And I think that there's a sort of a denial still.

00:48:59   I know, like, again, I went through this, or like, I was the arms dealer giving people

00:49:03   these weapons and in denial about the implications of it until it was being misused.

00:49:10   And I think that still permeated.

00:49:12   And also, like, we were of the cohort that got to see these impacts first, right?

00:49:17   We were early and got to see it go through.

00:49:19   So everybody is a couple of years behind, depending on when they got to social media,

00:49:22   maybe several years behind in realizing these same things.

00:49:25   And so that lag is an opportunity for the Musks of the world to sort of say, hey, look,

00:49:31   software is software.

00:49:32   If I can make the dashboard on your Tesla, which he doesn't do, but, you know, he takes

00:49:37   credit for, then I could certainly make a couple tweets appear in a way that you like.

00:49:41   And you're like, these are completely different problem spaces.

00:49:44   The fact that they both involve writing some code does not mean they're the same kind

00:49:47   of problem at all.

00:49:48   Very much.

00:49:51   Well, who was the CEO before Gil Amelio was the CEO before Steve Jobs.

00:49:55   And again, I see a similarity here where I see Twitter as being worth saving.

00:49:59   I do not think that before this entire Musk saga started that they were on a path anywhere

00:50:06   good.

00:50:06   And I thought it was so disappointing when Jack Dorsey stepped down as CEO to focus.

00:50:15   Well, and sold out his workers.

00:50:17   I mean, the wild thing is that Jack was playing both sides of the deal.

00:50:21   This was actually one of those things where, like, again, I don't get surprised either

00:50:24   this industry is I have no misgivings about what it is.

00:50:27   It's a multi trillion dollar industry, but I did not think he would be saying one thing

00:50:32   in public and to his workers and be doing a backdoor deal to sell to Musk at the same

00:50:38   time, just straight up.

00:50:39   I mean, it's a lie and it's a rare thing.

00:50:41   Like again, like you can be critical of CEOs for many valid reasons, but actually they

00:50:46   very rarely will bald face tell a lie to the world about that kind of thing.

00:50:51   They will about are we going to do a video iPod or not, but not about like that kind

00:50:55   of thing.

00:50:55   That was really stark and striking.

00:50:57   And one of the sort of like the first in the series of jaw dropping dominoes that started

00:51:02   to fall around all this.

00:51:03   And I just I don't know Prague Agrawal, who was the CEO for what, a year, I guess, or

00:51:09   almost a year.

00:51:09   Although he'd been CTO for a while.

00:51:11   I know well, but naming him CEO to me without knowing him or knowing that it just seemed

00:51:17   very clear to me that he was not the person he wasn't going to do what I think needed

00:51:22   to be done to fix Twitter.

00:51:23   And again, it's not my specific ideas like, Oh, you should name me John Gruber, the CEO

00:51:28   or or pay me to be a consultant for a couple of weeks so I can put it all on a whiteboard.

00:51:33   It's not that I had a plan.

00:51:34   It's just that I would know it when I saw it and steady as she goes.

00:51:40   We'll just keep going the way we were, which was more or less Twitter for the last few

00:51:44   years wasn't going to do it.

00:51:47   And so, I mean, I've had some conversations with Prague over the years, and I have a little

00:51:52   bit of a window into some of the technical choices because actually to this, as far as

00:51:56   I know, I'm sure after this comes out, Musk will delete it.

00:51:58   But if you go to the Twitter developer site, they say, get started trying it.

00:52:01   It's a glitch app, and I run glitch.

00:52:03   So we see that.

00:52:04   And so we had a chance to talk about the platform and like he's a technologist.

00:52:10   One thing that's important to understand with Jack having picked Prague as a successor is

00:52:14   it's very rare.

00:52:15   You almost never see a publicly traded company, multi-billion dollar company where somebody

00:52:20   goes from CTO to CEO.

00:52:22   Technology is not the path by which you become executive of a technology company.

00:52:28   That's a really, there's a million things I can unpack from that statement, but that

00:52:31   is just a factual statement.

00:52:33   And yet that happened there.

00:52:35   And also, Prague started Blue Sky, which is their sort of attempted making a protocol

00:52:40   that Twitter could run on to sort of decentralize things.

00:52:42   And we can talk about Mastodon and Fediverse and all that stuff too.

00:52:45   But conceptually, not the wrong idea, really interesting idea.

00:52:48   And he was much more committed to it than Jack was.

00:52:50   I mean, Jack was sort of like, yeah, we'll fund it, we'll do it.

00:52:52   But Prague was a person that understood how it worked, was interested in it.

00:52:55   So I don't think conceptually he was impossible to have tackled the problems, but I do think

00:53:01   he was taking an incremental approach.

00:53:03   And this is one of the things that I think I pull away from the valid part of what you're

00:53:06   saying is an incremental approach wasn't going to fix what was broken with Twitter.

00:53:09   They needed a more vigorous shaking.

00:53:12   Yeah, and that's exactly what we're trying to make.

00:53:14   Would you consider Satya Nadella an exception to that as well, though, as a technologist

00:53:19   who became the leader of Microsoft?

00:53:21   Yeah, I mean, I have a lot of complicated feelings about the pattern in our industry

00:53:26   of Indian immigrant CEOs becoming CEOs at these companies when they're in duress, right?

00:53:32   Yeah.

00:53:33   There's a phrase that people use around, especially women becoming CEOs of glass cliff.

00:53:37   It's sort of related to the glass ceiling, right?

00:53:39   Of as soon as things get bad enough, we'll bring you in.

00:53:41   And it helps that the...

00:53:42   We'll let you drive the company over the cliff that it's obviously inevitably heading

00:53:47   towards all of it.

00:53:47   Exactly.

00:53:47   And then you'll take the fall.

00:53:48   But I mean, I think it's one, I actually do think the pattern is real, but I also think

00:53:52   of like at Google, things were not in a dire place.

00:53:55   Microsoft, you could argue things have been stagnant for a long time, which for their

00:53:58   culture is a dire place.

00:53:59   And then obviously Twitter was in serious duress.

00:54:02   There is a rare path as a technologist.

00:54:05   Microsoft is an outlier place because...

00:54:07   They value technology to such a profound degree.

00:54:10   And also they started as a developer tools company and have essentially gone back to

00:54:14   becoming a developer tools company, especially with the acquisition of GitHub and all the

00:54:18   other things they've done.

00:54:19   So it's a really interesting thing where I think in a lot of ways, that was a return

00:54:24   to the starting point after having had a lot of digressions over the years.

00:54:28   That is a really interesting thing because they were...

00:54:30   I mean, this is a tangent, but I think a relevant one.

00:54:33   Microsoft made developer tools for all major platforms.

00:54:37   That was what they did to start with before Windows, before DOS, before anything.

00:54:41   And then would make sure that the apps that you created on those were pretty easily portable

00:54:45   to any place that their platforms were running.

00:54:47   And to an approximation, that is what they do now, where their apps run on every platform

00:54:52   and they own the entire developer stack top to bottom with coders write their code in

00:54:57   and where they host their code and what they hope is where they run their code.

00:55:01   That's a really...

00:55:02   That thing, I think, is that explains the exception of why technologies can come up.

00:55:06   But I think there's also this broader pattern of, was the problem with Twitter about technology?

00:55:10   And it's not, it's about culture, right?

00:55:12   It was really about like, what does it want to be culturally?

00:55:14   There was obviously a huge cultural battle about what it is.

00:55:16   It had become a signifier, I think largely because of Donald Trump, of the larger cultural

00:55:22   battles that are happening in America and around the world.

00:55:24   And those are problems that are uniquely ill-suited to trying to solve through technical formats.

00:55:32   Like it is not an API problem that people are having political battles in your platform.

00:55:37   Dave- But basically that was my thinking, was that this...

00:55:39   Prague was obviously not the person to give Twitter the shakeup.

00:55:43   Whether it's because maybe he had the ideas to do it, but he obviously didn't have the

00:55:47   political clout to do it.

00:55:49   Right?

00:55:49   Michael- Well, also there is an inherent conservatism to being a CEO of a publicly traded company,

00:55:54   especially under an FTC consent decree, which leads to taking an incremental approach.

00:55:59   He might have had, I don't know, this is not a conversation I've ever had with him or anybody

00:56:03   else that was at that level, but he may have had a radical vision.

00:56:06   And even if he did, and he had board buy-in to pursue it, the grown up right way to do

00:56:14   it would be build the plan, iterate over time, make sure things are secure, reassure your

00:56:20   advertisers, build a plan about what your staffing is going to look like.

00:56:24   Those are the things you do.

00:56:25   And so there's no way to know now because of how things have played out, but it is not

00:56:31   at all impossible that he had a good plan and was executing it and we couldn't see it.

00:56:34   And that making your decisions based on what people are tweeting at you is actually a bad

00:56:39   way to make decisions.

00:56:40   And so I think that's because like you look at, again, going back to the jobs example,

00:56:43   you know, when he goes to the sort of first, after the return from the next exile, he goes

00:56:48   to Macworld and he's talking to people and it's not an obvious immediate turn, right?

00:56:53   Like it is a short time to effectively when the iMac comes out, right?

00:56:57   But on the way there, he didn't say anything that said we're doing this radical change.

00:57:02   He said, we have to get it together.

00:57:03   There was no hint that a Microsoft partnership was coming.

00:57:06   There was no hint, like the things that were seen as these radical bold moves were not

00:57:11   communicated until they were communicated.

00:57:12   And so I think that's the thing that like a turnaround for an organization is a very

00:57:16   hard thing for any leader to do.

00:57:19   And you actually, there's the right way to do it is to not be disruptive until you know

00:57:23   what you're doing.

00:57:24   I remember, I don't know how to explain it.

00:57:28   I just, again, it's an, I know it when I see it.

00:57:30   And when Gil Amelio was named Apple CEO after a run of bad, even worse CEOs, Michael, Michael

00:57:39   Spindler famously, right, right, right, right, right.

00:57:42   At one point it was reported in a book that he, it was, they had like an upcoming quarterly

00:57:47   report and they thought he was in his office and he wasn't.

00:57:50   And where is he?

00:57:50   And it turns out he was in his office and he was under his desk crying.

00:57:54   I mean, it was bad, bad, relatable, and really did not.

00:57:58   Well, if you listen to it, it wasn't there long, but he did a lot of damage to the company.

00:58:02   But when Gil Amelio was named, he said some things that were right.

00:58:06   You know, like he obviously didn't like totally not get the Mac or Apple.

00:58:11   And I think the one thing I remember him saying was that he saw Apple as the mag light of

00:58:16   computers and that sure, most people aren't going to buy, aren't going to, if they're

00:58:21   going to go buy a flashlight, they're not going to spend the money on a mag light, but

00:58:25   the people who really want a good one are going to, and that's Apple's role in the

00:58:30   computer business.

00:58:31   And in broad strokes, that's right.

00:58:33   It's not factually incorrect, but it's a horrible story to tell.

00:58:35   And it's also, that alone does not make you the right person to be CEO of Apple.

00:58:40   That's right.

00:58:40   That's right.

00:58:41   And actually there's a meta point here too, which is Apple is the epitome of a company

00:58:46   for whom the story has to come first because it is what sustained them in the low points

00:58:50   and is what made the high points possible.

00:58:52   And so when you're bad at telling the story, you are a failure in the role, even if your

00:58:57   business strategy is right.

00:59:00   And this is true of Twitter as well.

00:59:02   The reason Twitter is a household brand, despite not being that big, certainly compared to

00:59:06   Facebook, but even compared to like Pinterest or, and they handed TikTok to the world by

00:59:12   killing vine, right?

00:59:14   Like their failures are billion dollar companies.

00:59:16   And so, so like this is the thing is like, you have to be a master storyteller.

00:59:22   Jack is good at personal storytelling, but not at the company storytelling.

00:59:25   Still doesn't like to tell a story that he, for 20 years, he has not wanted to be a public

00:59:30   figure.

00:59:30   So like that wasn't the guy.

00:59:32   So like that part about like, that's why the story has been filled in by press and media

00:59:35   and politics and the rest of the world is there.

00:59:38   And then you get somebody in who wants to be the story at the expense of the product

00:59:42   and the team.

00:59:43   And it's a failure.

00:59:44   And I think that's that thing that like the people who built the myth around Musk, like

00:59:48   he has fans, which is a weird thing for CEO.

00:59:51   Cause again, even like going back to the jobs example, obviously there are lots of people

00:59:55   who are jobs fans and acolytes and stuff like that.

00:59:57   But he didn't have a fan culture in that way when he was alive because he didn't cultivate

01:00:05   it.

01:00:05   Like he was fine if people liked his work and appreciated it, but he wasn't carrying

01:00:09   himself.

01:00:09   Like he wasn't doing TV appearances and having people falling over him and creating that

01:00:13   kind of cult of personality famously did very little of it.

01:00:16   Really.

01:00:17   Right.

01:00:17   I often say this, another often recurring theme here on the podcast is me saying, boy,

01:00:24   you know, I've been writing Daring Fireball for over 20 years now, but boy, I wish I'd

01:00:28   been writing it for 25 because, you know, there's a lot of stuff when the story starts.

01:00:35   And there's, you know, and I'm honest about it.

01:00:38   And I would, I will be very honest that if I had been writing about Apple when they acquired

01:00:43   Next, you know, I definitely, I was as keen an observer of the company as I am now as

01:00:49   I write Daring Fireball.

01:00:50   I was still that obsessed with the company and was perhaps even if possible, even more

01:00:56   so, cause I was so worried that they were going to go under and I didn't know what

01:00:59   I would do with myself.

01:01:00   You know, what computers would I use?

01:01:03   When they bought Next, my thought was, I don't know if this is the right decision.

01:01:09   I was sort of more honestly hoping they would buy B, you know, the famous thing was that

01:01:14   they could have bought B.

01:01:16   B had the technical correctness again, right?

01:01:19   And I thought was actually more technically correct than the Next operating system.

01:01:23   Totally.

01:01:23   Totally.

01:01:24   Right.

01:01:24   And it was appealing at the superficial level, but again, Gasset was not this storyteller

01:01:28   that did shops.

01:01:30   Right.

01:01:30   And I didn't own a B box, which is still, it's a great name.

01:01:33   What a great name, but I had used one and I knew that they ran on the Power PC platform

01:01:39   and famously they had this demo, which in 1995 or six or whenever they did the demo,

01:01:44   had four windows on a 15 inch or 17 inch display running four different videos at the same

01:01:52   time.

01:01:52   Full motion video.

01:01:53   It was incredible.

01:01:54   It was incredible.

01:01:54   One of the great demos of all time.

01:01:56   It was jaw dropping.

01:01:57   So I was of the opinion that they should have bought B and not Next.

01:02:00   I thought Next was old news already and had sort of lost their workstation war to Sun

01:02:06   and that a workstation OS wasn't really the right thing for consumers.

01:02:11   I was familiar with B.

01:02:12   I knew B had better color support.

01:02:15   Eh, I would have been, you know, I will admit I was on the record of thinking they should

01:02:19   have gone that way, but when they went the Next route instead, I at least thought, even

01:02:25   though I thought I wish they had bought B instead, I thought at least this gives them

01:02:30   a chance.

01:02:31   And also it wasn't hard to believe that Steve Jobs could understand Apple.

01:02:35   Right.

01:02:36   No, like that was not the part where we're like, I don't know if he knows the culture.

01:02:41   Like that is not the challenge.

01:02:42   And you know, famously in hindsight, he was not instantly named CEO.

01:02:46   He was an advisor and then used political cloud and got, Emilio was ousted and then

01:02:53   he took the CEO title.

01:02:55   Which is very Twitter-esque actually now that you think about it.

01:02:57   Right.

01:02:57   Like that's that Dick Costolo tweet about like the first day of COO, step one, undermine

01:03:03   the CEO.

01:03:04   Yeah, first day.

01:03:05   He was the interim, you know, the joke was he was the I CEO, you know, interim CEO.

01:03:12   I'll be the temporary CEO just while we find a permanent CEO.

01:03:17   Right, he was working real hard to find a replacement.

01:03:18   But, you know, he's admitted to his biographer, you know, that he did it because he was

01:03:23   uncertain that even he could save Apple and he just did, you know, that way.

01:03:26   You don't want to get the stench of failure on you, right?

01:03:29   Nobody likes a failure.

01:03:30   Nobody likes a loser, right?

01:03:31   Yes, exactly.

01:03:32   And that's the thing is, I mean, I think an important part of this whole story is Next

01:03:38   also would have failed, right?

01:03:41   They were definitely going to fail.

01:03:43   Right.

01:03:44   Like they had no negotiating leverage except his storytelling ability, but it's not like

01:03:47   Next was going to win.

01:03:48   And I think that's part of it is like, he did not want to have a failure because America

01:03:53   hates losers.

01:03:54   And, and, and also because, you know, he was a child of immigrants and I'm sure we're

01:03:58   all aware of that.

01:03:59   But I think that's this part that is really interesting with Musk.

01:04:05   It's like Musk is very clearly in his flop era.

01:04:07   It is dead obvious, right?

01:04:10   Like you jumped the shark going on SNL, you jumped the shark having the like the musician,

01:04:15   you know, a girlfriend and then the breakup and the whole thing.

01:04:18   Like, like he is in every cliched of the, like the middle-aged guy.

01:04:22   Like he already had the hair plugs, like the whole thing about being in his flop era.

01:04:26   And then this is the like grasp at relevance to sort of be like, I can do this.

01:04:32   I'm this genius.

01:04:33   And also a lot of Silicon Valley guys are obsessed with having a third win because then

01:04:38   you're in jobs territory.

01:04:39   Right.

01:04:40   Right.

01:04:41   Right.

01:04:41   Right.

01:04:42   Because then it proves that weren't you weren't a fluke, right?

01:04:45   You know?

01:04:46   So, but basically that's my feeling is, you know, Steve Jobs and next wouldn't have been

01:04:50   my first choice in 1997, but I thought at least this gives them a chance, especially

01:04:55   if Jobs stays around and, you know, it turns out that was right.

01:04:58   And the job also was not in a cultural moment where tech CEOs were cultural figures.

01:05:04   Oh, definitely.

01:05:04   And where people were obsessed with them, like they were nerds and unpopular in 97.

01:05:09   And that's really, really important because you had to want to do it.

01:05:12   And he was a person who did date musicians because they liked it.

01:05:16   It wasn't cool.

01:05:17   Right.

01:05:18   And so like, this is a really, like this, the thing again, where the sort of the fun

01:05:21   house mirror version of this is like, you know, the, the, the difference between Grimes

01:05:26   and any given folk singer that, that, you know, jobs is attached to Joni Mitchell, I

01:05:31   believe.

01:05:32   Yeah.

01:05:32   Joan Baez, maybe both.

01:05:33   Joan Baez, not Joni Mitchell, but, but, but Joni Mitchell, different vibes, but, but

01:05:37   Jon Baez.

01:05:38   And that's the thing is like, Jon Baez was not going for somebody cool and not going

01:05:43   for the richest guy in the world.

01:05:44   Well, like if they were, I don't know if they were ever photographed together while

01:05:47   they were out, but if they had been every, it would have been there's Joan Baez with,

01:05:51   I don't know who, some guy who works in technology, some handsome young man.

01:05:54   Yeah, right.

01:05:55   Exactly.

01:05:55   And, and so, so that's really, really important, right?

01:05:57   Because, because the, the thing that people can't understand is like, these things are

01:06:01   shaped the same, aren't they the same?

01:06:03   And yet the cultural signifiers are the exact opposite, right?

01:06:06   Like he absolutely could talk fluently about music to Joan Baez in a way where they're

01:06:11   credibly spending time together because they care about creativity and culture.

01:06:15   And that is the opposite of by the time I became the richest person in the world, I

01:06:20   could find a cool musician to date me.

01:06:22   And, and, and that part, like, and that's the same reason you have the difference in

01:06:26   this approach to, and then like, and I'm not somebody who idealizes jobs.

01:06:29   Like I have lots of criticisms of him, but I think that part about like the story has

01:06:33   to be true at some level, if you're going to tell a good story.

01:06:36   And in the story of like, when Elon Musk was a little boy, he dreamed of building a

01:06:41   communication network so people could share their ideas with each other is not true.

01:06:45   Yeah.

01:06:45   Well, before we move on, I will just say that that's the last part of my opt- I'm

01:06:51   more optimistic than I have been in years was at least with Elon Musk, I thought he

01:06:56   will shake up the company.

01:06:57   It needs a shake up.

01:06:58   And he, he, he himself obviously uses and loves Twitter.

01:07:03   And I think that gives him a chance.

01:07:04   I don't think he's good at it.

01:07:05   No, and we'll get to that.

01:07:07   You know, that might be a good sign.

01:07:09   But I do think that that's been a profound problem for Twitter for its entire lifetime,

01:07:14   that it's never been led by people who really seem to use it.

01:07:20   You know, I mean, Jack used it, but not, I'm not saying you have to tweet prolifically.

01:07:26   I mean, maybe Jack used it enough, but Dick Costello didn't really use it at all.

01:07:30   No.

01:07:30   And the thing is he would have been good at it, but I think the culture is such that he

01:07:34   couldn't have done it.

01:07:34   Yeah.

01:07:35   I've met Dick.

01:07:35   I met him at a South by Southwest, I think a long time ago, but he's very fun.

01:07:39   He's a funny guy.

01:07:39   He really is.

01:07:40   Yeah.

01:07:40   He's a brilliant, he did stand up comedy actually.

01:07:43   Now that I think about it, I mean, that's how funny, you know, like successfully, not

01:07:47   like in a cringy Elon Musk.

01:07:49   Not on Open Mic Night.

01:07:50   Well, not Open Mic Night, not Elon Musk hosting SNL was actually, you know, could do it, but

01:07:57   therefore could have been, you know, comedians in my opinion tend to be excellent at tweeting.

01:08:03   Well, and also like it's the skill is especially for improv comedy, reading the room.

01:08:10   Right.

01:08:11   And this is this really pertinent thing about, you know, how do you become, how do you have

01:08:16   the right set of skills?

01:08:17   But you know, I think there's just such a, it's funny because all these examples keep

01:08:21   coming back to culture.

01:08:22   You talk about comedy, you talk about music, you talk about the arts.

01:08:25   Like these are things that are about culture,

01:08:27   influence in what humans are passionate about, and those are not the, you know, the drivers

01:08:34   for this, this sort of like outside acquisition.

01:08:37   It was ego.

01:08:37   And also the ambivalence about it is the indication.

01:08:39   Like I'm not passionate about this thing.

01:08:41   And those are the, again, zero question that, you know, a turnaround by Steve Jobs is, is

01:08:48   he passionate about the computers that Apple builds?

01:08:51   Like it's, it's like why he gets up in the morning, right?

01:08:55   Liz breathes and eats it.

01:08:56   And, and if you're like, I wasn't sure if I wanted to buy it or not, but then I got tied

01:09:00   into the deal and I had to go through with it.

01:09:02   You're like, man, that is, that's the opposite energy.

01:09:05   Everybody knew and still knows that the reason Keynote is one of the best apps Apple has

01:09:10   ever made and that anybody's ever made is that Steve Jobs himself was a diehard user

01:09:14   of Keynote.

01:09:15   And in fact, he had been using Keynote before there even was Keynote.

01:09:20   There was some app that Next made, you know, that, that was the roots of Keynote.

01:09:23   And in between the next years and when they announced, okay, we're going to come out with

01:09:28   this iWork suite for the Mac called, you know, with a Keynote and pages and numbers.

01:09:33   Numbers.

01:09:33   Yeah.

01:09:34   He had been using what is Keynote throughout all that time.

01:09:37   He had a team of people making it.

01:09:39   It was that I bet he had great product feedback.

01:09:41   Oh, I'm sure because he used it all the time.

01:09:43   Yeah.

01:09:43   All right.

01:09:45   Anyway, let me take a break here.

01:09:46   The holidays are approaching, so it's time to start thinking about what you're going

01:09:50   to get for gifts for your loved ones.

01:09:51   And I'm sure some of them are hard to shop for.

01:09:54   If you're looking for a gift for somebody who's hard to shop for it, but you know, they

01:09:57   love coffee.

01:09:58   Look no further than a personalized coffee subscription from Trade Coffee.

01:10:03   Trade Coffee is a coffee subscription service that makes it so simple to discover new coffees

01:10:09   and make your best coffee at home every day.

01:10:12   Trade partners with the nation's top rated independent roasters to send you coffee that

01:10:18   they know you'll love fresh to your home and on your preferred schedule.

01:10:23   You want a bag every week.

01:10:24   You want a bag every two weeks.

01:10:26   You want it every 10 days.

01:10:27   You can get it.

01:10:28   You want to pause it because you're going on vacation.

01:10:31   Do you want to change from weekly to biweekly because you're actually not going through

01:10:37   it fast enough?

01:10:38   It's all easy to do on the fly.

01:10:40   Whether you already know what type of coffee you like, or if you're new or your giftee

01:10:46   is new to specialty coffee and need some help, Trade makes it easy and convenient to discover

01:10:51   new coffees.

01:10:52   It's a perfect gift for loved ones.

01:10:54   It's a perfect gift for anybody who's a coffee lover.

01:10:56   It's also a perfect gift for yourself if you just want fresh coffee just magically delivered

01:11:02   to your door on a regular schedule.

01:11:04   And it really is terrific coffee.

01:11:05   I had some at the beginning of the podcast right here today.

01:11:08   Treat yourself or the coffee lover in your life with Trade Coffee right now.

01:11:13   Trade is offering listeners of the talk show a total of $30 off a subscription and access

01:11:20   to limited time holiday specials at www.drinktrade.com/the-talk-show.

01:11:27   That's www.drinktrade.com/the-talk-show for $30 off www.drinktrade.com/the-talk-show.

01:11:36   Ah, all right.

01:11:39   That was why I was optimistic.

01:11:42   So you know what's interesting too is there's a broader thing that's happening in like some

01:11:47   parts of tech, the sort of like I said, the tech tycoons.

01:11:49   I think if you look at the mosque, you look at Peter Thiel, you look at the sort of Andres

01:11:54   and Horowitz VC folks, they've kind of radicalized each other into the opposite of what they

01:12:00   used to be.

01:12:01   So there was a narrative that venture capitalists would say at the beginning of like the web

01:12:05   2.0 era, which is like, you know, you're the founders, you're the visionaries, you're the

01:12:09   geniuses, we just want to fund the brilliant things that you make so that they can change

01:12:14   the world.

01:12:15   And that was a very common refrain.

01:12:17   They all said that.

01:12:17   And then you go back, actually, this is probably the beginning of the year, so more than a

01:12:22   couple months ago now.

01:12:22   Andres and Horowitz put out a really a political platform.

01:12:26   They called it like the, you know, America Rebuilding or something like that.

01:12:31   And it's a very explicitly strongly political document about the way that, you know, capital

01:12:38   should be allocated as a country and what we should invest in, what the resources should

01:12:41   be, and the rest.

01:12:41   And they sort of said, if you conform with this political platform that we are advocating,

01:12:51   then we will write you a check.

01:12:52   Right.

01:12:54   And this sort of even carried out in their partner Chris Dixon talking about web 3.0 and

01:12:57   the same kind of thing, which is like a very explicit political and economic set of goals

01:13:02   and an approach.

01:13:04   And if you conform with that, then we'll cut you a check.

01:13:07   But it's akin to the old days when you would get a loan from, you know, for your mom and

01:13:12   pop company from the bank.

01:13:14   You're like, I want to open a flower shop.

01:13:15   And they're like, okay, we're going to tell you what flowers to sell.

01:13:17   And if you sell those, you can have a lot.

01:13:19   And it's the opposite.

01:13:21   And it's a rare thing where you go like a 180 from the narrative of half a generation

01:13:27   before of like, whatever you are, you know, where our jobs is to support you and give

01:13:31   you resources to.

01:13:32   This is what we're commissioning, right?

01:13:35   Like we're the metages.

01:13:36   And we expect you to deliver this.

01:13:39   And I think that inversion of power of like the sort of the big central capital folks

01:13:43   saying, we want you to follow our, you know, sort of political whims is a really stark

01:13:52   and striking change.

01:13:53   And it's very different.

01:13:54   It leads to different things being made and a different sort of allocation of resources

01:13:59   to what people focus on.

01:14:01   But it's such a, I think it's a direct cause of why we're at the moment that we're at

01:14:08   with not just mosque, but sort of across the board in tech is like, if you had the ambition

01:14:14   to sort of do something really interesting and empowering for people, you know, the first

01:14:20   test is going to be well, but does this, you know, help us do what we're trying to do.

01:14:23   And I think the good example is to look at, you know, in the last generation, one of the

01:14:28   big wins for the venture capitalists is something like Uber, right?

01:14:30   And, you know, there's a lot of, I think everybody's hashed out a lot of the pros and the cons

01:14:34   like you can, you know, you can Google that stuff, but there are a couple of inarguable

01:14:39   things about Uber.

01:14:40   One of which is that it's never made money and it's never come close.

01:14:42   And that the investors who invested early in Uber have made billions of dollars off

01:14:50   of it going public.

01:14:50   And also that it's had, you know, a transformative and deleterious effect on the workers who

01:14:57   drive cars and used to make money doing so.

01:15:01   And that thing of you can make lots and lots of money as an investor by sucking all of

01:15:09   the money out of a market, even to the harm of the people who were in that business before.

01:15:12   And it doesn't matter if the business ever makes money because you'll still get ahead.

01:15:17   And then of course we've seen sort of extreme versions of that with the crypto world.

01:15:21   We're like, again, is people are losing their shirts in the crypto crash.

01:15:24   You know, Andreessen Horowitz has had the most profitable fund they've ever had in their

01:15:29   history investing in the web three stuff.

01:15:32   This is a really striking shift from how we used to think about making technology.

01:15:36   And then it leads to the takeaway, which is you should tell people what to do because

01:15:41   you'll make lots of money that way, even if it all crashes and burns.

01:15:44   You're really cheering me up here on you.

01:15:48   I'm sorry, but this is why, you know what, this is why I wasn't optimistic about Musk

01:15:53   coming in.

01:15:53   This is why I didn't have what I used to have the same hope you did.

01:15:57   Like this will shake things up because I'm like the direction you're shaking when you

01:16:02   shake things up matters.

01:16:03   And I'm like, I don't believe that he wants to solve the problem that Twitter was born

01:16:07   to solve.

01:16:07   And that's still my lens of like, why did you start doing this?

01:16:11   And I'm like, if he had been skeptical of the other thing, because he is one of those

01:16:14   people says you should make stuff.

01:16:15   Cars are real.

01:16:16   Rockets are real.

01:16:17   You should make stuff.

01:16:18   Right.

01:16:19   And then I'm like, okay, okay.

01:16:20   But how are you approaching this?

01:16:23   And it was not, I know where we're headed.

01:16:25   This is the vision.

01:16:26   This is the story.

01:16:28   Which would have been what it looked like if this is the thing he cared about.

01:16:30   Well, let me read a tweet from you.

01:16:33   I think you tweeted it too, but I've got it on your Mastodon account in front of me.

01:16:36   This is from April.

01:16:38   Back in April of this year, you wrote people, this is an April.

01:16:42   This year would be, I think after Musk announced that he wanted to buy Twitter.

01:16:49   I think it was the day of the first conversation about it.

01:16:51   All right.

01:16:51   There hadn't been anything.

01:16:52   And there had been a couple of weeks before where he might invest in Twitter.

01:16:55   They might put him on a board.

01:16:56   He said, F you.

01:16:57   We don't want you on the board.

01:16:59   This, that.

01:16:59   And then he was like, you know what?

01:17:00   Screw it.

01:17:01   I'm just going to make an offer to buy you and take you public.

01:17:04   And here is what you wrote in April.

01:17:06   People are really not realizing how F-ing terrible.

01:17:10   You wrote the real word.

01:17:11   I'll just say F-ing in case people are trying.

01:17:14   I swear enough on the fly on this podcast.

01:17:16   But I'll try to keep this clean for people who listen to the...

01:17:20   I was cursing.

01:17:21   I'm sorry.

01:17:21   Right.

01:17:21   People are really not realizing how terrible the Musk era of Twitter is going to be.

01:17:27   They see it as some amusing novelty when it's actually going to reveal how many places Twitter

01:17:32   was actually making good choices because those will end.

01:17:35   So, you know, being right points to you at least so far.

01:17:39   But you know what?

01:17:41   You know what actually inspired that was a couple of years ago, I'd had an interaction

01:17:48   with one of the guys who hosts the Chapo Trap House podcast.

01:17:52   And those guys at that time, I don't know how they feel now, but they really didn't like me.

01:17:56   And it is one of those, like, you know, people being dicks to you on Twitter kind of things

01:18:02   where I was like, Oh, this sucks.

01:18:03   And then one of the guys, Felix reached out to me and he's like, Hey, you know folks at

01:18:08   Twitter, right?

01:18:09   And I was like, yeah.

01:18:09   And he's like, we found these folks that are reporting kids in Saudi Arabia who are

01:18:18   gay and reporting them to the authorities because it's illegal to be gay there.

01:18:22   And they can be imprisoned and even sentenced to death.

01:18:25   And, you know, it's something he's very fluent in and cares sincerely about.

01:18:29   And he, you know, the fact that he reached out to somebody who he had been publicly dunking

01:18:33   on for months is an indication of sincerity.

01:18:36   And I was like, yeah, you know what?

01:18:37   Like none of this Twitter beef means anything.

01:18:39   These kids are in harm's way.

01:18:41   And so I connected him to the right folks and, you know, kudos to him for raising the

01:18:45   issue and getting it handled.

01:18:47   And kudos to Del Harvey who used to be in charge of trust and safety at Twitter for

01:18:52   doing the right thing on her.

01:18:53   And this was based on these teens in Saudi Arabia, what they were doing on Twitter, either

01:18:58   direct messages.

01:18:59   So they were on Twitter.

01:19:00   And I don't know the full circumstance because I don't read all of the relevant languages,

01:19:06   but what was clear to me was they had not outed themselves.

01:19:12   And that the people who were doxxing them as being gay had like real world visibility,

01:19:20   like physical world visibility into who they were.

01:19:23   And so knew the implications of what they were doing there.

01:19:25   It could not be a more stark and obvious example of what these platforms can do and things

01:19:29   that Silicon Valley insiders don't think about, which is, you know, real people's real lives.

01:19:37   The example I always go to is the Facebook one, which is, you know, Amnesty International,

01:19:42   which could not be more credible and respected globally, has said that Facebook played a

01:19:47   central role in the enabling the Rohingya genocide in Myanmar because of its information

01:19:53   policies.

01:19:54   And I know Rohingya people, right?

01:19:56   This is not abstract to me.

01:19:57   And this is like, you know, these can be my cousins.

01:20:00   They look like my family members.

01:20:01   And this is a thing that doesn't get talked about very much in the Musk conversation.

01:20:07   Nobody said, what are you going to do?

01:20:09   Because he said, you know, I'm about free speech and then one of his early policies,

01:20:12   actually around this time that I wrote that message was, you know, anything that's legal

01:20:16   to share is going to be legal on Twitter, right?

01:20:19   Because that's what free speech means.

01:20:20   And it is absolutely 100% legal to out a gay child underage in Saudi Arabia and lead them

01:20:31   to the destruction of their life.

01:20:34   That is a legal thing to do.

01:20:35   And it is immoral.

01:20:37   Right.

01:20:39   It's a terrific example because it's clearly not the right policy for Twitter.

01:20:43   Right.

01:20:44   It's just such a crystal clear example.

01:20:46   And it's also one that is very familiar to people who practice, like there's trust

01:20:49   and safety organizations now as industry organizations, where people have this trade

01:20:54   craft across many different platforms and learn from each other and practice how do

01:20:58   you do this stuff and what are the concerns and what do you need to be aware of?

01:21:01   And there are people that have been doing this for 20 plus years, you know, and, you

01:21:06   know, that that's something where like you, you can become an expert in it.

01:21:10   And in doing so you can save people's lives.

01:21:12   And these are not people who were consulted or even considered in this transaction.

01:21:18   And now they're all gone.

01:21:19   All of them at Twitter, every person who had knowledge of this domain of problem is gone.

01:21:24   And I didn't know that was going to happen in April, right?

01:21:28   Like, but I knew that he wasn't asking about it.

01:21:31   And that is exactly the sort of thing that I have a terrible, you know,

01:21:36   in these long three weeks.

01:21:38   And again, just to go back, I'll just, it's my domain of knowledge, but to go back to

01:21:46   Steve Jobs and coming into Apple, he knew they needed to shake up.

01:21:49   They, at the time, I think I just reread the article.

01:21:53   I think Apple laid off 4,100 people at some point after the next reunification, you know,

01:21:58   which is a lot of people, you know, substantial percentage of the whole company, a substantial

01:22:02   percentage, but it was extremely measured, you know, and famously, I mean, this is not

01:22:10   like, oh, that sounds like a good story in hindsight, but I'm sure it didn't work out

01:22:13   that way.

01:22:14   But no, it really did work out that way where Jobs thought, I'm probably almost certainly

01:22:18   going to have to lay off the entire hardware design wing of the company because they're

01:22:26   hard, everything they make looks like shit.

01:22:28   So obviously they all suck, but didn't just axe them upon taking CEO, went and met them

01:22:36   and found a young designer who was leading them named Johnny Ive and saw that they had

01:22:42   all sorts of wonderful ideas that the company just wasn't making and that there was an enormous

01:22:48   amount of talent.

01:22:49   And he also, famous Jobs also famously then said that he was extremely surprised and it

01:22:54   gave him, once he got to know the company, how much software engineering talent was still

01:22:59   there because he just assumed that all of the, Apple was in such trouble that surely

01:23:06   none of the, there are no good software engineers left because they would have left.

01:23:10   They already left, right?

01:23:11   No, and it turned out they love the company and its goals and its ideals so much that

01:23:15   they still were there and it was talent to be tapped.

01:23:18   Right, they were just enduring shipping things that they were crappy.

01:23:22   Right, and without, just as an outside observer, with the timeline of Twitter's layoffs since

01:23:31   Musk took over, you don't have to be an insider to think that they laid people off willy-nilly,

01:23:43   you know?

01:23:43   And there's reports even that they laid people off and then realized some of the people they

01:23:48   laid off were actually essential and asked them back 24 hours later.

01:23:51   I mean, I have a friend I've known for quite a while who had that experience.

01:23:56   I don't know anybody firsthand, but I know I can confirm secondhand that yes, I know

01:24:01   somebody secondhand who that story is actually true.

01:24:05   It wasn't just one person, it was quite a few.

01:24:08   That's not how you do layoffs, wisely, you know?

01:24:13   Well, it's also just the inhumanity of it all.

01:24:15   All of it was needlessly cruel.

01:24:16   I mean, I think the same day that Twitter began its layoffs, they had layoffs at Stripe,

01:24:19   the payment company.

01:24:21   And Stripe is obviously not Twitter in a million ways.

01:24:25   One, it's very behind the scenes, sort of technical, but two, it's actually much bigger

01:24:30   business in terms of dollars.

01:24:32   And it's always awful to people being laid off and to go through that experience.

01:24:41   They were so thoughtful about it.

01:24:43   I mean, they really articulated what mistake they'd made, that it was a mistake, that

01:24:47   the leadership is the ones who are accountable for why this has to happen in the first place,

01:24:51   but here's what we can do about it.

01:24:52   They built alumni email addresses for people to be able to be reachable and be able to

01:24:57   connect with one another because one of the most dehumanizing aspects of a layoff is you

01:25:02   lose contact with your coworkers who were the people around you.

01:25:05   All those things they sort of thought through, and I thought, you never want to have to do

01:25:10   it, but if you are going to have to do it, here's how you can do it.

01:25:13   And it was the same day.

01:25:15   It was the same day.

01:25:16   It was within the same 24 hours.

01:25:17   And you can't make up the serendipity or the contrast.

01:25:22   Yeah.

01:25:23   Yeah.

01:25:23   And it was stark too, because I also think, so Patrick and John Carlson, the brothers

01:25:28   who founded Stripe, they're Irish.

01:25:30   And I think a big part of this too is also the culture, right?

01:25:33   They come from a culture where you're supposed to treat people with dignity at work.

01:25:39   There's a limited number of hours you work and people are supposed to be paid and there's

01:25:42   a social safety net and any manner of considerations is a very proud worker culture in Ireland.

01:25:49   And so I think that informs their sense of obligation and sort of social responsibility

01:25:57   around it.

01:25:58   I don't want to be too reductive.

01:25:59   Obviously, it's also they made a good choice and their leaders make good choices all the

01:26:02   way down, but that contrast sort of draws.

01:26:05   There's no reason that these have to be different.

01:26:07   There is no reason that these had to be so stark or contrast in terms of the humanity

01:26:11   of it.

01:26:12   And this is why I say, I think one of the galvanizing forces for all this is that poisonous

01:26:18   analog to owning the liberals, which is within the tech tycoon circles in that conference

01:26:27   room with David Sachs and Jason Calcanis and all these folks that Musk has around him.

01:26:31   This is how we show we're strong, which is the sort of classic thing that we can insecure

01:26:36   people do.

01:26:37   We show we're strong by bullying these people who we have power over.

01:26:40   Right, that doing layoffs in the most dehumanizing.

01:26:45   I don't think there's any other word for it.

01:26:46   And again, you can come up with a hypothetical science fiction scenario of the worst, but

01:26:52   in practical real world terms, it's hard to imagine how Twitter could have done it in

01:26:57   a more dehumanizing way.

01:26:59   You know, with emails going out at midnight that said, here's what you do.

01:27:04   You check your email at eight in the morning and it's either going to find out if you got

01:27:08   a job still.

01:27:09   Yeah, it's just I mean, I would even come up with an idea like that.

01:27:16   I mean, it's and what you know, how what was the rush?

01:27:22   The rush wasn't the however bad Twitter's finances are and they're not good.

01:27:30   It's famously one of the problems is overall in the history of the company, they're unprofitable.

01:27:35   And I do think they're bloated.

01:27:36   I've said this, you know, I'm sure they're quite bloated.

01:27:40   You know, the layoffs were the right move, but there was no reason to do it on 12 hours

01:27:47   notice.

01:27:47   You know, it's just not even close, right?

01:27:50   It's not even worth perseverating on the point here on the podcast.

01:27:53   It is.

01:27:54   And other than to point out the point you made, which is that the cruelty of it was the point,

01:28:00   you know, that that doing it in this fashion wasn't a happenstance or oh, they didn't even

01:28:07   it didn't really occur to them that this is a sort of cruel, cruel way of doing it.

01:28:15   Well, and it's a performative cruelty for their peers, right?

01:28:18   This is this thing that they're sort of I think we're going to see one up some ship on this.

01:28:23   I think the others in that cohort are going to do sort of similarly depraved things, you know,

01:28:28   in the months and years to come because that standards now been set.

01:28:32   He's being cheered on by his fanboys for this.

01:28:35   Like, let's keep in mind.

01:28:36   It's not like people's reaction to this was the human and, you know, rational thing where

01:28:41   you sort of say, gosh, these are people with lives and families.

01:28:45   And why are you acting this way?

01:28:46   Like his fans are like that shows them.

01:28:49   This is us getting back at, I don't know, woke culture or whatever their argument is.

01:28:53   And I guess, you know, I did not see this coming.

01:28:57   I, you know, I'm not a David Sachs fan.

01:29:00   I don't know him that well, but I know of him, you know, and I know that he was close to Musk,

01:29:04   but I really did not expect David Sachs to be at Twitter headquarters Friday night, you know,

01:29:10   interviewing engineering managers and making a list of, yeah, that guy, you know, seems okay.

01:29:18   Keep him.

01:29:19   She doesn't just get rid of her.

01:29:21   You know, I did not expect him to be playing a role like that.

01:29:25   I did not expect layup, massive layoffs to happen within a week.

01:29:29   Wait, obviously nobody could do that within a week, right?

01:29:32   No, no, you can't.

01:29:33   And it's the whole point.

01:29:34   How else do you find out that there's a Johnny Ive in the design department

01:29:38   other than taking your time?

01:29:39   There's the right pace.

01:29:41   It's not, oh, we have all the time in the world.

01:29:43   You know, urgency is different than what's urgency is okay.

01:29:52   Acting as though the building is on fire is not when it, when it's not.

01:29:55   I just didn't see it coming.

01:29:59   And I think I put too much faith in the fact that he obviously knows,

01:30:05   Musk knows what it's like to lead very large companies that need

01:30:11   very talented people working for them to succeed, right?

01:30:17   There is no way to create a rocket ship that works, let alone to innovate

01:30:24   and have a rocket ship that can go up and then come back down and land on a raft,

01:30:29   as opposed to going, you know, and again, you make breakthroughs like that.

01:30:32   I forget what you were talking about earlier on the show where we look back

01:30:36   and, you know, years and laugh, but in hindsight, it's crazy that the way

01:30:41   that we've been putting things into satellites, into space for 50 years

01:30:46   involves massively expensive rockets, just ending up in the bottom of the

01:30:50   Pacific or Atlantic oceans, you know?

01:30:52   Yeah.

01:30:52   Yeah.

01:30:52   That's that's slightly wasteful.

01:30:54   Right.

01:30:54   That's that's a high five.

01:30:56   That's that's the team mission.

01:30:57   That's the victory.

01:30:58   Mission control is high fiving each other.

01:31:00   You know, it's Miller time, you know, celebrate.

01:31:02   It was a successful launch.

01:31:03   We just dropped a 50 missile in the ocean.

01:31:07   Right.

01:31:09   You make breakthroughs like that by acquiring and holding talented people.

01:31:17   Yeah.

01:31:19   So what I'll give is that people, people talk about coders, right?

01:31:22   Or programmers.

01:31:23   Are they talking about one?

01:31:24   I think there's a, again, amongst the bubble that he's in and the sort of

01:31:28   most extreme of right wing media.

01:31:30   I think there's this perception of like, Twitter was like 90% content moderators

01:31:34   just trying to shut down anybody saying anything nice about Republicans.

01:31:38   You know, millions of sort of this weird, distorted thing.

01:31:40   And it's like, it's mostly a bunch of coders.

01:31:41   Like it's, it's, it's a technology company, but also, you know, you'll know

01:31:46   what this has been, maybe not everybody else to the, the SRS systems, reliability

01:31:50   engineers, this is a very specific discipline within a large tech companies,

01:31:56   but it is, you know, the people, what it sounds like it's about reliability.

01:32:00   It's making sure that these, these systems that the very many complicated

01:32:04   systems that, that run these modern internet platforms keep running.

01:32:09   And it's it's a extremely demanding, extremely technical practice and very

01:32:18   frequently, you know, relies on being on call like just like doctors are, right?

01:32:22   It's like, if this thing blows up, you got to come in and fix it.

01:32:24   And and Twitter has arguably some, some of the best SRS who have ever done this

01:32:31   work anywhere.

01:32:32   You know, you put them up against Google, you put them up against, you know,

01:32:35   Facebook, anybody you want to put up there and keep in mind, Google and Facebook

01:32:39   are much, much larger than Twitter.

01:32:41   As much as attention as Twitter gets, it is nowhere near in the same league as

01:32:45   the trillion dollar companies like Amazon and Google.

01:32:47   And yet some of the best SRS who've ever done the work have been at Twitter

01:32:53   historically.

01:32:53   And it's because of the challenge, because it is that hard to be the real

01:32:58   time information engine for the world.

01:33:01   And and that group being decimated, like I can obviously articulate the story

01:33:08   around, you know, the content moderators.

01:33:09   We can all understand that, but this is the kind of thing where you have, you

01:33:13   have the rocket scientists, right?

01:33:15   You have the people who can do this unique task in the world in a way better

01:33:18   than anybody else has done it to the point where all the other platforms are

01:33:21   leaning on their technologies that they invent.

01:33:24   They're using the same architecture, you know, and those folks are all gone or

01:33:29   worse willy nilly.

01:33:31   People are feeling like the guy to the left of me and the person to the right

01:33:34   of me both got picked off and I'm still here and I don't know why I survived,

01:33:37   which is a lot of people's feeling.

01:33:39   It's so destabilizing.

01:33:42   And that's what I honestly find surprising given that, you know, that Musk

01:33:48   didn't, he's not coming from, I realized that Tesla is, they're not similar to

01:33:55   Twitter and their problems, but they're similar in the way that they need to,

01:33:59   like I said, acquire and hold on to talent.

01:34:01   Why in the world given his leadership in three weeks would anybody of talent take

01:34:09   a job at Twitter going forward?

01:34:12   I mean, it's, I mean, SREs are leaving Tesla because of this, like this is the

01:34:16   thing is like, it is a community that exists beyond one company and they all

01:34:20   talk and, and, and, and you go into that practice because you value stability.

01:34:26   That is your job.

01:34:28   The thing you were doing there, you're sacrificing your nights and weekends for

01:34:31   and missing out on movies with your kids for is because you think it's important

01:34:34   to offer stability to the world and the technology that they use.

01:34:37   And there is nothing less stable than this.

01:34:41   Like it could not be a more perfect way to undermine their values at a human

01:34:45   level, their sense of purpose in the world.

01:34:47   And as you alluded to earlier to demand personal fealty from engineers.

01:34:54   I mean, again, it's not a way to me, demanding personal fealty is no way to

01:34:58   lead any group, anything.

01:35:00   Yeah.

01:35:00   It's always a sickness, but in this case, it's the particular worst sickness you

01:35:04   could possibly have.

01:35:05   And I've often said this, I have many friends.

01:35:07   I know you do too.

01:35:08   I'm just, and I'm sure most of the, many of the people listening are engineers.

01:35:12   And certainly almost everybody who listens to my show knows software

01:35:16   engineers or hardware engineers.

01:35:17   I've always thought engineers are among, are the most interesting.

01:35:22   If you just, if that's all I know about you is that you're an engineer of some

01:35:25   sort, then there's a much higher, way higher chance than a random person that

01:35:33   I would enjoy talking about politics with you, whether we agree or disagree,

01:35:38   whether you have voted Republican almost every time for your life.

01:35:43   And I've voted Democrat most of the time in my life.

01:35:46   I bet we could have a really interesting decision or discussion about the

01:35:50   differences because the engineering mindset is to look at it analytically and

01:35:56   rationally and talk about actual problems and actual solutions and to try to try

01:36:02   actively to take the emotion out of it.

01:36:06   Right.

01:36:06   And politics famously, it's, you know, nobody can completely detach emotion from

01:36:10   it, but you can try.

01:36:12   And that's what can give an interesting discussion between people who disagree

01:36:17   and probably will not come to agreement simply by having the discussion.

01:36:21   But that's the, that mindset is exactly the sort of thing that, you know, click

01:36:25   this link to say you agree that you're going to, you're going to go totally

01:36:30   hardcore.

01:36:30   I mean, I'm not as offended by his memo that he sent out last night as some, some

01:36:35   people are totally outraged.

01:36:36   I see some of what he's saying there.

01:36:38   I certainly wouldn't have used the words that he used.

01:36:41   I think calling it going, I don't know how you could.

01:36:46   Yeah.

01:36:46   What's the show on HBO, Silicon Valley, right?

01:36:50   They, they mock, they mock Silicon Valley culture.

01:36:53   If I were in the writers' room.

01:36:57   This is more extreme than any parody.

01:36:58   Yeah.

01:36:59   If I were in the writers' room and the episode had the Elon Musk thinly veiled

01:37:04   Callen character writing an email saying that their culture was going to be,

01:37:09   quote, totally hardcore.

01:37:10   I would say that's, that's too much.

01:37:13   We, you know, I get it.

01:37:14   So corny.

01:37:15   I mean, that's the other thing too, that I think is really important in all this

01:37:17   too.

01:37:18   It's so the, what more personifies the tech bro attitude, the guy's got no taste,

01:37:24   right?

01:37:24   Like this is this thing.

01:37:25   It's like, this is corny.

01:37:26   Right.

01:37:27   This is embarrassing.

01:37:28   That's an embarrassing thing to put in the subject's line of an email who talks

01:37:32   like that in 2022.

01:37:33   We, right.

01:37:34   That's boomer behavior.

01:37:35   Why are you doing that?

01:37:36   Inspiring the whole company to that we, we're, you know, we need to build Twitter

01:37:42   2.0.

01:37:43   We, the company is not in good shape.

01:37:45   You know, we've had to let go of a lot of people.

01:37:48   So now we're, there's fewer of us than there were.

01:37:51   We need to work harder than we've been working to build this thing.

01:37:56   There you go.

01:37:56   There, you know, I mean, I mean, the other thing is you would do that first,

01:38:00   right?

01:38:00   That's not the thing you would do after chaos for a month.

01:38:03   You can, you know, saying we need to work harder and, and come up with better

01:38:08   ideas and we need to do a better job of execution and ship new features faster.

01:38:13   All of it.

01:38:13   You could say that.

01:38:14   And that's the CEO.

01:38:16   And the CEO in the industry is saying that right now.

01:38:18   That's the CEO as the coach of the team, inspiring the team to, to play at the

01:38:23   highest level that they're capable of.

01:38:25   But it's, we got to be totally hardcore.

01:38:29   Totally.

01:38:29   It's ridiculous.

01:38:30   I also want to, I want to put out something.

01:38:31   So as we're recording this, I'm getting ready tomorrow.

01:38:34   I'm going to be at this.

01:38:35   This is going to be like the most political episode you've ever had.

01:38:38   I'm at this Obama Foundation, a democracy forum.

01:38:41   And they're, you know, sort of wrangling with all the different things that are

01:38:43   threatening democracy around the world.

01:38:45   No big topic there.

01:38:46   And we have conversation about, you know, misinformation, disinformation, media

01:38:51   manipulation, all these sort of related topics, but obviously Twitter is a huge

01:38:54   part of that conversation.

01:38:55   And there's an interesting thing where like Obama had done a speech at Stanford

01:39:00   in the spring of this year.

01:39:01   And it got, you know, I got a little bit of coverage, not actually that much

01:39:05   because it's sort of, you know, I think people feel like they already know he's

01:39:08   going to say.

01:39:09   And it was in his way, you know, very professorial and even handed and, you

01:39:15   know, like these are things that are not entirely compliments, but like, it's

01:39:18   fine.

01:39:18   He did what he does and he's a great speaker.

01:39:20   But in there, there was a really interesting section that jumped out to me,

01:39:25   which was he talked about, you know, everybody's focused on the algorithms

01:39:29   and what gets, you know, amplified and whether they're being fair to everybody.

01:39:33   But he's like, it's a market failure that we care this much about any one

01:39:37   platform, whether it's Twitter or anything else.

01:39:40   And I think it was right after Musk had talked about, you know, started saber

01:39:43   rattling around Twitter.

01:39:44   And it was a really astute analysis or is a really sharp point, which is the fact

01:39:51   that we care about any one of these platforms means the market isn't

01:39:54   competitive and that we're not inventing enough new ways to communicate and

01:39:58   connect and that we don't have enough, you know, he did not his, those were kind

01:40:01   of his words of the party.

01:40:02   He didn't say it's like what I would articulate open protocols, open standards,

01:40:05   interoperability, right?

01:40:06   Like, like nobody feels like beholden to their email provider.

01:40:10   Right.

01:40:10   And it was really stunning because I was like, it one it's true.

01:40:16   It's absolutely correct diagnosis, right?

01:40:18   Like it shouldn't matter this much if Twitter gets broken.

01:40:21   You know what I mean?

01:40:22   Like it actually is a failure of the ecosystem that there isn't some other

01:40:26   player and, you know, in competitive markets, like it is, it has been

01:40:30   absolutely phenomenal for the iPhone that Android is such a strong ecosystem

01:40:35   and so innovative, right?

01:40:36   Like that's a great thing.

01:40:37   That is a competitive, fiercely competitive market market.

01:40:40   Even if there's a little bit of unfair play by the two big players who run it,

01:40:43   right?

01:40:44   But like it's inarguably competitive and Twitter does not have any direct

01:40:49   competitors, right?

01:40:49   Facebook is not, nobody said I'm fed up with Twitter.

01:40:52   I'm going to go to Facebook.

01:40:53   Not one person, right?

01:40:54   Even though everybody has a Facebook account, that's telling these are not.

01:40:57   The closest thing they have to a direct competitor would be Instagram because

01:41:01   the basic paradigm is sort of similar where it's you, but nobody uses

01:41:05   in the same way.

01:41:05   Well, they are socially and culturally right, right.

01:41:07   And because of Instagram's design and again, I could go on and on about it,

01:41:13   but it, it, it really does matter what you started ads.

01:41:17   Even when you started, you know, when Instagram started as a, you know,

01:41:20   three, four, five people and Twitter was, you know, just a weird side project

01:41:26   of Odeo and there were only a half a dozen people working on it.

01:41:30   But the basic idea that little kernel that grows into this massive billion

01:41:35   user thing that Instagram has, I know Twitter only has 200 million users or

01:41:40   whatever it, that matters.

01:41:42   It was, but yeah, you're right.

01:41:44   There are, there's no, there, there is no direct competitor, but that's also

01:41:47   true of Instagram too, right?

01:41:48   We could do a whole podcast episode about the, the defacement of Instagram

01:41:52   over the last few years by Facebook.

01:41:54   I don't recognize, I don't use Instagram very much.

01:41:56   I go in like maybe every couple of months.

01:41:58   It's never the same twice and it is never what I think it is.

01:42:01   It's it, you know, and again, there is no real competitor, you know, you know,

01:42:06   I've, I've moved to glass and I had the founders of glass on the podcast a

01:42:09   couple months ago, but they, they don't describe themselves as a competitor to

01:42:13   Instagram.

01:42:13   No, they're doing a different thing that happens to be photos, right?

01:42:15   I actually have been really heartened by like seeing I've been seeing flicker

01:42:20   a lot lately since this sort of chaos.

01:42:22   And, and with the, the, the NASA launch of Artemis talking about, you know,

01:42:27   innovation and competition.

01:42:28   They're the official NASA account photos are on flicker because flicker has got

01:42:32   this, you know, the flicker foundation where they're sort of preserving

01:42:35   historical photos over time.

01:42:37   And it was just such an interesting thing where I was like, look at that 20

01:42:40   years in, this is the place that, you know, our, our preeminent space agency

01:42:46   finds is a good home.

01:42:47   But going back to this point about the competition, right?

01:42:50   Like, that was a, that was a really key point that I think got lost in his

01:42:55   co in Obama's conversation, but also what was telling was the reaction on Twitter

01:43:00   from Marc Andreessen, who is as powerful a figure as the tech industry has, was

01:43:06   Obama's telling you to shut up.

01:43:08   It's literally what he said, right?

01:43:09   Right.

01:43:10   And again, think of the contrast of a generation ago, venture capitalists

01:43:15   would say, we want robust competitive markets.

01:43:17   We'd like to fund the next Twitter.

01:43:19   If you've got a competitor Obama's right, build that competitor with us, right?

01:43:24   We need more competitive markets.

01:43:25   Who's the, who's the radical upstart.

01:43:27   This is what Silicon Valley's narrative was supposed to be.

01:43:29   We, we, we are disruptors.

01:43:31   We challenge the status quo.

01:43:33   We make radical things.

01:43:34   We think different.

01:43:35   That's what we are.

01:43:36   Right.

01:43:37   And instead he hears somebody saying at Stanford, right?

01:43:42   Where Google and Yahoo and all these things come from at Stanford.

01:43:45   We need to be more competitive and have more dynamic markets and invent new

01:43:49   things.

01:43:50   And he lies and says, this guy's telling you to shut up.

01:43:54   And that is not how it used to go.

01:43:56   Right.

01:43:57   That's just different.

01:43:57   And I think that change is so stark and so obvious that like, and it's hard

01:44:02   because like, I think people process these things as inherently political.

01:44:05   As soon as you hear Obama's name, you think, Oh, this is a political statement.

01:44:07   And it's like, it is not like, I am a very boring middle-aged dad who has been a

01:44:12   CEO.

01:44:13   I'm not a radical person structurally in what I am in the world.

01:44:17   And yet I'm like, I like that story that I should be able to be an entrepreneur

01:44:21   and be competitive in technology.

01:44:23   And the person who's telling me I don't is the guy who wrote a political tract

01:44:26   and said, we will funnel billions of dollars to anybody who builds in compliance

01:44:30   with our political tract.

01:44:31   And that guy is not Obama.

01:44:33   That guy's Teresa.

01:44:34   All right, let me take one last break here.

01:44:36   Thank our third and final sponsor of the show.

01:44:38   It is our very good friends, or at least my very good friends at Squarespace.

01:44:42   A Squarespace, you know what that is.

01:44:44   It's the all in one platform to build design, host update, post publish a

01:44:52   website, everything from domain name registration to choosing from untold,

01:44:59   uncountable number of great professionally designed templates that you can tweak to

01:45:04   your heart's content.

01:45:05   And just all the features that you can think of.

01:45:10   They have member areas where they make it easy if you're a creator to monetize

01:45:14   your content and your expertise in a way that fits your brand with a member area.

01:45:18   You can do, you know, unlock new revenue and do stuff like have gated content like

01:45:25   videos or online courses or newsletters that are only available to your paying

01:45:31   members.

01:45:32   They have their own built in analytics, which is just a terrific design, just one

01:45:36   of the best analytic designs I've ever seen for a dashboard.

01:45:39   How much traffic you getting?

01:45:42   Where is it coming from?

01:45:43   What parts of your site are getting traffic?

01:45:45   Which parts aren't?

01:45:47   Online stores, you want to host a store?

01:45:50   Squarespace can do everything from having the actual catalog to doing the actual

01:45:55   e-commerce parts of the transaction part of having an online store.

01:45:59   You name it, they've got it.

01:46:02   What do you do to find out more?

01:46:04   Go to squarespace.com/talk show.

01:46:09   You go to squarespace.com/talk show.

01:46:12   You get a free trial, 30 days, full featured.

01:46:16   Everything you can do at Squarespace, you can do.

01:46:18   There's no watermark on the site that says, oh, you're in trial mode or something

01:46:22   like that.

01:46:22   Nope.

01:46:22   It's the real deal for 30 days.

01:46:25   And then when you're ready to launch, just remember that same offer code talk show

01:46:30   and you will save 10% off your first purchase.

01:46:32   When your free trial is up, you can purchase a whole year at once and save 10% just by

01:46:37   going to squarespace.com/talk show.

01:46:40   How bad do you think it is at Twitter?

01:46:45   Do you think it's the people I talked to every single person is leaving.

01:46:49   There's not one person I know.

01:46:50   And I know a lot of people at Twitter.

01:46:52   Well, I used to know a lot of people at Twitter.

01:46:54   Now I know a handful.

01:46:54   There's not one who is saying I'm staying.

01:46:58   And it's funny because even there's people, this is a really interesting thing.

01:47:03   I know a good number of people there who are like, this is my job.

01:47:05   This is what I do.

01:47:06   A lot of them had stints at other tech companies, right?

01:47:10   So they worked at Google or they worked at Microsoft or something.

01:47:12   And they're like, this is my stint at Twitter.

01:47:15   And they were not like, I'm somebody who's like very politically opinionated and has

01:47:20   a point of view and like a very mission driven around when it gets my job.

01:47:23   It's what I do.

01:47:24   They have been radicalized by this.

01:47:27   Like they have come out and been like, what they say, because they know me.

01:47:31   They're like, I, the, all this stuff you've been ranting about all these years.

01:47:34   Now I get it.

01:47:35   Like there wasn't some opt out.

01:47:37   There wasn't some way to like, not think about what is the political agenda of these

01:47:41   guys or what are they saying to each other?

01:47:42   Cause like I used to tune all that stuff out.

01:47:43   Who cares what somebody says about Elon Musk?

01:47:45   Like I just do my job and I'm fine.

01:47:46   And and in every case, what galvanized them and what radicalized them is cruelty to their

01:47:55   coworkers.

01:47:56   Like there's nothing more effective at making people suddenly have a really strong motivator

01:48:04   about what they want to see in the world.

01:48:05   Then them seeing good people harmed, you know, and, and that's really, really consistently.

01:48:10   And I'm sure you probably hear the same from people.

01:48:12   Talk to you.

01:48:14   They're like a guest where they're like, you know, I'll be okay.

01:48:15   I'll figure it out.

01:48:15   But I cannot believe my, you know, what I hear is like my pregnant coworker is in, you

01:48:21   know, disarray.

01:48:22   Like they don't know if they're, they're going to be able to hold it together.

01:48:24   I hear a lot from, you know, being of Indian descent, a lot of Indian workers who are here

01:48:30   on visas and their being in America is contingent on them having a job and their job is into

01:48:37   rest.

01:48:37   And they don't know.

01:48:38   They said they're like, I don't know what's going to happen day to day.

01:48:42   And in many cases, their family back home in India is dependent on them sending money home

01:48:46   to them.

01:48:46   And so the idea of like your entire, an immigration is a brutal process.

01:48:51   It's a terrifying process.

01:48:53   And the idea of all that being put at risk, even though you did your job, even though

01:48:58   you have, in some cases, a skillset that they say they want.

01:49:01   Well, it's just unfathomable.

01:49:04   It says a lot too, that we're obviously, I mean,

01:49:07   I'm not Warren Buffett here giving, you know, amazing insight into the market, but we're

01:49:13   obviously in a moment where the entire industry is tightening, right?

01:49:18   Yeah, for sure.

01:49:19   It's a button down moment.

01:49:21   Successful companies are laying people off.

01:49:24   Stripe is a very successful company and they had a layoff.

01:49:28   And they're being cautious.

01:49:28   I know Facebook has had a bad couple of years, but they're still very profitable.

01:49:34   They laid off 11,000 people, which is more than all of Twitter, all of Twitter, you know,

01:49:40   and Facebook laid off 11,000 people.

01:49:43   And I mean, Amazon just made a show of 10,000 people a lot of, and it's not like, you know,

01:49:49   although in fairness, Amazon will turn 10,000 people in that time period anyway.

01:49:53   So they might just be taking credit for what was going to happen.

01:49:55   Maybe, but it sounds like some of the people at Amazon are in the product division.

01:50:01   Like they're going to sort of get out of the business of making a bunch of their own.

01:50:06   I don't think they're going to entirely abandon it, but I think that we're going to see fewer,

01:50:10   you know, fire, whatever, or Alexa devices.

01:50:16   Hopefully I didn't set off anybody's device by saying it.

01:50:19   Do you know there's engineering tightening going on there, but yeah, the bottom line

01:50:26   though is it is not.

01:50:28   If you didn't get laid off at Twitter, if you were still there and you're choosing to

01:50:33   leave on your own, you're doing it a time when it's probably the hardest it's been in five

01:50:39   years or will be in the foreseeable future to find another job because the places that

01:50:46   are-

01:50:46   Yeah, this is a tough moment.

01:50:47   Right.

01:50:47   Apple is not laying anybody off, but made a show of mentioning on their quarterly call

01:50:54   last week or 10 days ago that they are instituting an effective hiring freeze, so they're not

01:51:00   hiring.

01:51:00   Yeah, everybody's either frozen or going slow or whatever, at the best case.

01:51:04   Whereas, I don't even just a year or two ago really, or certainly for a while the industry

01:51:11   was in a hiring craze, right?

01:51:13   And if you had the talent and even better, a resume that had a couple of good, well-known

01:51:24   companies on it, the ability to jump ship and go from Twitter and land a new job at Google

01:51:30   or something like that.

01:51:31   Right, right, you can just sort of switch gears pretty easily in most technical roles.

01:51:35   Whereas now it's the opposite.

01:51:37   It is probably one of the worst times we'll see, hopefully, to do that.

01:51:41   Nobody would do that lightly.

01:51:43   And I do think that's the part that I find.

01:51:46   So I just did not-

01:51:48   I don't like Elon Musk.

01:51:50   I'm not a fan.

01:51:51   I never followed his Twitter.

01:51:53   I think his jokes are stupid.

01:51:54   The guy's bad at Twitter, which I think should be relevant, but that's a side point.

01:52:01   I mean, we're talking about thousands of people going through duress, but also, guy's a shitty

01:52:05   tweeter.

01:52:05   Sorry, I'm like, you know,

01:52:10   But I really do think, I think you agree.

01:52:12   I mean, this is-

01:52:13   I'm actually stepping on your toes here, because this is your job.

01:52:16   I don't have employees, but this is literally your job.

01:52:18   I think you will agree with me wholeheartedly that for any company in tech, the single most

01:52:25   important thing is acquiring and holding talent.

01:52:30   It is-

01:52:31   Yeah, it's inarguable.

01:52:33   It's the hardest thing.

01:52:34   It's the hardest thing.

01:52:35   And also, in a great way, tech workers historically have been amongst the most empowered.

01:52:42   Coders have been amongst the most empowered workers of recent years in a great way because

01:52:47   they have this valuable set of skills, and there is such demand, and they're starting

01:52:52   to seize that power, which I think is incredible.

01:52:54   But this is that-

01:52:55   I think this is also part of it, is that amongst Musk and his cohort of these insiders that

01:53:02   I keep talking about, they really want to crush that worker power.

01:53:06   They really want to crush that sense of solidarity and organizing that's happening everywhere.

01:53:11   And so this is setting an example of like, if this is the, what they would say, quote

01:53:16   unquote, "wokest workforce," then we have to crush them the most and set this tone that

01:53:22   these people can-

01:53:23   And I do think-

01:53:25   Well, I mean, I think actually in the fullness of time, there will be nothing that more galvanizes

01:53:31   the union movement in the tech industry than Elon Musk's mismanagement of Twitter.

01:53:35   I wonder.

01:53:36   It could have that effect.

01:53:40   I think that in our field, I think what we're seeing with the people who remain at Twitter

01:53:46   leaving on their own, that you don't need a union to organize.

01:53:50   This isn't me arguing against a tech company like Twitter unionizing the workforce, but

01:53:57   I'm just saying part of it-

01:54:00   Well, I think it's for everybody that can't do it, right?

01:54:01   Because part of it is some of the groups that he targeted most, like content moderation

01:54:08   and some of the human rights groups and all that stuff that were inside the organization.

01:54:12   None of those are coders, so they don't have that same sort of power.

01:54:16   But they do have the ability.

01:54:17   But the one thing that's different in today's world, and especially the company like Twitter

01:54:21   that is a communication company, people know how to talk to each other, right?

01:54:26   That people within Twitter are communicating and they can organize in an ad hoc fashion

01:54:32   in a way that wasn't possible generations ago, right?

01:54:37   Well, yeah.

01:54:38   The only way that people who worked at General Motors could organize together on the factory

01:54:42   floor was through an official organization like a union.

01:54:45   There wasn't a Slack that they could talk to each other on.

01:54:50   Well, yeah, except, well, as we've seen with Twitter, right?

01:54:52   Slack is a tool that's owned and controlled by the company, and therefore they can monitor

01:54:57   and surveil and fire you for what you say in Slack.

01:54:59   And it's there.

01:55:00   The other thing is just legal protections, right?

01:55:02   Like we have the NLRB, we have labor laws, the API, for lack of a less technical analogy,

01:55:10   the API for accessing those protections to labor is organized.

01:55:14   And so I think, you know, I'm agnostic as to like the specific implications.

01:55:19   Like I look, I think it's telling, for example, the Amazon union in Staten Island is an independent

01:55:24   org.

01:55:24   It's not affiliated with any, it's not like SEIU or whatever.

01:55:27   It's like an independent org that, you know, the workers put together.

01:55:31   And I think that that's a really sensible and modern and dynamic way to do it.

01:55:37   And so that makes sense.

01:55:37   And again, that's a good example where like the myth of Silicon Valley that I grew up

01:55:43   on would have been like, oh, you independently organized major and institution, and were

01:55:47   able to transform one of the biggest incumbent companies in the world by doing so?

01:55:51   Like that's disruptive.

01:55:52   That's great.

01:55:53   But that's not the reaction to Chris Smalls and the team in Staten Island of like, wow,

01:55:58   look at what technical innovators you are, you know, Bezos is like, we have to crush

01:56:02   you under our heel or I guess a chassis.

01:56:04   And sort of similarly, I think of this as like, I don't, I'm agnostic as to what structure

01:56:08   it takes at Twitter, but I think the key takeaway people are going to have is we got to stick

01:56:13   together because they will try and pick us off and wear us down if we don't.

01:56:17   I'm just flabbergasted at how obviously stupid it is, you know, and product decisions

01:56:24   can be reversed.

01:56:25   So I think his whole, the whole, you mentioned the whole blue check mark thing and being

01:56:30   worried about it and taking that.

01:56:33   Even giving a shit about it.

01:56:34   It's such a corny thing to care about.

01:56:36   It really is.

01:56:38   I always thought, I mean, I've heard people's, you know, call, they talk about people who

01:56:41   have them being the blue checks and I think it's so silly and I don't, I have one because,

01:56:48   you know, I write the site and, you know, I got mine automatically after Matt Honan was

01:56:53   hacked a couple of years ago and they just found all the people who were like Matt Honan

01:56:57   and I, you know, I fit that profile and they gave us, they verified us so that our accounts

01:57:02   couldn't be hijacked because they realize, oh, people, you know, of my profile might

01:57:06   be of, you know, it'd be a juicy targeted.

01:57:09   It would be a juicy target to take my Twitter account, even if you only have it for six

01:57:13   hours and tweet a bunch of crypto nonsense or whatever the scam of the day is.

01:57:17   But you can make a bad decision, a product decision, and then you can reverse it.

01:57:23   And that is the sort of thing that I actually think is what had had stagnated Twitter.

01:57:32   You know, their fear of making decisions, you know, like the years they spent talking

01:57:37   about an edit button for editing a tweet, you know, they launch it.

01:57:44   They should have launched it years ago and then if problems showed up with it, oh, well,

01:57:48   then change it.

01:57:49   You know, you can always take it back.

01:57:51   You can try things, try things if they don't work.

01:57:54   So here's a, here's a counter example.

01:57:55   They had fleets, which was not a wild success, but they built a platform, it is what it is.

01:57:59   And then within Silicon Valley and again, it sort of interests in Horowitz cohort, they

01:58:06   got obsessed with Clubhouse.

01:58:07   We're all going to Clubhouse.

01:58:08   We'll soon start with a big, big hype bubble for a while, especially the Web3 folks.

01:58:12   And Twitter pivoted really quickly and built spaces and killed that category.

01:58:20   Like this still exists.

01:58:21   People do spaces.

01:58:22   I like a lot of them that I join in.

01:58:23   It's kind of like live podcasts kind of feel to it.

01:58:26   But even if it's not a huge product, absolutely, they rip the market out of the hands of Clubhouse.

01:58:32   Like they sucked all the air out of the room instantly.

01:58:34   - You remember, I mean, we're old, you know, but remember chat lines?

01:58:39   You would before, you know, this was before the computers could handle anything as complicated

01:58:45   as audio, but you can call.

01:58:46   - In caveman days.

01:58:47   - You could call like a local phone number and, you know, and up to 50 people or 100

01:58:53   people or whatever, you know, some capability, 100 people could be on the chat line at once

01:58:58   talking to each other in a group.

01:58:59   And it was, you know, I wasn't really into it.

01:59:02   And of course, you know, you can predict the way a lot of them went and which sort of things

01:59:07   people chose to talk about anonymously, but people enjoyed it.

01:59:11   It was a thing.

01:59:12   But, and you know, Clubhouse had that feel.

01:59:15   I get it.

01:59:15   I was, you know, I've done some.

01:59:17   I, but here's the point.

01:59:21   - Product is just- - Twitter shipped that feature

01:59:25   really quickly, really effectively and won the market.

01:59:28   - Yes.

01:59:28   - This contradicts the narrative that Musk is painting.

01:59:32   And part of it is like, they can't acknowledge that without acknowledging that Clubhouse

01:59:36   is a failure, despite having as much possible backing of, you know, again, Andrew Snohorowicz

01:59:42   and that cohort as is possible.

01:59:44   I mean, they went all, all in.

01:59:46   - Right.

01:59:46   - Money, attention, time, themselves, like they're, you know, you have the VCs themselves.

01:59:50   - Yeah.

01:59:51   - Showing up on there every night to try and prop the thing up.

01:59:53   And despite that got, you know, their butts handed to them by the culture at Twitter that

01:59:58   allegedly can't ship anything.

01:59:59   - Right.

02:00:00   The one thing that I just see that there's no product decision that Elon Musk could mandate

02:00:10   that would permanently wreck Twitter.

02:00:12   Cause it could always be reversed.

02:00:14   Now there are product decisions or content moderation decisions, which is, which to me

02:00:19   for Twitter is a product decision, right?

02:00:20   The content moderation, Nielai Patel wrote, I think that was the headline.

02:00:24   It is, that is their product.

02:00:26   - Yeah.

02:00:27   - Or it's the product of all of these networks.

02:00:28   - Right.

02:00:29   But being unable to hire and retain people is not a bug you can undo.

02:00:32   - It is not.

02:00:33   And the loss of institutional knowledge can not be recovered on a dime.

02:00:40   But in longer term, being a place where talented people don't want to work and where the people

02:00:45   who are left don't want to stay, you cannot recover from.

02:00:50   - The well is poisoned.

02:00:51   - Trust me, as somebody who, you know, as we talked about an hour ago, I was optimistic

02:00:58   about Twitter.

02:00:58   I, my pessimism about Twitter has nothing to do with the product decisions that Elon

02:01:05   Musk claims to still be wanting to make or has made so far.

02:01:08   It is entirely the pessimism, the despair over the, what he has done to their, the staff,

02:01:16   you know, and the...

02:01:17   - Yeah, the workers in the culture.

02:01:18   - Right.

02:01:18   And again, they obviously needed to lay off a lot of people.

02:01:21   I know that from my people I know who work at Twitter, that they were vastly overstaffed

02:01:28   and that there were a lot of people who really didn't do much.

02:01:31   You know, they had an entire team whose, the team's job was to do the search text box,

02:01:36   not search, not a search team, just the text box on the website where you type what you

02:01:42   want to search for.

02:01:43   There was a whole team behind that.

02:01:45   That's almost comically in violation of Brooks's law, you know.

02:01:52   - I'm not, maybe, I don't know.

02:01:54   I mean, you know, I've built some text boxes in my day.

02:01:56   - Well, I don't know anybody who's worked at Twitter who doesn't think that they were

02:02:02   overstaffed.

02:02:03   - I agree.

02:02:04   Everybody says Twitter's overstaffed and everybody says where they're overstaffed is that team

02:02:08   over there.

02:02:08   - Yeah, yeah.

02:02:09   - That's been true.

02:02:11   That literally was the feeling at Twitter when it was 20 people.

02:02:14   - Right.

02:02:15   - So, and I just say this again with nothing but love and appreciation for what they've

02:02:20   all done.

02:02:20   But like, you know, like I spoke at the first Twitter developer conference, right, and chirp,

02:02:25   this was like 2011.

02:02:26   And actually this is a sidebar, but it's such a great story because nothing epitomizes Twitter

02:02:31   better.

02:02:31   In the early days, Twitter didn't have its own app.

02:02:33   Just to recap, I know you know this, but just to sort of tell people, Twitter didn't have

02:02:37   its own app.

02:02:38   And so there were a lot of apps in the app store that all of a sudden you can post on

02:02:41   Twitter using this.

02:02:41   And it was confusing.

02:02:44   Some of them had great features.

02:02:45   Some of them were terrible.

02:02:46   And so Twitter, probably rightfully from a strategy standpoint, decided we're going to

02:02:52   make our own official apps called Twitter.

02:02:54   Nobody else can call their app Twitter.

02:02:56   And we're going to build them for Android and iOS because we want people to use it.

02:02:59   And they decided to announce this on the eve of their first ever developer conference where

02:03:05   every developer in the world who had built a client for Twitter would be in the room.

02:03:09   - Or at least following along intently from home.

02:03:12   - Absolutely, or watching with rapt attention.

02:03:15   And they opened, literally, the thing they set themselves up for was, we're just going

02:03:20   to knife all of you in the back.

02:03:21   Just wanted to set that up.

02:03:22   That's the starting point.

02:03:24   All right, now let's start the show.

02:03:25   I have never seen anything like it in my life.

02:03:28   And I had to speak at this thing.

02:03:29   I was like a developer.

02:03:30   We did an analytics app I did with Gino Trapani.

02:03:34   And I was like, I got to follow that?

02:03:39   You got to be kidding me.

02:03:40   You just told all these people, we are eating your lunch, go home.

02:03:43   It was incredible.

02:03:45   And I just was like that.

02:03:46   And at the time, Twitter was still young enough that you didn't know that this would be a

02:03:49   harbinger of things to come in terms of their strategy over the years, especially for developers.

02:03:56   But it just felt like, now in retrospect, I'm like, oh, that's the moment Twitter became

02:04:00   Twitter.

02:04:01   Because then immediately after that, being on stage for that, I went backstage and will.i.am

02:04:06   for the Black Eyed Peas was there at the developer conference.

02:04:09   And I was like, sure, of course.

02:04:10   Why not?

02:04:10   Yeah, that makes sense.

02:04:12   Yeah, let's get it started.

02:04:16   All right.

02:04:16   So where do we think Twitter's going?

02:04:18   We think I think I was on this line of thinking.

02:04:24   And now you've more or less convinced me over the course of the show that I'm never wrong

02:04:28   about Twitter.

02:04:28   I got to tell you, I hate it.

02:04:29   It's a horrible curse that I actually can always predict what's going to happen.

02:04:33   And it's horrible.

02:04:34   But that fundamentally the company is now on a perhaps irrevocable sunken--

02:04:42   There's some kind of death spiral going on.

02:04:45   I don't think Twitter ceases to exist.

02:04:47   The bird is going to be around.

02:04:48   The logo is going to be around.

02:04:49   There'll be an app.

02:04:50   But in terms of its cultural relevance and its importance and centrality to the media

02:04:55   and political ecospheres, I don't think that you can recover that because I think, one,

02:05:00   so we'll sort of say this, we're talking now in mid-November '22.

02:05:05   I think within the next 90 days, you're going to have a massive downtime outage or instability

02:05:15   on the platform, the likes of which we haven't seen since the Fail Whale base.

02:05:19   Or some kind of catastrophic exposure of DMs through--

02:05:25   That's the other thing I was going to say.

02:05:26   I think people can't conceive of-- there was that moment a couple of years ago, people

02:05:30   seemed to have already forgotten about where all of the verified accounts got theoretically

02:05:35   at risk.

02:05:35   And so they shut them all off from being able to post.

02:05:38   And all these big accounts got hacked.

02:05:40   And that was when they had a lot of teams working on this stuff.

02:05:44   With nobody at the wheel, the idea of, say, 10 million accounts having all of their direct

02:05:51   messages dumped due to a security bug or a really, really major account getting taken

02:05:58   over in ways that people-- they're not a bunch of kids that immediately tweet out cryptocurrency

02:06:03   spam, but instead are deliberately doing credible misinformation.

02:06:08   These are attack vectors that we haven't reckoned with in any major platform.

02:06:12   We've never seen this kind of vulnerability.

02:06:14   And then that's all if the whole thing stays running.

02:06:16   It's not at all unlikely that you just have one thing, dusty old server in the corner

02:06:24   that falls over because computers love to fail.

02:06:28   And all of a sudden, you've got the guy who knows what box to kick doesn't work there

02:06:34   anymore.

02:06:34   And Twitter's not exactly famous for having a very simple diagram of how the back end

02:06:42   actually works.

02:06:43   Right, right.

02:06:44   Yes, this very clean architecture diagram of like, oh, you simply push the reset button

02:06:47   here, right?

02:06:48   Oh, yeah, just let me stand in front of the whiteboard here.

02:06:50   It'll take me about 30 seconds.

02:06:51   You got this, and you got that, and then this one goes here.

02:06:54   And then when you make a call, it all goes.

02:06:57   Legendarily complex architecture.

02:06:58   One of the hardest problems in computer science is getting the tweets out.

02:07:01   And everybody who knew about it for sure is gone, and everybody who remains has got one

02:07:07   foot out the door or is terrified.

02:07:10   So that's like a real-- that thing, like I said, I think the next 90 days is going to

02:07:18   start.

02:07:19   And then after that, the question is like, what happens?

02:07:21   It's kind of like if you've ever been on a bike or a motorcycle, it starts to wobble,

02:07:25   right?

02:07:26   You got a real short window where you got to get it back up on two wheels, or the wobble

02:07:31   gets worse and worse and worse.

02:07:32   And that's the question is like, how bad is the wobble after that?

02:07:37   And in the meantime, the flourishing of my good old friend, the old open web, it's wild.

02:07:45   Like Mastodon and Fediverse and all these things are just popping up.

02:07:48   And people are like, yeah, it's confusing and weird, but so was Twitter in the beginning.

02:07:52   It's not for everybody, but neither was Twitter in the beginning.

02:07:55   Still isn't.

02:07:55   Twitter is still a niche product.

02:07:57   And so all these folks who were laying in wait for 5 and 10 and 15 and 20 years, building

02:08:05   open protocols and open standards and weirdly named Mastodon apps and servers and stuff

02:08:12   are like, all right, come on over.

02:08:13   It's not the smoothest experience in the world, but it's here, and we're going to be here,

02:08:17   and we'd love to welcome you and try some weird stuff, and you're going to have fun

02:08:20   again.

02:08:20   And also you're not going to get run over with ads and whatever.

02:08:24   That's a really interesting thing that I would not have guessed.

02:08:29   I'm somebody who's been, I've talked about this over the years.

02:08:32   I am somebody who's loved to kind of like, well, I think open protocol should win and

02:08:36   people should make open source tools, but it's like, I'd give it up.

02:08:39   You know what?

02:08:40   We tried it, it didn't work.

02:08:41   Bummer about that.

02:08:42   We should have worked.

02:08:43   And then, you know, it's like the third act of like some Marvel movie or something where

02:08:48   like, what?

02:08:49   The ragtag band of misfits came together and saved the day?

02:08:54   Like it's a really very heartwarming thing to see.

02:08:56   I have worried for so long that podcasting was the last open thing that would really

02:09:04   take root and that it was, at the time it seemed like, oh, what a great idea.

02:09:08   What another great idea for the open web, right?

02:09:10   When podcasting started and now in hindsight, I've been thinking for a while, oh my God,

02:09:14   it's a miracle that podcasting exists the way it does today as this open thing.

02:09:19   What you said there is so good.

02:09:21   Speaking of having fun with computers, I would love to just real briefly tell me, tell our

02:09:30   audience, tell us, tell everybody about glitch.

02:09:32   Oh, sure.

02:09:32   Let me plug my stuff.

02:09:34   No, I mean, it's what glitch is, is a community where you can go in your web browser to glitch.com

02:09:41   and build a real app or website in 30 seconds for free.

02:09:45   And it feels as joyous as the first time you built a Myspace page back in the day or, you

02:09:54   know, saw something cool on the internet that you made.

02:09:55   And it is very informed by the fact that from the beginning, the web was supposed to be not

02:10:01   just something you consume, but something you create.

02:10:03   And then the first web browser that Tim Berners-Lee made could read and write the web.

02:10:06   And what's been amazing was six years ago, we started talking about that with glitch.

02:10:10   And it was like this kind of cross your fingers, hope it becomes real.

02:10:13   Today, glitch is over 2 million developers that are signed up.

02:10:18   They have made millions and millions of apps.

02:10:20   And it's everything from, for example, all these people discovering Mastodon and the

02:10:24   Fediverse, they want to find their Twitter friends and bring them over to this sort of

02:10:29   new platform.

02:10:30   The most popular tool for doing that is called Fedafinder.

02:10:33   And it was made by a guy named Luca Hammer, who is a glitch user.

02:10:36   And he made this app in a couple hours on glitch, like a very quick amount of time.

02:10:40   And hundreds of thousands of people have used it to migrate their follower and friends lists

02:10:45   from Twitter to Mastodon.

02:10:47   And that's one app on glitch out of the millions.

02:10:49   There's also stuff where people are building VR stuff.

02:10:52   And anything you could imagine that-- and also work stuff.

02:10:57   Someone wants to build a Slack bot to be able to get their sales reports into a Slack channel

02:11:01   for that, they're using glitch for that too.

02:11:03   But it is this, frankly, very idealistic idea that I was not convinced all these people

02:11:12   would buy into still, because I had bought the story that the web had closed up.

02:11:16   And people don't create the web themselves anymore.

02:11:18   But what it's proven is it's just like food.

02:11:23   We were talking about that earlier.

02:11:24   All of us have fast food sometimes.

02:11:28   In the airport, you're going to have some McDonald's, it's going to be fine.

02:11:30   And if all you ever eat is the factory farmed fast food, you are not going to feel good.

02:11:35   And the things we remember in our lives, at the end of our lives, are what are those great

02:11:41   meals we had, surrounded by people we love, made by people who love us, that was a cuisine

02:11:46   that's part of our community, or part of our culture, part of our tradition.

02:11:50   And the same thing could be true of the web that we spend our time on.

02:11:54   Is what if an app that we use every day, or a site that we go to every day, was made by

02:11:59   somebody we know or that we love, is part of a community that we're part of, was made

02:12:04   by people that we can name?

02:12:06   How many apps on your phone were made by somebody you know who made it?

02:12:09   Not enough.

02:12:10   And so Glitch is that place, and it's been something really special.

02:12:14   Our team is incredible.

02:12:16   I've been so lucky to attract that talent we talk about, those kinds of people that

02:12:20   are hopefully feeling very empowered as workers, but also that build something out of the

02:12:26   sincere desire in their heart that they give a new generation the same web that we grew

02:12:31   up on, which is this is a place I can make something for the world.

02:12:34   And do it, you know, like if the idea fits in an evening worth of hacking, you know?

02:12:40   Yeah, and you don't have to spend your life learning to code with some new language and

02:12:44   setting up a development environment.

02:12:45   I mean, I love doing that stuff sometimes, like if you just want to tinker, but if you're

02:12:48   like, "I got an idea and it would just be cool to kick the tires and see if I can put

02:12:51   it up out there," maybe it's a joke.

02:12:53   Like, God forbid you make a website just because it's funny.

02:12:57   Yeah, I'm so heartened, likewise, to see some of that mentality coming back, and you

02:13:04   know, it's just fantastic.

02:13:07   And Neil, thank you for joining me.

02:13:09   What a fantastic discussion.

02:13:11   What a fantastically fun discussion of a totally disheartening subject matter.

02:13:19   I think we're in a moment where that pain and transition and turmoil that everything

02:13:24   is going through could catalyze the thing that made us optimistic about tech in the

02:13:30   first place.

02:13:30   And so that's the thing where I sort of see some hope.

02:13:33   And I am really grateful that you have me on because I also have been inspired by like

02:13:38   20-plus years of you saying, "I can tell a story in my own voice on my own website and

02:13:45   maybe have some impact on this space and this industry."

02:13:48   It still gives me chills to think about, like, we can put those tools in people's hands

02:13:53   and they can tell a story that has an impact on the world.

02:13:55   So I'm grateful you do it, and I appreciate you validating that part, too.

02:13:59   That is very kind of you to say.

02:14:01   All right, let me also thank our sponsors for the show.

02:14:04   Let's see if I could do it off my head.

02:14:06   We had Squarespace, where you can build your own website, Collide, where you can manage

02:14:13   your fleet of Mac, Windows, and Linux devices, and Drink Trade Coffee, where you can buy

02:14:21   yourself a coffee subscription or get one as a gift for someone who you know who loves

02:14:24   coffee.

02:14:25   Anil, thank you.

02:14:26   Thank you.