PodSearch

ATP

55: Dave, Who Stinks!

 

00:00:00   Insert music here. Oh, you're not gonna play it for me. I want to hear the music. I'm not set up. You'll hear it

00:00:04   Damn it, man

00:00:06   boo

00:00:08   Okay, so the first item and follow-up is when we talked about final draft and the script notes podcast last week

00:00:15   and we got a lot of feedback on that also directly from the two hosts of that podcast and

00:00:20   As it turns out I definitely and perhaps also Casey

00:00:25   attributed to the two hosts' statements which were not theirs.

00:00:29   In particular, discussing their stance on software pricing.

00:00:33   After listening to the podcast, I was under the impression

00:00:35   that they thought software, if not should be free,

00:00:39   should definitely be much lower priced.

00:00:41   And I cited an example on the podcast of them saying,

00:00:44   well, Apple's operating system is free.

00:00:46   That was not the host of the podcast that said that.

00:00:48   That was Joe Jarvis.

00:00:49   He was the product manager from Final Draft

00:00:52   who was on the podcast.

00:00:54   I know how people feel when they say they can't identify our voices as separate things

00:00:57   because you're listening to the podcast for the first time, and you're, you know, 15, 20 minutes

00:01:01   into it, you lose track of who's speaking. So, I think Casey said that in the last show,

00:01:06   and I said that, and Marco tried to correct us, and none of us could remember for sure at the time,

00:01:10   but Marco was right. I could.

00:01:11   Yes, yes. Well, you weren't all that insistent. I said, "I thought that was them," and Casey said,

00:01:16   "Yeah, I thought that was them too," and you said, "I thought it wasn't," but none of us had

00:01:19   the podcast in front of it. And speaking of that, there is actually a transcript of this podcast,

00:01:23   which Mark will put in the show notes, that will give you a text version so you don't have to

00:01:28   listen to the podcast. I know people sometimes don't want to listen to audio and scrub to the

00:01:32   parts. You can just look at the text version, which is much faster to read and find the discussion

00:01:36   of this topic. So I apologize to both of those guys for making it sound like they thought all

00:01:40   software should be free. And in fact, John August sells some software of his own, and so he is not

00:01:45   just a observer of the software industry, but he's also a participant on the other side of the coin.

00:01:50   Yeah, yeah, I definitely was part of screwing that up. And I will say that I was completely

00:01:56   in love with listening to that particular episode of that particular podcast because

00:02:00   was it, is it Mark Madnick? Is that right? Yeah. Who is the CEO of and co-founder Final

00:02:06   Draft to my ears was straight out of New York. I can tell his voice from the others, definitely.

00:02:13   And so that's what I was going to say is he was the only one I could, without question,

00:02:17   place who was who. Everyone else, I was taking a shot in the dark.

00:02:22   Yeah, we got a lot of feedback from people who are talking about the screenplay format

00:02:28   and people who have many alternate apps for screenplays. And it looks like that ecosystem

00:02:33   is actually pretty vibrant and seeing some new life in terms of alternate applications

00:02:37   and alternate formats for writing screenplays. So it's actually much more lively than it

00:02:45   might seem if you just listen to this and think everyone is stuck with Final Draft,

00:02:48   that the young and upcoming people are seeking out alternatives to Final Draft and those alternatives

00:02:54   exist. And in fact, there's a format kind of like Markdown that you can use to produce a screenplay

00:03:00   and any application like Markdown, any application that can just edit text can edit that format,

00:03:04   and then you can convert from that format to the "real" screenplay format. So that's really opened

00:03:09   up the field to a lot of other editors. Which I also mentioned during last week's show.

00:03:14   Yeah, but you mentioned it by name, but I didn't know what the name was, so I didn't know, you know,

00:03:18   if you had said it's like Markdown for screenplays, I would have been like, "Oh,

00:03:21   maybe you said that." This is where you can insert the clip of you saying that.

00:03:24   I don't think I did, but yeah, I don't think I talked—I think I gave one sentence about

00:03:29   mentioning this fountain format, but yeah, otherwise the follow-up is, "I was right."

00:03:36   Go ahead.

00:03:36   Oh, we'll get to when that's not so true. Don't you worry.

00:03:39   There has been plenty.

00:03:42   if you had been more insistent to like, "Casey and I both got it wrong and who said that,"

00:03:45   and you're like, "I think it was the other guy." You weren't sure it was the other guy either,

00:03:49   or sure enough to say, "No, it totally wasn't that guy."

00:03:51   So I think I did, but that's okay.

00:03:54   You can insert that clip. There you go. I'm just trying to give you places where you can insert audio.

00:03:58   [Music]

00:04:02   There were parts in there. The other part of it is that Craig and John did at various times say

00:04:09   things like, "and we know the price of software is going down and things that

00:04:12   used to cost a lot cost more," but they were saying it as a sort of a offering up of the

00:04:17   idea that they understand where the final draft guys are coming from, not that these were positions

00:04:22   they agreed with. So if you read the text, it is much more clear that they are trying to provide,

00:04:28   like, trying to say that, "I empathize with your situation. I understand that in the software

00:04:32   market software is apparently being devalued," and then they would say, "but," and then go on.

00:04:36   I think Craig is still closest to walking that line and saying that he doesn't think an upgrade

00:04:41   just for Retina should be worthwhile. But again, it gets back to the specifics of this one program

00:04:46   that they don't like that hasn't been updated for a long time and not a general statement about all

00:04:51   software. Yeah, and I think that specific complaint about Retina being the thing that should have been

00:04:58   free, I don't agree with them on that. I agree, if they want to complain that the updates in general

00:05:06   two final draft, haven't included enough new features

00:05:08   to be worth their price, that's a different story.

00:05:10   And charging for what is really a bug fix,

00:05:13   that is also a different story.

00:05:15   But Retina is actually supporting new hardware,

00:05:20   and that's not necessarily, quote, a bug fix.

00:05:22   And it's like if you have to support a new version of the OS.

00:05:26   You don't give that to everybody for free

00:05:28   if they're all running a really old version.

00:05:29   But I think their problem with this

00:05:33   was all tied up in the overall problem of Final Draft being crappy and not really having

00:05:39   any kind of development pace for a long time.

00:05:42   All right. So the other bit of follow-up that we have is regarding whether or not it's

00:05:49   wise to treat warnings as errors in production code. And I believe this came from the go-to

00:05:57   fail conversation, is that right?

00:05:58   It was kind of a separate branch, yeah.

00:06:01   It was a separate topic, but yes.

00:06:03   There is the culture of using the wall flag

00:06:08   or weverything flag or the various permutations thereof

00:06:12   that specify the really esoteric errors in C,

00:06:16   or esoteric as you would say.

00:06:18   And that kind of morphed into my position

00:06:22   of with my web development now,

00:06:25   I'm doing my PHP framework such that all warnings and notices in production even are treated

00:06:33   as exceptions.

00:06:34   I was doing it in development for the last few years, but now even in production all

00:06:38   warnings and notices are exceptions.

00:06:40   And I asked the listeners during last week's show, because John and Casey, you guys severely

00:06:46   disagree with that, saying that in practice, to paraphrase your arguments, if you'll permit

00:06:50   in practice that in a production environment,

00:06:54   once you get beyond a one person operation,

00:06:57   that gets really tricky.

00:06:58   You might have other people deploying PHP updates

00:07:01   to the server and breaking the site in the middle of the night

00:07:04   and maybe it's not that important to break the site

00:07:08   for some minor reason, it's not worth it

00:07:11   and it's better to keep everything up as much as you can

00:07:14   and just log the warnings.

00:07:15   So I asked the listeners,

00:07:19   If they worked in a big organization

00:07:22   that is known for having,

00:07:24   that either has a really important web presence

00:07:27   or is known for being really good at tech

00:07:28   like Google or Amazon,

00:07:29   I ask them, what's your policy in your organization?

00:07:34   And please let me know in the feedback form.

00:07:36   And boy, did they.

00:07:38   Oh my God.

00:07:40   And I think it's pretty safe to say that

00:07:43   not only did I lose this argument,

00:07:46   I'm pretty sure I actually lost it unanimously. Well, it's not—there's no winning or losing

00:07:51   based on what people say. A bunch of people sending feedback doesn't mean anybody won or

00:07:55   lost anything. I've been thinking about it, and I think I can frame this. The reason Casey and I

00:07:59   kept disagreeing with you and the reason we had a disagreement on the last show is it's all about

00:08:02   the framing of the thing. The thing I put in the show notes, which I'll talk about at the end, but

00:08:06   the major framing of the topic was that in the beginning—like, the topic came up, and you

00:08:14   you mentioned something about it. In the beginning, both Casey and I stipulated that there are

00:08:17   certainly situations where it is both feasible and the right thing to do to elevate warning

00:08:22   stirrers in production. And we gave the example of you. You control everything about it. You're

00:08:27   a one-man shop. You're doing the client and the server side. You're not running a bank

00:08:34   for a podcast app. It's reasonable, at least. And then you stipulated that there are institutions

00:08:40   in which it is unfeasible to do this because of something about like it's a different department

00:08:46   that controls that or the developers are too far away from it or things have to be up all the time

00:08:53   and that's the most important thing. And you stipulated that if there's some sort of ailment

00:08:57   in the enterprise type company, it's also a practical concern that you might have to do that.

00:09:01   But every time Casey and I tried to come up with a scenario where it wasn't just like something

00:09:07   that you have to do because of real-world practical concerns that are just such a shame,

00:09:11   but that it was actually the right decision, like not because of some sickness in your organization,

00:09:16   but actually because the arrangement of things in this perfectly healthy organization or such,

00:09:20   that this is the correct move, you would say, "Oh, I don't know." Basically, you kept creeping

00:09:24   towards an absolute disposition, and whenever we tried to cite an exception, you would say,

00:09:27   "I'm still not convinced." So I think despite all the feedback, because all the feedback you could

00:09:32   dismiss and say, "Well, all those people belong to organizations that have some sort of sickness,"

00:09:35   which most of them probably do, let's be honest. Giant organizations that have some sort of

00:09:39   organizational sickness that requires them to do this. You know, isn't that a shame? Yes,

00:09:43   I understand it's a reality in your life, but it's not really the right thing to do.

00:09:47   And what I was trying to convince you of last time that I think is still the case is that

00:09:51   there are situations where it is actually the right thing to do,

00:09:55   not to elevate warnings to exceptions in production. And the difficulty of convincing

00:10:01   of that I think has a lot to do with us not—since this came up as a tangent—us not sort of being on

00:10:07   the same page ahead of time as to what the heck is a warning? What do you mean by a warning? We got to

00:10:11   it from the warning flags in the compiler, but that's obviously not what we're talking about,

00:10:14   because that's when you're compiling the program before you've deployed it. There is no—those

00:10:17   are different kinds of warnings. And I was trying to think of a way to get us to agree on what

00:10:24   warnings are, but that's almost impossible to do because we have to talk about specific technologies

00:10:28   involved and where do they come from and I thought of like the most extreme example is like imagine

00:10:32   there was just a random number generator in your code and 0.001% of the time on this particular

00:10:37   line it would emit a warning and then you would elevate that to an exception in production and

00:10:40   you'd be like well of course I'm not going to turn that on because I'm just signing myself up for

00:10:43   downtime. That is pretty much what warnings are like in many situations in libraries that you

00:10:51   didn't write that you have no control over in languages or runtime pass through code that could

00:10:56   could admit warnings.

00:10:57   And I think in that situation, that absurd situation,

00:10:59   obviously, you would agree that if there's something randomly

00:11:01   elevating things to exceptions, no, you

00:11:03   wouldn't want to turn those into warnings in production.

00:11:05   You wouldn't want to turn them into exceptions,

00:11:06   because then you're just signing up for downtime.

00:11:08   And lots of other people who wrote in,

00:11:10   who belong to these organizations,

00:11:11   to have these problems that cause them not

00:11:13   to be able to do what you think is the best practice,

00:11:16   described why they think it was the right decision.

00:11:19   But I don't think any of them were convincing--

00:11:22   that would have convinced you or should have convinced you

00:11:25   really is the right thing to do. So I'm wondering if reading all those things, were you convinced

00:11:29   that this is actually the right thing to do, to not elevate them to exceptions in some situations,

00:11:34   and not just merely that it's something that poor suckers have to do?

00:11:38   Trenton Larkin Now, before you answer that, Marco, let me

00:11:40   jump in. There's, excuse me, there's a couple of quotes, just three quotes that I'd really,

00:11:44   I'd really like to read really quickly that I think really kind of nail this home. And I,

00:11:49   unsurprisingly, I agree with everything John just said. The three quotes are from Alex Tall,

00:11:54   and tall was in all caps. In practice, the C whatever-o, CIO, CEO, etc., doesn't give a

00:12:00   flip about anything but their own agenda. In the end, Marco has the correct idea, but outside of

00:12:05   small and extremely well-structured organizations, the practice falls apart. And I think that that's

00:12:11   exactly what John and I are both saying. That yeah, you know, in certain cases—

00:12:13   Well, no, but that's the guy who's in an organization that has problems.

00:12:16   Like, he's admitting that his organization has problems that cause him to have to do what he

00:12:21   he knows isn't the best practice because, you know, oh, such a shame. Like, that's

00:12:24   not really what I'm saying.

00:12:26   Okay, but I guess the way I read that was, Marco's idea is absolutely right. It's

00:12:34   just not always applicable.

00:12:35   No, I don't think his idea is absolutely right all the time. That's what I'm saying.

00:12:39   Isn't that what I just said?

00:12:41   Here's, well, kind of. Well, keep going. Read the other feedback because I don't

00:12:45   know if…

00:12:46   Okay, so the other two were, secondly, there were two different engineers from Amazon that

00:12:50   wrote in. And I'm paraphrasing kind of the mutual message that both of them said. And

00:12:58   so this is not verbatim, but they said, "I get why Marco says what he's saying, but

00:13:03   when you deal with this kind of scale, something is always broken." And so in the example

00:13:08   of Amazon, there's so many moving parts that no matter what you're doing, no matter

00:13:12   what happened, there was always going to be something that isn't quite right. And so

00:13:19   You kind of have to plan for that, and there's not a lot you can do about it.

00:13:22   And then finally, anonymous said, "We don't guarantee uptime with code quality.

00:13:28   We guarantee uptime by having an institutional strength in reacting to problems."

00:13:33   And this is kind of getting into a different tangent, but I just thought it was a very

00:13:35   interesting point as well, that even if your code has no smell to it whatsoever, that doesn't

00:13:41   necessarily guarantee uptime.

00:13:43   and really guaranteeing uptime is more about just being able to react to issues better.

00:13:49   So anyway, those are the three quotes I have. So I apologize, Marco or John, feel free to carry on.

00:13:53   Well, so before Marco has a chance to respond, I want to give what I thought was the strongest

00:13:58   argument sort of buried in our feedback and that many people have offered. And it gets back to kind

00:14:04   of like getting back to the final draft thing where the CEO of Final Draft was coming on that

00:14:08   podcast and trying to make all his problems your problems. And a lot of the people who were

00:14:12   discussing the problems with their big organizations that make this unfeasible also said,

00:14:17   basically, and, you know, more or less, and even if we could do this, it's essentially making your

00:14:21   problem and your problem is basically I want to ensure that my code is warning free, but I think

00:14:25   that will produce higher quality code. It's turning your problem into your customer's problem because,

00:14:30   you know, once you turn that warning fatal in production, you have some sort of, you know,

00:14:36   avoidable downtime that causes you to, you know, well, I'm sure we're going to go fix it now,

00:14:41   because once there's downtime, everyone runs around like their hair is on fire. And that will

00:14:44   make sure that we don't let human nature go through. And if we merely log the errors,

00:14:48   maybe we'll just ignore them and they'll build up in a log. So this is a way to guarantee

00:14:51   that we have code quality. But the way you're guaranteeing is at the expense of your customers.

00:14:55   Someone just wrote in the chat room-- I lost the name. I think it was-- what was it? What

00:15:00   something? Anyway, someone just wrote in the chat room, enjoy this downtime. I didn't trust myself

00:15:04   to take a warning seriously. Like, message to your customers. And what I would say, the

00:15:11   strongest argument in this vein is that

00:15:13   by elevating warnings to errors in production and

00:15:17   making your problems, your customers' problems, if that's the only way to ensure that you stay in warnings clean,

00:15:23   that in itself is a sign of an unhealthy organization. And a healthy organization

00:15:27   would have a policy to log warnings and have them addressed in a timely manner, which was suggested in the last show and a lot

00:15:33   of people wrote it and they said this was the policy. So I think if this is the only way you can avoid the

00:15:38   pitfalls of human nature and make yourself address these things, that's a

00:15:42   sickness that's worse than the sickness that lets you not do that because a lot

00:15:47   of these people who were in organizations that did seem very competent

00:15:49   say we have a policy, a warning gets logged, it makes a ticket someone has to

00:15:53   deal with in a fixed amount of time and they can execute on that policy. So

00:15:57   that's their stopgap against human nature, not pushing it off onto their

00:16:01   customers and causing them problems. And that's why I think there are actually

00:16:03   situations where it is the correct move, not just the pragmatic one, not just the

00:16:08   unfortunate, I'm sad I had to do this one, but actually the correct optimal no ends if

00:16:12   so buts about it policy.

00:16:14   Mostly, I'm not going to argue this anymore because I've clearly been proven wrong. And

00:16:20   I can see, I understand and accept a lot of the kind of arguments that have been put up.

00:16:26   Just to clarify though, like when I'm talking about warnings in production, like the kind

00:16:32   of warnings that I see on my apps are usually things like MySQL truncated to value because

00:16:38   I passed one that was too long and my application code is not validating that for length properly.

00:16:45   Or something else with my SQL, like the connection was dropped in the middle of transaction and

00:16:50   my library didn't reconnect.

00:16:53   And those are, like, that kind of thing, that's both the kind of thing that I want to know

00:16:58   about.

00:16:59   And so I think you're right, Jon, that yes, making it your customer's problem is not

00:17:03   ideal and yes if you have the structure in place where if warnings just get logged in

00:17:09   a database somewhere and then you're required to act on them, that's great. And if you have

00:17:14   that kind of discipline within your organization, that's awesome. But I think so many organizations

00:17:20   don't have that in practice. There's the big organizations that can have procedures

00:17:27   like that. And they don't always, but they usually, usually big organizations are more

00:17:33   likely to have something like that. Where this tends to fall apart, and where this,

00:17:37   you know, where a lot of software methodologies fall apart, is, and a lot of discipline falls

00:17:44   apart in our industry, is in the small and medium sized shops. And in that kind of situation

00:17:50   where you don't have a lot of procedures in place, you don't have a lot of infrastructure

00:17:53   in place, you don't have giant different teams

00:17:57   doing different parts of it, it's maybe you have

00:17:59   between two and 20 people in your organization.

00:18:02   That size of organization, that's a lot of times

00:18:04   where discipline goes out the window.

00:18:07   And if you get a bunch of logged warnings in production,

00:18:10   you might not fix them, or you might not even see them.

00:18:13   And so I do think there are a lot of situations

00:18:17   where enforcing this, it's blowing up production

00:18:21   with warnings as a disciplinary tool to combat human failings. I think there's still a place

00:18:28   for that. I will yield to the counter-argument on this being the right idea for everybody,

00:18:35   and even for a larger organization. I think that's been proven that that's probably bad.

00:18:39   But I think you could apply this kind of rule of crashing in production on everything to

00:18:47   more than just me. I think there's more of a place for it.

00:18:50   Yeah, it's not just you, but you're like, again, Casey and I stipulated there are situations where

00:18:55   that's valid. In all cases, we're trying to find the correct solution is the one that plays to the

00:18:59   strengths of the particular situation. The strengths of a small shop, a one or two man

00:19:04   shop, a shop working on something that it doesn't matter if it's down for a little bit like a game

00:19:07   or something that's not super critical. That is not necessarily a strength, but that's the shape

00:19:14   of that beast is that you can take advantage of this—since you're a small shop and doing something

00:19:21   non-essential, you can take advantage of this to make sure that you don't ignore the errors,

00:19:25   because you're basically hacking. It's one of those self-hacks, like when you do something to

00:19:29   remind yourself to exercise all the time. It's one of those life hack things or whatever, but

00:19:34   on a small scale. And that makes perfect sense. And then in the larger organizations, they have

00:19:39   different strengths. And one of the strengths of a larger organization is they have the ability to

00:19:43   have policies to be imposed on people. It's very hard to have a policy imposed in a five or 10

00:19:47   person company because who's going to be the big guy imposing the policies? It's going to be your

00:19:51   friend who sits next to you and you're not going to take it seriously. That's a weakness of small

00:19:54   organizations and a strength of a big one. So you have to pick the solution that's appropriate for

00:19:58   you. And we think it's not just for Marco, but for other organizations where it may be possible

00:20:02   that this may also be the best solution. The tricky part, and it gets back to what I wrote

00:20:07   in the show notes, which is what is the nature of warnings? And warnings are just like without

00:20:12   agreeing on that, it's very hard to have a discussion about this because transaction

00:20:16   aborted or value truncated, everyone would agree that—I mean, regardless, even like,

00:20:20   value truncated, I think that should be elevated to an exception in almost all situations. But the

00:20:26   warning about like, "Are you sure you want to use this function?" Because a lot of times,

00:20:31   people use this function, and really, they mean that function. And that happens at runtime,

00:20:35   because you never hit that code path or a particular value, and you're totally like,

00:20:38   "Yes, I use the right function. Don't worry." That's why I get back to the random number generator.

00:20:42   That's the extreme thing. Some code that you didn't write and some library that you don't control

00:20:47   decides that it wants to send you a nice friendly message, depending on if it gets, you know,

00:20:51   the data value is over five, and it's past 3 p.m. on a Tuesday, and the system clock says it's past

00:20:57   the year 2010, which I thought would be the distance, and it decides to emit a warning.

00:21:00   Jared: I would consider that a problem if you didn't plan for that and catch it.

00:21:03   John "Slick" Baum: No, but I mean, if the warning is like, "Are you sure you wanted to call this

00:21:07   function, maybe you meant another one. Like it's advisory, it's guessing. Like you're

00:21:11   not the, if you could find the person who wrote this warning, you'd be like, stop writing

00:21:14   that warning. Like, because they're human beings writing these warnings, and the warnings

00:21:18   may have nothing to do with any kind of erroneous or unexpected situation, but merely maybe

00:21:23   giving advice, right? And advice is fine in lots of situations. Like, again, get back

00:21:28   to the random number generator. A lot of times, warnings to me feel like they might as well

00:21:32   have just been, "My code was running fine," and some random number generator decide,

00:21:37   "I'm going to send you a friendly message right now that had nothing to do with anything that

00:21:41   was not useful, and the only reason I have to react to it is to add whatever I need to add to

00:21:44   make that warning go away if I can." Because sometimes you can't, because it's in a compiled

00:21:48   library that you use that you don't control, that's not acting in a way that is unexpected or wrong,

00:21:53   but that author decided to give you advice about something. And that is the worst kind of warning,

00:21:58   the worst kind of runtime data-driven warning that you don't control.

00:22:02   And so that's why I think a lot of people have different positions in this,

00:22:04   because when they think of warnings, it's like, "Oh, it's something I have to react to."

00:22:07   And when I think of warnings, I often think of, "The only reason I have to react to this is to

00:22:12   shut this person up because they don't know our code, or they're trying to be helpful and give

00:22:15   advice or suggest a different way that we could do things." And it's like, "I don't want your

00:22:19   suggestion now at runtime. I need to blank that out of the logs because I can't get rid of it,

00:22:24   and it's not telling me anything useful. So as with most things when we're discussing this,

00:22:30   trying to agree on the premise. That's why it's easier to talk about a specific technology or a

00:22:34   specific version of something, because then you can kind of say, "Here's the set of warnings

00:22:37   that come out. Are we okay elevating these all to exceptions?" And then you have to look at the size

00:22:41   of your organization and all that other stuff. But if you just say warnings in general,

00:22:44   everyone's got a different picture in your head of what we're talking about.

00:22:47   **Ezra Klein:** All right. We are sponsored this week by something pretty cool,

00:22:52   something pretty new here. It's called In Flux and this is a music album from Brave

00:22:58   Wave Productions that highlights the diverse and ever-changing nature of music. It's a

00:23:03   blend of chiptunes, rock, electronica, and more, and it features diverse musicians from

00:23:07   around the world. Brave Wave's music is dedicated to exploring the interplay between

00:23:12   video games, music, and nostalgia. You might know them from their previous World One 2

00:23:17   albums. In Flux's theme is collaborations between the composers of the East and West,

00:23:21   featuring original music from Manami Matsumi, Tim McCord of Evanescence, Keiji Yamagishi,

00:23:27   Akira Yamaoka, Saori Kobayashi, and more. Get "Influx" today on iTunes, Bandcamp, or

00:23:34   their online store at bravewave.net. So go to bravewave.net for more info. Now, rather

00:23:39   than keep talking about this music for another 90 seconds, I asked them if we could just

00:23:42   play a couple of samples from the album, and they said yes. Here's some samples from "Influx"

00:23:47   showing off two different styles in the album, and there's a lot of different styles on here.

00:23:50   [Music]

00:23:57   [Music]

00:24:04   [Music]

00:24:11   [Music]

00:24:18   (upbeat music)

00:24:21   (upbeat music)

00:24:24   [Music]

00:24:31   [Music]

00:24:38   [Music]

00:25:06   All right, that was from Influx by Bravewave. Go to bravewave.net to preview more of it or buy a copy. Once again,

00:25:13   that's bravewave.net and the new album is called Influx, two words. Thanks a lot to Bravewave for sponsoring our show.

00:25:20   You know, I'll always have a soft spot in my heart for chiptunes because it's straight out of childhood.

00:25:25   You know, that's when even I cared a lot about

00:25:27   Nintendo and things of that nature and

00:25:30   now I'm not that into gaming, but I'll occasionally stumble upon a chiptune album like this one and just

00:25:36   Absolutely. Love it if for no other reason for the nostalgic factor

00:25:41   You wouldn't give me the bleeps and boops theme song, but we'll accept

00:25:45   Yeah, do you want to sponsor John I hardly endorse bleeps and boops

00:25:53   All right, so I guess we should talk about software methodology

00:25:58   But first, you want to talk about any of these new bugs that have come out or Apple buying

00:26:03   Nintendo.

00:26:04   What else happened?

00:26:05   A lot happened this week.

00:26:06   CarPlay?

00:26:07   We're not going to do that before Software Methodology?

00:26:09   CarPlay!

00:26:10   Right.

00:26:11   CarPlay was interesting.

00:26:12   Well, we could do Software Methodologies first if we want Casey not to explode.

00:26:17   Although, you know what?

00:26:19   I'll tell you what.

00:26:20   Do you hear the fans from my Mac Pro?

00:26:23   I do not.

00:26:24   Exactly.

00:26:25   Wait, is that your new Mac Pro?

00:26:28   Yep.

00:26:29   Wait, what?

00:26:30   You got it?

00:26:31   It came today.

00:26:32   Oh my god.

00:26:33   Are you serious?

00:26:34   Yep.

00:26:35   Are you...

00:26:36   I don't even know what to say right now.

00:26:40   I am so serious.

00:26:41   You need to say something about software methodologies, Casey.

00:26:44   Now is your time.

00:26:45   Okay, so...

00:26:46   It's pretty good, by the way.

00:26:48   Don't!

00:26:49   No!

00:26:50   Stop!

00:26:51   Oh, I hate you so much.

00:26:52   All right, so a long time ago...

00:26:54   Yeah, it's fantastic.

00:26:55   Here, I posted this.

00:26:56   Here, here's a picture.

00:26:57   can actually see just as proof. You waited just for this very moment, didn't you? I did.

00:27:04   I've kept it off of Twitter all day. I need you so much. So, software methodologies. I was gonna say,

00:27:13   if this is the longest troll ever, half of me admires it and half of me is about to get in the

00:27:18   car and drive to that town in which you live and murder you. Oh, God. And this is why I kept it off

00:27:26   Twitter all day. For you, our listeners, to hear Casey's genuine reaction because he

00:27:31   really didn't know that I had it yet.

00:27:33   Casey McPhee I hate you so much. All right, so a long time

00:27:36   ago in a galaxy far, far away on a podcast very similar to this, we talked about, we

00:27:43   touched upon the idea of talking about software methodologies. And what we mean by that is,

00:27:49   you know, how do you write software? How do you do that? Especially in a group atmosphere,

00:27:53   which means Marco probably doesn't do this very often anymore and hasn't done it in a

00:27:59   long time. And I haven't prepared anything specific about this, so I'm just kind of going

00:28:05   to kind of ad lib. But a lot of my time, my professional time, has been spent, actually

00:28:14   all my professional time has been spent working in teams. And I've found over the years that

00:28:19   there are very, very, very many different ways of going about authoring software. And

00:28:25   I should say right now that if you're not the kind of person that really gives a crap

00:28:29   about how to write code, then this might be applicable to you anyway, because many of

00:28:35   these things that we're about to talk about are actually applicable to just about any

00:28:40   project. And so there's a couple of different—well, there's many, many, many different ways

00:28:46   of going about this, but a couple of very, very obvious ones. And the way that I wrote

00:28:51   most of my code in my career is by using a technique called waterfall, which is to say

00:28:59   you do all the planning up front. And so that means you do a lot of planning up front, you

00:29:05   do a lot of thinking up front, you have a lot of meetings up front, and you pretty much

00:29:11   do everything you can before you write a line of code, and you do that all first.

00:29:16   And you do no code until you are ready to pretty much just phone it in, because you've

00:29:20   specified almost everything up front.

00:29:25   And the alternative to that, or the most obvious alternative to that, is something called Agile,

00:29:30   which is to say, you just kind of fly by the seat of your pants and see what happens.

00:29:35   So it's—they're very, very different, and there's pluses and minuses to both.

00:29:42   But before we dig into that, Jon, what do you use in your day-to-day job today?

00:29:48   I don't use any method or methodology, depending on what you think is an actual word, that

00:29:57   you could name with a proper noun, with a capital letter.

00:30:01   We use some vocabulary from the world of Agile,

00:30:03   but it's kind of pointless.

00:30:06   And there's no real--

00:30:09   we have a system, and we have processes,

00:30:11   but we're not following any kind of methodology

00:30:15   from a book or a paper.

00:30:17   Even within the organization, the processes that we have

00:30:20   are mostly methodology agnostic.

00:30:24   So I would not put us into any one of these bins.

00:30:28   In past jobs, I've in fact never been at a company that has adhered to a particular

00:30:34   system for doing software development that I could name with a capital letter.

00:30:40   So let me instead answer that.

00:30:43   Well, okay, I'm sorry.

00:30:44   I should ask Marco.

00:30:45   What do you do, Marco?

00:30:46   Or what did you guys do at Tumblr?

00:30:50   The answer to both of those is the same, which is a long, awkward silence.

00:30:55   All right, so I've had the benefit and detriment of using kind of a little bit of everything.

00:31:08   I've used Agile in a strict sense, I've used Agile in a not so strict sense, I've

00:31:13   used Waterfall in a strict sense, Waterfall in a not so strict sense.

00:31:17   And what's interesting is there's a lot—it doesn't really matter what methodology you

00:31:24   use or method or whatever that you use. A lot of it falls down to the team and what

00:31:29   the team is comfortable with. And so I've had a lot of experiences where I've tried

00:31:36   to use agile and in doing so in client work, in my personal experience, is going to be

00:31:44   made or broken by the client. So the way agile works, and I probably was being a bit flippant

00:31:51   earlier when I said it's all about flying by the seat of your pants. That's really

00:31:53   not it at all. The way it works is you spend a little bit of time up front planning the

00:31:58   next couple of weeks, or the next sprint as it's called. And so you spend that time

00:32:03   figuring out what are we going to do for the next couple of weeks. And the way that typically

00:32:07   works is you come up with something called user stories, which is to say, you know, as

00:32:12   a user of this online banking system, I would like to make a deposit. And somebody will,

00:32:18   or you as a team will decide, all right, how long do we think that will take? But not in

00:32:22   terms of hours, which is a typical consulting way of doing things, but instead in terms

00:32:27   of something called points. And points are not arbitrary, but not really defined either.

00:32:33   So what that means is, typically you'll say, "Okay, of all these user stories, we agree

00:32:37   that depositing a check is, for whatever reason, a one-point story." That's very, very, very

00:32:42   simple. And so we will say that we will judge all of our other stories based upon the difficulty

00:32:50   of this story, this one point story.

00:32:53   So in the beginning you do sprint planning and you say,

00:32:56   okay, for the next two weeks,

00:32:57   we think we can cover 20 points worth of effort.

00:33:01   And so what do we think we're gonna do?

00:33:04   And so you'll figure out these are the things

00:33:07   that we're going to put in the current sprint,

00:33:11   and then we'll have a backlog of things

00:33:13   we'll get to if we can,

00:33:15   and an icebox of things we'll get to

00:33:16   way in the future if possible.

00:33:18   And you do your sprint usually for two weeks, but not always.

00:33:21   And you do your sprint for two weeks and you try to figure out, you know, you try to get

00:33:25   all these things done.

00:33:26   And at the end of that sprint you'll see, okay, well, we didn't, we didn't do as

00:33:31   much as we wanted, so actually we ended up only doing 18 points worth of work.

00:33:38   And we'll consider 18 points our velocity for the next sprint.

00:33:42   And this is all very boring on the surface, but over a couple of sprints you get to figure

00:33:48   out what is your team's velocity. And your team's velocity can get you to a position

00:33:55   that you can actually plan how much work you're going to do in the future. And that's the

00:33:59   power of agile, is when you've gotten a couple sprints under your belt, and you've

00:34:03   gotten your velocity, and it's relatively repeatable and reliable, and then at that

00:34:09   point you can predictably figure out, okay, given all the work we have left to do, then

00:34:15   how much time will it take?

00:34:18   And so I've done agile several times,

00:34:22   all but one of the times it failed spectacularly.

00:34:27   And the reason, there are many reasons

00:34:29   it failed spectacularly, but in my personal experience,

00:34:32   the most obvious and hugest reason

00:34:37   that it failed spectacularly is because our client

00:34:40   didn't really get it, which probably falls down to us

00:34:44   not really explaining agile properly to our client.

00:34:49   But the one time it worked well, our client really got into it and really understood,

00:34:54   okay, these point things don't really translate to ours, and they don't really translate

00:35:00   directly to anything specific.

00:35:04   But these points are kind of like currency.

00:35:08   And if I decide out of the blue that I really want some new feature, and if I ask the team,

00:35:17   "All right, how long do you think this will take?" and they say, "Well, it's going

00:35:20   to be three points," I know as the product owner—I know as the client, as the product

00:35:26   owner that I'm going to need to take away three points worth of effort if I want to

00:35:31   shimmy in these new three points that I've just come up with.

00:35:36   And when it got to the point that we and the client both understood that points are currency

00:35:43   and really had faith in the system, it worked unbelievably well.

00:35:49   But generally speaking, none of that happens.

00:35:51   Instead, what you end up with is scrummerfall, which is you do a bunch of work up front,

00:35:55   have stand-ups every day, say you're working in agile, and none of it works, and it's

00:35:59   complete disaster.

00:36:00   That's basically all I have to say about that.

00:36:02   That sounds awful.

00:36:05   Is this what working with other people is like?

00:36:07   - Which part?

00:36:08   I'm asking honestly, which part sounds awful?

00:36:11   - All of it, the coins, it sounds condescending,

00:36:14   the points, I mean whatever.

00:36:15   I mean it's like, it just sounds,

00:36:18   and I mean there's been so much said and written

00:36:21   and experimented with and tested over the years

00:36:24   about how to organize and manage engineering tasks like this

00:36:29   sorry, programming tasks, sorry to the real engineers

00:36:31   out there.

00:36:32   (laughing)

00:36:34   And most of these systems boil down to ways that are easily exploited for laziness or

00:36:42   personal gain, and/or ways that are just very obfuscated and potentially condescending or

00:36:51   infantilizing.

00:36:52   And I think I would have a hard time with some of this.

00:36:56   Like…

00:36:57   It's not infantilizing, it's just having a boss.

00:36:58   I know it sounds the same from your perspective.

00:37:03   else telling me what to do? I'm not a baby. Get out of here."

00:37:07   So what—is there a specific thing or things that you take issue with? Because your reaction,

00:37:13   to be honest, is not unreasonable, especially knowing the frame of mind you're coming

00:37:17   from.

00:37:18   It's hard to explain. I think just the whole—the whole setup sounds like—it sounds like a

00:37:26   lot of people need something to do with their jobs who aren't necessarily programming

00:37:30   all day, whether they're managers or whatever you want to call them. And managers have a

00:37:37   role. Good managers are very, very helpful, but there are a lot of managers out there

00:37:42   who aren't good. And so much of this stuff sounds like the creation of mediocre managers

00:37:52   trying to occupy their time and prove themselves worthwhile by coming up with some kind of

00:37:56   system, some kind of procedures and frameworks and abstractions over people doing work and

00:38:03   the process of building software. And maybe one of the reasons why these things often

00:38:10   fall apart or don't work very well is because, like you just said, okay, well, this feature

00:38:15   -- you said in an ideal case, you can take the points of people's productivity and you

00:38:19   can say, okay, well, this feature will cost three points. That's just -- all it is is

00:38:23   estimating time, right? And we're always—our entire industry, myself included, is horrendous

00:38:28   at estimating time. And so, like, is it really any different to say, "Oh, that'll take two weeks,"

00:38:33   or "That'll take, you know, 30 man-hours"? And are any of them even accurate?

00:38:38   Pete: It's not really estimating time as much as it is estimating difficulty, and that's the

00:38:42   key difference. And if I were you, I'd be kind of sucking air through my teeth and being like,

00:38:47   "Is there really a difference there?" But there is, because you're saying relative to other

00:38:52   things, this is either a little bit more difficult or a whole lot more difficult. And so a one-point

00:38:59   story, we all agree as a team, including QA, including everyone, we all agree as a team,

00:39:05   this is not very hard. Whereas an eight-point story, and usually use the Fibonacci sequence,

00:39:10   so it's what is it, one, two, three, five, eight, something like that. You know, this eight-point

00:39:14   story is many orders of magnitude more difficult than that other one-point story. And so it's less

00:39:21   about estimating time than it is difficulty. And the theory is you take time out of the

00:39:27   equation, and that's what that velocity is all about. Because over a couple of sprints,

00:39:30   you realize, "Okay, we bit off 40 points worth of work, but holy crap, we only did

00:39:36   20. So realistically, we shouldn't sign up for 40 points any more sprints. We should

00:39:41   sign up for 20." And over time, things become a lot more predictable. And the other thing

00:39:47   that you said, which makes perfect sense, but I think I'm doing a pretty crummy job

00:39:51   explaining Scrum and Agile, is that, oh, it's all about giving managers something to do.

00:39:57   Well, not all about, but in part about giving managers something to do. And in fact, it

00:40:01   actually, to some degree, neuters the traditional project manager in that you're no longer beholden

00:40:09   to a Gantt chart, which is possibly the most evil thing ever created. And instead, the

00:40:15   The whole idea of Scrum and Agile is the team is the one in power. And if there is a project

00:40:22   manager, their job in life is, and to be honest, it is what I think it should be, which is

00:40:28   to get obstacles out of the way. And the best project managers I've ever, ever, ever worked

00:40:34   with do two things. Well, three things actually. Number one, they get obstacles out of the

00:40:40   way. Number two, they advocate on behalf of the client to our team. So they are the client's

00:40:49   representative whenever the client isn't around, and sometimes even when the client

00:40:52   is around. And number three, they advocate on behalf of us to the client. So if the client

00:40:59   is like, "Dude, you guys got to be able to do more than 20 points in the sprint. Really.

00:41:03   I mean, come on. This stuff isn't that hard." It's a project manager's job to kind of

00:41:07   step in and say, "Well, no." Whether or not you think it's difficult, the fact of the matter is,

00:41:11   history shows us, data shows us, that we can only do 20 points of sprint. So this is the way it's

00:41:18   going to have to be. And to think anything else would just be impractical and irresponsible.

00:41:22   Does that make any sense at all? It totally does. But what bothers me about systems like this is I

00:41:28   look at this and I say, "Well, why do you have to call them points? Why does it have to be this

00:41:32   concept of this currency or this, like, why does this have to be another level of indirection or

00:41:36   or an abstraction above what it really is,

00:41:39   which is people working, man hours.

00:41:41   This to me sounds a lot like the culture of Java,

00:41:48   and which has infected PHP as well,

00:41:51   of making tons and tons of deep class hierarchies

00:41:54   and classes on top of classes,

00:41:55   inheriting from classes and factories

00:41:56   and abstract factories and all this crap.

00:41:58   And when you're really trying to do something

00:42:00   that's a lot simpler than that,

00:42:01   it doesn't need all of that.

00:42:02   And so when I look at some of these things,

00:42:06   it's easy for me to get turned off by it.

00:42:10   And honestly, there's probably tons of value here

00:42:12   that I'm not seeing because I'm an idiot in this regard.

00:42:16   I've never, I'm completely inexperienced

00:42:18   in following any kind of formal methodology.

00:42:20   In all of my programming jobs,

00:42:22   even when I work with other people,

00:42:24   we never follow any methodologies closely.

00:42:26   We would, kind of like what John said earlier,

00:42:29   we would borrow occasional things

00:42:31   and like, oh, we'll try a few weeks with this

00:42:33   or try a couple of months with this

00:42:34   and it wouldn't ever stick. So I don't know what I'm talking about in this area, so keep

00:42:39   that in mind. I mean, normally I don't know what I'm talking about with a lot of things,

00:42:42   but this time I'm actually admitting it, so that should mean something.

00:42:45   [laughter]

00:42:46   No, but your questions are completely reasonable, and so to answer one of them, you know, why

00:42:51   points, why that level of indirection, why not just speak in hours? And it's because

00:42:55   of exactly what you said, which is that developers are unbelievably, indescribably bad at coming

00:43:02   up with accurate estimates. And so the whole idea of points is, like I was saying earlier,

00:43:06   it's an order of magnitude of difficulty. And you can kind of construe how many hours

00:43:11   a point will be after a few sprints when you say, "Okay, well, sprint is two weeks. Casey

00:43:16   has done 10 points worth of work every two weeks, so that's about five points a week."

00:43:22   So you can extrapolate that out to figure out what hours is. But the idea is to take

00:43:28   away any sort of measure of time and just argue about difficulty and track difficulty

00:43:36   so that time kind of falls out of that equation. I'm not sure I'm doing a great job describing it.

00:43:42   I think there's a lot of reasons why this agile methodology is BS. And I think I have my own

00:43:46   cynical take on agile and other methods as well. But I want to get to a link that someone put into

00:43:52   the chat room whose name I will get this time because I froze my scroll back, Wonder Matt,

00:43:57   put in a link to the thing that I wanted to mention, which is—maybe, I don't know if I've

00:44:01   already used this—Fog Creek Software, Joel Spolsky's company, introduced this thing as part of—I

00:44:07   forget which product it's part of. I guess I could click the link to find out. But—

00:44:10   **Matt Stauffer:** Yeah, the evidence-based scheduling,

00:44:12   it was Fog Bugs a while ago. I think it was version five or something. It was a while ago.

00:44:15   **Matt Stauffer:** Yeah. And it's a similar take on this, but it's even more kind of data-driven

00:44:21   in that on an individual developer basis, they asked that individual developer to estimate how

00:44:25   how long sample will take.

00:44:26   And basically, each individual developer

00:44:29   gets a reputation within the system of how good they

00:44:31   are at estimating how long things are going

00:44:33   to take for them to do.

00:44:35   And if there's an easy weakness to pick out

00:44:37   of an agile system that uses points in the way Casey

00:44:39   described, it's that programmers aren't interchangeable parts.

00:44:43   And your crappiness at estimating

00:44:45   the difficulty of a task during one sprint

00:44:47   probably has very little bearing on your crappiness

00:44:49   of estimating difficulty of an entirely other task

00:44:52   on another sprint.

00:44:53   So you may think the entire team has

00:44:55   capacity of 20 points when really person A has a capacity of 50 points on his own if he's doing

00:45:01   a feature that involves OCR. But if he's doing a feature that involves pull-down menus, his point

00:45:05   capacity is much lower. But he's able to estimate both of them really well. So you think you'll be

00:45:09   able to, "Oh, we should be able to figure out exactly how many points we have." But the points

00:45:14   vary wildly. Being able to estimate the difficulty, who you estimate it for, difficulty for the team,

00:45:20   difficulty for you, who it's assigned to has such a big difference. That's why a lot of these things

00:45:24   fall down. And evidence-based scheduling is trying to say, what can we do to take the human element

00:45:30   out of the equation and say, let everybody lie, let everybody be crappy. But even then, you don't

00:45:34   have like, okay, they were crappy about estimating for this type of thing, but what about that type

00:45:39   of thing? Maybe they're better at estimating that. And you're hoping it's going to home in on some

00:45:41   kind of average, but I have the feeling that unless you find yourself doing the same kind of task over

00:45:46   and over again, which would be super boring and most good programmers don't want to do,

00:45:49   it'll be difficult to get something really predictable out of that system.

00:45:54   All right, so Marco, do you want to tell us about something else that's really excellent?

00:45:58   And then John, I'd like you to pick apart all of the arguments I just posited.

00:46:02   Sure thing.

00:46:03   We are also sponsored this week by our friends at Ting once again. Ting is mobile that makes

00:46:08   sense. They're a no BS, simple to use mobile service provider from the people at Two Cows,

00:46:14   the company behind Hover. Go to our special URL, ATP.Ting.com to learn more. They have

00:46:20   They have great rates and there's no contracts

00:46:23   and no early termination fees.

00:46:25   You own your device outright and then they have a pay

00:46:27   for what you use pricing model.

00:46:30   So here's what you do.

00:46:30   You pay a base price of six bucks per month per device

00:46:33   and then whatever you use on top of that in minutes,

00:46:36   texts and data, they will just bill you

00:46:39   whatever cheapest bucket they have that fits that number.

00:46:42   So for instance, if you use 100 megs of data this month

00:46:46   and a gig next month and then the next month after that

00:46:48   drop down back to like 200 megs.

00:46:50   Each month you'll pay a different price,

00:46:52   just whatever you use you'll pay for that.

00:46:54   So you don't need to guess what you'll need in advance,

00:46:56   you don't need to raise your data cap

00:46:58   before you go on a big trip and then lower it

00:47:00   when you get back and of course you'll forget

00:47:01   and so then you have two more months of paying

00:47:02   the high rate that you didn't even use.

00:47:05   You just pay for what you use,

00:47:06   they bill you for the cheapest bucket

00:47:08   that you fit in and that's it.

00:47:10   They even have new lower prices.

00:47:12   If you checked them out in the past,

00:47:13   check them out again because they even

00:47:15   just lowered their rates.

00:47:16   So for instance, two gigs of data is just $29.

00:47:20   500 megs data is just $12.

00:47:23   So to see how much you can save with Ting,

00:47:25   go to ATP.Ting.com and check out the savings calculator.

00:47:29   You can enter in your last few bills

00:47:31   of whatever your actual usage was

00:47:33   from your existing phone carrier,

00:47:35   and it'll show you how much Ting will save you on average

00:47:37   and then over time.

00:47:39   And if you're stuck in a contract with someone else,

00:47:43   and if you have to, if you, suppose you have to pay

00:47:45   termination fee to get yourself to Ting.

00:47:48   They will actually give you 25% of it back

00:47:52   in service credit up to $75.

00:47:55   So like Hover, Ting has great customer support.

00:47:57   There's a no hold, no wait phone number.

00:47:59   You call them up anytime during the business day

00:48:02   and a human being picks up the phone

00:48:05   who is right there and ready to help you.

00:48:07   It's really a great system, great customer support.

00:48:11   And we can think of various ways you can use it.

00:48:13   You can look at if you're a developer,

00:48:14   you want some test devices that you want to be activated

00:48:18   on cell networks but you don't want to pay a lot of money,

00:48:20   or if you're just a regular user and you're tired

00:48:22   of paying a really high phone bill for a lot of capabilities

00:48:24   you're not really using, check out Ting.

00:48:26   Go to ATP.Ting.com.

00:48:28   They're compatible with any Sprint phone

00:48:30   because they're a Sprint MVNO here in the US,

00:48:33   and if you have a Sprint phone, bring it over.

00:48:36   If it's compatible, go to their site to see the list.

00:48:38   They also will sell you a new or used phone

00:48:40   for pretty great prices, so check it out.

00:48:42   ATP.ting.com. Thanks a lot for sponsoring the show once again.

00:48:46   They should have a song with beeps and boops with a name like Ting.

00:48:50   Or just Bells.

00:48:53   That's true.

00:48:55   Two on the nose.

00:48:56   Oh, goodness. All right, so really quickly before John destroys all of the points I just made,

00:49:01   friend of the show David Smith said earlier in the chat, and I'm quoting,

00:49:05   "Scrum/Agile's purpose is to attempt to try and extract consistent or predictable performance

00:49:10   out of an uncontrollable situation. It is better than nothing, but it is no replacement for

00:49:14   talented, motivated developers who can get things done in any environment. Methodologies help reduce

00:49:19   the impact of individual talents on the overall outcome." And I think that that really makes a lot

00:49:23   of sense, and I completely agree with that. But with that said, Jon, tear me apart.

00:49:28   Jon Streeter It's not you. It's like the overall

00:49:32   concept of having a particular method by which you develop software. Like, the reason Marco has

00:49:39   a sort of visceral reaction against it, and I think most programmers do when they're in

00:49:43   that type of environment, is that if you've spent any time programming, whether on your

00:49:48   own or even just in small groups, even with just one or two other people, especially in

00:49:52   small groups where you kind of get a feel for like, "This is what it's like to make

00:49:55   software. This is what the experience is like." You learn kind of what works and what doesn't

00:50:05   in terms of getting the job done, independent of any schedules and stuff like that.

00:50:09   What works is having a bunch of talented people who are motivated and excited about the thing

00:50:16   that they're doing. And so all these methodologies, in some respects, are attempts, usually good-hearted

00:50:23   and sometimes vaguely effective attempts, to take, for example, mediocre programmers and make them

00:50:29   into great programmers, or to take great programmers who don't care about something and make them care

00:50:32   about it. Everyone wants, like, what we want is what happens when you get a bunch of talented

00:50:37   people working on the same product, who are good at their jobs, who are excited about what they're

00:50:42   going to do. And large companies say, "Well, we can't do that because we're too big and people

00:50:47   aren't excited about our stuff and it's hard to get that many people who are talented." Or even

00:50:51   if you could get that many people who are talented, they don't get along with each other in big enough

00:50:54   groups. So we need some sort of method to arrange all this so that we get some sort of somewhat

00:51:01   what predictable performance that about what we've got.

00:51:03   What we've got is maybe we've got really talented people,

00:51:05   we've got not so talented people.

00:51:06   Maybe we've got people who are excited

00:51:07   and people who aren't excited.

00:51:08   Maybe we've got people disagree

00:51:09   about how things are gonna go

00:51:10   and we have a whole bunch of people telling them what to do

00:51:12   so they don't even get to decide

00:51:13   which direction the product goes in

00:51:14   and we want them to do that in a consistent manner.

00:51:16   And so if you're in this environment, you're like,

00:51:19   you start to feel, if you just left us alone

00:51:22   and got rid of Dave who stinks,

00:51:25   then we would get this thing done.

00:51:26   But instead we have to go through these stupid steps

00:51:28   of this methodology and these user stories

00:51:30   these meetings and these points and going through all this stuff and you're just like

00:51:33   "look, I'm a software developer, I know what it takes to get good software, it just put

00:51:38   me in charge of the world and I'll tell everyone exactly what to do and just me and my three

00:51:40   friends but like everyone wants to get back to that thing where it's just you and a couple

00:51:43   friends in a room working on your program which is how lots of great software is made

00:51:48   and all this methodology stuff seems like it's pointless busy work that doesn't actually

00:51:53   make anything better and I think a lot of the times it is pointless busy work that doesn't

00:51:56   make anything better and the only thing that it makes better is that, as people pointed

00:51:59   the chat room, people higher up in the org chart can point to, "Well, we've been following this

00:52:03   methodology, and it has predictable results and look at our things to say we know how many points

00:52:09   in our velocity and blah, blah." It's all kind of like covering your ass that so you're not going to

00:52:12   get fired because, well, we have a methodology and this is what we do. But in effect, everyone

00:52:16   in that organization might feel like this stupid methodology is making us less efficient, less

00:52:22   happy, making us do worse work than we would if we just, again, fired Dave because he stinks,

00:52:26   and let three of us go off for a weekend and we'll solve your freaking program." That's how

00:52:31   programmers feel. The cowboy coding, like, "Stand aside. I know exactly what I'm going to do."

00:52:34   That's the tension between the feeling that any experienced programmer has, that if you just

00:52:40   let us do what we need to do, we could get it done. And the reality that if you're on a large

00:52:47   project, you can't do that. This is where this tension is. And some people are trying to apply

00:52:52   a methodology to make order out of this chaos. And the people who are contributing to the chaos

00:52:56   always feel like anything you impose on me is making things worse and they get grumpy about it.

00:53:01   And some people will get religion and say, "Oh, I'm going to really get into this process and I

00:53:05   like it and it's predictable." But the bottom line is no large organization is ever going to achieve

00:53:10   the ideal of a handful of talented, like-minded people communicating well. And even in that

00:53:16   article that I put in the show notes, it's like, "Why software methodologies don't work," even if

00:53:19   if the ideal is like, "Oh, we'll just focus on communication and make sure we have personalities

00:53:22   put together," those solutions are not scalable to organizations of hundreds of people. No way for

00:53:28   people to do a job together at peak performance is scalable to hundreds of people because inevitably

00:53:34   there's going to be personality conflicts, there's going to be disagreements, and that's why you

00:53:38   inevitably have to fall back to some kind of methods. Now, for things that are regularized,

00:53:42   which has been pointed out a million times, like building a road or building bridges or things

00:53:47   people have been doing for hundreds of years that are very simple, you know, pretty much

00:53:51   success and failure wise, there needs to be a span over this course of water, it needs

00:53:55   to carry this much weight, it needs to withstand these conditions, it needs to last this long,

00:53:58   we've done this too many times before, we know exactly the steps that are required to

00:54:01   take, still things can get screwed up and we can try new methods they might mess up,

00:54:04   but in general it is way easier than even the simplest software product where no one

00:54:08   has made this specific thing before with these specific requirements, people don't even know

00:54:11   what the requirements are and they're going to change a million times, like software is

00:54:15   so much more complicated than pretty much anything else humans do except maybe parenting,

00:54:18   that it's impossible to predict what's required to go into it.

00:54:23   And so, we're willing to accept essentially a massive reduction in peak performance just for

00:54:29   some grasp on like, "Can we get some predictability out of this?" Even if it means that we're working

00:54:36   slower. And once you get to a certain size in a public company, you're like, "I don't care how

00:54:41   much crappier it makes us. As long as we're still able to stay in business, I want everyone

00:54:45   to follow whatever methodology, A, B, C, and D, or whatever processes, even if I know that

00:54:50   this is going to send the best programmers out of the company because they don't want

00:54:53   to work for us anymore, and we're not going to get the peak performance that we would.

00:54:57   It's just better than the unpredictable chaos, because if you're not one of the programmers,

00:55:00   but you're one of the people in charge, you could very quickly feel like nobody's in charge,

00:55:05   nobody's at the wheel, you're just up there and you just whisper down to the programmers

00:55:08   well, "Hey guys, you think you could make our program do this?" And then you just wait to hear

00:55:13   a reply with your fingers crossed. That's not a way to run an organization. So these rules always

00:55:17   have to apply, but they're always going to feel like they're making things work. And in many

00:55:22   respects, they're wishful thinking that you are going to somehow come up with the correct process

00:55:27   to turn this huge team of people into the equivalent of 17 rooms full of five people who

00:55:32   are really motivated. You know, the funny thing about everything you're saying is, and this is

00:55:38   was pointed out to me on Twitter by Chris H. and he's right, is that the origin of Agile

00:55:46   was to try to get away from oversight by management. And so the whole idea of Scrum and Agile is

00:55:54   that you self-manage as a team. The team is self-governing. You establish the team norms

00:55:59   upfront. We're always going to be on time to meetings, which never actually happens.

00:56:03   We're always going to pay attention during meetings, which never actually happens.

00:56:08   trying to get everyone to get along and agree, but you can't do that. That's not how human relations

00:56:14   work. If we just all agree that we'll be friendly and motivated and work well together, then we will.

00:56:19   That's not how it works. It's like coming from the outside and saying, "It would be nice if

00:56:27   the interaction with our groups were this way." And then if you just say that to each other,

00:56:32   that doesn't make it so. If these two people still hate each other, they're always going to hate

00:56:35   each other. These two people disagree strongly about this technical issue. Telling them that

00:56:39   they shouldn't disagree is not going to work. It's just kind of like shifting the things around on

00:56:46   the table. Like, okay, well, we don't want people who aren't programmers telling us, so I'm sure if

00:56:49   we just let all the programmers sort it out, they'll do fine. No, you just move those dynamics

00:56:53   to a different group of people, a different, even more passive aggressive group of people.

00:56:56   So I take it, Jon, that you've never been in a situation wherein you feel like the method

00:57:06   improved the product.

00:57:09   Processes sometimes improve the product. And like I said, for large organizations, you

00:57:13   have to have processes and you have to have methods because if you don't, it's just wildly

00:57:18   unpredictable. And it's depending, especially with your mix of people, like the chestnut

00:57:22   that you always heard is like, if you have a bunch of great programmers, then you can

00:57:25   do whatever the hell you want, and you'll be awesomely successful. If you have a bunch of

00:57:28   mediocre programmers, you need to apply processes and methodology. And if you have a bunch of crappy

00:57:33   programmers, you need massive methodology application to them. And then in the end of it,

00:57:39   all of those situations, you end up with a product of similar quality, which I don't know if I believe

00:57:43   that. And I don't know which one of those things is better. But most companies are a mix of those

00:57:49   type of things. And it's just, I don't think there's any avoiding. With something as complicated

00:57:54   and unpredictable software. I don't think there's any avoiding that dynamic that small groups of

00:57:58   focus really smart people can do great things in short periods of times. Things that larger groups

00:58:02   of less motivated, less experienced, less talented people can never do. Like it's not like, okay,

00:58:08   well, it takes these five people in a room a year to do this. But if I have 300 people and three

00:58:13   years, I can equal them. No, you will never equal them with 300 people if you don't have the right

00:58:16   five, for example. It's because it's not a predictable thing because it's not something

00:58:21   you can systematize. Someone in the chat room was saying, oh, come on, there's plenty of things

00:58:24   that are more complex than software. Natural things, not plenty of man-made things. I would

00:58:30   say a 747, I mean, the consequences for errors in a 747 are much greater, but software is much

00:58:34   more complicated. The consequences are usually stupid and pointless and nobody cares, which is

00:58:38   why we get away with this. But software is insanely complex in terms of what number of states can this

00:58:47   thing be in and how many transitions from one state to another can it go through. I mean, think

00:58:51   the freaking halting problem. We can't even reason, make basic reasons about arbitrary

00:58:56   programs that you're going to consider. Programming is different by nature than most things that people

00:59:02   do. It just so happens that the consequences are usually not that serious. And that's when you see

00:59:06   programming where there is real consequences, like missile control systems or things in

00:59:10   planes, hopefully. They have massive process applied on them. They're willing to sacrifice

00:59:15   productivity and job satisfaction and not have the smartest people to say, "Look, we have

00:59:20   crazy requirements about how everything must be done in the most conservative fashion possible.

00:59:25   And it's like, some people look at it and say it's a miserable existence, but that's our only tool to

00:59:29   say, "We would like to make a program, but we would also like it not to fail, like ever. And so,

00:59:35   we are going to apply the methodology nuclear bomb, or nuclear, if you want to pronounce it correctly,

00:59:41   to this problem. And we are going to make it miserable for like, no one would sign up for

00:59:46   this. You can't use these features. You're not allowed to ever allocate memory. We have the

00:59:50   system for doing—I mean, for the space program and stuff like that. That's our only tool.

00:59:54   That's what our tool is for, for trying to make it so that we can minimize bugs and make it reliable,

01:00:01   but it destroys productivity. You can't use that same methodology that you used for the Mars rover

01:00:05   software. You can't use that to make WhatsApp. You'll be out of business. You can't even use

01:00:12   it to make iOS, for crying out loud. They would never produce a product. How many years? And they

01:00:18   would never get the people who wanted to work on it. So that's the place of methodology.

01:00:25   It's an evil that is necessary to the degree to which you demand the predictability of your

01:00:32   software. And even then, you have things going out of orbit because of unit conversion errors and

01:00:35   this bug. We're never perfect, right? So I feel for the people who want methodologies to make

01:00:43   things better. But I mostly see it as the only tool we have to try to fight against the inherent

01:00:49   chaos of writing software. And as per usual, I think you've hit the nail on the head. And

01:00:55   coincidentally, I was about to bring up that there are instances where Waterfall, which among

01:00:59   software developers is considered to be evil in almost all cases, there are instances where

01:01:04   Waterfall is absolutely the correct answer. And in fact, as I kind of hinted at during the

01:01:12   debug that I was on with Guy and Renee.

01:01:15   Nice plug.

01:01:16   I do what I can.

01:01:17   It was pretty good, actually. I'm like halfway through it, though.

01:01:20   Oh, thank you. So, yes, when I was working on some stuff for where failure was not an

01:01:27   option, we had pretty much everything that was waterfall. A lot of planning up front,

01:01:33   a lot of meetings up front, code reviews, and all of these things. And that was in order

01:01:38   to prevent exactly what you described, John. You know, it was to prevent a poor unit conversion

01:01:45   or something along those lines. And in that situation, it was absolutely necessary. It

01:01:50   was absolutely necessary, absolutely the right answer, and absolutely the right way to get

01:01:54   that project done. The problem was is that as a developer, especially one who tends to

01:02:00   want to sling code and rather than talk about slinging code, it was incredibly neutering

01:02:07   to me—is that a word? It doesn't matter. Anyway, it made me feel like I could never get anything done

01:02:12   because I just had to talk about getting things done. And that was very frustrating, but I don't

01:02:17   begrudge my then-employer for doing things that way. It was absolutely the right call. It's just,

01:02:22   it wasn't the right call for me.

01:02:24   **Matt Stauffer** Waterfall is like almost impossible with anything that's

01:02:28   reasonably complex because nobody knows what the correct design for it is. No matter how much you

01:02:32   talk about it. You could talk for three years to come up with a design, and once you start

01:02:38   implementing it on a third day, you're going to go, "Oh, we didn't think of that," because the

01:02:44   running program is too complicated for everyone to keep in their head. What happens at that point is,

01:02:49   do you bravely plow forward with the waterfall design? Do you go back to the drawing board and

01:02:54   start it all over again, or do you just make some little tweak and then you end up with this thing

01:02:57   that's kind of misshapen? It's like putting together a piece of furniture and one piece

01:03:00   piece of wood is bent so it barely fits. It's like, "Hey, we followed the plan, right?

01:03:05   And slot A into tab B." Yeah.

01:03:08   Yeah, and coincidentally, Marco, to bring him back into the conversation, has dealt

01:03:12   with a lot of that lately. Not as much waterfall stuff, but you've dealt with the reality

01:03:20   and realization that, "You know what? Maybe I need to throw away a lot of work I've

01:03:24   done and do it all over again."

01:03:27   Oh yeah, I mean that's, I mentioned in the after show a few, about four episodes ago,

01:03:34   I can tell you it was about a month ago, and I don't even, I think I even cut it out of

01:03:38   the final edit, but the live listeners probably heard me a month ago say that I was rewriting

01:03:44   the Overcast sync engine because I discovered some sync shortcomings and I was rewriting

01:03:49   the engine to be much better and everything else. And at the same time I was doing, and

01:03:55   And of course, in hindsight, I can see why this is a bad idea.

01:04:00   While rewriting the sync engine and the whole protocol with which it synced to the server,

01:04:04   I also took the opportunity to break up the data model for the two key models of the app,

01:04:12   which is podcast and episodes.

01:04:15   I broke off parts of those that were the user parts of them into separate models.

01:04:20   And this required such massive changes to the code on both sides, server and client.

01:04:25   At the same time, I was also changing the sync protocol

01:04:27   and trying to make it generic so I could use

01:04:30   the one master superclass/library functions

01:04:33   to sync anything.

01:04:35   And the result was I lost a month, basically.

01:04:41   And last night, I decided,

01:04:44   as I was staring there with the parts of my app

01:04:49   on the floor still, and as I was trying to rework

01:04:54   the changes still back into the app and I realized

01:04:59   I was throwing away so much working behavior

01:05:03   and so much nuanced complicated stuff,

01:05:06   especially on the iOS app, that depended on the old model.

01:05:10   Meanwhile, this whole time on my carry iPhone,

01:05:13   my main iPhone, I was using the old version of the app

01:05:16   from a month ago because this whole time

01:05:19   I couldn't put a new build on yet 'cause it was broken.

01:05:22   And for the whole month, it's been fine.

01:05:24   It's been working great.

01:05:26   And I've been sitting here using this build

01:05:28   for this supposedly unshippable alpha software.

01:05:31   I've been using the same build for a month

01:05:32   every day heavily and it's been perfectly fine.

01:05:35   And I realized, you know what?

01:05:37   Now that I'm like two thirds of the way

01:05:39   into this massive sync change, I realize now

01:05:43   that even if I get it all back together,

01:05:45   it's gonna be too complicated, too fragile,

01:05:49   and not at all maintainable.

01:05:52   And so even if I finish this,

01:05:53   the result will be worse than what I had before.

01:05:56   I was wrong when I thought it would be better.

01:05:58   In practice, it's not better.

01:06:00   So therefore, I decided to go back to what I had

01:06:05   and I spent today basically reverting back

01:06:10   to the old version from a month ago

01:06:12   and merging in a few, basically cherry picking by hand,

01:06:17   a few of the improvements I made

01:06:18   that will fit in the old system just fine.

01:06:21   And so I've lost, I wasn't doing this the whole month,

01:06:25   I was doing some stuff I could keep,

01:06:26   but I would say I probably threw away

01:06:29   two and a half weeks of work,

01:06:31   because the result is actually better the old way.

01:06:34   - I think you're only on step one of four

01:06:37   of the Brent Simmons syncing development.

01:06:39   (laughing)

01:06:40   Because I think at this point now you have to also

01:06:42   throw away this one again and try the other one,

01:06:44   and then you have to revert back again to the previous,

01:06:47   if you're following the Brent Simmons

01:06:49   Vespersync diary plan of syncing software development,

01:06:52   you've got a long road ahead of you.

01:06:55   And to be fair, to bring this very briefly back

01:06:58   to methodologies, no methodology would have

01:07:00   fixed that in my estimation.

01:07:03   They would have made it take longer.

01:07:04   Yeah, you're absolutely right.

01:07:06   Agile would have--

01:07:07   Or it would have forced you to ship

01:07:08   something you didn't want to ship because in areas

01:07:11   where there are constraints.

01:07:12   That's what I was getting at is that a lot of times,

01:07:15   methodologies are seen as the way to solve your problem,

01:07:18   your problem really might be your one software you have working on a sync system has not written

01:07:22   a sync system before. And there's no methodology that's going to make that go faster, make it come

01:07:27   out better. There are lots of practices that you can adopt. It's not like saying, "Oh, it should

01:07:32   just be the wild west." For example, there's practices that should be imposed on people that

01:07:36   I think everyone would agree on. I don't pick the language you're trying to like JavaScript with the

01:07:40   stupid use strict token or whatever. If you decide that's a reasonable thing to do, and I think it is,

01:07:44   and it doesn't affect your compatibility to say everybody

01:07:46   has got to put the JavaScript that you script,

01:07:49   deciding on your compiler flags, having

01:07:50   some sort of common naming convention and indenting

01:07:53   style, all the easy things that people can agree on.

01:07:56   That stuff counts, and it helps everybody.

01:07:59   But people say, all right, that's good.

01:08:02   Coming up with agreement on these standards helps.

01:08:04   If we come up with agreement on even bigger, more sweeping

01:08:07   changes, surely we'll get a proportional benefit.

01:08:09   And you don't.

01:08:11   I mean, that's where we should start talking about testing,

01:08:13   is that testing is definitely something you really need and is great.

01:08:17   And testing taken to its logical conclusion of test-driven development

01:08:21   is certainly better than having no tests at all, but you can't just keep

01:08:25   cranking that dial until it's like, now, just everything will be perfect all the time, because if a little

01:08:29   bit of testing helps, and a little bit more testing is great, then if we have 100% code coverage

01:08:33   and everything testing, we've solved the problem of software development, and you haven't.

01:08:37   And unit testing is something that I think is

01:08:41   A little bit, I don't know if taboo is the word I'm looking for, but not a lot of developers

01:08:45   that I know are really into it.

01:08:48   And I would argue that test-driven development is taking it a bit too far in my personal

01:08:53   opinion, but writing comprehensive unit tests is the same sort of thing as code reviews,

01:09:01   where at first I was like, "Oh, God, really?

01:09:04   This is a thing we have to do?

01:09:06   Everyone has to look at my code and try to tell me why it's wrong, and they don't realize

01:09:10   that I'm actually right.

01:09:11   Now I got to convince all these people, but in the end it actually worked.

01:09:14   Code reviews were extremely, extremely interesting.

01:09:18   And I always learned something from it, even if I never changed my code.

01:09:21   And most times I did change my code.

01:09:23   Unit testing is a similar thing where it's like, Oh God, do I really need to write unit

01:09:26   tests for all these things?

01:09:27   I run it a few times.

01:09:28   I give it a few example inputs and that should be enough.

01:09:31   Right.

01:09:32   But unit testing is an unbelievably awesome way to make sure that not only that what you've

01:09:38   written works, but also that it will stay working. And that's what's extremely powerful

01:09:45   about it. And you can take it to the nth degree, which is test-driven development, or you can

01:09:49   use it where appropriate. And one of the things that I need to explore more in Objective-C

01:09:55   is looking at Xcode's unit testing framework, because I've never really played with it.

01:09:59   But there's great frameworks in Java, in .NET, like nUnit, for example, and actually Microsoft

01:10:05   has their own copied version of that.

01:10:08   - Of course they do. - Of course they do.

01:10:10   That'll allow you to do those sorts of things.

01:10:12   And I agree with you, Jon,

01:10:13   that having these unit tests is a really valuable thing.

01:10:17   - Every methodology has something about it

01:10:20   that you can take away from it that's good.

01:10:22   Waterfall has some good things about it,

01:10:23   as in let's think about what we're doing

01:10:25   before we type.

01:10:25   Agile has some good things about it.

01:10:27   Let's take smaller steps,

01:10:28   because if you describe something as one big giant step,

01:10:32   you have no idea what's involved in it.

01:10:33   If you break it up into smaller steps,

01:10:35   then you have an eye. That's from the Gantt charts as well, but also you get that experience in

01:10:38   agile. Test-driven development, a lot of developers left to their own devices won't do tests. So if

01:10:43   you try to indoctrinate them into this crazy cult of test-driven development, then they'll learn to

01:10:47   write tests. And I think test-driven development is actually not that far off from something that

01:10:52   everybody should do. It's just that it's so far off from what people would do on their own that

01:10:57   it seems crazy at first. And I'm a big proponent of massive amounts of testing. What you run into

01:11:03   eventually with test-driven development or any kind of testing thing is that tests are also code.

01:11:09   Like, it's not like another infallible person comes and writes the tests, right? And it doesn't

01:11:13   mean you shouldn't write tests. That doesn't invalidate tests, but like, that's the limit

01:11:16   that you hit. The limit that you hit at a certain point is that your, you know, tests are code.

01:11:21   You're not infallible when you write them either. And the more of them you write,

01:11:24   the more difficult changes become, both in good and bad ways. And like, so every one of these things,

01:11:29   if taken too far, can have problems.

01:11:32   And that's what, again, what we're looking for is like, oh,

01:11:34   just if we've got those two or three guys who really kind of

01:11:36   know what they're doing and have experience with each

01:11:38   of the things they're going to be doing

01:11:40   and have done something like this before

01:11:41   and can kind of get the balance and just firing on all cylinders

01:11:45   and the project is small enough to fit in all their heads

01:11:47   collectively.

01:11:48   And then they're super geniuses and they

01:11:50   make this great 1.0 product with this great potential feature

01:11:53   and they hand it off to another group.

01:11:54   We're all trying to go to recapture that.

01:11:57   It's like trying to recapture your youth.

01:11:59   You're never going to get there again.

01:12:02   You're never going to be like you

01:12:03   were when you were 16 years old.

01:12:04   It's like, if we could just apply

01:12:06   this series of constraints.

01:12:08   But there's wisdom in every experience

01:12:10   that you've had that you want to apply.

01:12:11   And in some respects, it doesn't matter

01:12:13   if you're in an organization that's

01:12:15   like, whoever's in charge of things

01:12:16   says agile is the way we go.

01:12:17   Whoever's in charge of things says

01:12:19   we've got to do test driven, or we have to do pair program,

01:12:20   or we have to do reviews.

01:12:21   Like someone's leaning heavily on whatever button or dial

01:12:24   or accelerator they think is the best thing.

01:12:26   In the end, what matters is the people in the group,

01:12:30   the dynamics between the people, their motivation for what

01:12:32   it is they're doing, and their skills and experience.

01:12:35   Have they done something like this before?

01:12:37   And almost any methodology that you apply to them

01:12:39   will appear to work, because they would have

01:12:42   been fine in any situation.

01:12:43   And those same people would choose

01:12:46   to write tests because they know it's good.

01:12:49   There's no methodology that omits tests entirely,

01:12:51   because that's insane.

01:12:51   There's no methodology that omits any planning upfront,

01:12:54   because we hate waterfall. Like, there's always something, some piece of every one of them, like,

01:12:59   no matter what methodology you apply to a bunch of good people, they will pull the parts from each

01:13:05   one of them. The problems only come in when it's like, you know, how long ago was like,

01:13:09   no silver bullet written, and you know, the Fred Brooks stuff? Like, that's so long ago. Like,

01:13:13   we've known this for so long, it's just people keep reaching for that brass ring and saying,

01:13:17   "This time I've got it." And every time it's like, you don't have it, but you've discovered

01:13:21   something new that maybe we can take away from what you're preaching and move forward.

01:13:25   So, in summary, you know, methodologies can help, but really, they're not going to fix anything,

01:13:31   and it's all a bunch of hocus pocus. With that said—

01:13:34   We should suggest—oh, you want to know the sponsor? I was going to say that both of us should

01:13:38   convince Marco to write way more tests than he's currently writing.

01:13:41   Which is to say, more than zero?

01:13:43   I don't know what that number is. I wasn't going to say. I'm just saying.

01:13:46   Which is to say, more than zero?

01:13:47   Yes, I would suggest that.

01:13:50   I would too, I really would. But anyway, Marco, what else is really cool these days?

01:13:54   It is once again our friends at Squarespace. Squarespace is the all-in-one platform that

01:13:59   makes it fast and easy to create your own professional website or online portfolio.

01:14:04   For a free trial and 10% off, go to squarespace.com and use offer code.

01:14:08   Now you guys know it's a new month. January was Marco, February was Casey. March,

01:14:14   Offer code for Squarespace is critical.

01:14:17   Nice.

01:14:18   Everyone else gets a name.

01:14:21   I just get a truncated version of my old podcast.

01:14:24   All right.

01:14:25   Well, I guess people can spell it better than my name, I hope.

01:14:27   Well, maybe critical is your identity.

01:14:29   I was going to say that's so critical of you right then and there.

01:14:33   I guess hypercritical is too long to fit in the database column they store the coupon

01:14:36   codes in.

01:14:37   Right.

01:14:38   And maybe they throw a warning and it blows up their whole app.

01:14:41   Squarespace has constantly improved their platform with new features, new designs, and

01:14:44   even better support.

01:14:45   They have beautiful designs for you to start with and all the style options you need to

01:14:49   create a unique website for you or your business.

01:14:52   They have over 20 highly customizable templates for you to choose from, and these templates

01:14:57   have won numerous design awards from prestigious institutions.

01:15:00   Now Squarespace is very easy to use, but if you need any help, they have an amazing support

01:15:04   team of over 70 people in New York City that works 24 hours a day, 7 days a week.

01:15:10   and even they have won numerous awards.

01:15:13   So Squarespace starts at just $8 a month.

01:15:16   Now they have this cool feature called Squarespace Commerce

01:15:19   where you can build your whole,

01:15:20   you can build an online store

01:15:22   that sells physical or digital goods.

01:15:25   And it's this whole storefront capability

01:15:28   built into Squarespace.

01:15:29   So it looks like your site, you can theme it,

01:15:31   you can customize it, and you don't have to build

01:15:33   the shopping car, the tracker for all the inventory

01:15:36   and all that crap, they do all of that for you.

01:15:39   Building a store is a lot of work, they do it all for you.

01:15:42   And Squarespace Commerce is included

01:15:44   in every Squarespace plan.

01:15:45   So if you wanna use it, great, no additional charge.

01:15:48   If not, you just wanna make a site, a portfolio,

01:15:51   whatever the case may be, you can do that too.

01:15:53   All this starts at just $8 a month.

01:15:55   You get a free domain name with that purchase

01:15:57   if you sign up for a year up front.

01:15:59   And so you can start a free trial today.

01:16:02   No credit card required for that free trial.

01:16:04   It's a real free trial, you don't have to like

01:16:07   give them a credit card and hope that you remember,

01:16:09   otherwise you'll get charged, nothing like that.

01:16:11   Real free trial at Squarespace.

01:16:13   Go to squarespace.com and use offer code critical

01:16:17   to learn more and to start building your site today.

01:16:19   Now one more thing, if you hurry up,

01:16:22   they are interviewing designers and engineers

01:16:27   'cause they wanna hire 30 designers and engineers

01:16:29   before March 15th.

01:16:31   Now right now, this episode's gonna be released on March 6th

01:16:34   So, you know, hurry up, basically.

01:16:37   But if you interview for an engineering or design position

01:16:39   before March 15th, they will invite you and your partner

01:16:44   to be New Yorkers for the weekend.

01:16:45   They will fly you out, put you up

01:16:47   at one of the city's best hotels

01:16:48   and give you a long weekend of being a New Yorker.

01:16:51   Go in a restaurant, attractions, et cetera.

01:16:53   And they will pick up the whole tab for it.

01:16:55   They've been voted one of New York City's greatest places

01:16:57   to work for two years running.

01:16:58   So put them on your short list.

01:17:00   They're looking to hire, again,

01:17:01   30 engineers and designers by March 15th.

01:17:03   So hurry up, go to beapartofit.squarespace.com

01:17:07   to learn more and apply.

01:17:08   So thanks a lot to Squarespace once again.

01:17:10   Remember, squarespace.com, offer code critical.

01:17:12   - If you've been hearing Squarespace ads on podcasts

01:17:15   for months but not going to the site to sign up,

01:17:17   now's the time to do it with my code to show the other two.

01:17:20   (laughing)

01:17:22   - I wonder if I can get them to tell me,

01:17:24   like in a couple of months, tell me which of our codes won.

01:17:26   - Which of our, yeah, and it's too late for you two.

01:17:28   So now everybody pile on my code

01:17:29   so when Marco gets his information,

01:17:31   I will be the overwhelming winner.

01:17:33   Well, I don't think our codes stop working. I'm pretty sure the codes still work. It's

01:17:36   just, you know, the--

01:17:37   I don't know. That would be bad. Because then, like, whoever had the first code would have

01:17:41   the accumulation of, you know, they had months of users signing up. Anyway, everyone used

01:17:46   my code.

01:17:47   So, Casey, was the--after all this time, was the Methodologies talk on this show everything

01:17:54   you hoped it would be?

01:17:56   No, it actually really wasn't as exciting as I thought.

01:18:00   You think it's over, though? Are we done?

01:18:02   Are we? I don't know. We don't have to be.

01:18:04   Oh, we don't have to be. I don't really have that much more to say, to be honest, after

01:18:07   all that.

01:18:08   Well, I have a question for each one of you.

01:18:10   I mean, we are going to have nine episodes of follow-up.

01:18:13   Maybe Marco can abstain, but what methodology or part of a methodology have you found to

01:18:19   make the biggest improvement in how you feel about coding or how you write code?

01:18:27   So, Marco, do you have an answer for that?

01:18:29   It doesn't have to be a formal one, just anything you've done.

01:18:33   Honestly, what has improved my code the most by far has been open sourcing it.

01:18:41   Not necessarily because of the contributions I get, which are good, but I don't get a whole

01:18:44   lot of them.

01:18:46   It's mostly that if I know I'm going to be open sourcing it, I hold myself to a higher

01:18:50   standard and I reconsider my decisions more.

01:18:55   So by editing myself more and by pushing myself to a higher level of discipline for these

01:19:01   very important modules that I'm open sourcing, that has helped me dramatically.

01:19:07   And as an independent developer, I don't really have code review, I don't have pair programming.

01:19:15   There's a lot of methodology type stuff that I just can't do as one person. And a lot of

01:19:21   of the stuff I could do, like having tons of tests, I could do a lot of that stuff,

01:19:26   but it would take so much of my time and it would slow me down so much that it might not

01:19:32   be worthwhile or practical for me to do that. And so I don't do a whole lot of the other

01:19:38   stuff, but certainly, yeah, open sourcing by far, that's helped me more than anything.

01:19:44   So for me, I would say that I think Scrum, which is to say the thing where you get together

01:19:54   either physically or on the phone for five to fifteen minutes each day and talk about

01:20:01   what you've done, what you're doing, and what stands in your way, that sounds wonderful

01:20:07   in theory.

01:20:09   And in practice, never works.

01:20:13   Even in my company where we take Agile extremely seriously, and of course all the people who

01:20:20   think I'm wrong about everything are laughing right now, but we do take Agile very seriously.

01:20:24   And whether or not you think, "I know what I'm talking about," I assure you that they

01:20:27   do.

01:20:28   And so they take Scrum very seriously, and we have Scrums every single day.

01:20:32   And even though we take all this stuff so seriously, Scrums never last the 15 minutes

01:20:38   they're supposed to last, and they always go into tangents that they're not supposed

01:20:41   to go in.

01:20:43   And so Scrum has never helped. However, that one project where we had that perfect storm

01:20:49   of willing and capable developers, willing and capable QA, willing and capable PM, and

01:20:56   willing and capable product owner/client, when all of us really went all in on agile

01:21:03   and really bought into it and really believed in it and really took it seriously, it was

01:21:10   fantastic because it allowed for us to roll with the client's ever-changing request,

01:21:18   which to be fair, they were actually very good to us and didn't really change things

01:21:21   that often. But when they did, the product owner, the client would come to us and say,

01:21:26   "Oh, I really want to do this thing. How many points is that?" And we'll go and talk

01:21:31   for a few minutes. "Okay, it's eight points. Oh man. All right, let me figure out what

01:21:35   eight points I want to get rid of and I'll get back to you." And we didn't have to

01:21:39   argue with them. There was no scope creep. There was no, "Well, if you'd like this to happen,

01:21:46   you're going to have to take some other stuff out." "Well, how much other stuff?"

01:21:49   "We don't know. You're just going to have to take out other stuff." "Well, I need to know how much..."

01:21:53   None of that awkward conversation happened. They took it upon themselves to realize, "Well,

01:21:57   you know what? You've told me that this thing that I really want is eight points. And so I know that

01:22:05   that I need to take away eight points

01:22:07   from what's currently on the docket.

01:22:09   And oh my God, it was so wonderful.

01:22:11   I mean that genuinely.

01:22:12   It was so wonderful.

01:22:14   And I think this is exacerbated by the fact

01:22:16   that in consulting, you have this client versus,

01:22:21   well, not versus necessarily,

01:22:23   but you have your client and your own team.

01:22:26   And sometimes that can be an adversarial relationship.

01:22:29   But when we were all on the same page with Agile,

01:22:32   It was so wonderful because we were truly, honestly partners in getting this project done, and it was great.

01:22:38   And so to answer your question, Jon, the one time of the 10 or so years I've been working,

01:22:43   the one time that Agile really, really, really, really stuck, it was incredible.

01:22:50   But to be fair, I've tried Agile many, many other times, and it hasn't really worked out that well.

01:22:55   And at best, it was a distraction, and at worst, it was a hindrance.

01:23:01   Yeah, I would say for me, it's hard to pick,

01:23:03   because a lot of things-- like, there's

01:23:05   a lot of steps and things that I've done that have helped.

01:23:08   I mean, I recognize Marco's thing with open sourcing stuff.

01:23:12   Like me, having a lot of my code be open source

01:23:14   already in my career, though now it is super embarrassing

01:23:17   that it's out there and it's terrible.

01:23:19   That really helped for all the reasons Marco said.

01:23:21   Like, you just feel the pressure to make it better.

01:23:23   And especially when you're a young developer working

01:23:25   on your own, like I was on just like random open source stuff,

01:23:29   you need some kind of external motivation because you don't have a boss telling you to do it,

01:23:33   or maybe you don't have other experienced programmers trying to get you to do stuff.

01:23:36   But I think the thing that has made the biggest impact on me in terms of how I develop software

01:23:42   was as I started to do more and more testing. And not in any particular thing, not particularly

01:23:49   test-driven, not particularly even unit tests versus integration tests or his function tests,

01:23:52   but just the idea that testing is not like eating your vegetables. And I know I had turned a corner

01:23:59   when I think it was like one or two jobs ago, at some point there was some fairly complicated

01:24:03   project that they wanted with like, you know, just a big long description of how it's supposed

01:24:09   to work written entirely by non-technical people. So of course they have no ideas about feasibility

01:24:13   or anything like that. And they needed it in a super short time frame. And they're like, you

01:24:18   know, this is super important. We know it's really complicated, but like, you know, we have all this

01:24:21   meeting about it. Like what's it going to take to get this thing done? And my reaction to being put

01:24:26   in that situation to be the lead guy on this project was immediately to revert to kind of

01:24:34   like, not test driven entirely, but to say, I'm going to need a massive amount of tests.

01:24:39   More testing than my tests have to be great, and they have to be awesome, and I have to really

01:24:42   concentrate on testing because that's the fastest way to get this thing done on time with the fewest

01:24:47   bugs, so on and so forth. And the fact that that was my reaction shows that I had sort of learned

01:24:53   through bitter experience that testing is not like eating your vegetables, it's not like a

01:24:58   luxury you can get that you can afford to do if you have the extra time or whatever, but rather

01:25:03   when you're under the gun is when you really need to pull that out of your back pocket and not be

01:25:07   religious about it and say like, "Oh, I'm not going to take a step until I've got a failing test," and

01:25:14   all this other stuff. Not that crazy, but just for me to feel confident that I can move at my

01:25:19   fastest pace, I have to be sure that the code I'm writing is correct and the code that I've written

01:25:24   remains correct during this entire development process. And I've never been under quite those

01:25:29   same constraints before, but now, like, that's an example of a turnaround in methodology.

01:25:33   I think people who do pair programming have the same feeling sometimes where it's like,

01:25:37   I'm not going to do that all the time as the most extreme thing, but when push comes to shove,

01:25:41   I know which dials I can turn on myself to get my best performance, and I will choose from those

01:25:46   things. And I think testing is the one that probably, I don't even know how, it probably

01:25:51   came up because of the test driven development hype and everything, but I would not have thought

01:25:55   of that on my own to be the thing that I should do when I was a young programmer, but eventually came

01:26:00   to have it as one of the tools in my toolbox. And I lean on it heavily now and I try not to preach

01:26:05   each other people, but I say it should be something that you do from time to time to know how it

01:26:10   affects your work. Yeah, I agree. And it's funny because formal unit testing,

01:26:16   is something, and also formal integration testing,

01:26:19   is something that I find often gets punted

01:26:23   if you're running out of time or budget.

01:26:25   Additionally, performance testing is another example

01:26:29   of something that gets punted if you're running out of time

01:26:31   or running out of budget.

01:26:33   But all of those things are extraordinarily important

01:26:37   to give a deliverable that you're really truly proud of.

01:26:40   And it's a hard thing, man, when somebody's looking at you,

01:26:45   you be it a project manager or a client saying, "Oh my goodness, you really, really, really

01:26:50   need to get this thing shipped."

01:26:52   And you say, "No, my test classes aren't complete yet.

01:26:57   There is not enough code coverage, so you need to leave me alone."

01:27:00   It's a hard thing to sell, and it's a hard thing to say, but so often, if you don't get

01:27:05   that right up front, you'll pay for it later.

01:27:08   Same thing with performance testing.

01:27:09   "Oh, well, you know, we don't need to worry about that.

01:27:11   We shouldn't have but five users at a time."

01:27:13   something weird happens next thing you know you have 500 users at the same time and your

01:27:17   website comes to a screeching halt. Yeah the key thing is to recognize when each tool is appropriate

01:27:24   so for the example that I cited I was handed a big giant you know word document that always

01:27:28   word documents of this complex system not written as a programming spec by any means but merely

01:27:34   written as like like you know a fantasy scenario wouldn't it be cool if and what about this and

01:27:40   this would do this and this would do that really complicated stuff it involved tables or whatever

01:27:43   it was kind of like unintentional waterfall, where it was like a big idea of a complex system

01:27:49   that is just now in the kind of like blue sky stage, but then needs to be shipped this software

01:27:55   in like a very short period of time, like less than a month. And that's a case where you can say,

01:28:01   "Look, if you agree this is how this needs to work, and you think you've got it all down,

01:28:06   it's the fastest way for me to do this is to lean heavily on testing, because there are so

01:28:12   many complicated scenarios that I need to run through, and it's not like I'm going to get 100%

01:28:16   coverage, it's not like I'm going to test every possible iteration of input, but at the very least

01:28:21   this document here describes many different situations and how they interact, and I can test

01:28:25   every single one of those things. It's the only way I'm going to be able to ratchet my way through

01:28:29   this code, and that's different than a situation of like, "well we're not quite sure what we want to

01:28:33   make it, but it's going to kind of be like this," and in that case writing all these tests would just

01:28:36   be like pinning yourself down with spiderwebs while you're trying to to move to an unknown

01:28:41   destination. So you have to recognize like, when do I have enough information to really pin this

01:28:45   down with tests right now? Or when do I have to do kind of a more fast and loose agile type thing of

01:28:50   oh, just get something up and running. And we'll look at it, and we'll poke around that. And we'll

01:28:52   see how like, in that case, you're wasting your time to writing and changing and writing and

01:28:57   changing tests for something you're not even sure how it's supposed to work. So in all these cases,

01:29:01   like you can't take you can't use the same tool for all different situations, you have to know,

01:29:05   like, for example, in Marco's case, when he's doing the UI, like, oh, let's look over there,

01:29:10   how does this transition look? Maybe this button should be over there. Maybe this actually shouldn't

01:29:12   be a button. It should be a slider. Let me try this with a gesture." Writing tests for all those

01:29:17   cases would just slow him down. It would not help. Whereas, for example, the sync code, which can be

01:29:21   largely faceless, and very complicated in many different states, is an ideal opportunity to do

01:29:27   a very complicated series of data-driven tests and maybe even some fuzz testing to be confident

01:29:33   that this tiny little kernel of stuff that you've made works the way you expect it to.

01:29:37   Oh yeah, and one of the reasons why I don't write a lot of tests is because most of the

01:29:42   code I write, and especially most of the difficult, tricky code I write, is the former, which

01:29:49   is, it's UI stuff that's much harder to test and write tests for and maintain those tests

01:29:55   as I change things, and as I refine the design, as I add buttons and features and move stuff

01:30:00   around, it's much harder stuff to test for. I don't end up writing a lot of the easily

01:30:06   testable faceless module math code. That's very unusual for me to write a lot of that.

01:30:10   **Matt Stauffer:** Well, you've got the F something DB. It's not FMDB, it's FCDB?

01:30:15   **Brian

01:30:17   Cundey AO;** It's FC model. And somebody else wrote tests for that. This is why open source is great.

01:30:20   **Matt Stauffer** Yeah, but I'm saying that's another thing that open source does, because

01:30:24   most open source cultures have a culture... I'm going to say most of them have a culture of culture,

01:30:28   but I'm always depressed when I download something untarred and run configure and make an error and

01:30:34   make tests and it says, "No, make tests. I don't know what you're talking about." And I'm like,

01:30:37   "Oh, come on, guys." But yeah, most of the newer software cultures are in dynamic languages instead,

01:30:44   and even JavaScript stuff, testing is part of the culture. And if you release something

01:30:47   as open source, it's expected this social pressure to say, "Where is the test suite? How can I tell

01:30:53   that this works on my system? I want to immediately run the test suite."

01:30:56   I mean, Perl is super crazy on the testing thing where it won't even install modules if they don't

01:31:02   pass their tests unless you know the secret incantation to tell it don't bother running the

01:31:05   tests. And that runs into all the anti-patterns that I talked about, like, well, tests are

01:31:09   software too, and the people who wrote them aren't infallible, so a lot of times the module won't

01:31:13   install because the tests won't pass, but the tests are wrong because they were written by

01:31:17   another fallible human. And you can go too far in that direction as well. But overall,

01:31:22   it's a good idea to test until it hurts, basically, and then back it off one notch.

01:31:30   Oh goodness. Alright, so are we good?

01:31:37   Thanks a lot to our three sponsors this week, Influx by Bravewave, Ting, and Squarespace,

01:31:44   and we will see you next week.

01:31:46   [music]

01:31:47   Now the show is over, they didn't even mean to begin, 'cause it was accidental. Oh, it

01:31:56   It was accidental.

01:31:58   John didn't do any research.

01:32:00   Marco and Casey wouldn't let him.

01:32:03   Cause it was accidental.

01:32:05   It was accidental.

01:32:07   It was accidental.

01:32:09   And you can find the show notes at ATP.fm.

01:32:14   And if you're into Twitter, you can follow them at

01:32:19   at C-A-S-E-Y-L-I-S-S. So that's Casey Liss, M-A-R-C-O-A-R-M-E-N-T-Marco-R-M-N-S-I-R-A-C-U-S-A-S-C-R-A-C-U-S-A-C-R-A-C-U-S-A-C-R-A-C-U-S-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C-R-A-C

01:32:49   you for letting me have my moments. And yeah, I wanted to talk about it too, but I think the

01:32:53   problem is software methodology is too big a topic. It's like software development. So that's

01:32:57   going to be the development show. Right. Discuss it. Software. What do you think, guys? Yeah. And

01:33:02   the other thing is, it occurred to me that I was describing my experience with Agile, but I know

01:33:10   that we're going to, or I'm going to get so much email. Because, "Oh, that's not how Agile is. Oh,

01:33:16   you're calling Scrum Agile and Agile Scrum and you don't know what you're talking about." Oh my god.

01:33:20   I am not looking forward to that at all. I'm just glad because last week when I asked people to

01:33:25   tell me what their organization did about warnings in production, I don't think we've ever gotten

01:33:31   that much feedback. And it was like, seriously, every time I check my email all day long,

01:33:36   there'd be three more and they were each five paragraphs long. It was actually a lot to keep up

01:33:42   with. Well, when you solicit, like, "Tell me about the problems that you have at your work,"

01:33:46   people will say, "We didn't solicit feedback in the show about software methodologies."

01:33:50   Well, when we get the things of like, if you were doing it right, like you didn't put the butter in

01:33:55   the coffee or whatever, like the whole nine yards, it's kind of like, "No, you don't understand. My

01:33:59   religion is the true religion, and you're doing it ever so slightly wrong. If only you didn't eat

01:34:03   shellfish, you'd be fine." That's why I was trying to get at the beginning of the whole point of

01:34:09   of software methodology is to attempt to impose order

01:34:14   and to make things better than they could ever possibly be.

01:34:17   Like, no software methodology can possibly do this.

01:34:19   Like, there is no silver bullet.

01:34:22   There's nothing you can do to make it as good

01:34:24   as it can possibly be.

01:34:25   You just have to decide what is the appropriate level

01:34:28   of annoyance for the given task at hand.

01:34:30   And then there are many things to choose from,

01:34:32   and some are better than others.

01:34:33   And people will write in to tell us that.

01:34:35   But I don't think we're going to get any absolutists who

01:34:37   Like, you know, pair programming is the only way to do programming.

01:34:40   Anyone not doing pair programming is doing it wrong, and there's no way anyone who's

01:34:44   not doing pair programming can be as good as someone who's doing pair programming.

01:34:47   Oh, you say that.

01:34:48   You say that.

01:34:49   Well, I don't think we will, because they won't listen to our show.

01:34:52   The thing is, that's the thing we didn't discuss here.

01:34:56   Unless you have a time machine, you can't A/B test these things, because software is

01:34:59   like, it's always different.

01:35:02   You have to get the same exact people solving the same exact problem in the same exact environment

01:35:05   with the same exact knowledge.

01:35:06   and by the time they're done with that project,

01:35:07   they have new knowledge, and you can't find people

01:35:09   who are identical who have this,

01:35:10   like it is impossible to validly A/B test

01:35:12   because everything software engineers do

01:35:15   is novel in some small way.

01:35:17   - Yeah, that's one of the problems

01:35:20   with a lot of these things.

01:35:21   Like, you know, and this, I mean, this is a problem

01:35:23   in so many fields and so many actions,

01:35:27   but like when you can't really scientifically test

01:35:31   this stuff very easily, if at all,

01:35:34   then you might do something in your methodology

01:35:37   and ascribe your success or failure to that thing

01:35:41   when in fact it had little to do with it

01:35:42   or nothing to do with it.

01:35:43   And that's why I look at so many of these things

01:35:46   with skepticism, like okay, you did it this way

01:35:49   and it worked, but if you did it some other way,

01:35:51   would it have also worked?

01:35:52   And did it work because of the way you did it

01:35:55   or did it work because of the people

01:35:56   and the conditions they were in?

01:35:58   And there's all these uncertainties.

01:36:00   like I recently got into the world of high-end headphones.

01:36:04   And I mean, the audiophile world is comical

01:36:08   in the things that it believes in.

01:36:10   And I go on HeadFi,

01:36:13   the biggest headphone enthusiast forum,

01:36:16   and I have trouble relating to the people there

01:36:19   because they're all talking about

01:36:20   how they upgraded the cable to this new cable

01:36:23   and how they can hear all sorts of things

01:36:25   that cables can't actually make you hear

01:36:27   that all cables, you know, cables don't vary that much and it's all psychological and you're

01:36:32   spending $400 on a headphone cable and like, I can't understand these people because I

01:36:38   know scientifically like what they're thinking is invalid and what they're saying is wrong

01:36:44   and they're wasting their time and money doing things that are just totally placebo. And

01:36:50   I wonder how much of the methodology stuff is that same way where you know, you can't

01:36:56   really say that helped you. Yeah, I think individual people can tell what parts of a

01:37:03   methodology they're using is helpful to them, basically by gauging their own performance.

01:37:07   They're like, "I've gotten similar problems like this in the past, and I performed this

01:37:12   way on it, and it's taken me this long, and I've had this much difficulty, and I know

01:37:16   when they told me to do this, I felt that I was going to have difficulty with it, but

01:37:20   I did this new thing this time, and it helped me do it better." And that's on an individual

01:37:24   Like, I decided to use testing, or I had a friend look at my code

01:37:27   before I checked it in, so we kind of had informal code

01:37:29   reviews, or I clarified the requirements.

01:37:31   Like, that's just-- becoming a programmer

01:37:36   and having experiences, you learn

01:37:37   what all those little things are.

01:37:38   And that's also part of the frustration,

01:37:40   is if you're put into an organization

01:37:41   where you're told to do x, y, and z,

01:37:43   but not this thing that you know that helps you,

01:37:45   and you're like, oh, well, I really

01:37:47   do better when I'm pair programming,

01:37:49   or I can't stand pair programming,

01:37:50   it makes me unable to work.

01:37:52   But the organization wants to have rules for everybody.

01:37:56   They don't want you to just do whatever the hell they want,

01:37:58   because then they feel like they can't manage anything.

01:38:00   And that's the difficulty of the situation.

01:38:02   Even without A/B testing, I feel like individual programmers

01:38:05   can get a feel for the things that help them

01:38:07   and want to repeat that as appropriate.

01:38:09   Did you see that link up in the chat room, by the way,

01:38:12   with the audio file at the ethernet table?

01:38:13   - Oh my god, yes.

01:38:15   - Dalton posted that to app.net today,

01:38:17   and he said he couldn't tell whether it was a joke

01:38:19   or not either, and I can't tell either.

01:38:21   I can't like this.

01:38:21   That's the thing about these sites.

01:38:23   It seems like it's gotta be, but then you never know.

01:38:27   - Anything that is made for audio files

01:38:29   that promises better sound quality,

01:38:31   that has no basis in science,

01:38:33   has no basis in ABX testing, and costs a lot of money,

01:38:37   is probably not a joke.

01:38:38   That's the law of audio file crap out there,

01:38:41   is like, yeah, it's out there, they're serious,

01:38:44   and people are actually buying it.

01:38:46   Probably not a lot of people,

01:38:47   but enough people to make it worth selling

01:38:49   thousand dollar six-foot cable.

01:38:51   Yeah, that is absolutely ridiculous.

01:38:52   That's gotta be a joke, come on.

01:38:54   Hey, do we want to do a quick accidental neutral on CarPlay, or do we want to save that for

01:38:58   next week?

01:39:00   We can do it.

01:39:01   Honestly, I really don't think there's much to say about it yet.

01:39:04   I think—

01:39:05   Why do you even say that?

01:39:06   You know how this goes.

01:39:08   That's the Top Gear equivalent of how hard can it be.

01:39:11   Right, exactly.

01:39:13   Yeah, I mean, we knew about what they were calling iOS in the car last summer.

01:39:19   and we knew this was in the works,

01:39:20   and there's some stuff in the iOS 7.1 SDK

01:39:24   that's in beta still.

01:39:25   There's some stuff in there

01:39:27   that looks like it's probably related.

01:39:29   And so this is not really a huge surprise.

01:39:33   And then even like a month ago,

01:39:36   I think it was Steven Trout and Smith,

01:39:37   one of the good iOS hackers was leaking

01:39:40   like screenshots of the simulator running

01:39:44   with the CarPlay interface.

01:39:46   Anyway, so it's not big news that this is happening.

01:39:51   The important news, I think, is that,

01:39:55   is twofold.

01:39:57   One, that Apple's actually kind of working

01:39:59   with the car manufacturers to give a variety

01:40:01   of input methods.

01:40:02   The Ferrari one that was demoed was resistive touchscreen.

01:40:05   The Mercedes one used a control wheel

01:40:07   and the screen wasn't a touchscreen at all.

01:40:09   And then some other one used a touchscreen.

01:40:11   And so there's, like they're showing off,

01:40:13   there's a variety of different screen inputs,

01:40:15   which is important because if you look at the car screen control landscape, as we've

01:40:18   discussed many times, it's all over the map. And not everybody, including our beloved BMW,

01:40:26   they don't use touch screens at all. Lexus also stopped using them recently. I believe

01:40:30   that will probably also filter to Toyota if it hasn't already, where these like little

01:40:35   remote dongles of some sort, these remote knobs or pointer things are being regarded

01:40:40   as more safe than touching the screen and certainly works a lot better. So it's nice

01:40:44   that Apple is kind of adapting to what's out there to a degree. It's also not surprising

01:40:50   that what they want is, and what CarPlay seems to be, is literally AirPlay, basically. It's

01:40:58   like, iOS is creating the entire interface as a video and streaming it to the car's navigation

01:41:04   system and taking over the video screen.

01:41:06   Well, it's two-way AirPlay because it takes input to the direction.

01:41:09   Yeah, yeah, yeah, sure.

01:41:11   So it's basically AirPlay video output, at least, so that--

01:41:14   yeah, so Apple's just taking over the screen.

01:41:17   So it's not surprising they're doing this.

01:41:21   It's not surprising they got a few car manufacturers to say,

01:41:24   we will do this soon.

01:41:26   What will be news-- and this is what I said in my post about

01:41:28   it-- what will be news is when a lot of cars have shipped

01:41:32   with this, because that's still up in the air.

01:41:34   That could still not happen, or that could still take a long

01:41:37   time to happen.

01:41:38   And if a few car makers introduce it on a few of their high-end models for model year

01:41:46   2014 and the future, and let's say that BMW has this thing called BMW apps, if you want

01:41:53   that in your car, you have to pay extra money in most cases on most of the models.

01:41:57   And so if the other manufacturers offer CarPlay, but it's a $500 option or a $200 option,

01:42:04   If it's an option at all, it costs extra money at all.

01:42:07   And it's only on certain models.

01:42:09   How long is it going to be before a lot of cars

01:42:12   actually have this on the road?

01:42:14   That could be a while.

01:42:16   So there's two sides.

01:42:17   As a user, if you buy a car with it, that's great.

01:42:20   It works for you.

01:42:20   As a developer, I think it's potentially

01:42:23   a lot less interesting for a while

01:42:25   unless it gets a meaningful install base.

01:42:27   I was kind of depressed by the announcement,

01:42:29   Because what it showed me is that, like television thus far,

01:42:34   this announcement proves that cars are another area where

01:42:38   Apple is not going to be able to make things better as much as we want them

01:42:40   to.

01:42:41   Like television, they have the little puck, and it does some things,

01:42:44   but it doesn't solve the whole television problem.

01:42:46   And you're like, oh, I was really hoping Apple would just go in there

01:42:48   and solve everything like they did with music, but they're not.

01:42:51   And in cars, same thing.

01:42:52   They're coming into-- we know, like television,

01:42:54   it's all a screwed up situation, and everything there is screwed up.

01:42:56   We're like, oh, god, Apple, please save us from these terrible car

01:42:59   interfaces, and they can't.

01:43:00   They just can't.

01:43:01   What they're doing is saying, please

01:43:05   let us have access to your existing screen

01:43:07   where you run your existing software

01:43:08   and let us take over it briefly.

01:43:10   But that's like saying, please, Johnny,

01:43:13   I've designed the iPhone for us, but all you get

01:43:16   is what's on the screen.

01:43:17   We're going to tell you what's surrounding it,

01:43:19   whether there's any buttons, what they look like,

01:43:21   how big it is, how much it weighs,

01:43:22   what the battery life is.

01:43:23   But make a great product for us.

01:43:25   And as we've discussed when we discussed the car interiors,

01:43:27   Was that a neutral or on this show? I didn't remember. It was neutral.

01:43:30   I believe. Like, the things that make a car interior, like, it's a holistic thing. It's

01:43:35   everything about it. It's from the driving position and the visibility to how many physical

01:43:40   controls are, to how many virtual controls, to how they interact with each other, to the

01:43:44   software, the hardware. Like, it is one big whole thing. And it's like, unless Apple can

01:43:49   design that entire thing, they're not going to be able to ever really solve the problem.

01:43:52   And they can't design the entire thing because they're not a car maker, the same way they

01:43:54   can't fix television because they don't own all the programming and they don't own all the

01:43:58   television sets and you know i guess they're doing the best they can in an ideal scenario

01:44:03   the integration required for this will will shrink to some 10 cent chip that just like everybody has

01:44:08   in their cars and they use that same chip to do interaction with android and whatever else and

01:44:13   it's just like well when you buy a car in the future whatever smartphone you have which we'll

01:44:17   just call phones because all phones are smartphones by this point will be able to throw up its ui on

01:44:22   the screen and make something out of it. But that interface is always going to be less

01:44:27   satisfactory than it would be if there was one design thought for the entire thing. You

01:44:32   can't just make some rectangle on your car and say, "This is the rectangle where smartphone

01:44:36   stuff happens." And maybe there's some buttons on the steering wheel that activate and do

01:44:39   stuff like you are never going to get the right, really good interface with interaction

01:44:44   of everything unless someone is designing the whole thing. And I believe car manufacturers

01:44:48   are not thus far capable of designing the whole thing, and Apple is not allowed to design

01:44:52   the whole thing. So we're kind of stuck with this crap for now, I guess.

01:44:56   Yeah, and the other thing I've noticed is I had a heated but friendly debate with Dave

01:45:02   Nainian, who writes the excellent software SuperDuper, which if you don't have, you

01:45:07   should. And he was extolling the virtues of the Tesla interface. And I have not interacted

01:45:16   with it myself, but he was saying that it's extremely good and it's touchscreen done right.

01:45:23   And I have, in neutral, I quite vehemently said I hate touchscreens in cars. And I stand

01:45:30   by that to this day, but I've not seen the Tesla interface. And Nanyan, among others,

01:45:36   he wasn't the only one, but he was an example of somebody who came out of the woodwork saying,

01:45:39   "Oh my God, really, the Tesla interface is so good and this is so bad." And he, in a

01:45:44   series of tweets that, and this might have been a direct message between the two of us,

01:45:47   but he said to me, "It's like Mac OS on the phone. Do we want that?" Or another example

01:45:53   is iOS on the Mac. Is that really what you want? And whether, I'm not sure if I agree

01:45:58   with him or not, because I, although I really don't like the name CarPlay, the screens I

01:46:04   saw I liked, but his point is absolutely interesting and worth thinking about. Like, is this really,

01:46:11   is this really the right answer?

01:46:13   Yeah, I think the right answer has to involve more than just that screen.

01:46:17   Just look at the little home button, like the UI home button.

01:46:20   Why is that there?

01:46:21   Because they can't make a hardware home button.

01:46:23   Because they're confined to that little screen.

01:46:26   It's sad that their interface is probably going to be better than the iPod integration

01:46:29   that car makers have been shipping for years, simply because, look, it's not rocket science,

01:46:33   people.

01:46:34   Can't you just integrate?

01:46:35   We gave you this interface to integrate with our iPods.

01:46:38   I've got it in my Honda.

01:46:39   You can show the track name, you can go next in previous track.

01:46:42   But you still did it bad and you got the details wrong.

01:46:44   So just give us a screen and we'll do an iPod,

01:46:46   and we'll do Siri integration.

01:46:48   And like in many respects, as many people pointed out,

01:46:51   there are advantages to this over the existing systems

01:46:53   because like now I can upgrade my phone

01:46:54   and I automatically get something better with CarPlay

01:46:56   and all that stuff.

01:46:57   But if, for example, your car has a really slow UI

01:47:02   and CarPlay is like laggy on it,

01:47:04   or if you buy your 200 grand Ferrari

01:47:06   and it's got a resistive touch screen,

01:47:07   upgrading your phone's not gonna fix

01:47:09   the resistive touch screen.

01:47:10   And upgrading your phone's not going

01:47:11   add a physical button and if your car UI has one touchscreen and they use that

01:47:15   touchscreen for climate controls but then you have the CarPlay thing on it

01:47:18   you have to get out of CarPlay just to change the temperature and you wish they

01:47:20   just had a knob it's not gonna fix that either so I really I really wish there

01:47:25   was like sort of a single-minded holistic approach to interface and

01:47:29   that's why I think Dave likes his Tesla so much because that's the case where

01:47:31   the car maker does do the whole thing and they do concentrate heavily on

01:47:36   touchscreen type interfaces and it does come together in a manner better than

01:47:39   most other cars, but I mean the Tesla for example I think it should have more

01:47:43   physical controls and it's definitely a version 1.0 software and it's not as

01:47:46   snappy as like an actual iPad is and it's like we see that the technology for

01:47:50   all this is there it's just not like the the correct arrangement of the people

01:47:54   with the technology and the people who make the cars to make it happen.

01:47:59   Yeah I don't know I'm very very intrigued by this I think it has a lot of

01:48:04   potential. One of the things I'd noticed is that to me the look of iOS 7 made a

01:48:10   lot of sense on screen. I don't know why, but not to say that I

01:48:17   dislike it on iDevices, but I actually really really really liked the look on

01:48:23   the Ferrari and Mercedes and Volvo video demos. The bar is low.

01:48:29   I mean I could show you some screenshots of my Honda's UI

01:48:34   looks like it's like a web page from 1995. It's not good. My favorite part of the Ferrari video

01:48:39   was when the Apple rep who was demoing it, she's showing that you can switch back to the Ferrari

01:48:46   interface. She's like, "Well, if you want, you can tap here to switch to the FF interface,

01:48:52   but we'll just go right back." She showed it for a second and it just looked awful. And this is a

01:48:59   brand new Ferrari, the interface looked horrendous, and then switched right back to this nice,

01:49:04   clean Apple world.

01:49:05   Yeah, because Ferrari's expertise is not in user interface design for the touch screens

01:49:10   in their cars.

01:49:11   That is not where their strengths lie, and it's never going to be.

01:49:15   In this respect, I bet Ferrari's like, "Oh, finally, someone could take over some of this

01:49:18   stuff."

01:49:19   But still, it's like not everyone's going to have a phone or plug it in, so you have

01:49:22   to have an interface there.

01:49:24   The standard is so low for these interfaces that they're--

01:49:29   I don't know what the solution is.

01:49:31   Like, again, Tesla, at the very least,

01:49:33   they're taking it seriously and saying,

01:49:35   we're going to have a different UI.

01:49:36   We're going to take ownership of it.

01:49:37   We're not just going to have some sort

01:49:39   of generic touch screen, and we're

01:49:41   going to have some third party write some software

01:49:43   and put our logo above it in a different background color

01:49:46   and call it a day.

01:49:47   I know other companies do much more than that,

01:49:49   but it's a difficult problem.

01:49:51   It's difficult on many levels to figure out

01:49:53   the best balance is between physical controls and touch controls and all the modern features

01:49:56   people expect and integration with their devices. So I think CarPlay is a step forward, but it's

01:50:02   a small one. Like Apple TV was a step forward too, but it's a small one.

01:50:05   Jared Ranerelle I think it's worth asking what problem this

01:50:10   solves for everyone involved. Now, you know, to me, a lot of the audio functions of it

01:50:16   are solved perfectly well by Bluetooth. And Bluetooth is extremely convenient, because when

01:50:22   done right, and it's usually done pretty well in cars, surprisingly, because there's not

01:50:26   much to it, when done right, I have overcast on my phone, and if I'm in my house, and I'm

01:50:34   washing dishes or whatever, I'll play a podcast. And then if I go to my car to go to the store,

01:50:38   I get in the car, turn the car on, and just start driving. And a few seconds later, my

01:50:43   podcast starts playing through the car speakers. When I turn the car off, it pauses and saves

01:50:47   my position. And then when I go back inside, I can resume it and play inside. And all of

01:50:54   that works without having to plug the phone into the car, without having to wait, without

01:50:58   having to launch anything. It just, I get in the car, the phone happens to be in my

01:51:02   pocket, it's always in my pocket, and it just works.

01:51:04   Do you have a way to pause the audio?

01:51:06   Yeah, through the car's built-in wheel controls and stuff. Because, you know, because the

01:51:11   car, using the Bluetooth protocol, the car can signal basic things like play/pause, you

01:51:15   You know track forward track back stuff like that

01:51:17   Can you change playlists and stuff because this is like the simplest possible scenario that you're describing where you're in the middle of an audio

01:51:23   And you know exactly what you're playing but like I agree that is that's a step up from what you would have had it on

01:51:28   Before but like even a slightly more complicated scenario like I'm going at a trip and I want to hear this song or this playlist

01:51:33   That's a place where carplay would have your back because it's like I know how difficult it is to

01:51:38   You know using my stupid on-screen controls

01:51:40   Even if I have my iPod plugged in to find the playlist I want to select it to you know enable or disable

01:51:47   Shuffle to you know to do all that stuff with a touchscreen touchscreen with with my stupid controls. It is terrible and

01:51:54   Siri is probably also kind of terrible

01:51:56   But it's better than I would rather

01:51:58   Shout something out to my car and say tell it what to play then do nothing

01:52:01   But Bluetooth doesn't solve that at all because Bluetooth doesn't give you like here's it

01:52:05   Here's a visual interface to like all your playlists or all your songs now go find the one you want right?

01:52:10   It's merely like a wireless version of an audio cable.

01:52:13   And so CarPlay, I think, is a step up from that,

01:52:16   even if it lacks the seamlessness of you just being able

01:52:18   to get in and out of your car.

01:52:19   - Oh yeah, I mean, for podcasts,

01:52:21   I think the difference is a lot smaller,

01:52:22   because you tend to listen to one podcast for much longer

01:52:25   than you listen to one song,

01:52:26   and you tend not to have to do serious navigation

01:52:29   to find the podcast you wanna hear at this moment.

01:52:31   Like, generally, you put it on, and whatever's on,

01:52:33   you keep listening to,

01:52:34   'cause it's an hour and 50 minutes long.

01:52:37   So it's not that big of a problem for podcast music.

01:52:39   I agree, you're right, and obviously,

01:52:41   a lot more people listen to music than podcasts.

01:52:44   But just pointing out that Bluetooth convenience

01:52:48   is pretty severe competition for this at the start

01:52:52   for audio functions.

01:52:53   And Bluetooth is also everywhere already.

01:52:56   Bluetooth is in so many cars now,

01:52:58   so many new models included, it's been around forever,

01:53:01   so there's so many cars already on the road with Bluetooth

01:53:03   that it really does solve a lot of these problems

01:53:05   in a pretty limited but very effective way.

01:53:07   So where CarPlay, I think, will really shine is navigation, in theory.

01:53:17   But I think in practice, this is still iOS maps, iOS navigation, and trying to replace

01:53:24   a car system, or either trying to sit on top of a car system that's being ignored, or be

01:53:32   present in cars that have screens but don't have nav, which is not a ton of them, but

01:53:36   Certainly there's still some.

01:53:39   And is that really what you want?

01:53:43   I mean, usually a car's built in nav is not that bad,

01:53:47   and it's also all loaded, all offline,

01:53:50   all built in, ready to go,

01:53:51   it has the high precision GPS transmitter

01:53:54   or receiver on the car.

01:53:55   In my opinion, the GPS that's built in

01:54:00   to cars that have navigation is usually way better

01:54:04   and more consistent than the one in iOS 6.

01:54:07   Well, I mean, maybe in an M5, that's true.

01:54:09   But I think the big advantage it has is that as your car ages,

01:54:12   assuming you keep it for a while,

01:54:13   your phone will get better, and the navigation presumably

01:54:16   will get better.

01:54:17   You know what I mean?

01:54:17   Whereas whatever navigation your car came with,

01:54:19   that's the navigation it's going to have.

01:54:20   And that, in my experience, is what happens.

01:54:22   When I get into someone's car, and it's

01:54:24   like an eight-year-old car, their navigation

01:54:26   is gross looking.

01:54:27   And if they bought a modern car, it would be better.

01:54:29   But I think the key is that to do navigation well,

01:54:31   you need to integrate with the rest of the car.

01:54:33   Like Marco said from the antennas that you're getting the signal from instead of having a tiny little iOS device inside an armrest plugged into

01:54:39   a cable underneath a big steel roof all the way up to like, you know, eventually we're gonna

01:54:44   Yeah, the augmented reality stuff. You're gonna want to show stuff on the HUD

01:54:46   You're gonna want the navigation to know maybe the navigation wants to know the speed from the speedometer and the angle of the wheels

01:54:51   instead of having to get that info from GPS and dead reckoning like there's so much more you can do when you did navigation built

01:54:58   Into the car then when all you get is to be I'm a phone

01:55:01   I do navigation and I can project my image onto the screen like it's it's such a weak solution to a

01:55:06   Comm to a common problem that cars have been solving for years on their own

01:55:09   Well, the other problem is Marco's navigation in your brand new m5 is excellent and the navigation in my 2011 3 series is pretty

01:55:18   Good on good

01:55:19   I'm actually very glad I got I

01:55:21   Got a car with navigation because that wasn't the plan when I went looking for it for use 3 series. However

01:55:26   even in the

01:55:29   I think the build date on the car was December 2010.

01:55:32   In the three-ish years since the car has been built, the local area has changed quite significantly.

01:55:41   And so, if for no other reason, even the UI left alone, there's an extreme advantage

01:55:47   in having your phone do your navigation because the maps get updated.

01:55:53   And I'm not talking like the UI.

01:55:54   I'm talking the actual data gets updated.

01:55:58   I'm looking right now at how much it would cost to get my maps upgraded from BMW and

01:56:04   I need to pay for a $40.90 USB key and that just gets me the key.

01:56:11   On top of that I need to pay $204.11 for the actual maps themselves.

01:56:17   So I'm in $250 to get an update for the nav on my car and one of these days I'll

01:56:24   probably pony up for it because after three or four years, it gets to the point that you're

01:56:30   really in need of it. Because a lot of the streets I go to—I shouldn't say a lot—some

01:56:35   of the places I go to, they just don't—they plain don't exist on the navigation in my

01:56:42   car. So that's a bummer. Additionally, traffic lately, as mentioned by David T. Moof in the

01:56:49   the chat, traffic lately is way better coming off the phone than it does my car. And my

01:56:54   car does accept it, it does receive traffic, and it's good, but it's certainly not foolproof.

01:57:00   And with the advantages that Waze brings, among other things, with kind of crowdsourcing

01:57:05   traffic reports, it's almost always going to be better in navigation on the phone. So

01:57:10   yeah, right now your M5 is wonderful, but I'd be curious to see if you said that in

01:57:15   three or four years.

01:57:16   Well, that's another example of your car aging and not doing well with the tech.

01:57:20   But I think the root problem is right now it's unacceptable for someone's home to not

01:57:25   have an internet connection, but it's acceptable for cars to not have internet connection.

01:57:29   It's like, "I don't want to pay for two wireless bills.

01:57:31   I don't want my car to have a separate connection."

01:57:33   That's a symptom of the market, the shape of the market now.

01:57:36   What we would want is as time goes on, as cars get better, inevitably cars need to have

01:57:40   network connections, and cars need to auto update their maps, and they need to not charge

01:57:44   it.

01:57:45   So it doesn't do this and modern cars still don't have internet connections

01:57:48   But they can get one through your phone maybe or actually we're doing the reverse here and saying nope car

01:57:52   You still can't have an internet connection, but my thing that I have that has an internet connection

01:57:56   It's allowed to spray its image up on your screen

01:57:58   And you're allowed to send it inputs down to it

01:57:59   But like you know not too far in the future if we ever get the stupid carrier stuff sorted out or like we have

01:58:05   better wireless type like cars need to have internet connections just like homes need to like well mine does even my

01:58:13   2010 built car does and if I subscribe to like the super baller

01:58:17   BMW what is the BMW assist which is kind of like on star I can do basic Google location searches from the car and

01:58:25   Certainly up until Google freaking ruined Google Maps recently. I was able to send

01:58:32   Addresses from Google to my car which sounds silly but oh my god

01:58:37   It's the most wonderful thing in the world

01:58:39   And so my car has some modicum of internet connectivity, but to your that doesn't really

01:58:44   Terrible software right and you to be honest you're on a kind of different point

01:58:49   Which is it should be able to do a lot more with that internet connection than receive

01:58:53   extraordinarily small packets of search results and addresses

01:58:58   Yeah, that's what I say where we see all the pieces

01:59:00   We know what iPads are like we know what iPhones like and we know all the things that could be useful to cars

01:59:04   And it's like we have this tech. It's not that expensive compared to the cost of a car

01:59:09   We know people know how to write software to do this

01:59:12   But they're not at the car companies and like you're just gonna get we just gotta get these guys together

01:59:16   And that's the kind of like you know we're hoping in the Apple announcement or something like maybe they would get together

01:59:20   Maybe Apple would buy Tesla and be able to do the whole experience

01:59:23   But they're they're still so far apart and the things we know today are technically possible that they're not financially feasible

01:59:30   Or just not organizationally feasible like we can all envision it. You know like why is the car the separate?

01:59:37   Realm where we can't have the nice things that we know exists. Why do we have to choose between?

01:59:41   Navigation is integrated with our car, but it's crappy in other ways and navigation that is you know fast and responsive

01:59:49   And can respond to voice commands and gets its maps updated all the time, but is not integrated with a car in any way

01:59:54   Yeah, I don't cruise see where it goes though for sure

01:59:59   Now I'm gonna be very jealous when my car can't do it. I'll just have to get a new m3

02:00:04   It's the only answer I wonder too like like how much of carplay

02:00:09   Will will fail to be useful to people who aren't totally bought into the Apple ecosystem

02:00:16   like there's a whole lot of features in

02:00:18   Mavericks and iOS 7 where like if you don't use Safari as your browser for instance

02:00:25   then it doesn't work right for you.

02:00:27   Or if you don't use iCloud's reminders or calendar

02:00:31   or whatever.

02:00:33   And so how much of that do you think will apply to CarPlay

02:00:37   where obviously it's very much based on Siri?

02:00:40   And actually, one topic is third party apps' integration

02:00:45   with it.

02:00:45   And Apple, as far as I know, they

02:00:48   have not clarified whether anybody

02:00:51   will be able to make an app that uses this

02:00:53   or whether this is partners only, like Siri.

02:00:56   And it sure looks like it's partners only

02:00:59   to start at least.

02:01:00   And so, you know, to what degree will this be limited

02:01:05   if you choose not to have, not to use Apple services

02:01:11   for everything, or Apple's apps for everything?

02:01:16   - Yeah, Google Maps, you know, is the obvious example.

02:01:18   Like, what if I don't wanna use Apple's apps, Maps?

02:01:21   I wanna use Google Maps.

02:01:22   Exactly. Yeah, I don't I think they're gonna have to limit it probably you know I keep you like safety and legal reasons like

02:01:30   You're allowed to we have all these rules about like you can't even enter the destination of where you're going if your car is not

02:01:36   Moving at the car manufacturer puts that into their navigation, right?

02:01:39   What at what speed is it safe to play flappy birds on your

02:01:43   Airplay screen, you know, it's only one tap but we can send that over carplay, right? That's really easy input

02:01:49   Yeah, that'll work with the iDrive knob. Just hit the knob to flap.

02:01:52   And I think one of the conversations that Casey was having with Dave on the Twitters was,

02:01:57   what do you call it, the scrolling, like touch scrolling? Like, and how that's not quite as

02:02:04   natural as you might think it would be when you're in a moving car versus buttons. That was mostly

02:02:09   getting back to like the resistive touch screen that the Ferrari has. It has to have buttons

02:02:12   because it's terrible, right? But like, even without that constraint, you know, what's the

02:02:17   the easiest way to find something in a long list

02:02:20   when you're in a moving car.

02:02:22   Because scrolling through a list is great

02:02:24   when you're looking at your phone in your hand

02:02:25   and flicking with your thumb, but your eyes

02:02:27   are on the phone the whole time.

02:02:28   When your eyes are supposed to be on the road,

02:02:31   if you flick and then look back at the road

02:02:33   and look back down and you've passed it,

02:02:34   maybe that's not the best interface.

02:02:35   That's why it keeps getting back to Siri.

02:02:36   That's why they keep demoing Siri,

02:02:38   because that's what you want.

02:02:38   You want eyes on the road, but still get

02:02:40   done what you want to get done.

02:02:42   And that's one of the strengths also of knobs and dials,

02:02:44   is that if they have little things,

02:02:46   you can feel them clicking, you know how far you're going,

02:02:48   or you can feel switches in the on or off position,

02:02:50   you don't have to take your eyes off the road.

02:02:52   It's a different environment,

02:02:53   and the things that work great on iPads and iPhones UI wise

02:02:56   don't necessarily work the same way in the phone.

02:02:58   That's why I think CarPlay looks so different

02:03:00   and it is so like sort of Spartan and limited,

02:03:02   and they keep demoing Siri is like,

02:03:05   they don't want you to say, oh great,

02:03:06   now it's exactly like mounting my iPad to my dashboard.

02:03:08   Mounting your iPad to your dashboard is dangerous

02:03:11   in terms of like what you can do to distract yourself.

02:03:13   And CarPlay, I think, is trying very hard not

02:03:16   to be that dangerous.

02:03:17   So that's why I would imagine it's not

02:03:18   going to be open season.

02:03:20   Or if it is open season, the things that third party

02:03:22   apps are allowed to do through the interface

02:03:24   is going to be so limited that it's not

02:03:27   going to allow you to put Flappy Bird up there

02:03:29   or play a movie or something in the front seat of your car.

02:03:32   Right.

02:03:32   Now, actually, so the new stuff that's in the iOS 7.1 SDK

02:03:38   looks like it would be a way--

02:03:41   stupid NDAs, whatever I can say. It looks like it would be a way to have arbitrary apps

02:03:47   be usable from the screen, but only present a structured menu, kind of like the old iPod.

02:03:54   So that would be your entire interface could be the structured menu, and you have a few

02:03:58   limited things you can do, but you're not just given an arbitrary view that you can

02:04:04   do whatever you want with, like put a game in it.

02:04:06   And I bet those menu items would be speakable. You'd probably put in some metadata to say

02:04:10   if they say this, that's what they mean. If they had voice—they want it to be simple interface,

02:04:15   so you can't do some crazy thing. And also, ideally, you wouldn't have to touch a screen

02:04:18   at all, but you could activate the app and say—like they were showing, play song name,

02:04:23   whatever—or if you had put a menu with five options, pick an option, which is send text,

02:04:27   and then Casey's app would send a text, and it would send a text. That type of integration.

02:04:33   It's kind of boring because you look at it and you're like, "Oh, it looks kind of like a car

02:04:38   UI. Oh, it's a bunch of items. And the thing is, sometimes it's not even as responsive as you

02:04:43   expect it to be because of whatever the crappy host operating system or the host hardware is. So

02:04:48   it's still a far cry from the performance and experience of having an iPad in the dashboard,

02:04:54   but you probably wouldn't want to have that anyway in terms of the UI. The UI has to be

02:04:59   simpler and different because it's not the same as holding an iPad and using it.

02:05:02   By the way, I should mention that what was news to me anyway, and I believe news to everyone,

02:05:07   was that CarPlay will support things like the iDrive controller, which I consider to

02:05:14   be a tremendous win.

02:05:17   Despite Dave Nainian's love for touch screens and cars, I'm still of the opinion that having

02:05:23   a physical control is a much, much, much better way of doing things.

02:05:26   And so having some amount of integration with things like the iDrive controller I think

02:05:33   is very important and a very welcome bit of news that I didn't know before.

02:05:37   Oh yeah, definitely. And that's, me too, like when I saw that, because that's the

02:05:43   thing you would expect knowing Apple and the way they usually do things, you would expect

02:05:46   that they would say, "Alright, everything that we support has to be a touchscreen, period."

02:05:50   Like they would just dictate that and then, you know, two manufacturers would sign up

02:05:54   and no one else would. Heck, maybe that's why we haven't heard about iOS in the car

02:05:57   for almost a year. Maybe their original plan was to dictate that and they couldn't get

02:06:01   enough support.

02:06:02   been slowly backsliding. Fine, resistive touch screens are in. Fine, you can use a jot down.

02:06:06   Fine, we'll put the home button on the screen since you guys can't find a spot for it on

02:06:10   your dashboards. Not going to change your dashboards for our UI." And they were probably

02:06:14   all saying, "We also want to be able to integrate the same way with Android, so nothing you

02:06:18   do can preclude someone plugging in their Android phone to do whatever the hell they're

02:06:22   going to do for that." Right, exactly. But yeah, I think the new stuff in 7.1, it doesn't

02:06:29   say it's for CarPlay and it doesn't say much of anything actually. There's not a lot of

02:06:34   documentation for it, but the new stuff in 7.1 sure looks like this is what it's for

02:06:39   and it's a way to create a hierarchy of menus for things that could be music or it could

02:06:44   be anything. So that would mean that both Overcast and FastText would in theory work

02:06:50   over CarPlay. And whether it's integrated with Siri as a different story and they could

02:06:56   be, it might not be, it might be limited, who knows? But that's a different story entirely.

02:07:01   Yeah, I'm very interested to see whether they do something a la, you know,

02:07:08   BMW's connected apps where you have to be a blessed partner in order to get access. We'll see.

02:07:15   It'll be interesting. All right, it's getting long titles.

02:07:18   Man, this Mac Pro is good. Save it for the next show. Now you get to suffer. Now you get to hold

02:07:24   and we'll wait until next week, you can tell us about that.

02:07:26   - Well, the funny thing is,

02:07:27   there's not that much to say about it, really.

02:07:29   - Don't, no, don't go there.

02:07:31   - This is gonna be a short topic.

02:07:33   - Don't go there.

02:07:34   - The biggest thing I noticed so far,

02:07:36   it is noticeably faster,

02:07:39   not by an earth-shaking amount,

02:07:41   'cause what I had before was already fast,

02:07:43   but it is noticeably faster, the IO is definitely faster.

02:07:46   What is most noticeable by far,

02:07:49   besides how cool it looks, is how much quieter it is.

02:07:52   I mean, that's like, it's shocking how quiet this thing is.

02:07:56   And I've always been of the opinion

02:07:58   the Mac Pros were always pretty quiet.

02:08:01   Compared to what they were doing,

02:08:02   compared to any other tower PCs that you would have,

02:08:05   they were always pretty quiet.

02:08:07   But now that I have this in the same room

02:08:09   and then across the room, I still have TIFF's old Mac Pro,

02:08:12   it's a big, big difference in noise.

02:08:14   And John, if you end up getting one of these things,

02:08:16   which you probably won't and probably shouldn't,

02:08:18   but if you do end up getting one of these things,

02:08:20   that I think is what you will notice first.

02:08:22   it just so it's such a big difference. I'll get one eventually. I guess assuming this form factor

02:08:28   lives as long as a cheese grater inevitably I'll eventually get one. Yes, I'm looking forward to

02:08:31   that. That and the size like seeing it there on your desk like next to your water glass practically

02:08:35   being the size of your water glass like that's just going to be such I mean I'll have more leg

02:08:38   room presumably there'll be fewer cables draped from the top to the bottom of my desk you know

02:08:44   it'll be nice. Yeah it is really cool. So titles. And I'm actually I'm not touching the case because

02:08:50   'cause I've seen pictures online of people who have,

02:08:52   you know, they're covered in fingerprints.

02:08:54   So I pick it up by the lip,

02:08:56   how you're supposed to from the top,

02:08:58   and I set it all up.

02:08:59   I didn't touch any part of the case

02:09:01   except the very lip part,

02:09:02   and I wiped that off with the microfiber cloth

02:09:05   after I was done.

02:09:06   - Listen to that.

02:09:07   Do you hear that sustain?

02:09:09   Well, you would if it was playing.

02:09:11   You don't get that reference.

02:09:12   Nevermind, titles, titles!

02:09:13   I know this is one of my suggestions,

02:09:18   but I kind of like it because who suggested it?

02:09:20   It was Dave who stinks, resugested by _DavidSmith.

02:09:23   I didn't mean that to reference him,

02:09:26   but now that I think about it, it's funny.

02:09:28   And I think that's a title that will be difficult to guess.

02:09:31   Well, I guess if you know it's a software methodology episode,

02:09:33   you can guess.

02:09:34   But I find that one funny.

02:09:36   I didn't get it.

02:09:37   Someone suggested hell is other people.

02:09:39   I always say hell is other people's code.

02:09:42   Like that's why I was amazed when someone buys a software

02:09:45   product, like Daniel Jalk had bought Mars Edit,

02:09:48   people buy other apps that companies don't want anymore. That's like my worst nightmare.

02:09:52   Like getting someone else's pile of code dumped onto me, right? Even if I got to see it beforehand,

02:09:58   it would be like, "Here you go. You paid a lot of money for this. Hope you can make a good go of it.

02:10:05   Hope it's not too crazy." If it came with the other person and they were my slave for the rest of

02:10:09   their life, then they'd be like, "What did you mean by this? What are you doing? What is the

02:10:13   purpose of this class? Describe to me." And then if they didn't know, I could just beat them.

02:10:17   It's also you never thought about what the purposes class was, did you? It just does stuff.

02:10:20   Thank you.

02:10:21   [