PodSearch

Under the Radar

125: Privacy Policies and GDPR

 

00:00:00   Welcome to Under the Radar, a show about independent iOS app development.

00:00:04   I'm Marco Arment.

00:00:05   And I'm David Smith.

00:00:06   Under the Radar is never longer than 30 minutes, so let's get started.

00:00:10   So for today's topic and discussion, I think what we wanted to kind of just talk about generally is user privacy

00:00:17   and how we manage that, things to keep in mind around user privacy.

00:00:22   And also to talk a little bit about the GDPR big scary EU directive that is going to be coming into effect

00:00:31   within the next couple of months.

00:00:33   Like, as best I can tell, it's slightly amorphous as to when all of the parts of it take effect and who they affect,

00:00:37   but in general, before this summer, there was going to be a big new legislative requirements around user data

00:00:48   and about user privacy.

00:00:49   And as a small developer, I kind of look at some of this stuff and I get very intimidated, very scared.

00:00:56   And in some ways, that's a good thing because it makes me just not want to ever touch user data ever.

00:01:02   And also, it's mostly, though, I just need to tell myself that most of this stuff, if you're doing things reasonably

00:01:10   and if you're doing things respectfully, you're probably going to be okay.

00:01:14   That in general, if you're being respectful of user privacy, that you're not doing shady stuff,

00:01:19   these kinds of laws and the kind of things that they require are probably not going to be super onerous.

00:01:24   But that's probably also a good fair disclaimer.

00:01:27   Neither of us are lawyers. Neither of us are especially neither lawyers in EU law.

00:01:33   And so if you have specific questions or issues about your situation, this is not the kind of advice that you want to base that on.

00:01:39   We're just hopefully going to give you some general advice and some general experiences that we've had as we have looked through this ourselves

00:01:46   and to things to think about going forward.

00:01:50   Because at the very least, and before we get into some of the specifics of this,

00:01:54   one of the things that I do like about this big, scary EU directive has been that it has forced me to do an audit

00:02:05   or a kind of go through what I'm doing, which is always just a useful thing to do.

00:02:11   That while it turned out that I think in general there are very few things and changes that I have to make

00:02:17   to make my apps kind of be respectful of GDPR, it's been a useful exercise to sit down and to go through my app

00:02:27   and to think about, well, what data am I collecting about my users?

00:02:31   And do I need that data? Is there any way that I could avoid using that data or having that data kept?

00:02:37   How am I storing that data? Who has access to that data? Etc.

00:02:41   And it's a useful activity to go through that, even if it wasn't something that I have to do,

00:02:49   would probably be a good idea to do on a semi-regular basis.

00:02:52   So at least in that sense, it's been useful.

00:02:54   But what is really interesting about this, what I have found, is it's trying to really think about what do I absolutely need to store?

00:03:04   And thankfully, in general, the approach that I've taken with most of my apps is I don't want to store or know anything about my users.

00:03:12   Which in some ways is probably, from a business 101 perspective, is a terrible idea,

00:03:17   because I have no relationship with a lot of my users to the degree that they go to the App Store, they download my app, they install my app.

00:03:26   I know nothing about them, and they just use it, and everything's great.

00:03:30   At some point, maybe they'll give me some money from an in-app purchase, but even with an in-app purchase, I just know that someone bought it.

00:03:36   I know nothing about them. Which in some ways is terrible, but in a weird way, as a one-person developer kind of shop,

00:03:45   it's actually lovely and liberating to not have to manage and think about and deal with the implications of me having a user's credit card information,

00:03:55   or a user's address, or information about them.

00:03:59   Especially a lot of my apps deal with health data, things actually about measuring a user's health.

00:04:05   I actually have no interest in knowing anything about your health, because suddenly then, there's a tremendous burden and legal liability

00:04:14   and things on me. I would much rather all of that data just stay on your phone. It never goes anywhere.

00:04:20   You can use the app, you can enjoy the app, it can be useful to you, but from my perspective, you're just this nameless number that is just like

00:04:29   somewhere in my daily downloads, someone downloaded it. And that's kind of a useful thing.

00:04:35   And I think it's a different perspective than, certainly I get the impression that a lot of big companies,

00:04:42   their goal is to sort of slurp up as much data as possible about their users and use that for whatever,

00:04:49   but it is possible, I think, to take the opposite approach. And in a situation like this, it really comes back to be like,

00:04:56   "This is great. I know nothing about any of my users. I don't have to worry about it."

00:05:00   For all of these requirements and data protection stuff, except for a few cases for me, there's nothing that I know about my users, so problem solved.

00:05:08   Yeah, I mean, number one strategy of avoiding certain liabilities and requirements of privacy laws and regulations is

00:05:17   just don't keep the data or don't collect the data or don't store it. Just have as little data as possible on people.

00:05:23   And that's not something that some businesses can really do. A lot of businesses depend on having a lot of data on people

00:05:30   or need that data for basic functionality of their app. But independent developers like us are lucky in that

00:05:37   we have options. And as I've been looking over Overcast stuff, I had kind of a head start on a lot of the GDPR stuff

00:05:47   because Overcast has had a privacy policy since day one. Apple does not require privacy policies for all apps

00:05:54   unless they have auto-renewing subscriptions.

00:05:57   Or health data.

00:05:58   Oh, or health data. I didn't know that.

00:06:00   If you ever access health data, you have to have a privacy policy.

00:06:03   That makes sense. Yeah, so like, and I think the original reason for the auto-renewing things was that

00:06:07   auto-renewing subscriptions used to give you access to like the zip codes and email addresses of everybody

00:06:13   because they were made for newsstand publications back in the day and the magazine business was like

00:06:18   we can't operate without all sorts of personal data to spam people with.

00:06:21   So, but, I don't know if that's still even available to anybody. I know it's no longer there by default,

00:06:27   but I've had a privacy policy since day one and it's been, you know, and it's an intimidating thing to write,

00:06:34   but I highly suggest everybody, for every app that you possibly are responsible for, make a privacy policy.

00:06:41   And even if you use the same one for all your apps, make a privacy policy because it forces you not only to,

00:06:47   you know, to think about these things, but to actually codify like what data are you actually collecting

00:06:54   and what data are you not collecting and what are you doing with this data.

00:06:59   What are you allowed to do with this data and what are you explicitly saying you're not going to do with this data.

00:07:03   Who is it having access to? Are you sharing this data with anybody else?

00:07:06   Like to write a privacy policy, even if no one ever reads it, like the value of having it is that it forces you

00:07:14   to basically audit what you're doing with information and what information you even have.

00:07:18   And this might give you an opportunity to say as you're writing this document, like, you know, actually,

00:07:23   I don't need this. Like one of the things I did recently was I turned off IP address logging on my web server logs

00:07:29   because I have literally never had to look at it. Like there has not been a single time in all of Overcast

00:07:35   where I've had to look at the IP address records of anything. So I'm just going to, I'm like slowly like removing places

00:07:40   where I would record IP addresses for people because it just has never come up.

00:07:43   If it ever comes up that I really wish I had that information, maybe I'll rethink this policy, but it hasn't yet.

00:07:48   I mentioned in the past that one crazy plan I had, which I haven't done yet, but I'm still thinking about doing it,

00:07:56   is to hash email addresses in addition to just passwords. I've always hashed passwords, so I don't store plain text passwords.

00:08:04   But I wonder if I could also hash email addresses. And the main advantage there is like, well then I don't have emails.

00:08:13   And the problem is I would still have like some derivative version of the email. Because to make that work,

00:08:21   you basically have to use the same salt for everything so you can look up to see like when someone's logging in with email address,

00:08:27   like they type this in, I have to be able to find whether that record exists or not. So I think, the only way I could think of to do that

00:08:33   is to use the same salt for all the hashes. But I could still do something more secure than just storing them as plain text.

00:08:37   Like I could still have a salt and a hash, even though they would all be the same, that's still better than having the emails there

00:08:43   if I pick a strong enough hash. And that would literally just get me out of the business of having emails at all.

00:08:49   The downside of that is that I could never send emails to users, which currently the only time I do that is for ad buyers

00:08:58   to notify them that their ad has expired or that availability is now open in a category that they requested to be notified about.

00:09:04   So I could like maybe only store their emails and no one else's, because they're a minority of the user base.

00:09:10   So basically I'm brainstorming ways in which I can avoid having people's emails at all.

00:09:17   Because that is the most, that's the main personal identification that I have, or personal information I have for people.

00:09:24   You know, I have like what podcasts they're subscribed to, which counts, you know that's still personal information,

00:09:29   it's not as sensitive as like an email address would be, and it makes it hard to tie it to an individual without that email address.

00:09:35   So anyway, so you know I've gone through a number of times when writing or editing the privacy policy to say like,

00:09:42   "What's the least information I can get away with?" And part of the, this is also part of the reason why,

00:09:48   and this becomes important with GDPR, this is part of the reason why I have dropped all third party service integrations with my app.

00:09:54   I don't have third party ads or analytics or crash reporting or anything. I'm using Apple's crash reporter,

00:09:59   I have my own basic analytics on the server, and I do my own ads.

00:10:03   And this is again, like not a luxury that everybody could have, but if you can do it this is very important,

00:10:08   because not only does this let you control what information you're collecting and not collecting,

00:10:12   but also GDPR makes you responsible for any breaches or privacy problems that happen with like subcontractors

00:10:23   that do work for you, and I think that's going to be interpreted to mean like, if you embed Google Analytics

00:10:29   or something in your app and they get hacked, you have a problem. You can't just say, "Well that's their problem."

00:10:35   Like that liability could fall on you. So you become responsible for any third party stuff that you integrate in your app.

00:10:42   And honestly, you should be thinking this way already. I know the reality that most people don't, and that's,

00:10:47   you know, we'll have the documents some other day, but if you can minimize your dependence on third party code,

00:10:53   especially code that talks to third party services, the least of that you can have in your app, the better,

00:11:00   because that just, it eliminates areas of liability for you under GDPR.

00:11:06   - Yeah, and it's, I love this kind of exercise though, because it is entirely about making conscious choices,

00:11:13   where it's like there's, like there may be a default that you could just like ignore this, A,

00:11:19   which is probably, honestly, for the first several, until I started doing health related applications,

00:11:24   I just completely ignored the thought of this type of stuff. Like I didn't store much, I don't have much data

00:11:29   in my previous apps, but I probably had something, and I just never thought about it. But like, being forced to go through

00:11:35   and be like, even to the degree of, do I need these IP addresses? That was something that I actually recently,

00:11:40   I've been spending this morning, I've been sitting down, removing it, or working out how to remove it from all my apps,

00:11:45   'cause it's like, do I need this? No, I don't. Is there anything that is being stored that is tied to somebody in any way?

00:11:54   And how can I get rid of that? And unfortunately, I use a third party advertising framework in my apps,

00:12:01   because that's just the reality for what I find myself in, and it's like, it doesn't make me feel great in some ways,

00:12:08   because of that, but it makes me feel good about the fact that I at least got rid of anything other than that,

00:12:14   which is somewhat essential for my business, I've gotten rid of. I do a very basic, bare minimum kind of analytics thing

00:12:22   that is entirely anonymized and has no tie to a user whatsoever, that other than that, there's no analytics,

00:12:29   and I got rid of, in the past I used to use Fabric, or the more kind of heavy-handed analytics things,

00:12:35   where I'm feeding my data into their big data system, and I get a little bit of benefit, but honestly,

00:12:41   the reason those analytics packages are free for most developers to use is that that data is valuable to other people,

00:12:50   and is being used in ways that I lose control over, so at least separating myself as much as I can from things like that is great.

00:13:00   But one other thing too that I like about when you come up with a privacy policy is it's nice to,

00:13:08   maybe it's like making promises that your future self has to follow, is also something that's kind of nice,

00:13:15   that there's all kinds of situations that you can get into where people will reach out to you with various degrees of shadiness,

00:13:24   people wanting access to the data about your users. For example, I know there's a couple of companies who reach out to weather apps,

00:13:36   is a common version of this, because most weather apps have access to a user's location, because that's really useful for a weather app.

00:13:44   But there's a lot of people who would love to embed a little framework into your application that exploits the fact that a user has given you access to their location,

00:13:57   to then report that and use that data for marketing reasons and all kinds of other stuff that having accurate access to a user's location could be very useful for.

00:14:08   But it's nice to, I've said publicly that that's not something that I do, and it's nice to make that promise that then your future self,

00:14:19   not that I'm ever particularly tempted, but there is something even more reassuring to just immediately discard a lot of those types of potential,

00:14:29   I wouldn't even really call them opportunities, but situations that you might find yourself in that you can say,

00:14:35   "Well, I've said this is how I'm going to do it." And certainly a privacy policy, I'm sure, always includes some kind of language around the,

00:14:42   "This policy may change," and so on, but at least you're making a commitment to yourself that you've at least thought about this.

00:14:49   And I think that is a nice thing to have to put down in writing that this is something that I care about, this is something that I'm going to endeavor to uphold.

00:15:02   And I think that is just a good thing, it's at a personal level, to make you feel better about the work you're doing.

00:15:10   - Yeah, I mean, in kind of like a large overview summary way, I think one of the great benefits of GDPR is going to be that until now,

00:15:19   there has been very little reason not to collect data. It's like, "Well, what harm does it do? I have this data in case I ever need it."

00:15:26   Similar, like when you were talking about adding third-party analytics services that are free and you're really paying with your users' data,

00:15:32   but like, they're free to you. And the thinking a lot of times of that was, "Why not embed this? I might get useful data out of it."

00:15:39   But now, the thinking has to flip around. Now the thinking has to be, instead of, "Eh, why not collect this information?"

00:15:46   Now it has to be like, "I better have a really good reason to collect this information, because collecting this is going to bring on liability and work for me."

00:15:55   And so, the default now should start to become, for a lot of us, "Let me collect as little as possible," rather than,

00:16:02   "I'll collect as much as I need, just in case I need it someday." And that's a big change, and I think overall, it's a little more work,

00:16:09   but I think overall, this will be significantly for the best. Anyway, we are brought to you this week by Linode.

00:16:16   With Linode, you'll have access to a suite of powerful hosting options with pricing starting at just $5 a month.

00:16:21   You can be up and running with your own virtual server in the Linode Cloud in under a minute.

00:16:26   Whether you're just getting started with your first server or deploying a complex system like what we have, Linode is the right choice for you.

00:16:32   It has the fastest hardware network with fantastic customer support behind it all. It has never been easier to launch a Linode Cloud server.

00:16:39   Linode guarantees 99.9% uptime for server availability, and Linode offers additional storage now, too.

00:16:45   They now have block storage. It's out of beta. It's available in a couple of their data centers so far, and they're expanding it very quickly as they go.

00:16:52   Linode is really great for setting up your own server for things like hosting large databases, running a mail server, operating a VPN, running Docker containers.

00:17:00   I keep meaning to teach myself Docker, and that's how I'm going to do it, is on Linode.

00:17:03   You can host your own private Git server and so much more. So for instance, if you wanted to host your own private analytics thing on your own web server, you can do that with Linode.

00:17:11   And Linode is also hiring right now. If this interests you, go to linode.com/careers.

00:17:17   So anyway, Linode has fantastic pricing options available, starting at 1GB of RAM for just $5 a month.

00:17:23   They have all sorts of plans above that, too, including new high-memory plans.

00:17:26   Listeners of this show can get $20 towards any Linode plan by going to linode.com/radar.

00:17:32   That will support us and get you 20 bucks towards any Linode plan. That's four free months on that 1GB plan.

00:17:38   And with a seven-day money-back guarantee, there's nothing to lose. So go to linode.com/radar to learn more, sign up and take advantage of that $20 credit, or use promo code RADAR2018 at checkout.

00:17:48   Thank you so much to Linode for supporting this show.

00:17:50   So one interesting thing, and my research into GDPR that I think is a really interesting filter for this discussion, too, is one of the new abilities that GDPR grants to EU citizens is that they are allowed to demand from a company that has data about them a full log of the data that that company has about you.

00:18:18   And the company is required to give this back to the customer within, I think, 30 days or something along those lines, which is both kind of lovely and also kind of terrifying, that I just have the vision of if 10% of my users made a request like that of me, even if I have no data on you, dealing with that would be a nightmare.

00:18:42   But what I love about that is it's a really interesting question, and I think a different way. You can kind of look at it from the what data am I collecting, and then the other interesting thing is if you got one of those requests, like the thought exercise of what would you do with it, how would you proceed, and I think that is also a helpful way to make sure that you are aware of all the data you're collecting.

00:19:04   If you're thinking about it from a perspective of if somebody said, hey, I want to know what data you have stored about me, what tree would you walk down to gather that data, and then you can kind of look at the tree from the top down and say, do I still need all of this, would I want to be able to provide a user with all this information, sort of as the reverse of the privacy policy version, which is almost like from the ground up version of that, like starting at the roots and kind of working your way up.

00:19:32   And that's interesting because I think what I've found is that from almost all cases, for most of my applications, other than there's a couple where I have a recipe for my recipe app, I have a way to sort of backup your recipes to the cloud, and I have your email address for that.

00:19:46   But other than that, the nice thing with what I've found is as I think about all my apps, and I've gone through them all and it's been kind of tedious, but I'm like, is there anything that I collect that I would be able to tie back to a user, that if a user said, hi, I'm John Doe, I want you to tell me what information you have about me, that it's really encouraging to look down and say, I don't know who you are, I have nothing for you.

00:20:11   That's problem solved. But it's also just a really interesting way to look at this same problem, and is certainly something that you have to keep in mind, because this is apparently something that people may ask us for now.

00:20:25   I believe something similar to this has been around from the, I think it was called the Data Protection Act, was something that existed prior to this, that didn't quite go as far, but I haven't had any Data Protection Act queries, so hopefully this isn't something that's going to open the flood doors and I'm going to be spending all my time doing administrative busy work.

00:20:47   But nevertheless, it's an interesting way to think about the same problem of, if someone, maybe in the case of, here's my email address, tell me everything you know about me, starting from there is another interesting way to walk through and say, what do I actually know about this person?

00:21:03   And you can audit it from that direction as well.

00:21:06   - Yeah, and you can think too, first of all, I think this process should be automated. I think whenever possible, you should develop some kind of just automated script that you can run that will generate these reports, even if no one's ever asked you for one.

00:21:18   And I think one thing that might be useful for that is to think about what data appears there that somebody might consider creepy.

00:21:25   Are you collecting data about people that if they knew about it, they would get freaked out, or that they would think you were being a little bit overreaching or creepy?

00:21:36   And you should really think about it. Do you need to be collecting that amount of data?

00:21:40   This is one of the reasons why, as I mentioned, I'm trying to get out of the IP address business, everywhere I have logs and things like that, because I just don't need that and it's kind of creepy.

00:21:50   With an IP address, you can get things like geography and stuff like that, and I don't need that. I have no use for that information.

00:21:57   So think about when you're, I strongly suggest that you actually make a script to dump out this report for what you have on somebody, and anything that can't be shown to somebody without being embarrassed, stop collecting it.

00:22:11   I know this is not so easy for a lot of people, but this really is the path forward for those of us who want to just run an honest business and don't want to deal in basically being creepy data brokers.

00:22:25   And it's probably fair to say, too, it's important that you're pointing out there that you may have information in places that aren't structured that you still would be liable to be responsible for, like logging.

00:22:37   Yeah, like server logs.

00:22:38   Yeah, like if you have a log entry that has a user's user ID as well as their IP address, you're now actually, entirely unintentionally, hopefully, but potentially unintentionally, you are creating this tremendously specific trail of potentially where they've been over the course of using your app, or how often they're shifting networks or like this information that is in there that you may not even be intentionally collecting.

00:23:03   That is just good to be aware of that as soon as you have some type of actual user identifier, any time that that identifier appears anywhere in any of your systems, you are potentially leaving behind this trail of sort of like privacy concern that you have to be aware of.

00:23:19   That may not be just like, well, what tables in my database have a user ID, you know, column?

00:23:26   Like that's like level one, but there's probably level two and three beyond that that you also need to be thinking about.

00:23:32   Yeah, because another part of this is, it's sometimes called the right to be forgotten.

00:23:37   Yeah.

00:23:38   And it's basically that people now have the right to request that you delete everything you know about them.

00:23:44   And these, it's kind of unclear so far whether this includes like server backups, like if you, you know, you backup your database, like do you have to somehow delete it from all the backups immediately, and that's kind of unclear.

00:23:54   But the answer to that's probably going to work out soonish, but it's probably going to be somewhere on the practical side, but basically, you know, A, you're going to have to start thinking about deleting those backups after a short amount of time.

00:24:06   And B, you know, you should have a way, like similar to this generation of the report that I was talking about a minute ago, like you should have a way that you can delete everything you know about a user.

00:24:17   And again, that's going to involve, you know, if you start thinking about it, that could involve things like server logs, and then again, you might want to question, do you need to keep those logs, or how detailed do they have to be, what information has to actually be in them, etc.

00:24:28   But you should have a way that you can delete everything you know about a user. Like this is something like, I've had this since day one in Overcast too, just because I didn't want to have to deal with stuff like this.

00:24:36   So like right in the app, you can go to the account page and you can delete your entire account, and that deletes everything I know about you. Like, and it's great for me, it's great for users, they know they have a way out, and I was able to say, because this is also privacy policy stuff, like the privacy policy, you know, right from the beginning I had to say something on the lines of like, you know, how do you access, edit, and delete your data, and so there it is.

00:24:59   Like you make a way for people to do it and make it automatic. And again, like, you know, this is going to be easier for some types of apps than others, but it's just one of those things, again, like, basically complying with GDPR is not that hard if you have simple data practices like this.

00:25:16   Like, when a user deletes their account from Overcast, I actually issue delete commands to the database. It's not like a soft delete where I'm setting like a deleted on date and then marking it as deleted. Like it's an actual delete from the database because that data is no longer needed. Like it's, I don't need that for any reason anymore.

00:25:32   So if you can do stuff like that, like avoiding complexities of like soft deletes and stuff like that, like it's just better now for this new environment.

00:25:41   Yeah, and I think one thing I will say though is that it's, you can use all these good practices, they're always going to, in some ways, they come at a cost. And I think it's probably fair to sort of wrap up towards talking a little bit about that.

00:25:53   That having a good privacy policy, being thoughtful, doing the stuff where you, like even just in the simple case of rather than having a deleted on flag, actually deleting the user case is if that user then comes back and says, "Oh, I didn't mean to," or "I changed my mind,"

00:26:10   you know, from a user experience perspective, it would be great to be like, "That's no problem. I can just turn your account back on."

00:26:16   Instead, you have to say, "Sorry, I, you said delete. You meant delete. It all went away."

00:26:22   And so there is certainly a cost that is going to happen with all of these types of choices.

00:26:27   And I think the reality is, while there is a cost, and while there are certainly choices and features that you won't build if you're acting in a very privacy-oriented way, I think in general, like this is a show for independent iOS developers,

00:26:44   I think the advice I would give anybody is to err on the side of collecting less, having less data, because even if it may, there are features that you could make, you are creating the possibility of such a big overhead and nightmare down the road that it is unlikely to be worth doing.

00:27:02   And for example, I have a lot of people ask me why I don't have the competitive health tracking, I guess you could call it, where me and three of my friends compete against each other to see who can get the most steps in a week, say.

00:27:19   And the answer I give is because I don't want to know how many steps you've taken, because in order to build a feature like that, it's like either I have to go down the road of building some kind of crazy encryption scheme where each one of your friends has a private key that you can encrypt your step data and be passing it around, which sounds like a nightmare and I don't really want to go down that road for, or I just need to store your health data in a database.

00:27:46   But as soon as I do that, suddenly I've gone from this world where I know nothing about you to I know a tremendous amount about you. And I know like, you can infer a lot of things from information like that.

00:28:00   And so for me, I've just decided that the cost of not building those features is enough for me to just not do it. And maybe that makes me slightly less competitive. Maybe there are certainly other companies that have that kind of feature that I'm sure is useful for them.

00:28:19   But I made the conscious choice that I don't want to do if I'm going to do a feature like that, I need to do it completely and actually put it had to have all of the infrastructure that I would need to manage that and to be secure about it.

00:28:32   If I don't want to go down that road, I don't want to do it in kind of like this, this half baked way that is just going to open me up to potential down tremendous pitfalls down the road.

00:28:42   So that's just the approach I take, but it's certainly something worth saying that these choices are going to have a cost and you just have to be conscious about them and understand what you're getting into as a result.

00:28:53   And I think big picture, you can say like, well if you don't collect this data, you can't offer these convenient customer service features or things like that. If we do collect this data, we get the world we have today. We know the cost of that now. And you know what, it turns out all those customer service benefits for a lot of us are not worth the cost of the crazy data privacy leaking world we have today.

00:29:15   So let's try it the other way for a while and we'll see how that goes.

00:29:19   Yeah, no, it's been working fine for several years. I can say it's great. So it's recommended even though it isn't perfect.

00:29:29   Thanks for listening everybody. We'll talk to you next week.

00:29:31   Bye.

00:29:33   [BLANK_AUDIO]