PodSearch

Under the Radar

197: iOS 14 Privacy Disclosures

 

00:00:00   Welcome to Under the Radar, a show about independent iOS app development.

00:00:04   I'm Marco Arment.

00:00:06   And I'm David Smith. Under the Radar is never longer than 30 minutes, so let's get started.

00:00:10   So today we want to talk about privacy and specifically there's a whole host of changes that relate to how privacy is being managed in iOS 14

00:00:21   that seem like they're worth sort of unpacking and discussing because some of them are just

00:00:26   Apple has added a variety of kind of indicators and additional things that are providing information to the users

00:00:33   about how apps are actually working. There are some things that are a bit more sort of metadata related

00:00:41   in terms of some new information we're going to have to start providing in App Store Connect

00:00:46   to sort of add to our store listing talking about privacy.

00:00:50   And then there were also three programmatic changes to privacy as they relate to location, photos, and tracking.

00:00:58   But I think to start off with, I think it's just interesting to talk through how Apple has added a sort of a new recording indicator.

00:01:06   So this is for both the microphone and video cameras. Whenever they are active now, there's a little sort of orange, either a microphone or a dot in the top corner of the phone.

00:01:18   This sort of indicates that either of those are active and similarly there's also now a paste indicator that any time an app reads from the pasteboard

00:01:28   it gets shown to the user. And I think both of these are good, just straightforward wins.

00:01:35   I think it's, because they are eliminating a situation where what you never want is like, every time you see one of these indicators

00:01:43   it should never be a surprise, ideally. That like, if you see the recording indicator because you're recording something, it's not a problem.

00:01:50   The user wouldn't be confused or sort of anything unexpected. But if they're seeing it at a time when they weren't expecting it, that's problematic.

00:01:57   And I think that's a good kind of indicator for these things where otherwise they're transparent to the user.

00:02:02   And even though microphone and camera are sort of guarded behind other privacy controls in the app, or in the App Store,

00:02:09   you could have them active at any point. It's not like Apple necessarily is always saying,

00:02:16   "Once you've asked for the microphone recording, you could use it whenever you want without any indication."

00:02:23   And so this also kind of mirrors the experience we have on the Mac where when the FaceTime camera is active,

00:02:29   there's the little green light that comes on. And I think it's just a good general win that I think makes a lot of sense for Apple to have added to the OS.

00:02:38   Yeah, I mean people have all sorts of different thresholds for what they consider over the line with privacy and tracking stuff.

00:02:45   But one thing that almost everybody can tend to agree on is it's better to know than to not know when something about you is being recorded or tracked or something.

00:02:56   And oftentimes what makes something cross a line for people is when they are surprised to learn,

00:03:03   "Oh, ew, you were recording that or you were capturing this about me?"

00:03:08   And Apple walks a fine line here because they have to.

00:03:11   Because Apple as the platform owner can't actually be super aggressive about blocking a lot of things

00:03:18   because a lot of major companies and major businesses that are built on their platform and that their platform needs,

00:03:24   because otherwise people wouldn't want to buy an iPhone if it didn't run all these popular apps,

00:03:28   a lot of those things Apple can't just outright ban. So instead, Apple gives control to the user in a lot of these cases.

00:03:36   So certain types of creepiness are banned outright, like the most egregious things or the most hard to detect, underlying, sinister things.

00:03:44   But a lot of stuff that can be used in a creepy way but also has legitimate uses, things like location information,

00:03:51   the pasteboard tracking you were mentioning, Apple walks a line there where they say,

00:03:56   "Okay, we're going to just either ask the user first before you are allowed, on a technical level, access to this information,

00:04:03   and/or we are going to let you use it maybe without approval, but we're going to display something to the user that says,

00:04:10   'This is being used right now.'" So that's just like the clipboard API now.

00:04:15   In a way, Apple is allowed to hand responsibility off to the customers so that they don't have to be the super aggressive bad guy with some of this stuff.

00:04:24   And I like that they're doing this because really, if you look around the industry, no one else seems to care nearly as much as Apple about this kind of stuff.

00:04:33   If Apple didn't care so much about protecting people's privacy on their platform, which happens to be very popular,

00:04:39   I think we would just be in a much worse, crazy, dystopian future. I mean, we are in some ways, but at least not this way.

00:04:49   At least this way, things are not as bad as they could be because Apple cares so much, and I really appreciate that.

00:04:54   And I think, too, what I love is that they're also holding themselves to the same standards.

00:04:58   It's sometimes a little funny, but when you activate Siri in iOS 14, you see the little microphone indicator pop up.

00:05:05   And even kind of interestingly, I've seen it pop up a few times when I imagine Siri thought it might have heard its trigger phrase, but it wasn't sure.

00:05:13   But there goes the little microphone indicator, and they're holding themselves to the same level of accountability as they are to all of us.

00:05:22   And I think it has been clearly, these indicators have been useful because they've sort of shown two different categories of things.

00:05:28   There have been a few apps who have been found to have been accessing things at times when perhaps it was a bit unexpected,

00:05:33   and they've been sort of shown out as a result.

00:05:36   And that could be intentional, or it could have been just accidental or poor coding, or things that...

00:05:44   It's one of those things where if there's no consequence to something, it's simpler as a developer.

00:05:51   It's like if you ever expect to take anything from the pace board, maybe you just, on launch, you just always check and see if there's whatever that data that you might sometimes expect is.

00:06:02   And it's just like, it's lazy, but it's sometimes the case. There have been a couple of those, and these indicators help cut down on that,

00:06:09   because now there's a consequence and a cost. So from an engineering perspective, you have to be careful and thoughtful any time you access these protected resources.

00:06:17   Additionally, there's also, sort of related to this, it's interesting, they're taking the Safari kind of privacy report and including that on iOS as well.

00:06:28   And so this is just a way for, you know, when you're browsing around in Safari, you can now also see what tracking or cross-site cookies and things like that that a website is using are being active.

00:06:44   And I think this is just a very interesting tool in the same way of just, it's not limiting anything.

00:06:48   Isn't Apple coming in and saying you can't do this necessarily, though they do a whole bunch of that with their anti-tracking technology, but moreover, they're just making you aware.

00:06:58   And I think that awareness on websites is just as useful as it is inside of applications.

00:07:03   Oh, totally. I mean, websites have been historically the worst offenders because even though they didn't have a lot of access to the hardware resources on your devices, they still ransacked your privacy with whatever they could get their hands on, whatever information was available to websites they would exploit and be disgusting.

00:07:25   Apps have the potential to be even scarier because they do have more access to even more sensitive resources.

00:07:34   Things like your location, if they've gotten permission to do that, things like camera, microphone access, that's why Apple fights so hard for this stuff.

00:07:41   Because we saw what the tracking industry did with the limited resources they had at their disposal on the web.

00:07:48   Now that they have apps, they will do anything they possibly can, and that's where I think a lot of Apple's push for this stuff comes from.

00:07:55   And I think what's interesting is that the next kind of thing that they're doing is also just trying to make this part of the conversation.

00:08:02   And I think the next thing I want to talk about is their privacy report that they're now having.

00:08:06   It's slightly unclear as to exactly how we're going to be providing this as app developers, but as app developers we're going to have to sort of self-report all the various attributes of privacy that we might be collecting, what data is available, what data is collected by us, what data is collected and shared with third parties.

00:08:26   It's a little bit vague exactly how this is going to be, but as far as I can tell what Apple is going to do is there's going to be some section, probably it reminds me of the same way that the app rating system sort of works in App Store Connect, where there's a whole series of questions that you have to kind of check.

00:08:42   It's like, "Do you use this kind of data? Do you use this kind of data?" And it goes through this whole list, and it's going to be available in our App Store entry now.

00:08:49   And obviously this is self-reported, so it's going to be a little bit interesting to see how they handle that in terms of the accuracy of that report.

00:08:57   Like, are they going to be doing the kind of validations you could imagine where they scan your binary and they see you're linking to this analytics framework, but you're not reporting it in your thing?

00:09:07   Or is it entirely the honor system? But I think it's interesting that they're going down this road of just trying to make this a constant part of the conversation around applications.

00:09:17   That on your phone, anything that's collecting anything about you is very upfront, has to be very clear.

00:09:23   And I think, honestly, I'm kind of glad this is in here now, because they've had a requirement for a long time around having to have a privacy policy.

00:09:31   That's something that they've said for a long time. I can remember at least two years ago, I think it was every app has to have a privacy policy.

00:09:38   But what it is and what it says is kind of arbitrary and up to the individual developer. You just have to have something.

00:09:46   Whereas I feel like this is a nice standardization of that. And in many ways, it's like an easy, consistent view of a privacy policy that's not the same as a privacy policy.

00:09:57   Because there's things in the policy side where it's like, what do you do with business transfers? Or there's other more legalistic stuff.

00:10:04   But from the high level of attributes of what I think most people care about in a privacy policy is just knowing what data of mine are you collecting?

00:10:12   And are you sharing it with anyone else? And it seems like we're going to have to start reporting that.

00:10:17   And then it'll just be shown as an entry in the App Store. So I don't know how much people will actually scroll down to wherever this might be.

00:10:24   But based on the screenshots that Apple has provided for this, it seems like it's going to be relatively prominent in our App Store entries.

00:10:31   So it's, again, sort of this extra incentive for developers to be thoughtful and not just throw in an analytics package just because.

00:10:39   Because if they do that, they may have to start checking a whole lot of additional check boxes.

00:10:43   Yeah, and this is, I think, it comes down entirely to consistency of enforcement.

00:10:49   Which unfortunately the App Store has not been very good at, especially things like this that are kind of hard to enforce consistently.

00:10:56   Because it's kind of hard to be able to know whether a developer is being truthful about some of these things.

00:11:02   I do think it will be significant for all of us out there to finally kind of make us check to see what all these SDKs are that we're including in our apps.

00:11:14   It's going to force us to actually make decisions and be like, "Okay, well, if I add this analytics SDK that makes my life easier in this one way,

00:11:24   then my App Store page for my app is going to get less appealing to people in this other way."

00:11:31   And so this thing that is free to me, and it seems, I mean, we have such a problem in this business with so many developers adding a bunch of SDKs

00:11:39   because they're free and they provide some useful function to us.

00:11:43   But then the result is our app is sharing all this data with possibly creepy ways that these developers didn't even necessarily think about or know about when they added the SDK.

00:11:55   Assuming they can at least automate the detection of popular SDKs, like you said, popular add-in tracking stuff,

00:12:03   like I'm sure Apple will find ways to do that, then that'll be great in the sense that it will make developers face these questions

00:12:10   and it'll cause a lot of developers to decide not to leak this data to somebody else because maybe the SDK doesn't provide enough value to make that worth it.

00:12:19   Ultimately this will come down significantly to enforcement because what I like about this is that I think I'm doing things the right way.

00:12:28   I don't include any third-party SDKs in my app. I don't share data with anybody else. I care a lot about privacy.

00:12:34   But for the most part, that's been kind of like a virtue that only you got to enjoy, you know, because as a developer, your users couldn't really tell,

00:12:44   unless they did a lot of research, they couldn't really tell when they were looking at whatever category your app is in,

00:12:50   like, "Oh, I can get this pedometer app that is very respectful of my privacy."

00:12:55   Now you can tell, like, "Oh, this other pedometer app is full of tracking and creepy stuff, and this wonderful one here is not."

00:13:04   And so it can be an advantage to us that if you're doing things the minimal tracking way or the minimal creepiness way,

00:13:13   the most private way possible, now you have a slight advantage in the app store, as opposed to there being no difference at all visible to customers,

00:13:21   and therefore it being a lot harder to make the right decision for a lot of people.

00:13:25   Yeah. And I like too that it's, I think, there are certain, it creates a burden on the vendors of third-party libraries and SDKs to presumably have to publish in some form for their users what they're actually collecting and doing,

00:13:45   so that developers have the ability to accurately fill in this report.

00:13:49   And I think that is just a useful exercise that kind of forces the vendors to be thoughtful about what they're doing,

00:13:56   and then that trickles down to developers being thoughtful about what they're doing.

00:14:00   And I think that knock-on effect, I think, is just a useful thing, that it's not always clear when you're including an analytics package,

00:14:08   what data is being collected, that you might be, like, and I think Apple is clear in some of their documentation about this even,

00:14:15   like, even if you are, you include an analytics package, and you are using it for particular use,

00:14:22   but it also collects additional data, even if you are not using that data, but that, you know, a lot of, I think, with many of these platforms,

00:14:30   they're collecting data in aggregate for additional purposes, and you may not be aware of that as a developer.

00:14:38   And so I think those vendors having to be upfront about that potentially and in theory, and obviously, yes, it's a massive enforcement question,

00:14:46   exactly how that works, and I expect this is one of those things that will have a bumpy start and then hopefully settle down over time.

00:14:54   But I like that it will also mean that the vendors, like, you kind of imagine that in the, you know, it's like, they say, you know,

00:15:02   how to install this into our, you know, into your app, and it's like, here's the Swift package that you need to include, here's the couple of lines of code to, you know,

00:15:09   enable the package, here's how you use it, the way that you, whatever metric you're collecting or whatever it is,

00:15:15   and then at the end of the last step is like, you need to add these three check boxes to your privacy report,

00:15:21   because this is, you know, the data that we're collecting. Like, maybe that's naive to think that's the way this is going to go,

00:15:27   but like, ultimately, like, now that's a question that I have in the back of my mind, if I was ever to think of using a third party platform,

00:15:33   it's like, what check boxes am I going to have to check on the report, and, you know, can I believe and sort of rely on this vendor if that data is being collected,

00:15:43   that it's being collected and being used in a reasonable way.

00:15:46   You know, before, there was no downside to you, to including pretty much anything, whereas, you know, imagine if, well, you can add the, you know,

00:15:54   Google ad analytics thing or whatever, but then you'll have to make your app rated 17 plus.

00:15:59   You can imagine a lot of people would choose differently then. They'd be like, well, that will cost me some sales, maybe it's not worth it.

00:16:04   And so now, to have that kind of disclosure on the page, again, this is all assuming enforcement, which I think is a huge question mark,

00:16:12   but if it is reasonably consistently enforced, and if egregiously bad behavior is actually found and punished,

00:16:21   as opposed to just like, you know, all the unethical companies not disclosing what they're doing, and it looks like you're the same even though you're not,

00:16:29   that could be a pretty bad failure mode. But if this is actually enforced consistently,

00:16:35   and if Apple is able to detect when people are lying about what they are including or not including at all reasonably,

00:16:42   then this can be great, and it can really be a boon to those of us who try to do things in a more privacy respectful way,

00:16:50   and it gives a direct incentive for more apps to do that. And that I think is wonderful,

00:16:58   and I really, really hope that this does get effectively enforced for that reason.

00:17:03   This episode of Under the Radar is brought to you by RevenueCat.

00:17:06   So if you get to a point in developing your application where you want to add an in-app subscription,

00:17:10   you are immediately opening a giant box of complexity and in many ways pain,

00:17:15   dealing with receipt validation, user management, all the backend services and reconciliation that you have to do is just challenging,

00:17:22   and that's what RevenueCat handles for you as a developer.

00:17:25   That you instead focus on building features that matter to your customers rather than just worrying about accounting.

00:17:30   They support iOS, MacOS, Android, and Stripe, and they just make it easy to verify subscription status across platforms

00:17:36   and deal with subscriptions in just the same rational way.

00:17:40   They have SDKs for iOS, iPadOS, WatchOS, Android, React Native, Flutter, Cordova, Unity, even MacOS and Catalyst,

00:17:46   and they just make it straightforward to add subscriptions to your application.

00:17:50   I use them in WatchSmith and I used them before they were a sponsor of the show,

00:17:56   and it really is as straightforward as you would imagine to just add subscriptions to your application.

00:18:00   It took me a couple of hours to get up and go running, and it's been working great for me ever since.

00:18:05   They even have automation and webhooks and things that you can add on the backend to have even more intelligence

00:18:11   and interactivity with if you have existing infrastructure or things that your revenue system needs to connect to.

00:18:17   And the great thing too is RevenueCat is free to start.

00:18:21   It's free until you ship, and then even beyond that their pricing is very reasonable,

00:18:25   and it kind of grows in scales with the success of your project, which I think seems really fair and lovely.

00:18:31   So you can relieve any of your subscription worries if you want to get started by going to RevenueCat.com.

00:18:36   You can start for free there. That's RevenueCat.com, and getting started is free.

00:18:41   Thank you to RevenueCat for sponsoring Under the Radar and Real AFM.

00:18:45   So related also to the privacy report we were just talking about,

00:18:50   I think the most direct and significant programmatic change that we are going to have to deal with and adapt

00:18:57   is the new sort of app tracking or add, like, the new tracking prompt that is,

00:19:03   I suppose it's app tracking transparency framework, I think at a technical level.

00:19:08   But essentially there is this new prompt where, from a policy perspective, Apple has said,

00:19:13   "Any time that you are doing any kind of sort of third-party tracking of your users,

00:19:19   or collecting data about that for," the classic example for this is something like targeted advertisements,

00:19:25   though this could also be potentially used in situations where data is being sent to a data broker,

00:19:30   so you think of all the weather apps that send your location data to third parties,

00:19:36   and they were sort of going off and being used for location tracking.

00:19:40   Any of those circumstances, there now is a privacy prompt, and it is the most scary,

00:19:46   "Who would ever hit yes to this prompt I think I've ever seen in the history of iOS privacy?"

00:19:54   And it's very kind of like, "This app would like your permission to track you across apps and websites

00:19:59   owned by other companies." It's like, "Would you like to allow this tracking,

00:20:04   or would you like to ask the app not to track you?"

00:20:08   And I think most significantly, it's like you won't have access to the advertising identifier framework

00:20:15   if you don't prompt this and the user doesn't say they allow you to track.

00:20:20   So I think almost all advertising frameworks currently use the advertising tracking identifier,

00:20:25   which is a way that you can have this durable identifier across devices,

00:20:32   so that they--if you've ever had the situation where you go to Safari,

00:20:40   or you're in one app and you search for something, and then you see an advertisement for that same product

00:20:47   in another application, and it's like, "How do the two apps even know that I'm the same person?"

00:20:51   It's because of this identifier that allows advertising frameworks to link the same user together,

00:20:57   which allows for targeted advertising, which tends to increase advertising rates.

00:21:01   But now, if we want to include that, or if we want to do any kind of external tracking of the user,

00:21:07   we have to do this prompt. And I think this is going to have huge ramifications for advertising on apps.

00:21:14   Certainly, as someone who uses--right now I use Google's AdMob advertising.

00:21:21   Anytime I use advertising, that's what I use, and I think I generally use it because

00:21:25   we've had episodes about this before, but I think even at a high level, I think Google has the most to--

00:21:34   they have such a large and high-profile position in the market that if they're doing things that are illegal or problematic,

00:21:43   they're going to be found out, and they're going to be held accountable to them in a way that if I was using

00:21:47   a smaller, more startup-y kind of advertising system, then I have less confidence about that.

00:21:53   But with Google, at least, I expect that they're being held to a higher standard, and so that works out.

00:21:58   But as far as I know, they haven't even announced yet what they're going to be doing,

00:22:02   so I don't know what I'm going to be doing this fall with my advertising, because I definitely am never going to show this prompt.

00:22:09   And so it may just mean that my apps show non-targeted advertising, which is fine for me.

00:22:14   If that affects my rates, that affects my rates, but I think generally it's going to be very interesting to see what this effect has

00:22:22   on all kinds of applications that rely on advertising, because even if just across the board it reduces the effectiveness of advertising

00:22:31   and in turn then reduces the advertising rates, that's going to be a felt impact on a lot of developers, myself included.

00:22:38   Yeah, I have a less optimistic take on this particular part of the privacy enhancements.

00:22:44   I think it's very unlikely that this is going to work the way people want it to.

00:22:49   The way that we want this to work, and the way Apple has hoped or promised it should work,

00:22:55   is you get control over whether your data is used in creepy ways, and the app has to ask you,

00:23:01   and then if you say no, the app has to then not be creepy. And I think that's very optimistic.

00:23:08   I think a much more likely set of outcomes here is apps maybe retaliate if you say no.

00:23:15   And maybe they say, "Well, then you can't use this new feature," or "You don't get the package gems,"

00:23:20   or "The app can't function without tracking for BS, reason, XYZ." It all comes down to really they'll make less money, as you said.

00:23:28   I think the rates between tracking and not tracking, if it was actually done the way Apple wants it to be done,

00:23:37   apps would make significantly less money, probably. A lot of them can't do that or won't do that.

00:23:42   So I think there's two failure modes here. Failure mode number one, well, I guess there's three.

00:23:47   Failure mode number one is they don't even consider what they're doing to be "tracking," and so they never even ask.

00:23:54   Failure mode number two is they retaliate if you say no in some way. They penalize the user in some way for saying no

00:24:03   so that you change your mind and go say yes. That would be the thing like withholding gems in a game

00:24:08   or withholding features if you say no. And then failure mode number three is they seem to obey your preference,

00:24:16   and then they track you anyway. Because they do something with the data that they, again,

00:24:21   they don't think they're tracking. We're just merely viewing you across multiple sites.

00:24:26   We are associating your purchase activity between multiple--

00:24:30   Well, we're anonymizing, right? I think the classic thing people say where it's like,

00:24:34   "Oh, we're anonymizing." It's fully anonymized location data, but the fact that it's like--

00:24:39   It seems like this one person is spending a lot of time at this address. I wonder who that could be.

00:24:43   Just because you anonymize data doesn't mean that you can't be tracked or be followed in that way.

00:24:48   Right. So failure mode number three is they seem to be obeying you, but then what they're really doing

00:24:54   is a thinly veiled excuse-covered or euphemism-covered version of tracking where to any reasonable person

00:25:02   the results of it you would consider tracking, but they have found some way to convince themselves

00:25:06   that it's not tracking. So those are three really big failure modes that I think all three will happen.

00:25:12   So I don't think this particular feature is going to work out the way anybody expects.

00:25:19   And that's a shame. I hope I'm wrong. I hope I'm totally wrong and that when Instagram puts up the thing

00:25:26   saying, "Hey, can we track the crap out of you?" and I say no, I hope then I don't see the creepy targeted ads.

00:25:32   But I just think that's incredibly unlikely in practice. And I think the much more likely scenario is

00:25:38   this is going to be largely ignored the same way the "do not track" header on the web was ignored.

00:25:43   Yeah, maybe. And I think it is very interesting to see. And this feels like something that is happening.

00:25:48   It's like these big, I think the biggest things, the biggest implications of this are going to have to be

00:25:55   worked out between these huge companies, which is kind of weird as a small indie developer to think

00:26:00   that this is like Apple is essentially fighting Google and Facebook and Amazon.

00:26:05   These are the kinds of companies who are now in this fight and it's going to be very interested to see

00:26:10   how that all ends up shaking out as a result. Because to your point, the main enforcement mechanism

00:26:20   that Apple has is that they can restrict the identifier unless someone has said yes, which is useful.

00:26:28   But the reality is I'm sure there are plenty of fingerprinting techniques that applications can do

00:26:35   that make it so they can generally know who you are, even if they don't have the identifier.

00:26:40   The identifier just made it so much easier. So it's like they're certainly able to continue doing this

00:26:45   in the background or do this in ways that are not within the spirit of the rule, but within the letter of the law, maybe.

00:26:54   It's possible, but I've got to say though, at the end of all this, it makes me wish that Apple still had iAd.

00:27:00   I really appreciated back in the day being able to just have advertising in my applications that paid reasonably

00:27:08   and wasn't creepy. That was something that I really longed for, that they would bring that back now.

00:27:14   And even if it was, they have this whole search ads business, and even if all iAd was was search ads,

00:27:19   essentially recommendations for other applications rather than them going down the road of trying to be this massive,

00:27:25   "They want Coca-Cola, and they want Nike, and they want all these big brands,"

00:27:29   and this whole thing that they ended up trying to do. I would be happy with it just to be smaller and simpler

00:27:35   and to not have these things hanging over my applications and feel like I'm caught between Google and Apple

00:27:43   as they fight out this privacy war. That doesn't feel great to me,

00:27:47   and I do kind of wish I had an alternative, but as far as I know, any kind of non-native advertising,

00:27:54   like the advertising you do in Overcast where it's purely on you, you're not reliant on anyone else,

00:28:00   any advertising network that you make yourself part of, I'm not aware of any that are super privacy-focused,

00:28:08   and you would be able to add to your application without any kind of concerns about this kind of tracking stuff.

00:28:15   So that's a little depressing to me as someone who, that is how I make money in some of my apps,

00:28:20   and so it's a reality that isn't going away anytime soon.

00:28:24   Well, I think the silver lining here is that either these protections won't work,

00:28:30   and therefore everything can keep going the way it's been going,

00:28:33   or they will work, and it will create a void in the market that maybe a more privacy-respectful ad network can fill,

00:28:40   or maybe the big ones like Google will start adding more privacy options and start selling more privacy inventory

00:28:46   if the demand is there. So we'll see what happens. Maybe that's a pipe dream, but I'm always an optimist in general.

00:28:53   The specifics I'm a little pessimistic on sometimes, but in general I'm an optimist.

00:28:57   I think things are going in a very good direction here.

00:29:00   Yeah, and I think maybe the general thing is just that they're heading in a good direction,

00:29:05   and all of these things aren't necessarily, like no one of these things we've talked about today are going to solve privacy on iOS,

00:29:12   but I think each one of them is a step in the direction of better privacy,

00:29:17   and that's all Apple is just continuing to apply pressure in this area.

00:29:22   It's like they're pushing against the wave of creepiness and abuse of privacy,

00:29:27   and as long as they keep pushing against it, then I feel good as a developer on their platform.

00:29:34   I feel I like that they continue to apply this pressure, and exactly where that pressure manifests itself in the future is hard to know,

00:29:40   but I think it's definitely encouraging, and I expect we'll have another round of this on iOS 15.

00:29:45   Thanks for listening, everybody, and we'll talk to you in two weeks.

00:29:48   Bye.