85: iOS 11
00:00:00
◼
►
Welcome to Under the Radar, a show about independent iOS app development.
00:00:04
◼
►
I'm Marco Arment.
00:00:05
◼
►
>> And I'm David Smith.
00:00:07
◼
►
Under the Radar is never longer than 30 minutes, so let's get started.
00:00:11
◼
►
So today we are going to sort of continue unpacking WWDC and transition from watchOS
00:00:18
◼
►
that we talked about last week.
00:00:19
◼
►
This week we're going to start talking about iOS.
00:00:22
◼
►
We'll see how it goes in terms of if it's just today or this week and next week.
00:00:26
◼
►
But there's a lot of really cool stuff in iOS, and I think it's fun to kind of walk
00:00:30
◼
►
through it both in terms of things that we think are cool for ourselves and for our own
00:00:36
◼
►
apps, as well as to just unpack some of the cool stuff there.
00:00:39
◼
►
Because I think this year, with their emphasis on technologies and refinements, like on the
00:00:46
◼
►
technology side, they added a whole bunch of stuff that is really cool and interesting
00:00:50
◼
►
technically.
00:00:52
◼
►
But I'm not sure yet, it hasn't much application, if any, to my own projects.
00:00:58
◼
►
But it's still interesting, and I'm trying my best to be aware of those technologies,
00:01:04
◼
►
just so I can have them in the back of my mind.
00:01:07
◼
►
But probably the best thing also just to mention is if you haven't gone through a lot of these
00:01:14
◼
►
beta cycles is there's usually two documents that are the best place to sort of get started
00:01:18
◼
►
with looking at what's new in iOS.
00:01:22
◼
►
And there's usually an actual document called "What's New in iOS."
00:01:25
◼
►
I'll have a link to it in the show notes, which is Apple's sort of like big, high-level
00:01:31
◼
►
structural change document, which is great.
00:01:35
◼
►
And then there used to be the old API diffs document, which Apple doesn't really do anymore,
00:01:40
◼
►
which I have actually filed a radar about, which I'll also have a link to in the show
00:01:45
◼
►
And instead, somebody has independently generated a nice diff, which I'll have a link to as
00:01:51
◼
►
well, which are great.
00:01:52
◼
►
Like the API diffs are super low level, which is probably why Apple got rid of them.
00:01:57
◼
►
But I've always found them to be tremendously helpful to just kind of survey all of the
00:02:03
◼
►
And it's the only place you'll ever find these little one-method changes where suddenly something
00:02:08
◼
►
that used to be really hard, now there's a convenience method for it.
00:02:13
◼
►
Or they've added -- I run this a lot with HealthKit, where they add a very specific
00:02:18
◼
►
subtype somewhere.
00:02:20
◼
►
And it doesn't show up in the high-level documents.
00:02:22
◼
►
So anyway, I'd recommend just browsing through these things and just looking for things that
00:02:27
◼
►
might be relevant for you.
00:02:29
◼
►
And that's sort of the process that I take.
00:02:32
◼
►
And it seems to work.
00:02:33
◼
►
>> Yeah, I agree.
00:02:34
◼
►
Those API diffs, it's a little hard to filter through them to get things that are actually
00:02:41
◼
►
really relevant to you.
00:02:42
◼
►
Because in recent years, ever since Swift's announcement, many of the actual differences
00:02:49
◼
►
in the APIs have been things that aren't really substantive, things like nullability changes
00:02:53
◼
►
or changing ID to instance type.
00:02:57
◼
►
So there's been a lot of changes like that that most of the diffing tools, including
00:03:02
◼
►
the one that Apple used to publish, they keep in there.
00:03:05
◼
►
So you have to do quite a lot of scrolling sometimes and filtering through.
00:03:12
◼
►
And to get rid of that, to do a more intelligent version, you basically have to build a compiler.
00:03:17
◼
►
So most people don't do that, understandably.
00:03:20
◼
►
But I always find things in the API diffs that I would not have found if I was only
00:03:27
◼
►
watching sessions and seeing the videos and stuff like that.
00:03:30
◼
►
>> The first technology that it's, I don't know if there's a particular order that makes
00:03:33
◼
►
sense for this, but one that I'm kind of fascinated by but have absolutely no use for is CoreML,
00:03:41
◼
►
which as best I can tell is a way that Apple is now packaging up the machine learning,
00:03:51
◼
►
take a giant bucket of data and pour it through some kind of modeling system to end up with
00:03:58
◼
►
a feature identification or that kind of machine learning process, and they're making it ridiculously
00:04:07
◼
►
fast on their ridiculously custom hardware, which is really cool.
00:04:13
◼
►
I see some of the things that they're doing where it's like, here's a picture, tell me
00:04:16
◼
►
where it was taken, tell me the objects that were in it.
00:04:19
◼
►
And in many ways, this seems like an extension of all of the stuff that they've been doing
00:04:23
◼
►
themselves in the Photos app, for example, or the Camera app for a long time.
00:04:29
◼
►
They've been doing this kind of work previously, and now they're just turned it into a general
00:04:34
◼
►
purpose framework, which is great.
00:04:37
◼
►
I don't think I have much to do with it yet.
00:04:40
◼
►
I'm starting to play with a little bit of some of the things that I do, for example,
00:04:43
◼
►
in sleep analysis, where I do some basic processing on motion data to identify different periods
00:04:51
◼
►
Theoretically, that kind of thing might ultimately be possible to turn into a machine learning
00:04:55
◼
►
kind of a situation, but it is so outside of my expertise that that's a project for
00:05:04
◼
►
probably something like next spring when things tend to get slow and I'm just looking for
00:05:08
◼
►
a little fun thing to do for a couple weeks.
00:05:11
◼
►
And it may work, it may not, but in general, it's kind of cool.
00:05:15
◼
►
And it's the kind of technology that I always love it when Apple can bundle up something
00:05:20
◼
►
that they've developed internally and then publish it to us, because usually that means
00:05:25
◼
►
that it's fairly well baked, fairly well thought through.
00:05:28
◼
►
They've built this for themselves first and then are just exposing it to us.
00:05:32
◼
►
And so I would expect it to go reasonably well if you have an application that can take advantage
00:05:38
◼
►
Yeah, I too, I really do not know very much about modern machine learning techniques and
00:05:46
◼
►
frameworks and I don't even have a good vocabulary on these technologies to be able
00:05:52
◼
►
to not only discuss them intelligently at all, but also I don't even know if I can
00:05:58
◼
►
use them or not.
00:05:59
◼
►
I really need to devote some time, and I think your suggestion of possibly the quiet spring
00:06:04
◼
►
time is a really good idea, to just familiarize myself with modern machine learning algorithms
00:06:11
◼
►
and types and just what these tools can even do before I have any idea whether I can use
00:06:16
◼
►
them or not.
00:06:17
◼
►
Yeah, and I don't know if there's anything more to say on CoreML other than it looks
00:06:22
◼
►
really cool, but way over my head.
00:06:27
◼
►
The next one that I think is, this is one I understand a bit better, but I'm not really
00:06:32
◼
►
sure I have anything to do with it, but is ARKit, which is sort of the other sort of
00:06:37
◼
►
brand new technology that Apple seems to be pushing this year.
00:06:41
◼
►
And I think when I first heard about AR, and I mean, it feels like for the last couple
00:06:48
◼
►
of years, like Tim Cook has kept saying AR is the future when everyone asked him about
00:06:52
◼
►
VR and everyone's like, "Oh, no, no, no, it's AR."
00:06:54
◼
►
I'm not sure I really got what he was talking about or why it would be interesting.
00:07:00
◼
►
And now that this technology is out and it makes building AR or augmented reality applications
00:07:06
◼
►
reasonably straightforwardly, it's interesting to see.
00:07:11
◼
►
And I think the key thing that I've noticed in this is that AR doesn't allow you to do
00:07:17
◼
►
something that you couldn't do previously with just regular 3D graphics.
00:07:26
◼
►
The demo they had at the W3C keynote of the little scene where spaceships are coming in
00:07:33
◼
►
and people are interacting, that's cool.
00:07:36
◼
►
That could just as easily not have had a table as the background, and we've been able to
00:07:41
◼
►
do that forever.
00:07:42
◼
►
But I think the thing that's been interesting now that this technology is out there, and
00:07:47
◼
►
that is making me start to understand why this is really potentially useful, is some
00:07:55
◼
►
examples that have started to crop up since the first beta came out.
00:07:59
◼
►
And specifically, there's a great website madewitharkit.com, which is just a simple
00:08:05
◼
►
site that somebody's aggregating people whenever they post videos of cool stuff they're doing
00:08:12
◼
►
And I think what I finally had the cognitive switch on is why ARKit is powerful is because
00:08:20
◼
►
it makes something like that, that same kind of thing that we've had for years of a 3D
00:08:25
◼
►
model in space, it makes the interaction with that 3D space feel natural and feel less complicated.
00:08:37
◼
►
Like you or I might feel entirely comfortable navigating a 3D world with a keyboard and
00:08:43
◼
►
mouse in a 3D shooter or a flying game or something like that, where once you get used
00:08:50
◼
►
to that, I can fairly adeptly move myself through 3D space, which is probably not necessarily
00:08:59
◼
►
the case for everybody.
00:09:00
◼
►
It might be really confusing and complicated to navigate in a generated 3D world, but if
00:09:08
◼
►
you take that 3D world and move it into something where suddenly it is immediately natural and
00:09:16
◼
►
understandable.
00:09:17
◼
►
You're looking through this little window on your phone, but it looks just like you're
00:09:23
◼
►
looking at your desk, you're looking at the floor.
00:09:26
◼
►
It's not this crazy thing and it behaves like you would expect.
00:09:30
◼
►
If you pick your phone up and move it up, you look higher on the thing, you look lower.
00:09:35
◼
►
And that I think is where things are getting interesting.
00:09:38
◼
►
I'm not sure if I have an app yet for this, but once I wrap my head around it, the key
00:09:44
◼
►
thing here is it takes this 3D graphics modeling world and makes it immediately understandable
00:09:52
◼
►
Imagine you could give an AR app to a young child and they would get it immediately.
00:09:58
◼
►
It's immediately obvious that as you move it around, you can see things around.
00:10:02
◼
►
As you move closer, things get bigger.
00:10:04
◼
►
As you move farther away, things get smaller.
00:10:06
◼
►
And that is really interesting.
00:10:09
◼
►
I don't know if I have anything to do with this yet.
00:10:13
◼
►
I saw one health and fitness app that was taking advantage of it, which was really cool,
00:10:18
◼
►
but they used mapping data to generate a 3D map of a run they went on.
00:10:26
◼
►
And you could kind of project that onto a table and you could look at what you did and
00:10:32
◼
►
watch the trace of it move around, which is really cool.
00:10:35
◼
►
I don't know if that really makes sense for adding to a workouts app.
00:10:40
◼
►
But there's something here and it's kind of cool to see.
00:10:44
◼
►
And if anything, probably ARKit is one that I'm most looking forward to this fall, just
00:10:48
◼
►
like downloading a ton of apps that are doing fun stuff with it and trying it out.
00:10:55
◼
►
It's important to see this now as a 1.0.
00:10:59
◼
►
AR right now seems, first of all, as you said, there's a lot of cool features and things
00:11:05
◼
►
we could do with this that we mostly just haven't really thought of yet.
00:11:09
◼
►
And it is awfully specialized.
00:11:11
◼
►
Most apps are going to have no use for this, as with many of these technologies.
00:11:15
◼
►
But that's not to say that there aren't good uses for it.
00:11:19
◼
►
And it is still also a very early version of AR.
00:11:23
◼
►
So for instance, this has to run on phones that only have one camera.
00:11:27
◼
►
And in a while, in a few years, maybe every iPhone that runs the current OS will have
00:11:32
◼
►
two cameras and it can do cooler things.
00:11:35
◼
►
Right now, it's basically table kit.
00:11:38
◼
►
You can do really cool stuff on tables.
00:11:41
◼
►
And that's great.
00:11:42
◼
►
I don't know a lot of people who have a bunch of empty tables lying around who want to fill
00:11:45
◼
►
them with 3D models.
00:11:46
◼
►
But we will find cool uses for it.
00:11:48
◼
►
But it's also going to get better over time.
00:11:50
◼
►
Like already, the current AR kit is limited only to horizontal surfaces.
00:11:54
◼
►
But in the docks, there's a couple of mentions of vertical surfaces.
00:11:57
◼
►
And the APIs for those are conveniently missing.
00:12:00
◼
►
So clearly, in the future, they're probably going to add vertical surfaces.
00:12:04
◼
►
And then maybe it'll be better detecting irregular surfaces, surfaces that aren't perfectly flat.
00:12:08
◼
►
Like if you're outside and you're looking at a hill, how do you map that into AR space?
00:12:14
◼
►
Because it isn't a single plane or it isn't a flat plane.
00:12:17
◼
►
So over time, this is going to get better.
00:12:19
◼
►
And the potential uses for it are going to grow.
00:12:22
◼
►
So right now, while it seems like hyper-specialized, I think it actually, over time, will become
00:12:29
◼
►
much bigger than that.
00:12:30
◼
►
So I'm looking forward to it.
00:12:32
◼
►
And I guess it's just fun.
00:12:35
◼
►
It's cool to see something like this because you can just kind of make these silly but
00:12:40
◼
►
very fun apps.
00:12:44
◼
►
It's just a really cool technology.
00:12:45
◼
►
And I love that it's like, I love it when a new API or SDK comes out with something
00:12:52
◼
►
like this, where it isn't so much like it's this brand new, totally new thing.
00:12:58
◼
►
But what it is, is like they're dramatically lowering the barrier to entry for trying something
00:13:05
◼
►
Especially, you know, like the actual math and graphical processing and everything going
00:13:10
◼
►
on to make this work, even in its 1.0 state, is crazy town.
00:13:15
◼
►
Like I don't know how they can do what they're doing, but they do.
00:13:20
◼
►
And so they're lowering the bar so that a developer like you or I, who is just like
00:13:25
◼
►
a one-man team or a small team, can probably reasonably do something in this space and
00:13:31
◼
►
make something happen, which previously may have just been completely impossible.
00:13:36
◼
►
And so it's kind of cool to just see them just like, "Hey, you know what?
00:13:39
◼
►
This is now like a given.
00:13:41
◼
►
You can do basic augmented reality stuff just for free.
00:13:45
◼
►
Go and have fun."
00:13:46
◼
►
And it's just lovely to just see like this huge, you know, it's like they always love
00:13:49
◼
►
in W2DC, they have these, the graphs or the charts where they have like the very, you
00:13:53
◼
►
know, it's like the stack of technologies on top of each other.
00:13:58
◼
►
And now there's this massive stack below ARKit that we just don't have to think about,
00:14:02
◼
►
like that all of the crazy complicated, you know, camera stuff that they're doing.
00:14:07
◼
►
And I mean, some of the stuff you kind of hear about bits and pieces of where like the
00:14:12
◼
►
stuff they have to do on the camera to, you know, like they track during optical image
00:14:18
◼
►
stabilization, as best I could understand, is that during optical image stabilization,
00:14:22
◼
►
as the lens is moving from side to side, they're actually tracking that in code to a different
00:14:28
◼
►
just for it in their models.
00:14:31
◼
►
Which is like, "Whoa!"
00:14:32
◼
►
Like that's crazy, but like fair enough, that's what you have to do because otherwise
00:14:36
◼
►
you have, you know, the camera lens is shaking.
00:14:39
◼
►
It's you know, it's shaking back and forth to try and stabilize things and so if you
00:14:42
◼
►
don't adapt for that, you'd be introducing error.
00:14:45
◼
►
And so like that's really cool and I'm so glad that I don't have to write the code
00:14:48
◼
►
that works that out because that sounds really, really hard.
00:14:52
◼
►
>> Enoch: Yeah, exactly.
00:14:53
◼
►
And this is one thing that makes this exciting for me too is that, you know, many apps on
00:14:58
◼
►
our phones and iPads are things that we've been able to do on desktop computers and laptop
00:15:03
◼
►
computers forever, right?
00:15:04
◼
►
And it's just like moving those tasks to mobile devices.
00:15:08
◼
►
AR is something that is like truly mobile.
00:15:11
◼
►
This is something that you effectively can't do on computers.
00:15:15
◼
►
Like just because of like the ergonomics of how you are holding them, what kind of hardware
00:15:19
◼
►
they have, what kind of cameras they have.
00:15:21
◼
►
Like you're never going to, I mean not never, but no one's ever really going to in mass
00:15:26
◼
►
take their laptops and wave them around in the air and do AR applications that way or
00:15:32
◼
►
move their iMac screen on their desktop angling around to like look at a table.
00:15:36
◼
►
Like that's never going to happen.
00:15:37
◼
►
This is like a mobile first thing and for people like us who are primarily or only mobile
00:15:42
◼
►
developers that's exciting to have these kind of new, completely new categories of
00:15:46
◼
►
things open up to us.
00:15:48
◼
►
Even if we don't have any immediate ideas for using them, like that will probably change
00:15:51
◼
►
over time and that will be great for us.
00:15:53
◼
►
Anyway, we are sponsored this week by DICE.
00:15:57
◼
►
DICE has been helping tech professionals like us take the next steps in our careers for
00:16:01
◼
►
more than 20 years.
00:16:02
◼
►
They have the insights and the tools needed to give you an edge.
00:16:05
◼
►
So if you're wondering what's next in your career, DICE's new career pathing tool
00:16:09
◼
►
can tell you about new roles based on your job title and your skills.
00:16:13
◼
►
And they can even show you which skills you will need to make the move.
00:16:16
◼
►
The DICE careers mobile app is the premier tool to manage your tech career from anywhere.
00:16:21
◼
►
You can find exactly what you're looking for with thousands of positions from top companies.
00:16:25
◼
►
And the DICE careers market value calculator allows you to understand what your skills
00:16:29
◼
►
You can discover your market value based not only on your job title and location, but also
00:16:33
◼
►
your specific skill set.
00:16:35
◼
►
So don't just look for a job.
00:16:37
◼
►
Manage your career in technology with DICE.
00:16:40
◼
►
Download the DICE mobile app and learn more at dice.com/undertheradar.
00:16:45
◼
►
That's dice.com/undertheradar.
00:16:48
◼
►
Our thanks to DICE for sponsoring this show and all of Relay FM.
00:16:52
◼
►
So another technology that blows my mind when they get into the weeds of how it's actually
00:16:57
◼
►
done is the new depth API, which is if you have, I guess right now it's just the 7
00:17:04
◼
►
plus, but presumably it'll be, or yeah, the 7 plus is the only phone that has the dual
00:17:09
◼
►
camera system, but presumably in the future there may be more cameras that can do this.
00:17:15
◼
►
And if you have one of those, they can do this crazy bit of a comparison between the
00:17:21
◼
►
two camera pictures and work out a fairly accurate depth map of what they're seeing
00:17:27
◼
►
in the picture.
00:17:28
◼
►
And this is at this point only used for that kind of simulated large aperture photography
00:17:37
◼
►
effect, where it blurs out the background, but keeps the subject of your picture in focus.
00:17:43
◼
►
>> BRIAN KARDELL Yeah, well, that's exactly what I was doing.
00:17:47
◼
►
I'm just walking my way around it.
00:17:49
◼
►
I don't want to go there.
00:17:52
◼
►
But I love that it's now just a general purpose API.
00:17:57
◼
►
You can do all kinds of stuff with this potentially.
00:17:59
◼
►
It doesn't have to just be, "Oh, now that we know depth, we can blur the background
00:18:06
◼
►
and make it look like a large aperture picture."
00:18:10
◼
►
You can do all kinds of stuff.
00:18:12
◼
►
The toy examples they showed were really cool, where you can say, "Hey, let's make the
00:18:18
◼
►
foreground image in color and the background of the image, let's turn it black and white."
00:18:24
◼
►
Or similar kinds of basic effects, or just taking advantage of this information is really
00:18:32
◼
►
kind of cool.
00:18:33
◼
►
And it's kind of fun to see, too, how I imagine this is sort of similar to ARKit.
00:18:40
◼
►
Apple is doing this really low-level image processing in real time that allows them to
00:18:46
◼
►
pull a lot of extra information and metadata out of what the cameras are seeing.
00:18:52
◼
►
And if I'm honest, this kind of makes me want to go out and buy a 7 Plus, because right
00:18:57
◼
►
now I just have an iPhone 7, just so that I can play with this and see what you can
00:19:02
◼
►
do with this data, now that we have this really straightforward and like the API for it is
00:19:08
◼
►
really pretty understandable and easy.
00:19:11
◼
►
And because of the new image format, too, the HEIF, I believe is what it's called,
00:19:19
◼
►
this data is also now available to us retrospectively.
00:19:22
◼
►
So it's like people don't even necessarily have to use our apps to take the picture.
00:19:28
◼
►
As much as custom camera apps are cool, I've probably bought and downloaded dozens of them.
00:19:35
◼
►
I always just use the built-in one because it's available right from the lock screen.
00:19:40
◼
►
And so it's great, though, that this data is now baked into those pictures, and so third-party
00:19:46
◼
►
apps can retrospectively take advantage of doing it, which is really cool.
00:19:50
◼
►
Yeah, and editing plugins, too.
00:19:52
◼
►
I mean, there's all sorts of applications for that.
00:19:55
◼
►
And so it's like I'm not sure what there is to do with the data.
00:19:58
◼
►
That seems to be the theme of the show so far.
00:20:01
◼
►
It's like I don't know what to do with this, but it's really cool.
00:20:04
◼
►
And the talk on it, in addition to actually, it was probably one of the best talks I went
00:20:11
◼
►
to just from a comedy perspective.
00:20:15
◼
►
I believe Brad was the guy who gave the talk, was just really funny and making lots of jokes
00:20:20
◼
►
about depth, which I don't know, it appealed to my sense of humor.
00:20:24
◼
►
So if you haven't watched it already, the talk about the depth API is really good.
00:20:28
◼
►
I haven't heard any jokes about depth recently.
00:20:29
◼
►
I'll have to check this out.
00:20:30
◼
►
Well, you've just got to dig deeper.
00:20:33
◼
►
I mean, you've just got to really--exactly.
00:20:35
◼
►
So moving on.
00:20:39
◼
►
We got two new Siri intents in iOS 11, which are all about lists and notes.
00:20:46
◼
►
So it's reminders and notes and allowing apps like OmniFocus or Things to now interact with
00:20:58
◼
►
Siri, which is cool.
00:20:59
◼
►
I mean, I sort of like the way that they're doing intents in the sense that they're kind
00:21:04
◼
►
of methodically expanding out the way we interact with Siri.
00:21:08
◼
►
I do kind of wish that they also had a fallback, like a general purpose thing, just to be a
00:21:15
◼
►
bit more, you know, sort of permissive in terms of what's possible with SiriKit.
00:21:22
◼
►
Like right now, it's kind of annoying that they have, you know, it's like if you're one
00:21:26
◼
►
of the apps that has the ability to interact with their structured format, like, great,
00:21:31
◼
►
you're golden.
00:21:32
◼
►
And you know, if you were a list app or a notes app, now you can do that.
00:21:36
◼
►
But if you're not, you're still stuck.
00:21:38
◼
►
And like, I get what they're doing with that, because their approach means that it's likely
00:21:43
◼
►
more reliable in general in the sense that it's fairly unambiguous as to what you're
00:21:48
◼
►
trying to accomplish versus having a completely freeform general purpose system.
00:21:54
◼
►
But I don't know, I kind of wish that you could interact with a bit more apps.
00:21:58
◼
►
And I imagine, you know, it's like your app Overcast would be a great candidate for this
00:22:03
◼
►
kind of thing to be able to be like, you know, it's like, "Hey, can you play this episode?"
00:22:09
◼
►
Or "Pick up where I left off," or those kinds of interactions, which right now you just can't.
00:22:15
◼
►
And that's kind of annoying.
00:22:16
◼
►
Yeah, like I think that the way they are apparently doing this with this like gradual expansion
00:22:22
◼
►
to very limited domains but making it really easy to implement once they get to your domain,
00:22:27
◼
►
like that's nice.
00:22:28
◼
►
And if they accommodate what your app does or what you want to do, then by the time they
00:22:34
◼
►
get to you, you are better off doing it that way than if they had this kind of like general
00:22:39
◼
►
purpose like, "Send Overcast this phrase," and you just have Overcast try to figure out
00:22:44
◼
►
what you meant by that.
00:22:45
◼
►
You know, like that, it is nice the way they're doing it if it works for your app.
00:22:51
◼
►
But if it doesn't work for your app, you're just shut out completely of this entire system,
00:22:55
◼
►
and that's unfortunate.
00:22:57
◼
►
But speaking of things that you did get, you got AirPlay 2, right?
00:23:01
◼
►
That's a technology that's slightly helpful for you.
00:23:06
◼
►
Have you watched the session on AirPlay 2?
00:23:07
◼
►
Actually, I don't think I have gotten there yet.
00:23:10
◼
►
Watch the session on AirPlay 2.
00:23:12
◼
►
The short version basically is if you are using AV player, you're pretty much okay.
00:23:18
◼
►
If you're using any other method to play audio, you have a lot of work to do now.
00:23:24
◼
►
And I use other methods to play audio because AV player is incompatible with smart speed.
00:23:30
◼
►
There's no way to really implement smart speed well with AV player, so I don't use it.
00:23:34
◼
►
And if you are writing your own audio graph or using any other method, you now have a
00:23:39
◼
►
lot of work to do if you want to support AirPlay 2.
00:23:41
◼
►
So that's going to take up an as yet unknown part of my summer, but certainly not a short
00:23:48
◼
►
So it's nice.
00:23:49
◼
►
It's going to be really cool when I get it working, but that's going to be a lot of work.
00:23:53
◼
►
I have to basically lop off and rewrite the middle half of my audio engine.
00:23:59
◼
►
That sounds fine.
00:24:01
◼
►
I mean, what could possibly go wrong about cutting out the core part of what makes your
00:24:07
◼
►
app your app and just replacing it this summer?
00:24:12
◼
►
And the good thing is, another new technology, which is drag and drop, will actually allow
00:24:17
◼
►
me to also rip out the middle half and rewrite part of my UI.
00:24:22
◼
►
But it's a part of my UI that I would love to get rid of.
00:24:23
◼
►
It's the crazy hacks that I had to write to get full-time reordering in a UI table view
00:24:29
◼
►
with my little drag handles and my playlist view in Overcast 3.
00:24:34
◼
►
I love having full-time reordering, and before that was pretty much impossible to do with
00:24:39
◼
►
a UI table view without tons of hacks.
00:24:42
◼
►
The drag and drop API will not only allow me to get rid of a lot of those hacks, but
00:24:47
◼
►
imagine the drag and drop with the spring-loading folder navigation stuff with Overcast playlists.
00:24:54
◼
►
Imagine being able to pick up three episodes in a playlist, backing out to your playlist
00:24:59
◼
►
screen and then dropping them into another playlist and having that open itself up and
00:25:03
◼
►
you can put them in a certain spot.
00:25:05
◼
►
This could open up really cool APIs.
00:25:06
◼
►
This will open up really cool APIs.
00:25:08
◼
►
But it's going to be a lot of work to do all that.
00:25:10
◼
►
I do plan to do that to the best of my ability, but again, this is going to be a lot of work
00:25:16
◼
►
this summer.
00:25:17
◼
►
But I think if I can pull it off, it'll be worth it.
00:25:38
◼
►
More specific to on the iPad moving data from one app to another.
00:25:44
◼
►
Like, okay, they could have done that.
00:25:47
◼
►
Instead, they've built this general purpose app that, like you said, on the iPhone is
00:25:53
◼
►
still there, works just fine, is built into collection views and table views, and is now
00:26:00
◼
►
instead built this general purpose app concept about publishing something as being draggable
00:26:10
◼
►
and a fairly comprehensive way to define what that means in terms of if it's a file or if
00:26:18
◼
►
it's a more straightforward type or an object type.
00:26:22
◼
►
You can just define what that is.
00:26:23
◼
►
You have a method that I imagine over time users will just get used to for how to access
00:26:31
◼
►
And then you have a method for putting that data somewhere else.
00:26:35
◼
►
And like I said, you can work inside your own app.
00:26:37
◼
►
You can do this.
00:26:38
◼
►
You can do it between apps.
00:26:39
◼
►
Like, I really like the way they structure this so that it's fairly general purpose and
00:26:44
◼
►
in general not crazy hard to implement either.
00:26:48
◼
►
Like, for the most part, if you're doing the basic kind of things that you would expect
00:26:52
◼
►
drag and drop to work with, it's mostly that you're implementing three or four delegate
00:26:57
◼
►
methods and you're off to the races.
00:27:01
◼
►
Unless what you're trying to do is share some kind of really complicated data type that
00:27:07
◼
►
needs special handling, like if you're just trying to have a string value be available
00:27:13
◼
►
as drag and drop or an image or something like that, you can just do it.
00:27:16
◼
►
And it's super straightforward to do.
00:27:18
◼
►
And I just love that they didn't just go--they didn't just do the basic.
00:27:24
◼
►
They did the full comprehensive solution.
00:27:28
◼
►
And so now it can be used in so many ways that are beyond just it's like a basic thing
00:27:32
◼
►
of oh, we can just add drag and drop to the iPad where you can drag things from one side
00:27:36
◼
►
to the other.
00:27:37
◼
►
Like, the fact that you could use it for table view reordering.
00:27:40
◼
►
And it sounded actually like they are internally now using it for table view reordering.
00:27:45
◼
►
>>Trevor: I believe so.
00:27:48
◼
►
Like, if you're using the built-in version.
00:27:49
◼
►
Like, that is a great example of I like how fully baked this is.
00:27:54
◼
►
And while I wish in some ways it's like it would have been great if we had had it before.
00:27:58
◼
►
Like, now that it's here, it's like this is awesome.
00:28:00
◼
►
And I love that it's just everything you would kind of expect it to be.
00:28:04
◼
►
And so that's kind of cool.
00:28:05
◼
►
>>Trevor Yeah.
00:28:06
◼
►
I'm a huge fan of the way they did this so far.
00:28:09
◼
►
And I have not actually coded with it yet.
00:28:11
◼
►
I've just been looking at all the APIs and watching the videos and everything.
00:28:14
◼
►
So you have more experience.
00:28:15
◼
►
But I too am very excited about the entire drag and drop API.
00:28:19
◼
►
Because what this does is, most of the advanced stuff like navigating to different screens
00:28:25
◼
►
while you are dragging things or navigating into different apps and with the other hand
00:28:29
◼
►
as you're doing things.
00:28:30
◼
►
Many of these things are going to always really be power user features.
00:28:33
◼
►
The same way like on Mac OS you've had like spring loaded folders where you can drag a
00:28:37
◼
►
bunch of files over a folder, wait a second, the folder will open up and you can keep dragging
00:28:42
◼
►
into different things there before you drop them.
00:28:45
◼
►
Mostly power users do that.
00:28:46
◼
►
I don't think most people know you can do that.
00:28:48
◼
►
And that's going to be the case with most of the stuff on iOS too.
00:28:51
◼
►
But that's really nice for power users.
00:28:53
◼
►
And it enables entirely new types of work to be efficiently done on iOS.
00:29:01
◼
►
These are all things that before you could usually do them but it was much more cumbersome.
00:29:06
◼
►
And so most people just wouldn't.
00:29:08
◼
►
Now you can.
00:29:09
◼
►
And that is better for the platform.
00:29:12
◼
►
That basically makes larger markets for things like professional apps and productivity apps
00:29:17
◼
►
where before it was more cumbersome because most people just wouldn't do it.
00:29:21
◼
►
And it's nice too.
00:29:22
◼
►
They have a standard interaction model now that users will grow accustomed to and then
00:29:28
◼
►
it's easier to adopt this new thing rather than having to like teach a user how to use
00:29:34
◼
►
It's like in theory all the apps will do it the same way and so adoption from a user's
00:29:38
◼
►
actually using it perspective will also go up.
00:29:40
◼
►
Well we're out of time this week.
00:29:42
◼
►
Thanks everybody for listening and we'll talk to you next week.
00:29:45
◼
►
[BLANK_AUDIO]