PodSearch

Developing Perspective

#188: Thoughtful Accessibility

 

00:00:00   Hello and welcome to Developing Perspectives. Developing Perspective is a podcast discussing

00:00:04   news of note in iOS development, Apple and the like. I'm your host, David Smith. I'm

00:00:08   an independent iOS developer based in Herndon, Virginia. This is show number 188, and today

00:00:14   is Friday, June 20th. Developing Perspective is never longer than 15 minutes, so let's

00:00:18   get started. So first thing I was just going to mention is an update to a site that I found

00:00:25   quite useful and I figured I'd mention it here. As we're all kind of going through all

00:00:30   the content that came out of WWDC, try to get through all the videos, listen to them,

00:00:34   watch them, whatever we can to get through. I think it's 106 videos or something like

00:00:39   that. It's quite a lot of content to get through. But something that Matt Thompson has put together

00:00:46   that is a useful resource that I've run into many -- used many times and so I've mentioned

00:00:50   here, is a site called ASCII WWDC. There's a link in the show notes to it. Basically,

00:00:56   it's a way he took, he extracts the transcript tracks from the videos, I think, and then,

00:01:04   you know, provides them in a searchable interface. So it's just really useful, I find. If you're

00:01:08   curious what was said at WWDC about a specific technology, you can go in and search and say,

00:01:14   "Hey, show me all the WWDC videos that involved," whatever, "UI search controller," or something

00:01:19   like that. And so I just mentioned it here, I find it really helpful as I'm trying to

00:01:23   navigate what it is exactly I'm going to be doing over the next two months in my work

00:01:28   to be able to kind of make sure that I'm touching on all the places in the videos that that

00:01:33   particular topic was covered. All right, so the main topic I'm going to be diving into

00:01:38   today is talking a little bit about accessibility. This is coming out of a update that I did

00:01:44   for pedometer++ before W2C, I think, and it was a topic that I mentioned a little bit

00:01:50   at the time, but I didn't really have time to go into it in great detail. And so given

00:01:54   that this is a fairly slow week, I figured I'd dive into it now.

00:02:00   So accessibility, it's probably fair to start off talking about what I mean by that. And

00:02:06   specifically I'm going to be focused on the accessibility elements of things like voice

00:02:11   VoiceOver in iOS. Focus mostly on the visual impairments. Obviously there's a variety of

00:02:16   different accessibilities, aspects of your software you build. People with different

00:02:22   impairments would require software that would be built in a specific way. But the one that

00:02:26   I'm going to focus on mostly is on VoiceOver and otherwise in general focused on building

00:02:34   applications for the visually impaired. And accessibility is a tricky topic sometimes

00:02:40   in the sense that I think it's an important part of software development, not necessarily

00:02:45   in so much as it is strictly a way to further your business or the financial of your application.

00:02:55   I think sometimes that can be the case. The visually impaired community is very vocal

00:03:01   and is very good about rewarding good apps, in my experience. But moreover, there's just

00:03:08   a certain amount of it's just the right thing to do. And sometimes you do things just because

00:03:12   they're the right thing to do, not necessarily because you'll see a strict return on it.

00:03:18   But making an application that appeals to a broader audience, you know, certainly has

00:03:23   some return to it. So I would encourage everybody to do it. In my experience, it's relatively

00:03:28   easy. The amount of work it takes is certainly not overwhelming. And the impact that that

00:03:33   can have in someone's life is quite dramatic. There's, on a personal level, there's nothing

00:03:38   quite, you know, there are a few things in my experience when I get a help desk request

00:03:43   from somebody who talks about how my application has allowed them to do something that they

00:03:47   couldn't otherwise do. And as a result, you know, has enriched their life and made it

00:03:53   better. That's quite dramatic. That is something that will sit with you in a way that a lot

00:03:59   of the other work that I do doesn't. You know, when you're really having an impact on people's

00:04:03   lives, that feels awesome. And maybe that's just as selfish as wanting it to pay off financially,

00:04:11   but either way, it's something that I've found to be worthwhile doing. So I'm going to kind

00:04:16   of dive into that a little bit. And what I'm going to focus on more is not so much the

00:04:20   basics of accessibility. I'll mention them briefly. I'm going to focus more on what I

00:04:23   would call thoughtful accessibility. And this is coming out of my own experiences recently.

00:04:29   If you're curious about the general overviews of accessibility, there are tons of W3C videos

00:04:34   about it.

00:04:35   Apple really emphasizes it, but spends a lot of time on it.

00:04:38   And so there's a lot of places to go.

00:04:40   Just look for all W3C videos about accessibility for the actual technical details of it.

00:04:45   But at a high level, accessibility is about providing an alternative way to interact with

00:04:50   your application.

00:04:51   You want to make it so that if someone has trouble seeing the screen, they can still

00:04:56   get useful value out of it.

00:04:59   And there's obviously various degrees of that on the spectrum,

00:05:03   all the way from someone who is completely blind

00:05:05   and is interacting with your application entirely

00:05:08   by touch and sound, versus somebody

00:05:10   for whom it is just difficult to read small text,

00:05:14   or is colorblind, or those types of things.

00:05:17   And so there's different things that you'll do along the way.

00:05:20   But there's a lot of technical things

00:05:22   that you'll do to address that.

00:05:23   So there's a system inside of iOS called VoiceOver.

00:05:27   And this is probably the core part of what

00:05:29   you're going to be looking at.

00:05:30   And VoiceOver is a system extension

00:05:34   that will read elements on the screen to the user

00:05:37   based on certain gestures and things that they do.

00:05:40   If you've never experienced this,

00:05:42   I would highly recommend that you do.

00:05:44   You can't really understand how people

00:05:46   are interacting with your application

00:05:48   unless you've turned this on.

00:05:49   So you go to General, Accessibility,

00:05:51   and you can turn on VoiceOver.

00:05:53   Or really what I do, and I think a lot of developers that I know do this, is you can

00:05:57   set that as a shortcut if you triple tap your home button.

00:06:02   And so then you just triple tap your home button, you can turn it on and off, which

00:06:04   is excellent for development, excellent for making sure that it works without having to

00:06:08   fuss in the settings app.

00:06:11   And so you can -- when you're in voiceover mode, the interaction model is very different.

00:06:16   It has sort of a state to it where there's a selected element that is active, and when

00:06:21   it becomes selected, its name is read aloud, and if you double tap, it'll perform an action

00:06:27   on that. If you swipe left or right, it'll select the next control, and so on. And like

00:06:32   I said, it's a bit hard to describe, but hopefully if you sit down with it and try it, it'll

00:06:37   make a lot of sense. And so that's how a lot of people interact with your application.

00:06:43   And so you want to be thoughtful about how that interaction is going to go. And some

00:06:47   things just aren't going to work. Like, this is something that I've struggled with a little

00:06:50   bit with an app like my weather app, Check the Weather, which has a lot of gestures in

00:06:55   it. And so gestures are often very difficult to translate completely into something like

00:07:01   voiceover because the system is taking over swipe gestures. The system is taking over

00:07:07   a lot of things. And so you just have to be instead thoughtful, not necessarily about

00:07:11   trying to take the same interface and just narrate it to your user. What you're trying

00:07:17   to do is make sure that the information you're trying to present to that user is accessible

00:07:22   in a reasonable and easy to navigate way. And there are really, I think there are probably

00:07:28   three ways, parts to this. And these are the things that I've more recently been able to

00:07:33   learn, especially honestly thanks to people in the visually impaired community who have

00:07:38   reached out to me saying, "I like your app. It has basic voiceover support, but it would

00:07:44   be much better if it did so and so. And they'll expand this and talk to me about this. And

00:07:49   if you're a developer for any amount of time, you're almost certainly going to reach out

00:07:52   to these people. And what I find, if you are, consider them tremendous resources, because

00:07:57   they're giving you insight into something that you can't really project yourself. You

00:08:03   know, as much as I try and think about what it would be like if I couldn't see my screen

00:08:06   or I'll use my application with my eyes closed, I don't know what it's like to actually be

00:08:11   blind or to be visually impaired and to know what that experience is like. So if you have

00:08:17   a user of your application who is available and interested in providing that input to

00:08:21   you, make sure you take advantage of it. I've learned so much every time I've done this.

00:08:26   And so recently, the two things that I learned from one of my users for Podometer++ that

00:08:31   who reached out to me is he was talking to me a lot about the way in which I was ordering

00:08:38   both the controls on the screen and the words in my descriptions.

00:08:45   So let's unpack those two.

00:08:47   So if Predometer++, if you're not familiar with it,

00:08:49   I mean, basically it's a pedometer using the M7 chip

00:08:52   and an iPhone 5S.

00:08:53   And it shows your daily step counts.

00:08:55   So it says how active you were.

00:08:57   And it has a horizontally scrolling bar graph

00:09:01   on the bottom and some stats for your current day at the top.

00:09:05   Fairly straightforward.

00:09:06   And when I'd originally shipped the app,

00:09:08   had just kind of done the basic version of VoiceOver, where all the controls said their

00:09:12   text out loud. So there's, you know, there's a thing that says how many miles you've walked,

00:09:17   there's a thing that says how many steps you've taken, and if you tapped on them, it would

00:09:21   just to say the name, which sort of works, but it didn't really address what the actual

00:09:29   user would want to do. And especially there were some issues I had where if you just take,

00:09:35   do that basic level of it. The way, the order in which those controls are going to be navigated

00:09:40   on this, when in voiceover mode, is somewhat arbitrary. I'm sure there's a basic rule

00:09:45   to it, but it's not something that is strictly predictable out of the box. And I was talking

00:09:51   to this user and he said, "What a lot of voiceover users do is they navigate an application

00:09:56   using the swipe gesture. So they'll swipe right to left and we'll move from control

00:09:59   to control to control. And so it's important the order in which those controls are displayed

00:10:06   to that user because they're going to, in order for them to find their place, they may

00:10:11   have to sit there swiping five or six times, and every time they do that they have to listen

00:10:16   to what is being said. And so in my initial implementation, you open the app, you turn

00:10:22   on VoiceOver, and I think it would start off with saying, giving you access to the settings

00:10:27   button and to the share button. And then your next one, if you swiped, it would swipe over

00:10:32   and it would start with a week ago's step count in the bottom chart essentially, which

00:10:40   is actually probably not really what the user is interested in, right? The user is probably

00:10:44   not most interested in knowing how many steps they took a week ago, but it just so happened

00:10:49   that because it was the left most control on the screen, it was being shown first. And

00:10:55   And so what I did is I had to, and this is relatively straightforward, you can set an

00:10:58   accessibility order for any control, is reorder that. And so when you swipe, you know, when

00:11:04   you use the first swipe now, it tells you your current day's data. And then it shows

00:11:08   you the other data after that. Because more often than not, what you're curious about

00:11:12   is what did I do today? And so it's just something you have, a thought exercise you need to go

00:11:17   through is what order would be most useful for displaying this information to my user?

00:11:22   Because keeping in mind, especially, whenever

00:11:25   they open the application, they're starting from scratch.

00:11:27   They don't necessarily keep their place

00:11:30   in terms of your application.

00:11:31   And so making it so they can quickly

00:11:33   get to the most important information first.

00:11:36   So that when you-- the first-- it's like in a podcast client,

00:11:38   for example, the primary control may be something

00:11:41   like the Play/Pause button after--

00:11:45   and then like the Skip Forward, then the Skip Back, maybe,

00:11:48   something like that.

00:11:49   You can keep that in mind, because that's

00:11:51   how they're going to be navigating. Secondly, being thoughtful about the order of words

00:11:57   and what you're actually going to be telling them. If they're interacting with your application

00:12:01   entirely from sound, the actually what you say and how you say it is kind of important

00:12:06   because if it's long-winded, overly verbose type of thing, it's going to be very frustrating

00:12:12   for them to hear. What they really want is to get the data out of your application. And

00:12:17   And so you want to be thoughtful and terse about how you do that. And the order is important.

00:12:22   So for example, I'm going to be telling somebody on a particular date, "They took so many

00:12:29   steps and walked so many miles." Those three bits of information. Now there's a variety

00:12:34   of ways I could tell someone that. I could say, "On Friday, June 20th, 2014, you took

00:12:41   look, 4,297 steps. This constituted walking 207 miles. Or, is that, obviously that was

00:12:49   wrong, 2.7 miles. Anyway, that's the point, right? That's a fairly long-winded way to

00:12:54   say it. Conversely, you could say 4,277 miles today. 4.6 miles. Right? That's the same information

00:13:05   but presented in a very concise way. And starting with the most important thing, in this case

00:13:11   I would say step counts, and then moving on to the less important information as you go.

00:13:17   And this is something that I found in my own experience makes a big difference. Having

00:13:21   these big, long-winded descriptions where you're describing the interface to the user

00:13:25   isn't really helpful. It's like labels showing current step count value to 4627 isn't actually

00:13:34   useful. The user doesn't care that it's a label. They just want to get that information.

00:13:39   And so being thoughtful about what your labels are is important, and being appropriately

00:13:42   terse. Now obviously you can take that to an extreme, but in my experience generally,

00:13:47   people who are interacting with your application are very good at parsing short bits of information

00:13:53   if you're thoughtful about them. And so take that approach.

00:13:57   One thing I'll also say is that if you put a comma between words, and it does a nice

00:14:01   little brief pause as it reads it, which makes it even clearer to hear. So often do something

00:14:06   like, you know, 4,622 steps to 4.2 miles, right? If you want to smush those two numbers

00:14:14   together, but if you just throw a comma in there, it'll actually say it pretty nicely.

00:14:18   And remember, you're trying to create an alternative experience for that user. You're trying to

00:14:25   do something that is a parallel experience. You're not narrating your interface. That's

00:14:31   the biggest thing I'd like for you to take away from this is hopefully when you're building

00:14:34   accessibility. You're not narrating your experience. You're trying to create an alternative experience.

00:14:40   And I think if you do, you'll create something that is valuable to a class of customers who

00:14:46   are some of the most fanatical, devoted people I've interacted with. And so I'd encourage

00:14:51   you to do that. All right, that's it for today's show. If you have questions, comments, concerns,

00:14:55   or complaints, I'm on Twitter, underscore David Smith, David at developing perspective

00:14:58   dot com. Thank you. Bye.