Above is a Vimeo video recording of my talk, Mobile Accessibility: Past, Present, Future! at Mobile Era 2018 in Oslo, Norway.
The video is captioned, but I also posted a full text transcript below with visual description for those who cannot see captions or prefer to skim text. The transcript is a complementary, NOT a replacement to video captions.
Want to hire me to speak your event or to consult your organization? Contact me. Looking forward to working with you!
Full text transcript with visual description
(Visual description: Starts with Mobile Era logo then showing logos of sponsors. Then showing a recording of two women – a host wearing a teal shirt and jeans and a speaker wearing a green top and a green jean skirt.)
Host: Hi there everybody, welcome back, and I hope that you enjoyed your lunch. Our speaker now, she’s a very special guest. She is going to have a talk on accessibility. And the special thing about this talk is that this entire talk is going to happen in sign language, and we have an interpreter right here who’s going to give voice to those signs. So, a very warm welcome to Svetlana Kouznetsova.
(Visual description: A video shows text in bottom saying: “Svetlana Kouznetsova: Mobile Accessibility: Past, Present, Future!”)
Svetlana: Hello everyone, thank you. Thank you so much for coming. My name is Sveta. I’m a user experience and accessibility consultant helping businesses make their product services environmentally user friendly and accessible to a larger audience.
(Visual description: A video changes to a teal background with Mobile Era lots on top right and 2 insets – a bigger one on the left showing slides and a smaller one on the right showing Svetlana presenting in sign language and wearing a jeans skirt and a green top standing in front of slides.)
Svetlana: I will share my experience as a deaf person before and after the mobile era and explain how people with disabilities use mobile devices and how to make mobile websites and apps accessible. Mobile means a wide range of devices like iPhones, tablets, iOS smartwatches.
(Visual description: A slide changes to a clip from Mashable showing an article about smartwatches.)
And speaking of smartwatches, here’s an article by Mashable talking about a smartwatch that was featured in Inspector Gadget, which was an American TV show for kids that I used to watch when I was younger. And I’ll show you a short clip from that. The girl is Inspector’s niece named Penny. And her dog was named Brian, and the three of them together worked together to fight crime. When I was watching those series, I was so fascinated as a kid with a smart watch, and also with the smart book that Penny used to communicate and to access information. And I watched those series long before the internet. So, it’s amazing how those things have become a reality and we now have smartwatches, smartphones, iPads, tablets, and so on. I don’t have a smartwatch myself. As you can see, I’m wearing an old-fashioned analog watch, right here. But I do use other Apple devices like iPhone, iPad, MacBook, and I can’t imagine my life without them. The internet and especially the mobile devices have really benefited me a lot as a deaf person. As we all know, the mobile phone evolved from an old-fashioned landline rotary phone.
(Visual description: A slide with photo of a rotary phone and a stack of square notes and a pen.)
Mobile phones are definitely much easier for all of us than the rotary phone. As a deaf person though, that phone was totally inaccessible to me when I was growing up. I could speak on the phone to somebody, and they could understand my speech, but I couldn’t hear what they would say back to me. So, it really wasn’t accessible for me. I wanna mention some history about the phone. Many people believe that the phone was invented by Alexander Graham Bell who was also an educator for the deaf. And there’s some things I’d actually like to share about him.
First thing, that Bell actually didn’t invent the telephone. He stole it, the invention, from another person named Antonio Meucci, and he was the first person actually to patent it. The U.S. Congress finally recognized him as the true inventor in… 2002.
Secondly, Bell is not well-liked in the signing deaf community. As an educator for the deaf, he was really pushing oral education on all deaf people, and he was against sign language, and created a stigmatism against it. I personally have nothing against speaking and lip reading, and I’m glad that I’m able to do it, but I also find sign language very important to me as a deaf person because lip reading gives only a partial access to visual information. He was also involved in eugenics, and he wanted to reduce the number of deaf people. So, he wanted to ban marriage between deaf people in fear of producing more deaf babies. But his fear was actually unfounded because 95% of deaf kids are born from hearing parents, and I’m one of them. I wasn’t even born deaf. I lost my hearing from an illness when I was two years old.
Bell also had a deaf mother and a deaf wife but ironically, his phone invention wasn’t accessible to deaf people for almost a century. So, when growing up, I couldn’t use the phone. And many other deaf and hard of hearing people also couldn’t use the phone. If I wanted to communicate with people over long distance the only way for me to do that was through paper letters.
(Visual description: A slide with a photo of paper letters and a pen.)
And do you remember how long it took to wait for those letters to come? And if I needed to talk with somebody, I often had to rely on a hearing person if I needed to call somebody.
Things started to change when I was 18 years old, and I was about to start studies at university. The summer before that, I was on a vacation with my family in Florida in the south of the United States and was staying at a hotel. And there was a group of college kids playing ping pong, and I joined in. One of them wanted to keep in touch with me. And I was thinking, how is this possible because he was hearing and I was deaf and I couldn’t talk on the phone. And it takes forever to wait for the paper letters. So, he handed me a paper with some weird character on it.
(Visual description: A slide with an image of a paper with the @ sign.)
He suggested that I keep in touch via email. Now remember, I grew up without the internet. So, the email, I didn’t know what that meant. It sounded so exotic. I asked him what it was, and he said that’s how you communicate through the internet. Well, I didn’t know what the internet was, and I asked him to explain. He explained that’s how computers communicate with each other. So, imagine trying to explain how the internet works to someone who has never used it. So, he told me that since I would soon be starting universities I would soon find out what it works.
And when I came to university, I learned more about the internet. And I was so blown away about how much it offered, especially for people like myself. I was amazed at how fast it was to send electronic mail compared to paper mail. When I was 18, I also got my first phone called a TTY, which is a text phone for deaf.
(Visual description: A slide with a picture of a TTY machine.)
That phone was invented by a team of deaf engineers in the 60s. Back then, the size of it was huge, like the size of this lectern. It was so heavy, it took two people to carry it. In the picture I have here, that’s from the 90s, so it had gotten a lot smaller by then. Now it’s really outdated, the TTY, same as the rotary phone. Most people don’t use it, no longer use it, but back then it was a really exciting thing for me.
So, I use this with a regular phone, and I would place the headset on those cups, and I would use it to communicate with other people who had the same machine. So, it was mostly for deaf people or other offices that had the machine. If I wanted to communicate with a hearing person, I had to go through a third party relay service operator. And that was very cumbersome. Hearing people would often hang up on us thinking that we were telemarketers.
Around that time, there was also this very cool thing that popped up called Instant Messaging.
(Visual description: A tweet by AIM posted on October r6, 2017 saying: All good things come to an end. On Dec 15, we’ll bid farewell to AIM. Thank you to all our users! #AIMemories bit.ly/eolaim. Below is a picture of an AIM yellow character wearing a party hat with an AIM pop up message saying: AIM is signing off for the last time. Thanks fro our buddies for making chat history with us!)
It was called AIM in the USA, AOL Instant Messenger. Here in Europe, there was something similar called ICQ. And it was much better than using an email or a phone. So, exciting times were about to start.
Around that time, the mobile phones started to emerge. And do you remember those huge brick phones? They were still not accessible to people like myself however, but we could start using communicating via wireless pagers. And I’m gonna show you a picture about how mobile use progressed for us.
(Visual description: A slide with a picture of 4 mobile devices – Motorola, Sidekick, Blackberry, iPhone.)
On the left is my first ever mobile device. It was a small device that could fit in your palm, and it allowed you only to send texts to other text-based devices.
And some time later, a new device called the Sidekick emerged, and it became very popular in the deaf community in the USA. We could not only send texts, but we could email, browse the internet, use instant messaging, and even take pictures, I mean the pictures were grainy, but it was a novelty nonetheless.
Then the smartphones emerged. The BlackBerry, iPhone, and so forth. I thought the iPhone was very cool, but I was not ready for a touchscreen without a physical keyboard. I wanted to, you know, be able to type without looking. So I got the BlackBerry. But eventually, I started having problems with that so I decided to just take the dive in and start using the iPhone, and I’ve never regretted it. And eventually I got used to the flat-screen.
(Visual description: A slide with an image of a rotary phone on top left and a TTY on bottom left with two arrows directing to an iPhone on the right.)
As you can see on the picture, both a regular phone for hearing and a text phone for the deaf people have now evolved into the universal phone. Special equipments for deaf people, it was a universal phone that can be used by anyone, I should add, and it’s also for the same price ’cause special equipment for deaf people often cost more than for hearing people. Plus you also had to buy the regular phone which really put us at a disadvantage. We could communicate, we had to communicate through hearing people on the phone via third party relay. Now we can communicate with each other using smartphones for texting, email, instant messaging, and in video chats. So, finally it puts us deaf people on equal footing with hearing people.
The second important thing I would like to mention is captioning accessibility.
(Visual description: A slide with a photo of a captioning decoder on top of a bulky TV set.)
When I was growing up, I didn’t have access to TV until I was about 15 years old. My father brought home a captioning decoder. Back then, TVs were very bulky and captioning decoders is another special device like the TTY, but deaf people had to pay extra. Now we can watch captions on TV, online, on computers, and on mobile devices by enabling the caption settings that are available to everyone there without us having to pay any more.
(Visual description: A slide with a photo of a hand holding a smartphone with a clip from a TEDx talk video.)
Of course, captioning settings won’t work until the media producers upload proper captions. Auto captions are not of acceptable quality. You can’t just turn them on. You need to clean them up, or create captions from scratch. It’s best to hire professional captioners for professional videos. The photo in the slide says that 80% of caption users are not deaf, and that’s true. This was from a TEDx talk I gave in New York City earlier this year that you can see online. I can also read live captions on my mobile device.
(Video description: A slide with a photo of a hand holding a smartphone showing live captions that say: “Video captions and live captions are not the same. Live captions are used for events, webinars, news broadcasts. Video captions are used for movies and video recordings and follow a different set of guidelines.”)
Uh, can we hold for a second? The captions aren’t showing up on here. He said “I just realized this.” I can see it on my iPad, but I can’t see it on this. Yeah, they’re on my iPad, but they’re not on here. I didn’t realize. The captions should have been on there from the start. Yeah, I saw that work. Yeah, okay. Okay. Yeah, I’m not sure because she’s actually, she’s going to show a video. So, if it’s only on presentation mode … Yeah, I can just continue forward.
Yeah, it’s a little bit disappointing because I wanted to show the captioning. I’m sorry, I thought they were showing up this whole time, like when we tested on the left. Yeah, so, I also read live captioning. So, that’s what it looked like when you saw that right here. Yeah, I won’t show it through the whole presentation, but if you caught a glimpse of it right before.
So the… So, the live captioning follows a different set of guidelines, and they’re used for events, live events only, and after the event they need to be converted into proper video captions after the event is recorded.
(Visual description: A slide with a photo of iPhone with 3 apps – iTunes, YouTube, Vimeo.)
When Apple launched iTunes, many movies, originally, were not captioned. Now, most of them are, which is great, so then I can watch captioned movies on Youtube, on Vimeo, and different TV shows from different TV networks. Not all entertainment apps have them, but it is improving.
(Visual description: A slide with a photo of a paper and pen.)
The third most important thing I would like to talk about is how I communicate with people in writing. When I was growing up, I always carried a notepad and pen with me in case there was communication problems between me and other hearing people. Now…
(Visual description: A slide with a photo of an iPhone with the Notes app. Then opening Notes and showing how it works.)
Now, with a mobile device, it offers a great function, the app called Notes. So, anybody can type or speak using voice to text to communicate. So many people can communicate with me now or other deaf people using text. And I suggest that if you want to communicate with a deaf person to use your phone and have it ready. ‘Cause it’s a lot faster than borrowing the deaf person’s phone to go back and forth. It’s also great, now, that phones can glow in the dark when you type because that’s not something you could do with paper and pen. The only con is if the battery runs out. Then you’re out of luck. So, I always have pen and paper in my back pocket just in case, but I almost never have to use it.
(Visual description: A slide with a photo of a hand clicking different apps.)
There are so many other very cool apps. Applications and phone settings that benefit me and all the other people with disabilities. When Apple released an iPhone in 2007, the blind community panicked because they were used to phones with a physical keyboard, and they wondered how the touchscreen phones would be accessible to them. But, Apple did a really good job at taking the lead in accessibility and incorporating a screen reader software called VoiceOver in 2009, and later added on more accessibility features. I’m going to show you an example of some accessibility features right now.
This is the Apple accessibility page, and it explains different features that can help people with various disabilities and also people without disabilities as well. So, you see the picture here, this is Facetime, which is great for people who use sign language or people who lip read. I use this to communicate with my grandmother. Before, I couldn’t. I couldn’t hear her. So, now I can see her, and she can hear my voice and I can lip read her. So, now we’re able to communicate directly through each other.
There’s other accessibility features, briefly. You know, people with mobility disabilities, visual, hearing. So, you can use VoiceOver, you know as I mentioned before, the screen reader for blind people. You can do hearing aid compatibility. So, you know, you can listen instead of reading. You can do other switches for mobility disabilities. So, yeah, you should check this out and then, when you go to the Home screen, you select General, and select Accessibility, and then you should check out all the different features that it has and sort of learn about what people with disabilities use. Different features.
Android also has something similar. They have a screen reader software called TalkBack, and they have also other features as well. People with disabilities make up the largest minority, 1.3 billion people worldwide, and it’s about the population size of China. And it controls $4 trillion market. So, it’s really important that products and services are accessible for them.
So, it’s important to not only take the mobile first approach, but also to take an accessibility first approach as well. Many people think that web accessibility is only for blind people and only based on coding. So, it’s supposedly the responsibility of the developers and many people come up to me after the project is done to perform an accessibility audit in their websites in hopes for an easy and a quick fix. But, making a product accessible is a huge work that requires a lot of planning from the early stages of the project.
Accessibility is not a one time thing to check off, but it’s also an ongoing process. It’s like building a house and not thinking about making it accessible until after it’s built. It’ll be much easier and cheaper to consider accessibility needs in the early planning and to implement these elements while building it instead of waiting until the end of the project. This is exactly what applies to websites and apps.
It is the responsibility of all team members to make websites and apps accessible to everyone. They have certain skills to contribute. For example, managers, designers, developers, content producers, and so on. And as a consultant, I help teams coordinate these accessibility efforts.
I also provide workshops on WCAG, Web Content Accessibility Guidelines. And what that is, it is an extensive accessibility checklist that was developed by W3C – World Wide Web Consortium, and it went through several revisions to catch up with changes in technology and new accessibility needs.
The first time it was set up was in 1999, and it was just simple HTML back then. Then, when the web became more complex, adding CSS, JS, ebook, pdf, and so forth, in 2008 they did a major revision based on those changes. And then, when the mobile issues came up, just this year in 2018, they did a third version to solve problems within that and cover some accessibility issues.
And the goal for this WCAG was to ensure that the website or the application is accessible by following four principles, POUR. Perceivable, Operable, Understanding, and Robust. And these principles benefit not only people with disabilities, but everyone. For example, people who have difficulties with color perception, low color contrast also affects many of us. Captions benefit not only deaf people, but also those in noisy environments. Keyboard compatibility benefits not just those with limited mobility, but also people who don’t have a mouse or a broken track pad, or just prefer a keyboard.
It’s also best to test these products early and often and to make the process more efficient. Testing is done by automated tools, performing code analysis, visual inspection, testing with a keyboard, screen reader, and other assistive technologies. And of course, involving people with disabilities in testing.
Now don’t use div or span, if there are HTML elements that provide more semantic meaning. For example, if you want to make a submit button, make sure you use the appropriate tag. The HTML code, for example right here, you would do a tag button, or inputs type. And that provides the semantics for assistive technology. If you just use div, you know, there’s no semantic meaning in that. And so, if you also use the wrong character, like if you just use A, that’s a link, not for a button. Also, if you’re adding, like, different attributes, for accessibility, if you want to make sure that image is accessible for blind people, you know you have to add this alt attribute, so you add that to flower so then they know what it is.
There are many websites with dynamic content, where HTML markup is not enough to make them accessible, and that’s where WAI-ARIA comes in. ARIA is a technical specification, also published by W3C, that explains how to increase accessibility of complex websites and applications. ARIA consists of markup that can be added to HTML, in order to clearly communicate the roles, states, and properties of the user interface elements. They provide additional semantics, and improve accessibility wherever it is lacking.
And here’s one example here. So, one example with WAI-ARIA here, if you’re doing a tooltip, you know, that doesn’t have a special tag. So if you want to include a tooltip, if you want to make that accessible, then you have to include the role, you know, and that’s part of ARIA and div, that’s what it can do. That’s where you can actually use div, because HTML doesn’t have something for that.
Also, you should use only WAI-ARIA when you need it. You should always use native HTML features to provide the semantics as much as possible. And if you don’t always have that possibility, because you have limited control over the code, or because you’ve created something very complex, that doesn’t have an easy HTML to implement, and that’s where WAI-ARIA can be a really valuable enhancing tool.
WCAG does not guarantee full accessibility. It’s to solve problems and make it as accessible as possible. Technology has changed, just like accessibility does, so it’s important that you maintain accessibility, you know, the same way you maintain websites and apps.
Now, I’m going to talk about mobile devices – Apple and Android. They have a different code, but WCAG also applies to those. Just like websites, you have to test them very early.
So, there’s xCode, Apple developed. And accessibility inspector. And you can use accessibility inspector, you know, to see if there’s any errors. And I will show you how that works right here. So, you open xCode, down to developer tool, then you go over to the accessibility inspector, and then you can either do a simulator, or you can connect it to an actual device, like an iPhone, or a tablet, or smart watch, any Apple product. And you use the inspector. This is a simulator that you’re seeing here. And you open the simulator, and you go to the accessibility inspector, and then you pick which one you want, either simulator or the device, and then you use the target, for different elements to make sure that they’re accessible or not. So, you can use inspector for different areas on your phone, different apps. And then when you get to the part you want to check, you do run audit, and it’s saying, in this situation, that it’s too small. Because most of those new WCAG, that this year, make sure that the images are minimum of 44 pixels, so this is saying that this is too small, so it needs to be fixed, and that it also suggests how you can fix that. So they have accessibility settings, too. I mean this is very limited. It’s actually better to test it yourself, you know, using VoiceOver, or a different other assistive technologies, and not only to rely on this, but it’s one tool that you can use. Also, it’s important for iPhone to have clear labels. Because we can see the labels and the words, but blind people can’t see it, for example, so it’s important to have the appropriate label, what it’s saying, and also the right trait as well. Is it a button, is it a link? So, those are two of the most important things to remember with the iPhone. I mean, there’s many more, but that’s an example.
There’s a website for Apple, has accessibility for developers, and it’s a really great tool to use. It’s right here, the web address.
Android has something similar as well, that you can test using, several different tools. Android Studio Lint, Espresso, or Roboelectric, any of those three. And then you can test with the actual device, and you could also use the accessibility scanner, to test as well, to see if there’s any accessibility issues, and how to solve them. And also, TalkBack, you can use as a screen reader for Android as well. But most importantly is that you include people with disabilities to test it themself, because they use it everyday, so they’re the experts in, you know, finding the problem and letting you know.
The Android also has a good developer resource. Which is right here. Developer.android.com/guide/topic/ui/accessibility.
Now, we have talked about the past and the present on mobile, so what happens in the future? AR, AI, VR, ML, PWA. Will PWA take over native apps? I also imagine a future in a hologram, where there’s no handheld devices, or wearables. And maybe, all the code would be in the air. I would love to see captions, to read those in the air. So you don’t have to use the device.
Also, I wanted to share this tweet, it says, it says, “There is no mobile web. There is only the web, which we view in different ways. There is no desktop web, or tablet web. Thank you.” I wonder what you think, and maybe we can share in a social media. Discuss about this quote, because we don’t know what’s gonna happen in the future, you know, what’s after mobile. Is it in the air? We have no idea. We never could imagine this 20 years ago, so we don’t know what will happen in the next 20 years. But regardless of the change in future technologies, it’s important to keep accessibility in mind, for each new technology.
Accessibility is a huge and a constant changing field, so we have to keep on our toes, with the new changes and technologies, and in accessibility. And, if you want to learn more, I provide consulting on user experience, and accessibility. I also wrote a book, about captioning access, and I consult on this topic. I recently gave a TEDx talk about how high quality caption increases audience and ROI for businesses.
And I want to thank the event organizers for having me, and for making this conference accessible to me, based on my recommendations. And I also want to thank Dataforeningen, for sponsoring this. It’s really important, you know, and it’s really great that sponsors can cover your events or your conferences, to make accessibility. Just like people can cover parts for food or so what, you can also get sponsors for accessibility, so it’s a great thing to think and keep in mind, and I’d like to thank them, and I’d like to thank all of you guys for coming as well.