XR tech often has a visual focus, but we have to be careful in that line of thinking; those who are visually impaired might be left out of the conversation otherwise. Yvonne Felix, an inclusion expert at CNIB, knows from experience that the visually impaired – or anyone with any impairment – don’t have to be excluded from XR; we just have to be forward-thinking and make accessibility considerations to include users from all walks of life.
Julie: Hello, my name is Julie Smithson, and I am your XR for Learning podcast host. Today, I look forward to bringing you insight into changing the way we learn and teach using XR technologies to explore, enhance, and individualize learning for everyone. Today, my guest is Yvonne Felix, who works with the Canadian National Institute for the Blind in the Accessibility and Inclusion Division as an experienced community specialist, with a demonstrated history of working in the medical industry. Thank you so much for joining me today, Yvonne. I’m really, really excited to have you here.
Yvonne: Thank you, Julie. I’m very excited, too.
Julie: Let’s start with your role at CNIB. We’ve had several conversations over the last year just on– a little bit about immersive technologies and how it affects people with disabilities, such as being visually impaired. And maybe we can start off with what your role is and, of course, how you got involved.
Yvonne: Sure. First, I’ll start by just sort of explaining why I wanted to work at CNIB and actually how I became interested and involved in assistive technology, and also life changing technology in general. So I was born with an eye disease called Stargardt’s, and it primarily affects your central vision loss. So in the center of your eye, in your macula, you have your rods and cones. And basically, it sends a message to say that I don’t need those photoreceptors for my central vision to work. So it’s considered sort of an autoimmune disease. But over the years — as I did lose my central vision fully by the time I was seven — I learned that I would just have the type of life that required me to use technology. And so it was just really embedded in my day-to-day. I thought everybody’s life was like that. I think I was in the timeframe of the world where technology was becoming very advanced very quickly. I have partial sight, compared to total blindness, and the differences I identify with using my sight. So there are many people that still do have some sight left, but they don’t use it. So it’s the same partial sight just identified, that I do use the sight that I do have left, in comparison to using other devices that focus on sight substitution as opposed to sight enhancement. So that’s how I got involved in CNIB. My role really is about inclusion and accessibility. And accessibility is a component of inclusion. You can take accessibility and it’s just one building block to other components that make inclusion a universally accessible culture, as opposed to just focusing on accessibility as one thing that you need to embed. It’s just one small part of looking at that bigger picture of universal design.
Julie: And this is where we start to talk about immersive technologies becoming– providing superpowers to those like yourself, who are visually impaired. And being able to substitute and to assist you in day-to-day situations and present this technology in a way to help you get through your days in places where it can help you being immersed into something. So it’s really interesting how all of this digital technology in the last couple of years kind of compounded just recently with COVID-19. And maybe you can speak to how the organization is adjusting to becoming remote, and what that baseline was like for the CNIB to be able to convert, and where you play a role in that.
Yvonne: Sure. Our organization jumped very quickly from one week being a completely in-person service with a small focus on virtual programming. So the organization provides programming and services to people with sight loss in Canada. And again, sight loss focuses on that spectrum of partial sight to blindness. And what happened within that first week of COVID, I’m actually very proud to say that we were able to go completely virtual with 127 different programs. And I think part of that success is because– I say this through celebration, because I think it’s very important to always look at that silver lining. But to remember that sight loss is something that is a part of a person, no matter where the inception is in their life, whether it’s from birth or later on. But it becomes very fundamental to how you function. And sort of what’s happening in the world today is that people are experiencing what it is like to have a disability, social isolation, not having access to visual communication, not having access to in-person bodies to be able to do your daily tasks, or be able to communicate. And that’s something that our participants — so we call them participants, not clients — but our participants, our community, and our staff deal with on a daily basis. So we have the technology, we have basically our community, our staff, people with sight loss, people who work with people with sight loss. We have that superpower already, to utilize and engage ourselves in the technology that we had and the platforms that we had. So for my role, it became more about that creativity and figuring out how do I build with something that’s not tangible? What are all the different features? And what can we do to make the experience more engaging than it would be, rather than just having a regular meeting? How do we encompass inclusion? So how do we make sure everybody is able to participate? And there’s these really amazing things that are happening. One of the meetings that I had last week, I was talking with someone who worked in another part of the country — because we have 55 locations across the country — and that individual is deaf-blind. And we had a conversation about the fact that we would never talk this much prior to COVID happening. I’d like to acknowledge that it is a pandemic. There are things going on that are very frustrating, and very scary, and we’re all staying safe. But what’s really beautiful about technology and the way the organization is functioning is that our employer– we are able to stay safe, but we’re also able to do our jobs and still provide service to the community at large.
Julie: That’s an incredible story. Can I ask what– did you use a specific platform to communicate with those individuals?
Yvonne: We actually use several. So the accessible platforms that we’re using, that we’ve been able to be able to give more than one way of communication or language is Zoom. We provide our meetings and our sessions to our participants — internally and externally — through Zoom. And the reason why we’re able to communicate is because there is a video option, so that an intervener or an interpreter can sign while the individual is watching and we take pauses. So there’s a whole sort of format that we use to meetings and different type of activities that we use. And then on top of that, there’s– we also have a closed captions. As an example, last week we were doing a session on– we have “Lunch and Learn.” So basically everyone across the country gets together during lunch, and we have a really engaging discussions or we have activities. And I was able to source a YouTube video that already had closed captioning. We had an intervener. We also have chat. So we kind of use all the different ways that Zoom or Teams– Microsoft Teams, that’s a very similar platform. So we use mainly those two types of platforms to communicate, because of the accessibility features.
Julie: Are you working with businesses right now, to help move their employees? Or is that something that CNIB does, support businesses and organizations, especially now because everybody had to work remotely? Is this something that you’re helping or your team — the CNIB team — is helping businesses adjust?
Yvonne: Yeah, absolutely. Our organization, we focus on advocacy. We never turn anyone away, whether it’s for profit, not for profit, whatever your need is. Some of the outreach has been because we do work with technology quite often and our organization, we’re acknowledged as an organization that supports users from all different levels, whether it’s entry level or a business where they have someone with sight loss. We have a program called Come to Work. And so that program advocates for not necessarily the person with sight loss, but helps teach a business, what are the things you need to know from an inclusive standpoint, to have someone with sight loss be able to apply and become a member of your team? So we’ve had outreach, I would say from the moment this started globally, just being able to have that conversation with businesses on how to be inclusive. And it’s been really incredible, because a lot of the conversation and a lot of this — not to be coy — but this sort of lifting of the blindfold. Or having this a-ha moment of, “Oh, we’re all in this situation now.” And being able to provide that support to people and have that extra layer of compassion, empathy, and understanding has been a really incredible opportunity to be able to advocate on behalf of the community, but also to be able to give businesses the tools that they need to be inclusive.
Julie: It’s so nice to hear during such a troubled time right now. For many organizations and businesses, to be able to move remotely and to make sure that they stay inclusive. The other day we were in a call and we started to talk a little bit and I’d like to explore this a little bit more. As we move towards further immersive technologies like VR and XR platforms where we can still come and meet. But I think moving towards a more sophisticated emoji system or avatar system, where we can start to identify right off the bat who those are in a space that would be visually impaired — or hearing impaired, whatever their challenge is — to connect, and where the rest of us who are able to hear and see, how do we identify these users now in these spaces? And what is that gracious introduction and what does that look like? That is not offensive, but is almost necessary so that we can understand the types of communication that we can build on.
Julie: Is that something that you’ve talked about, moving into more of an avatar description in these experiences?
Yvonne: It’s really interesting, because it really is– I like to call it the Wild West. We are really in the middle of a disruption of how we perceive information and how we process information. And we’re able to get this deeper understanding of who we are as humans, but also how are we using technology to represent ourselves? And for the most part, it’s been through sound or written language. When you take a few seconds to read Twitter, that’s words. So we’re we’re mainly using words. But when we start to think about who we are in a virtual space, we’re talking about touch. And even right now thinking about touche is something that is almost taboo. We’re trying to avoid touch. But for the sight loss community or the deaf-blind community, you can’t take that away. So what does that mean in a virtual space for the sight loss community? Something that I find really interesting right now is playing with sound and understanding proximity, and identifying oneself through sound and a tone of voice. It means everything sometimes being able to decipher is that individual in this meeting? Do they have something else going on? Or is that their voice that says, “I have kids in the background and I’m trying to get everybody quiet”? Or is there something else going on? Just having that safe space of either having a visual to represent oneself or a movement or a sound and being able to use sound for proximity, being able to use sound to identify when someone comes in and out of a room. Those things exist outside of using VR. But how do you incorporate basically how information comes into your ears and you feel it through your whole body? And what does that mean? So being able to translate that type of information of the day-to-day activities and how one functions through a space, being able to understand projection, being able to understand even anticipation. So a lot of times when someone is going to navigate– I’ll just call it the tangible world. And you’re using orientation and mobility. There are all these key signifiers that tell you how to use a cane and how to landmark and how to memorize where you’re going. So how do you take that information and move it into a digital space? And this is sort of my personal experience in just trying to test different platforms to see if this is doable for virtual conferencing. I personally find that you actually are able to navigate a little quicker if you’re using the tools that are there in terms of sound proximity, and I’ve seen a couple like that. When we’re talking about, again, language and identifiers– you brought up emojis, and emojis are something that when I use my phone and I’m using screen readers or I’m using voiceover, let’s say smiling face will say “smiling face”. It will read that to me, so I don’t necessarily see it. And it’ll give me the options for different colors of that smiling face. When we’re looking at things like a thumbs up or someone walking, there are no emojis that identify what my life is like to talk to somebody else who also has sight loss. I have siblings who have sight loss. There isn’t that visual language that’s in this emoji that represents things that I want to talk about or converse on a daily basis. And this is this is nothing new. It’s just new to the fact that there is now– technology is advancing and we’re using more visuals. What I described when I talked about the emoji face, that’s called “alternative text” and that alternative text is great, but I don’t see myself in alternative text. Like, I don’t feel necessarily included, because there’s sometimes parameters around how do you actually identify a visual with alternative textures, all these different standards and guidelines. So it would be great if I could just send an emoji of a cane or– I’m just trying to think about things that I have around my house that probably give other people– magnifying glasses. There is no emoji for a screen reader. Like, how do you even have a picture of a screen reader? It’s software. To me, it’s a language. It’s a type of language, it’s not a blindness language, but it’s a language that community would use.
Julie: Yeah. It’s almost like an accessibility language needs to be developed out of this new immersion and the space of collaboration, because we need to have that transparency of who you are when you’re communicating with somebody. And this type of thing would certainly open that up to helping people identify who Yvonne Felix is, compared to somebody else. It’s a really interesting time, as you said, it’s the Wild West of being able to figure out what these solutions are. But I have no doubt there’s probably a startup sitting here listening, thinking that they were already starting to work on this, and maybe they can come up and develop something for you.
Yvonne: I’m always on the hunt. I’m always looking for technologies that are like, “Oh, I bet that company isn’t even thinking about the sight loss community.” And like, that’s– it’s always exciting to be able to reach out to an organization or a company and say, “Hey, did you ever think of this?” Like, “This would be useful if you just tweaked it here or there.” But yeah, it’s no doubt there are lots of very intelligent, ambitious people out there that want to make the world an equitable place.
Julie: Absolutely. I wanted to just jump back to something that you were talking about with spatial sound, and how a friend of ours that I introduced you to — Thomas Logan — he is hosting these accessibility community groups inside of Hubs, to help people with accessibility challenges to understand the spatial world. And it comes back to learning about this three dimensional space that is being presented to us. You know, right now it’s mostly on 2D screens and this is where XR comes in. And you’re going to be able to put your headset on and then start to hear spatially — for yourself, anyways — be in that room, in that community room with others, and understand the spatial sound. And this is where we have to start teaching and immersing ourselves in this three dimensional world, and where spatial computing comes in. And that’s such an important component right now for us all to understand what those next steps are to become engaged and begin to collaborate. Right?
Julie: The other thing I wanted to note in supporting the accessibility for every community and introducing inclusivity. We launched in the XR Collaboration website — just recently — an accessibility tab that you can adjust the vision on it. And I wanted to see if you had a chance to try it, Yvonne, and what your thoughts were on that.
Yvonne: I did have a chance to try it. I thought it was awesome. And one of the reasons why, other than the fact that you can access someone’s website — which is always wonderful — the language that is used to identify what the need is or the adjustment that you’re making — and I apologize, I don’t remember them off the top of my head, but — it was very acute in identifying what is it that you want to do? So in most cases, when you go on a website, if there are options to be able to change the font color, or if you go into any accessibility feature — and again, this is sort of– I’ll play into why this is also very useful — is that the user themselves — so someone who has an accessibility requirement — usually they’ll be features in the operating system that they will access. Now, some people don’t actually know how to do that for the first time that you’re either turning on the device, or if you are just learning how to use technology. So it’s actually very useful to have something as a plugin or embedded in your site that they can access to do that on the site itself. Also accessibility features, you have to go into settings when you use the operating system and change the settings depending on the site itself that you’re using. So if a site has a color scheme — let’s say, blue and white — and that’s not the color scheme that works for you, then you would have to go into your accessibility features and change that. So the fact that you have that option with the features that you’ve provided on the site, being able to do that in the moment and change things as you’re going from screen to screen, it makes a huge difference, instead of having to always go in and change it from the… sorry.
Julie: The settings.
Yvonne: The settings! [chuckles] Yes.
Julie: So just to recap, XRcollaboration.com, recently we released an accessibility tool. If you go to the website and take a look at the top left corner, Yvonne’s talking about how you can actually scroll down into increasing the text or putting it in grayscale, high contrast, and negative contrast with a light background, or with a readable font. You can do this live on a website, to be able to provide that immediate accessibility — for those who are visually impaired — to highlight and increase the font size. I think even my mom enjoyed just taking a look at it, just because she’s almost 80 years old, so having an increased font size just for somebody who would like to have that font a bit bigger is even a great tool for anybody to have. You can take a look at it. As I said, it’s a free API called Pogo and we embedded it into our website so easily. And I think this is one of those steps where we need to start providing these digital tools to the accessibility community, to invite them, to make sure they’re included, to provide the tools that will break down those barriers to entry for communication.
Yvonne: You’ve brought up a very good point, though, that accessibility features are for everyone. And even in sight loss, low vision, there are roughly 400 million people expected within the next five to 10 years to actually have low vision. So that’s not necessarily on the spectrum of blindness just because of acuity, but being able to access those types of– the type of features where you’re talking about contrast and inverting colors. That is a universal design, that is far more inclusive than saying that if you have sight loss, here are features that will work for you. Others say it’s more productive to be prepared than it is to just rely on the technology or the accessibility features within whatever device someone is using.
Julie: So, Yvonne maybe just we’ll wrap it up with telling everybody, how can they reach you? And I guess if they have any questions with regards to what you’re working on.
Yvonne: Yes. So you can reach me in a few places. Yvonne Felix on LinkedIn, I will accept you, if you would like to connect with me. You can reach me at Twitter, @YvonneRFelix. And you can also reach me at CNIB at firstname.lastname@example.org.
Julie: Why don’t we have you close with a brief lesson for everybody about inclusivity, inspiring everybody to make sure they make the next steps to include everybody like you into their conversations?
Yvonne: Absolutely. One message that I’d love everybody to just take away is that you can work with the community and not just for a community. So embedding accessibility in your practices and just making it the way you do things means that everybody gets to participate. Whether it’s technology that you’re developing, having meetings, or just sitting around the dinner table with your family. Just do it with people, not for people.
Julie: And with that, that’s a great way to wrap up this XR for Learning podcast. Thank you so much for being with me today, Yvonne, I hope everybody enjoyed this session on accessibility. And thanks again.
Yvonne: Thank you. It was wonderful.
Looking for more insights on XR and the future of learning? Subscribe to our podcast on iTunes, Google Play, or Spotify. You can also follow us on Twitter @XRLearningPod and connect with Julie on LinkedIn.