Preview Mode Links will not work in preview mode

EBU Access Cast


EbuAccessCast 27 transcript

Dec 13, 2020

Transcript of EBU Access Cast - episode 27

 

Introduction

 

(jingle)

This programme is financially supported by the fundings from the European Commission.

You are listening to the EBU Access Cast. The first official podcast from European Blind Union about assistive technology for blind and partially sighted people.

And here are the hosts.

 

(Mario) Welcome, welcome to the 27th episode of the EBU Access Cast. My name is Mario Perčinić, I'm coming to you from Luxembourg and I have with me a bunch of my friends and some new guests in the show.

 

Yeah, yeah, yeah, we travelled away up to the North Pole because we have a new person in the show, so let's wish him a warm welcome from us, hello to Hlynur Þór… and what was your last name? Hahahaha.

 

(Hlynur) Hlynur Þór Agnarsson

 

(Mario) Yeah. Welcome.

 

(Hlynur) Thank you very much. Nice to be here.

 

(Mario) Yeah, it's the first time we have the Icelander in the podcast.

 

(Hlynur) I'm glad to be your first.

 

(Mario) Actually, we also contributed some long time ago with Birkir Gunnarsson.

 

(Hlynur) Yeah, a good friend of mine.

 

(Mario) Yeah, who is in the States, hello to Birkir.

 

(Hlynur) Busy man.

 

(Mario) Yeah, yeah, and except Hlynur we have Tanja from Luxembourg, hello.

 

(Tanja) Hello Mario, hello everyone.

 

(Mario) Bart from Belgium. Hello Bart.

 

(Bart) Hello everyone.

 

(Mario) And Pawel from… where are you now…? Austria.

 

(Pawel) Hahaha. From Poland in Austria. Yes, hello.

 

(Mario) How is it going in this cold and contagious times?

 

(Tanja) Luckily we have Internet.

 

(Mario) Yeah, that’s true.

 

(Pawel) Just wanted to say, if it wasn't for Zoom, it would be quite lonely.

 

(Mario) Ooooh.

 

(Tanja) Yeah, and online shopping, you know, it would be really difficult. Now, it's quite easy with this. I mean, it helps.

 

(Mario) Absolutely.

 

(Hlynur) Have you finished buying all your Christmas presents?

 

(Mario) No, actually, we didn't start. But yesterday it was Black Friday, by the way.

 

(Hlynur) Yeah, I finished before Black Friday even. Mostly I’ve bought online.

 

(Mario) Yeah. Yeah. Anybody got some new gadgets this month?

 

(Tanja) Well, yes, I bought a new laptop.

 

(Mario) Really?!

 

(Tanja) Because my old laptop was already 6 years old, even though it was still working more or less OK, but as it was old, some features were not well supported. And 1 of them was that with the old laptop, I couldn't use the virtual background on Zoom, which is quite helpful if you’re giving presentations on Zoom. So this was maybe the last reason where I decided, no, I need to buy a new laptop. And, I decided to buy an HP, 15,6 inches from the series Omen 15. It is quite powerful. It has I9 processor and 32 gigabytes of DDR RAM.

 

(Mario) What?!

 

(Pawel) Wow.

 

(Bart) Hehe, well done.

 

 

(Tanja) I wanted a really powerful laptop. I wanted that it flys and I have a plane.

 

(Pawel) Are you going to be a gamer now with such a powerful equipment?

 

(Tanja) Almost.

 

(Mario) I think it is a gaming machine.

 

(Tanja) It is a gaming machine.

 

(Pawel) That’s what I thought.

 

(Tanja) But I wanted to powerful laptop. So, then I have it. It has 512 SSD disk plus 1 terabyte of SATA disk, which is much more compared to what I had in the previous laptop because the other one had only SSD disk with much, much less capacity.

 

(Tanja) But, there is a small issue that I encountered, I hope that it will be fixed now. The laptop was restarting from time to time on its own, and it restarted in moments it didn't had to. So I hope it is fixed. We found out that it may happen on Omen models of HP, maybe because of graphical card, but at the moment I'm really not sure what is the cause. So I hope I will not restart at one point during the recording. But besides this, it's amazing. It's really amazing. It's 2,6 kg heavy which for aluminium and 15 inches is fine. It's larger compared to my old laptop. I had the 12 inch laptop before. So while it's bigger, but it's convenient because it has numpad as part of the keyboard.

 

(Mario) Oh yeah. That’s good when you’re using a screen reader.

 

(Tanja) It is convenient for us, for the insert, for the numbers. OK, depending what you want on the keyboard, but it's very convenient for us.

 

(Bart) How about the arrow keys? Because I see the tendency that up and down arrow keys are using the same space as left and right. And I think I will have a hard time getting used to that. How about yours?

 

(Tanja) Mine is standard.

 

(Bart) Also something I would look for in it.

 

(Mario) But it’s a good question. I mean, Bart raised a good question because for example, I have the HP machine, which is from 2017 and it also belongs to the gamer's laptop, but it's Pavilion's series and its up and down keys for the arrows are, let's say, a little bit squashed, which I got used to it in 2 days, but maybe 1 day at the beginning it seems like yeah, it could be a problematic. But after a day you get used to it.

 

(Bart) Even if you used a different machine at work or...

 

(Mario) Yeah, yeah, yeah.

 

(Bart) Ah, OK. I think I would have a hard time switching.

 

(Mario) But let's, let's say that, for example, when you would put the 2 of… Actually it is like that. So when you put up and down keys together, they are the size of another key. So it's like half of the keys up and half of the keys down. But this is on my machine, I think on Tanja’s machine…

 

(Tanja) No, mine is standard. And I don't have another key close to the arrow keys, which is also convenient because it is like on the standard keyboard. I don't have a right control key.

 

(Mario) Which is kind of becoming standard today.

 

(Tanja) Yeah, well, I have less and greater on that key.

 

(Hlynur) I think I've never, ever used the right control in my life.

 

(Bart) Control + a I think I would use it then.

 

((Mario) Hahaha yeah.

 

 

(Bart) Control + a, control + c, I think I use the right one.

 

(Mario) Yeah. But you can do it with your left hand.

 

(Bart) I'm sure you can. Yeah.

 

(Hlynur) But there's a beautiful thing in having the numpad on your laptop.

 

(Mario) Sure. Yeah.

 

(Hlynur) I usually have it when I buy laptops for myself since I am visually impaired. So I'm usually looking at around 17 inch laptops which makes picking(???) one much easier for me since there are not that many around. But my work laptop is what I think it's 10 Latitude, something like that. 15,6 inch. And of course, I use external keyboard and mouse and 2 23 inch screens while working. But I sure do miss the numpad when I'm working at home.

 

(Tanja) Yeah, well, for me now, it's a luxury to have also the numpad, because I did not have it for 6 years previously. So now I suddenly have a bigger keyboard, but, well, I don't have the right Windows Key, which is also becoming a standard to not have it.

 

 

(Bart) The application key.

 

(Tanja) Yeah, the application key. Well, for the rest, the home and page up, page down are used in combination with the fn and the keys are well separated like on a standard keyboard. So the keyboard is very pleasant, the keys are well separated, the keys are not attached so that is difficult to distinguish, you know, blocks of F keys or numpad.

I think it's a good choice. I really like the laptop. The most important it is very, very fast. And I hope it will not crash again.

 

(Pawel) Can you actually see that? Because it's a gaming laptop and it's so powerful your assistive technology runs faster with it?

 

(Tanja) Well, you know, everything runs faster.

 

(Pawel) OK, fair enough.

 

(Tanja) Compare to my old laptop it has a much powerful processor and the ram and everything. I'm not sure regarding specifically screen readers because everything runs faster, including Screen readers. So even when I'm starting, for example, JAWS that on the previous laptop would need like 15 seconds or so here it starts much faster. If I'm entering a heavy page full of videos and pictures now, it just opens like a Notepad.

 

(Pawel) Oh. That’s great.

 

(Mario) Hahaha. That's a cool thing when you have enough resources absolutely.

 

(Tanja) Yeah. I noticed, for example, that when I switched to my current machine from the laptop that I used to have before, which didn't have SSD and it had just 4 gigs of RAM, which was OK for that time. But yeah, when I went to the SSD machine… Pfff, yeah, man, the things started flying around so I can yeah, I can just imagine. I was reading also that those Hps also have quite cool design in the cooling system, so the air is blowing from different sides.

 

(Tanja) Yeah, exactly. It has the fan on the right of the laptop and on the back. Well, it's much larger compared to what they had before, probably because it uses so much…

 

(Mario) Yeah, it's much more powerful machine.

 

(Hlynur) Is it less noisy? Because how large it is?

 

(Tanja) Well, it is more noisy compared to my previous… But normally it is not using the full fan all the time. Only when you are loading something that takes more resources. But normally it's not noisy.

 

(Mario) All right, so congratulations on your new purchase.

 

 

(Tanja) Yeah, well, I think I will not have any Christmas purchases. Yeah, it was quite expensive, but I'm really happy.

 

(Mario) Nice. That's cool. Anybody else two report?

 

(Bart) I so far could resist, despite all the Black Friday emails trying to make you buy things, I think I should have a new iPhone because I'm still loving iPhone 6, which is going only to iOS version 12. Yeah. Now, on one hand, it's still running fine. I still go even if I never change the battery. I still can get through the day with the battery.

 

(Tanja) Wow.

 

(Bart) So on one hand, yeah, there are of course interesting improvements in iOS 14 that I don't have. But can I live without them? I think yes. So I'm in a doubt, but there is no compelling reason at the moment to replace it.

 

(Hlynur) Of course, even though you can live without them, it doesn't mean you have to.

 

(Mario) Yeah, of course.

Actually I ordered a new headset for me, but it's not here yet, so I cannot say anything about it. It will arrive next week. And if everything runs fine, then I will have something to report for the next podcast. But yeah, I ordered for Black Friday the new Aftershokz Aeropex which are the newest bone conduction headphones from the aftershocks and one of the reasons why I ordered them was because people said that finally the issue with usage in the VoIP services such as Skype or whatever we are using now teams, these headsets works fine because it's a Bluetooth 5 compatible. So it should be running fine. But I'm waiting to see how we run for real. And yeah, they're saying also that it has now magnetic charging for it instead of the USB and some things relating to design and the projection of the base and stuff is much better than the previous ones. So, yeah, I'm looking forward to see, but, yeah, more about it in the next episode, probably.

 

Ok, so after we did our gadgets corner for the beginning, let's jump into the news section and see what we have for this episode.

 

 

Accessibility in the news.

 

Android Talkback Update (2020): Multi Finger Gestures

 

Mario: And for this episode, the first. Thing comes from Google or to see from one small company called Live Accessible. They are based in the states who actually compiled the list of all the Multi finger gestures for TalkBack, which just have been released 2 months ago. So eventually once and I'm really happy to see that somebody finally came up with those gestures and a very cool YouTube video, because unfortunately, Google didn't come up with the official explanation for the whole thing related to the. Multi finger gestures in Android 11, which is reallyreally ashame.

 

Hlynur: I cannot imagine why they wouldn't brag themselves about this one.

 

Bart: You would think they are proud of having them now?

Hlynur: Yeah, you would think so.

 

Mario: The thing is that I honestly think as I'm the Android user and I pretty much agree that this is something that we were really waiting for a long, long time. And this is one of the best accessibility improvements in the next. It was a long time since we got something such a revolution.

 

Pawel: Yeah.

 

Hlynur: And we're also catching up with iOS on a lot of things, that's for sure.

 

Mario: And right now, when you look at this list of those gestures and the functions that you can do, you can really see that Android is now very next to the iOS in terms of usability. However, for unknown reason in the official Google help pages from theAndroid Accessibility Suite, you cannot find anything about it, which is a total shame for Google. Guys Come on.

Pawel:  they take their sweet time to catch up quite often. If you look at how often the code base of TalkBack on GitHub is updated, even though they made some kind of declaration, it's going to be open source. It's not like they abandon it. It's more like every once in a while they come up with, OK, we have this GitHub repository, OK, we should probably update it and then you have it. And then it's it's not like they develop on this repo, and you see all the changes come in. It's just like every once in a while, once the production cycle is over for a certain version.

Mario: Yeah.

Pawel: The will update it. And I have a sad feeling there might be a similar thing going on with the help pages.

 

Mario: It might be, but it still it does not bring justice.

 

Pawel: Now, of course, we just know it doesn't have to explain. It points to the source of the problem. Yeah.

 

Mario: I mean, what's really interesting is that, you know, somebody who's totally unaffiliated to Google took its time, did the proper video and did the really good HTML presentation of all the gestures. What did they do? How do they work and stuff like that. And Google didn't find the time to do that. I mean, come on, guys, that that's not OK. Yeah, that's true. Yeah, absolutely.

Hlynur: But nice work,

Mario: but nice work from the guys from accessible. Absolutely. So, yeah. If you're up On getting an Android 11 and you were wondering what to do. Check out the link in our shownotes for the liveaccessible website site and all the gestures.

 

Pawel: There is also because we mentioned that there are some additional options also coming, like the stopping and starting of multimedia content. And now it turns out I saw a thread on Audiogames forum. One of the users got a hold of Android 11 and there are some more undocumented features to,

Mario: no way what?

Pawel:  For example, there is a gesture that lets you pass through the next gesture out of the accessibility services.

 

So if you want to perform that, just.

Mario: Yeah, that's what that's the 4 finger thing. Yeah, yeah, yeah.

Pawel: And also the the selector was redesigned and now if you use the selector you can set every point of the granularity options as a separate thing. It's not like one granularity option. There are just different granularity options that you can switch between where its characters lines language because language is a new option as well. And.

Mario: speech rate.

Pawel: Yeah, exactly. So it's not like one granularity and the gestures just to the different levels of it. You can just put the granularity options individually in the selector as you want.

 

And also the language is finally there so you can change languages on the fly and you can also switch the screen curtain on the fly, the dim screen.

 

Mario: Yeah, so I mean, this is this is what I was saying earlier, this is really becoming very, very similar to the iOS. Now eventually this is very similar to the Rotor options.

Pawel indeed.

Mario: So, yeah, well, actually, I cannot wait for my Samsung to be updated to theAndroid 11, which should be soon. But yeah, I don't know, maybe I'll get it for, for Christmas.

 

Pawel: My Motorola will not be I'm pretty certain but. Well that's something to look out for, for the next phone I get.

 

Yeah.

 

Hlynur: So May I ask?

Mario: Yeah sure.

Hlynur: Ok, so I'm at work since I've been using Screen reader, JAWS for probably a year now, but I don't use them regularly for myself, this is TalkBack that we're talking about. And now I have a Samsung and I've been using voice assistant instead of TalkBack just because I had to start somewhere.

 

And this was just. The voice was above the TalkBack, so I turned on the voice assistant, would you recommend for me who is, as I said, not using, but I'm doing accessability researches on certain apps and certain pages through mobile phone. Would you recommend to using Bose or just switching over to TalkBack with those new finger gestures?

 

Mario: Yeah. May I ask you which, which exactly phone do you have.

 

Hlynur: I have the Samsung Galaxy. A 7 0.

 

Mario: A 70 yeah, 70 is one of the phones that probably will get updates to Android 11. And in the previous episode we were talking about that all Samsung devices, which will get updates to. Android 11 will be updated also with the new TalkBack, which will replace the VoiceOver, so there will be no more Voice assistant.

 

It's going to retirement.

 

Hlynur: So, yeah, you don’t need need to choose anymore.

 

Mario: No, you will not need to choose anymore and basically all the things that we just talked about. Will apply also for the old Samsung devices, which will go through Android 11 so yeah.

 

Bart: it's good to know it's unfortunate for the existing users, but for your testing, it will be easier. Yeah.

“Everyone laughs”.

 

Hlynur: So I might just switch now.

 

Mario: You will be forced anyway, so, yeah, so if you want to just get some practising, yeah, it would be wise to switch earlier and get familiar. So, you know, when the time comes that you don't get lost.

 

Hlynur: I guess that you now know what I be doing for Christmas. Yeah. Yeah.

 

Zoom has a nasty bug for screen readers

 

(Mario) All right, so after we talked about some new things for the Android, Tanja discovered something really interesting and that's unfortunate for Zoom and that's basically a bug.

It's a Zoom bug that is even not documented yet. I tried also searching around on the Zoom tickets and I couldn't find it. Googling also doesn't bring it up, even though the bug is happening to almost everybody, and it's the thing connected to the ...

 

(Tanja) ... to the screen reader.

 

(Mario) To the screen reader and the thing is that Zoom is reproducing some error when you're trying to use the virtual background.

 

[Music]

(Tanja) This is the demo for the Zoom bug.

When you are using a screenreader, JAWS or NVDA, and the virtual background when the video is turned on. I will open Zoom now.

 

(Jaws) Start window, Zoom app, Zoom window, starting a new meeting with video on button.

 

(Tanja) And go in the settings to see where you can change the virtual background.

 

(Jaws) Setting button.

 

(Tanja) You go in settings from the main window of Zoom.

 

(Jaws) Settings window, list, general selected.

 

(Tanja) I tabbed, and the first option is General.

 

(Jaws) Video selected, audio selected, share screen selected, chat selected, background and filters selected.

 

(Tanja) Background and filter is the option that we select with the space bar.

 

(Jaws) Background and filter selected. Cancel button, Ok button.

 

(Tanja) And immediately already there, because I have the virtual background already selected, I have this error popping up that doesn't read anything. It's only OK and cancel.

 

(Jaws) Cancel button, OK button.

 

(Tanja) So I will press insert+B to try to read more.

 

(Jaws) This dialog. Ok button, Zoom window, system menu bar contains commands to manipulate the window, alt plus ...

 

(Tanja) And it says AOMHost  but actually NVDA says Aomhost. But when JAWS is enabled it tells something different. Again, a message that doesn't make any sense. So I will go to ok.

 

(Jaws) Cancel button, ok button.

List, this dialog, ok button.

 

(Tanja) It pops up again. I will press again Ok.

 

(Jaws) List, this dialog, Ok button

 

(Tanja) And again Ok.

 

(Jaws) List, background and filter selected

 

(Tanja) And it disappears. So it always appears 3 times. I will tab to go through the settings of the virtual background.

 

(Jaws) Rotate 90 degrees button. Virtual backgrounds tab selected checked.

 

(Tanja) So it is selected.

 

(Jaws) Add image or video button.

 

(Tanja) I have an option "add image or video". I selected image.

 

(Jaws) List, background IDPC annual plenary meeting selected.

 

(Tanja) And here I have the name of the image that is selected. It happens the same with any image, even with the default images from Zoom. It is absolutely the same.

Ok, so I will go back

 

(Jaws) Setting button.

 

(Tanja) and start a meeting

 

(Jaws) Starting a new meeeting with video on button. Ok button, cancel button.

 

(Tanja) And then again.

 

(Jaws) You are using the computer audio alert. Ok button.

 

(Tanja) We have this annoying pop up, I will this time, I will press Cancel.

 

(Jaws) You are in a meeting, ok button. Cancel Button. Return on this dialog. Ok button, cancel button

 

(Tanja) And cancel.

 

(Jaws) Zoom window, return to meeting button. Schedule button. This dialog. Ok button,

 

(Tanja) Ok, here it is again.

 

(Jaws) Cancel button. Schedule button.

 

(Tanja) If you All+tab and come back into Zoom, it will disappear. But again, if this error message appears, the video is unstable and the virtual background unfortunately appears and disappears. And if you select a virtual background and you expect that it works, it is somehow unexpected that others, the others see suddenly you will without background. So it makes the video unstable unfortunately. And the way how to avoid it, is to turn off the screen reader. Learn the steps, how to activate the video without the screen reader and then turn the screen reader back again. This is the workaround that I used and it worked, but it is quite tricky. So for the moment, we can complain to Zoom and hope that this will be fixed as soon as possible.

 

[Music]

 

(Tanja) Yeah. So if you want, I can explain?

 

(Mario) Yeah.

 

(Tanja) Yeah. So I discovered the bug at the same moment when my laptop arrived. So I also did not suspect in Zoom that there is a bug in Zoom that gives error when you're using a virtual background and screen reader at the same time. If you're not using virtual background and you have without it, it it will not pop up this error. But if you have a screen reader with a virtual backround  and the video on. In the moment when you activate the video or when it is activated by itself, there is an error that is not even visible on the screen. So it was really difficult for me to discover what is causing this issue. I was thinking also is the graphical card from my new laptop. It's some drivers, it's it's whatever. But then at the end it was Zoom. We discovered it also on Mario's laptop, the same. And it happens in the last update from Zoom in the last version. And there is no way how you can avoid it. It happens with JAWS and NVDA. A workaround is to turn off the screen reader and activate the video and turn on the screen reader again. The problem of this error is not only that it pops up and it gives you an Ok and cancel button and then you press either of them ok or cancel and then it pops up again. You press it again, it pops up again. So it's really annoying. It also makes unstable your video. So on the other side, people will see you with the background, without, with the background, without. So it will change by its onw without any

 

(Mario) with no explanation actually. Yeah.

 

(Tanja) Without any changes on your side. So it's, it's pretty annoying and it's inconvenient for us when you are delivering a presentation and you're not aware if you have a background or no. And also this annoying dialogue. So we, we have to report this to Zoom because it's it's impossible that it happens just like this with the screen reader and with the virtual background.

 

(Mario) Yeah. As Tanja mentioned, I also tested the things on my machine, which is like 3,5 years old. And it does happen on my machine as well. However, why is it only difficult not to spot it is because if you have virtual backgrounds turned off, your video will start and stop without any errors and there will be no problem. However, once you decide to use the virtual background, yeah, then you are starting to have a problem. And I thought at the beginning when we noticed that, maybe it's a local thing, and we were asking a few friends who connected remotely, but they couldn't see it on the screen. And we were like, OK, maybe it's a remote thing? I had a friend of mine here and he was also not able to see and it was OK, this is really bizarre and the screen readers are detecting something, but, yeah, eventually.

It's kind of tricky thing and it's really sad that it helps only if you're running screen reader and as I was saying, the bug is not documented yet.

So we'll have to raise the issue.

 

(Bart) It is surprising that ... it is good that you tested it with and JAWS and NVDA. We know they use a different method of accessing the screen information. So, um.

 

(Tanja) Yeah, yeah, well, and it's really inconvenient because you think, OK, once you access a meeting, you activate the video once and it's fine. But if you're in a meeting where they use breakout rooms, your video stops and starts automatically. Again you will have this error message. And when you are going back from the breakout room to the main session again this error. So it's pretty annoying.

 

(Bart) I've never tried to set a virtual background, but I understand why you would want it to work. Yeah.

 

(Tanja) Well, yes, it's pretty convenient in the professional environment. You don't want others to see your place. We are all working from home. So, I mean, why not?

But then of course, it shouldn't give you these errors, and invisible errors, even worse. So it's difficult then to discover what is the issue. But actually it is a bug.

 

(Mario) Yeah, Pawel, did you notice that as well on your side?

 

(Pawel) I have a feeling I tried the virtual background some time ago, like a month or 2. I'm not sure if that really worked.

 

(Tanja) No, no, at that time, it was not the case because actually one of the reasons why I decided to buy a new laptop was one of them, the virtual background, and I tested it on another laptop and it worked perfectly. So I said, yeah, this works fine. You know, it's really the time to have a new laptop. So it's 2 weeks now.

 

(Pawel) Ok, that must be pretty new. I haven't checked it yet, but I will need to do that. Maybe we'll find a workaround.

 

(Mario) Yeah. Just one thing came up to my mind right now, but that's just a question of internal testing to see whether it works. Because both screen readers, JAWS and NVDA have a sleep mode. So I'm wondering if this error will come up, if you just enable the sleep mode so you don't need to toggle the screen reader completely off.

 

(Tanja) Oh, it's almost, it's almost the same. I mean, you don't hear it.

 

(Mario) Sure, you don't hear it, but the screen reader stays kind of active in the system, so I'm wondering whether this error will appear or not. This is just one.

 

(Tanja) Well, it's still a work around that we have to do to avoid this error.

 

(Bart) It shouldn't be necessary to do this.

 

(Mario) Yeah. Yeah,

 

(Tanja) yeah. Yeah. Well. We have to report

 

(Mario) We need to report it

 

(Tanja) Absolutely. And if you notice this too on your side, report it as well so that Zoom makes the change.

 

(Mario) Zoom knows what to do.

 

Microsoft Teams should have Slack and Zoom on alert with its latest update

 

Mario: Ok, so since we are on the virtual chat platforms another news is saying that Microsoft teams should have Slack and Zoom on  alerts with these latest updates.

 

Now, why is that? Well, because Microsoft is really pushing forward with the development of many of their software and teams is being one of them. It's being very much used since the beginning of the pandemic's. I started, for example, professionally using it at work as well, so I'm pretty much using it on the daily basis for the last 6 to 7 months. Or even longer. Yeah, and I remember when it was the beginning of 2020, Microsoft said that during the later time of the year, they will also allow usage of the teams from the personal Microsoft accounts, because before you would need to have the office 365 account associated with things that you are able to use it. And since like 2 weeks ago, they deployed new updates and new updates also allow you to use your personal Microsoft account. So if you have a Skype account that you're using or whatever, that's also your Microsoft account. So you can safely log into your teams and chat with the other teams users similarly to what we were doing basically on Skype. And from what I, I do know, this is just my initial guess, But I think that Microsoft is going to replace Skype at some time, at least Skype for business is going to disappear for sure. This is no longer going to be in development because teams is replacing Skype. However, what I have to say is that even on the mobile phones from the Android side and on the windows side, the accessibility of Teams is amazing. I don't know if some of you are using it also, but I have to say that it's really nice to see that the programme is really responding with Screen reader and I don't see myself the reason why should I use an extra scripts for Teams if I if I need to or whatever everything is being read out loud. You can get to the, you know, information that you need.

 

Bart: You're using the client or the web based the

Mario: I am using the the clients?

I've used the Web based version as well, but I'm using the clients on the phone and on the machines.

 

Bart: I use it occasionally. And as you say, I almost hear no one anymore being using Skype to Skype or even like a verb.

 

Let's Skype, which is a bit harder with less teams maybe.

“Everyone laughs”.

 

Bart: But yeah, but yeah, I think that that Skype might disappear.

 

Pawel: There is a Skype business somewhere.

 

Yeah. But as I said, yeah. Skype business will go away. It is being replaced by teams, that's definitely

Bart: what you can see they really wanted to have it die out.

 

Mario: Yeah.

 

Pawel: Just to implement a telephony part where you can actually integrate the teams with your PBX or with your telephone lines, the traditional telephone lines.

 

Hlynur: Yeah, it's some time now since they said that they were switching Skype for business and for teams that was working for about the phone here in Iceland for 7 years.

 

And I think it's about maybe 2 years ago when when we started switching out from Skype, the business to Teams, OK, and using Teams more and more.

 

Mario: Ok,

Tanja: teams has also much more functionalities comparing to Skype.

Mario: Absolutely. Yeah. Yeah.

Tanja: So I see why it replaces Skype.

Hlynur: It's a logical step.

Tanja: Yeah. Yeah. And yeah. And for us it's convenient because we can share also audio with the screen which we couldn't do with Skype or we could just share on the screen there and for our for showing our screen reader. It's very convenient. And you can also give remote assistance to someone overthinks.

 

Mario: Yeah. Which is a really nice feature for example, especially now when you cannot get physical help from somebody you can. If there are some situations and let's be honest, like if you're blind and have any kind of disability, you will, you know, sooner or later encountered some problem where you will need help from somebody and the fact that you can use it. You use the teams to do that now. Yeah, it's really simple how you can do that and it's effective.

 

Bart: So I'm using this to zoom in and I'm happy to hear that it works now with things as well. Yeah, very useful.

 

Pawel: I tried the private part of the of teams where you look in your private account. I actually discovered it by accident because I was giving a workshop on teams and at some point my participants and I, we couldn't really find each other to add to the organisation I founded for the workshop. And we were wondering what the problem is. And then it turned out that somebody was locked in with a private entity.

 

And they weren't receiving any requests whatsoever, and they had to somehow look in the email or re-login, and then I discovered that you actually can switch between your organisation sort of side of things and the private one, which is sort of a nice metaphor for like switching off from your work life and switching over to the private life and vice versa so they don't get mixed up. But I really regret that there are so many features missing from the private version of teams. For instance, I really like the way the messages are threaded in conversations on the teams for actual teams. Well, you just navigate from the messages up and down, up and down until you find a new thread and, then you click enter and the thread expands in its own sort of interface and you just watch the replies to that one thread. And when you think of all the Facebook groups, AR, WhatsApp groups, that, for example, in different countries, blind people are formed by blind people to, for example, serve technical support or exchange ideas, exchange advice on technology. Nowadays, a lot of people use WhatsApp for that, or Facebook, which Facebook is still better than WhatsApp. But it's a lot of mess with people discussing everything all over the place. And it's difficult to keep the structure and keep things in separate threads. So to make the old good mailing list a bit more modern and instant message like, and on Teams, this is perfect, except it only works in the organisational setting, not in the private one, which I think is a pity because the interface is there. It works great with Screen readers, you can expand and collapse the threads, but it's not there in the private version.

 

Mario: But I think that this is something that it could be reported to Microsoft. And, you know, there are people over there who are reading the  stuff.

 

Pawel: So if they find it a priority to add, they think this is relevant for like private communication because maybe they have a different. But definitely you can we can report it You can report it, and maybe they listen.

Hlynur: Definitely see them adding this to the personal profile.

 

Pawel and Mario: Hmm. Yeah. Yeah.

 

Hlynur: I cannot see why not. Yeah,

Mario: no, no thumbs up for for teams, absolutely. I this is this is one of the software that was kind of on the new list of things I started using during 2020. And when you find the piece of software that is in the same time accessible for us, that's, you know, 2 plus 2 plus up. So, yeah, congrats to Microsoft.

 

Pawel: It may just look a bit scary for new users because it's a bit big. And when you navigate with either tab  so you can navigate in 2 ways, either with tab, shift tab like a normal app or like a Web page. And for me, at least at the beginning, it was quite scary. Also, if you don't use the English version, but like I use the Polish translation, I couldn't figure out what all the buttons and options are about. Maybe, maybe just the translations fault. But it seems pretty huge to me. And this was like a feeling a lot of my friends shared when they called me asking for advice or support. This is so big. I would, with Zoom  if you have a good guidance on where things are, you can quickly figure it out here. There is a lot of like toolbars, tabs, lists, headings, edit fields. And I think a good place to start is pressing control dot or control period and reading through the shortcut keys because at the beginning they would really help you reach the most strategic areas of the app quicker.

 

Mario: yeah, even though there is a really, really good accessible guidelines on the Microsoft page

Pawel: that too.

Mario: which is explaining how to use it. And I agree that when you start using Teams at the beginning, it can get kind of confusing. Let's see. And there is some time for transition that that's what I.

 

Hlynur: I totally agree and I think that they probably know about that as well, that people are people are a bit overwhelmed when first using Teams. I know. I was also so it took me a while to get into it, but the pandemic surely helped me to use Teams much more because it was necessary.

 

But we held our General Assembly last October now at the Icelandinc Association for the Visually Impaired, and we were choosing which programme we would use since we couldn't come together.

 

 

Mario: Yeah,

Hlynur: we decided on Zoom instead of Teams because of Actually, how simple it was, it wasn't overwhelming at all.

 

We had to have good, good experience with it.

 

It was just like 4 or 5 shortcuts that people needed to know to, like, raise their hand.

Mario: Yeah, sure, yeah.

Hlynur:  And mute or mute. So we chose Zoom for that.

 

Mario: No, no.

Bart: I think the trouble is not that Applications are today totally inaccessible, I'm sure that Google Meat can be used, Teams can be used to, but we will need some time of familiarisation with the new interface and we need to learn shortcuts. And it's just annoying that if you have in the morning a meeting withTeams, then in the afternoon, one with Google Meet and then maybe for your leisure time in the evening, one with Zoom, that's for me a bit heavy to memorise all the different interfaces and

Hlynur: and you have go to webinar,

Bart: go back, connect.

“Everyone laughs”.

Bart:  So I'm a big fan of Zoom, but that's mostly because we chose it to use it on a daily basis. And of course, what you what you use a lot is what you like and what you know well. But I'm absolutely happy that all the other platforms can be accessible as well. But this switching, I think for a sighted person, this is not so difficult. They kind of recognise easily where things are on the screen and it goes relatively fast. But for us to have to memorise all the shortcuts and they update also quite frequently nowadays, of course, so many feature requests. And so to keep up with the different platforms, that that's not so easy.

 

Hlynur: Also, what I love about in Teams is that when you have a meeting, let's say it's a staff meeting or a larger meeting between a few different organisations, is the automatic creation of a conversation in the chat with all the participants in the meeting.

 

So you can then discuss with all those who were partaking in the meeting if you forgot something or you have any links or the slides that you wanted to share afterwards and you have a already made conversation with those people and very much like that feature.

Also how simple it is Ok, yeah, it is like you were talking about how to change between accounts because I have my account with the my association and I'm also a guest in the Teams of the national institution. And it's very easy for me to switch between those 2 accounts, even though it's on the same email address.

 

Pawel: Can you actually already, because I think they had a plan about this, invite external people to your Teams meetings. So, for example, you having an external expert at the conference who you just would like to pop in for an hour to give a lecture, but then you don't want them in the team anymore?

 

I think that this should be available.

Tanja: yes yes

 

Hlynur: Yes, this is listen, we have been having external contacts at Teams meetings. I did, however, have a problem that I couldn't I couldn't call anyone except for within my organisation, through Teams.

 

Pawel: Mm hmm.

 

Pawel: But than you need to invite them as a part of the organisation. Or you can just invite them for one meeting in the organisation.

 

Hlynur: I think you have I think you can invite them as a guest, but does that expire at any time? I'm not sure. So what I actually did, too, was that I instead of just calling, I created a meeting with them and that worked. But it's kind of a weird workaround. If I can have a meeting with whoever I want, why can I call someone also?

 

Pawel: Mm hmm.

 

I think the mobile apps are quite a nice starter as well, if you feel a bit overwhelmed by the desktop app and you're in a panic situation, like suddenly out of the blue, you were asked to come into a job interview on Teams you have never used before or your professor is expecting you at a lecture at 8:00 in the morning, sharp on Teams and, you never used Teams before, think the mobile apps are a bit easier with the interface if you really need to get grips to grips with this fast.

 

Mario: Absolutely, yes,

Hlynur: and especially and I think it's definitely also the case, they probably will add some features to the personal accounts on Teams later on. They I think they probably don't want to overwhelm users in the beginning, like we all probably felt like when we first started using Teams within organisations.

 

Mario: Yeah, yeah. And for example, my phone is officially licenced to use Teams, Samsung  did that with Microsoft. So what's really nifty is, are they on my phone? I have a push to talk for teams.Yeeeeahh!!!

Pawel: that's cool.

Mario: Yeah, that's it's, it's interesting. So yeah, if I'm on the phone and I need to drop into the meeting, it's cool to have.

 

Project guideline from Google

(Mario) So after we did our virtual chit chat, it's time to go to the nature for some run if we want to, but if you have a disability which is connected to the visual impairment, then you might have some obstacles on the way and that might be the problem. However, it looks like that situation might change and that some things could improve on that front because Google is coming up with the project guideline. It's a very new thing. And the project guideline is a piece of technology which works on your smartphone and basically what it does is that, from my understanding, is that it's able to catch the guideline on the floor when if you're running and then it's giving you the audio indications if you are on the track or off the track. Actually, there's a video description, there will be a video in our show notes for that, but I think, Hlynur, you saw the video about it and you said that you are a runner as well. So you could tell us what's going on there.

 

(Hlynur) Yeah, I do run, but I wouldn't call myself a professional runner.

 

(Mario) But you ran a marathon and I didn't so.

 

(Hlynur) Yeah, I ran a full marathon last year. My first, but hopefully my last. What I found very interesting about this and very exciting is that you had a blind runner who hadn't been able to run by himself for over 20 years. And this technology made it possible for him to do so, and it was a very emotional moment, also for me, because I love in the evening to go out for a run. I live in the, like the suburbs of Reykjavik, so I have to run for like 5 minutes and then I‘m out of the city and I can run past lakes and through woods and just being alone, exercising, it feels good. And so this opens up a possibility for, I think, many people to really enjoy running, because sometimes you would like to do it by yourself.

 

(Pawel) Yeah.

 

(Bart) This guideline is then something which is pre-recorded and you can follow it?

 

(Hlynur) No, I think you use your mobile phone, you strap it with a belt on your stomach. And I think it's using the camera to detect.

 

(Mario) Yeah, that must use the camera for sure.

 

(Hlynur) Yeah, yeah. It's using the camera. They always have those painted guidelines. But I think what they're trying to do is they're trying to detect the pavement or it might not even be concrete.

 

It might be just a path in the woods. I think they're trying to make it detect as many different pathway surfaces as possible. So you wouldn't need any officials to be painting guidelines all around the places. I think that wouldn't… I think it wouldn't be logical. So and I also see that this can probably be done. Then we're using also a race track, identifying the space between the lines there. So you could know when to when to turn also, and there were many people trying it out. A lot of blind people running by themselves in a group, it was very interesting. And I'm very much looking forward to see where this leads to.

 

(Mario) Yeah, actually, on the project page, they are basically inviting any organisations or individuals who is into sports and the running to get in touch with Google and to see how this technology can improve further and maybe you would get the chance of testing out. Absolutely.

 

(Hlynur) We also have here in Iceland one professional runner who is blind and trying to make it to the Olympic Games so I will definitely be showcasing this to him as well.

 

(Tanya) He says that at the moment, the technology supports only 2 locations.

 

(Mario) Yeah. Which I don't understand. What does that basically mean?

 

(Bart) That's what makes me think that it has to be kind of prepared or that someone has to run the tour first and record the images and that you can kind of download this virtual tour.

 

(Mario) Yeah, that's for sure, because they're saying that the app works offline.

 

(Hlynur) Yeah, it works offline.

 

(Mario) Yeah. So you don't need to be connected to the Internet so there must be some data in the phone.

 

(Pawel) They might be drawing the line on the floor, which I think happens for like official marathons. Correct me if I'm wrong, Hlynur, but I think you need to have like a visual guidance where you need to run so that you don't get off track and say, OK, I was first, but actually you went off the track and you took a shortcut, which wasn't official.

 

(Hlynur) Yeah, actually when I ran the Reykjavik marathon last year, there were no lines. So we were just running on the streets which were closed. I almost took the way back and ran a half marathon. But there was an employee who directed me that I was supposed to turn left. So I did. But there were no lines there. Just a lot of the employees, like when we were crossing some roads that weren't closed, they would close them for us so we could run without stopping. But there was no such like clear markings that I could see, except for you just follow the people in front of you most of the time and of course, some road closings with railings and and something like that. But no line, never.

 

(Pawel) They also mentioned a virtual line. Maybe the line will be just existing in the app.

 

(Hlynur) Yeah, maybe it's I mean, it might be using GPS even though it's offline.

 

(Mario) Yeah, but it might be if it's relying on GPS, then it would have to use very, very precise locations that would have to be related to Galileo because otherwise, the glorious GPS, for example, doesn't have that narrow decisions.

 

(Hlynur) And that's why they must be using the camera, because they're telling you that if you're in a race track, if you are on your lane between the lines or if you're going a little bit too far to the right, it would direct you to the left. So it's far more precise than GPS. So it must be using the camera just how it is doing it, I'm not quite sure, but I would very much like to know maybe we should dig into that.

 

(Pawel) and what Google is doing already and I think it's really a shame this doesn't work for blind people anyhow, because I tried it. They have a feature for sighted people where if you get lost and you don't know which way to go, as in if I should turn left or right, if I should go ahead or turn around and go back, it will turn on the camera for you. It will analyse the picture of the buildings around you and anything it can see. It will compare it with the satellite image they already have and it will paint the the arrows indicate the direction for you. And this actually works. I tested it with a sighted person on the Google Maps data, and I know for sure if many sighted people, they don't even know where to go sometimes we have Google Maps and they just take a random turn and see if this is getting better or not really. And then they just turn around if they need to. And in order to eliminate that, there is this feature where I think it's called Life Camera or Life something, Life View. And what happens is that, as I said, the image of your camera is analysed against the satellite image they have. And based on this, the direction is given to you.

 

(Bart) Why do you say it's not working for us? Because the only indicate visually where you need to go?

 

(Pawel) Yes. So, yes, there is no indication, nothing of screen reader, nothing written in text, nothing like that. There is just this graphical, because it's a AR and they just printed the AR arrow onto the image you see in the camera.

 

(Mario) It could be done with audio for sure.

 

(Pawel) It definitely could.

 

(Hlynur) Yeah, it could be.

 

(Bart) And I was doing this. There is an app where, especially when you have to start with your guided tour, you don't know in which direction you have to start your route. And I think it's like a compass. You turn around and when I don't know, but it gives a certain beep and stops beeping. That's the way you should start. So it shouldn't be too difficult to make it accessible.

 

(Pawel) I think it's doable, but they just need to think about it.

 

(Mario) Yeah, yeah, but I think that this is the thing they're doing in this project deadline, because at a certain point of the video of the video, you can hear actually this audio from the smartphone, how we how it sounds like. And you can definitely hear that when the, at least I think this is what's happening on the screen is that as the audio is changing, you can see that the person is going left or right to… I mean, that he's not right on the spot of the track. So, yeah.

 

(Hlynur) And also it looked like it was a nice touch that when he was on the right track, it looked like that it was mostly silent. And it only seemed to give him some sound once he was too far to the right or too far to the left.

 

(Pawel) I think it did have one of the articles I read that the louder the sound is, the further off the truck you are.

 

(Mario) Oh, yeah, so definitely something we should look forward to it and yeah, we can see how how it will develop further, but I see huge potential, which are great.

 

(Hlynur) Yeah, we should definitely keep an eye on this one.

 

(Mario) Yeah.

 

Accessible pregnancy test for the blind

(Mario) Ok, so the next topic is very interesting for blind women and girls, because eventually it looks like that some people are coming up with the idea of developing a pregnancy test for blind which I absolutely support. I think that when a woman who is visually impaired or blind cannot see the normal pregnancy test, it's definitely our right to have the test which is accessible by any kind of form. So it's really cool to see that people in RNIB are trying to take care about it. From the article that we have in the show notes, you can see that they developed a prototype with one developer. They developed the prototype of the test which is tactile based. So from my understanding is that you have like 2 different indicators on the test, the one that's showing if the test is positive and the one is that the test is negative.

 

(Hlynur) I think it's one showing that it works. And if there's a second one, then it's positive. If not, then it's negative. Like the 2 lines on the regular tests, then 2 lines would mean pregnant and 1 line would mean that it's working but you're not pregnant.

 

(Mario) Ok, well, yeah, by any case, the thing seems to be tactile and currently it's in the prototype stage. For my understanding is that people in RNIB want to raise the attention of the health organisations to start implementing something like that on the regular basis, which would be great because nobody's supposed to be left behind. And when I can just guess, because I'm not a girl and I'm not a woman, but I can just guess that having the information, if you're a woman and when you have even when you come to the point where you have to do the pregnancy test, you are at least being excited because you are waiting for something or you were wondering, oh, am I pregnant or not or whatever. And for sure, this kind of information is bringing lots of emotions to the end consumer who is, you know, pressed to take the test. And if you're relying on somebody else to tell you that and also make some comments about it, which you don't want to hear at a certain point, that's really not nice.

 

(Bart) Let's just hope that we can make this universally designed because you don't want, if you want to know the answer, you don't want to order a specific test and wait for it to come by post and then by then… you want to have access right away. So, um, if we can find a solution which will be implemented by the regular producers of this test kits, it would be great.

 

(Hlynur) This is an important human rights issue. This is just something that needs to be there. I have 2 kids myself. And so I know that the moment when the mother has the positive results, this is something that, this is her moment and hers only. So this is very, very, very important. And we should look to that this becomes a universal possibility.

(Mario) Yeah, yeah.

(Tanja) Yeah, I, I think it's also important for couples. I mean, for the woman for sure. But for couples too. I mean, if the partner is blind to because then you have to share the information again with someone else who is not your partner. So yeah, I think also for the woman for sure. But for the couples also.

 

(Hlynur) Yeah. It's an amazing feeling to see a positive pregnancy tests. I have done that before and I can assure you that looking at a positive pregnancy test is a great experience and if I wouldn't be able to see it, I would definitely want to be able to feel it to. It's an experience.

 

(Tanja) Mm. Yeah. Well for the moment the solution that we have is calling it Be my eyes, the special assistance from Clear Blue. But again, we have to ask somebody, even though, OK, this person is not somebody who we know, is a professional, who is therefore assisting us, but it would be nice to have an accessible pregnancy test.

 

(Mario) Yeah, let's go let's let's give you the thumbs up for that, and this is also going to be our show notes. So, yeah, people who want to see and support that definitely use the Twitter. Don't read just the tweets from Donald Trump. You can use it for the other tweets.

 

(Tanja) Hopefully not anymore.

 

(Mario) Yeah.

 

(Pawel) And also let the companies know that you, who you trust for this kind of devices because it's again, it's a design project. So you can just tell them, look, there is somebody who designed it for you. You don't even have to do your own RND for this, just implemented. And maybe it will be like the medicine at some point where you just expect that medication you buy has brand labels on them.

 

(Mario) Yeah, the only thing what I from the technical point of view about this whole thing, what I understand is that the the part of it, that's a specialised thing which raises these dots or lines. So you have to, like, insert that testing pattern inside of the thing and then it it's using some vibrating mechanism that raises those lines, basically this is how it works.

 

(Hlynur) Yeah, it's like silicon silicone bubbles, I think.

 

(Mario) Yeah, something like that, though. What I think I mean, maybe somebody else comes up with something else. But from the way how to this the thing is functioning is that you would need to order that kit like to have it, and then you could maybe use whatever tests or different, I don't know. But that's how it works. OK, but I mean, this is just a prototype. That's why the prototypes exist. Because eventually when you look at the final product of whatever do you have, the first prototypes were completely something else. I, I remember even seeing the first one, at least what they said. It was like the first prototype of Google Glasses. When I was back in Google for the first time, in 2013. They were showing us Google Glasses, which did have some kind of the same things which the later versions of the glasses, but they were also different. So, yeah.

Playstation 5 accessibility news

 

Mario: Ok, so after we have done all the things relating to the human rights and the rights for having accessible pregnancy tests, we could go to the last topic for today, And that's connected to the PlayStation 5, which just came out  last month I guess, And there has been accessibility news about their new screen reader and what it can do for the moment. Pawel, is that correct?

 

Pawel: Yes. We already discussed PlayStation 4 a little bit…

 

Mario: yeah

 

Pawel: By talking about the last of us in one of the previous episodes, the last of us part II, And meanwhile, we sort of expected it, but until they announce it, you can never be sure, that PlayStation 5 will be better than 4, because in 4 there was a TTS, it was called Back then, facility that was sort of a screen reader, but first of all, it was just available in the US, so not even in English speaking countries, just the US, so it's like firmware dependent; and secondly, it didn't read everything, as it turned out, so now PS5 is out and it has a lot of interesting features in store. It has a screen reader, yeah, it's now called a Screen reader; it's not just in English and it's now firmware independent, so you can get it also in a couple of other languages, I believe German, French, Spanish, probably some more, but these are the European languages.

 

Mario: mhm

 

Pawel: And the best thing about it is the the Start-Up process is completely accessible, so when you run your console for the first time, you will hear some music, and if you are blind, you just have to wait around a minute and it will start giving you instructions what you should do, so connect your controller and then choose this and go here and go through the whole wizard and the whole experience is accessible.

 

Mario: MHm [admiration]

 

Pawel: Apparently, it also gives you the instructions in a couple of languages, so in case you don't speak English (it's not your language), you can still hear it in the other supported languages, so you know from the start what to do no matter where you come from, and then you can just choose the language that you want; and everything about it is accessible, so wherever you wouldn't go in the settings almost (there are a couple of exceptions), everything is read out to you, so the achievements that you have already gone through in the games, all the trophies you have earned, the missions you have to do the activities (now they call them), so like these mini quests and and and also trophies count as activities now, at least for the older games; and all the multimedia apps: you have Netflix installed, there is Twitch, there is… Disney plus is not accessible really, but YouTube, Apple TV even, Apple TV+, I mean; and there is also, I think, some American centric stuff like WWE. So most of this or almost all of this is accessible. You can watch your favourite multimedia. You can play the games as long as they're made accessible, of course…

 

Mario: mhm

 

Pawel: And all the settings are accessible.

 

The only inaccessible part is anything to do with web browsing for some reason, so if there is a Web view embedded in any experience, this will not be read, I believe. At least on the Start-Up screen, what is not read is the place where you have to enable two-factor authentication on your PSN or PlayStation Network account and add your mobile number to it, But what happens instead then is that you are told, “OK, this screen is not accessible on the console in the TTS version, in the screen reader, but if you want to do it anyway, here are the URLs you can go to from your computer or any other device to do these steps and you can skip them for now, but you can do that later from another device that is accessible with web…

 

Mario: Ach, OK

 

Pawel: So good they really thought about it. It's not ideal, but it's really the best solution for for what it could be achieved at the moment. Also, from what I heard, as soon as you turn on in your accessibility settings your preferences on or off, So, for example, I need audiodescription, I need subtitles., I need this, I need that; this should carry through to all your experiences that you watch, so if a game offers AD…

 

Bart: [affirmative noise]

 

Pawel: You'll get AD in the game. If you watch something on Netflix and the series that you watch, or movie that you watch has AD, you’ll get AD. If you need subs, you'll get the subs. If you need some specific contrast settings, you get them in the games as well if they're supported, so you don't need to turn it on everywhere separately.

 

Mario: mhm

 

Pawel: Also, there is the option to dictate text so you can do the dictation and you don't have to use the on screen keyboard or hook up an external keyboard, you can just have the controller. And yeah, a lot of of of what you want is actually read out to you, so as you go through the settings, as you go through the, through the menu, as you go through the screen to start playing the game and watching your statistics, you can watch what your friends are up to in your PlayStation network, if they're playing or not and how you compare against them with the statistics in certain games. So all of the experience’s accessible and now here is to hoping that there will be more accessible games from now on…

 

Bart: Mhm

 

Pawel: That it will not just be the last of us Part II, but that there will be some new games from some more companies that will be accessible too and yeah, it's really great to hear that

 

Bart: Mhm

 

Pawel: the gaming industry on the mainstream side is also pushing forward because it’s a… it’s a difficult accessibility issue to tackle, because with websites or apps you sort of think in this pattern style that, “OK, here I have to put a heading, here I have to label a form field, here I have to put a text on the link”, But every game is different because it's actually designing 3D worlds and like a whole of imaginary scenarios and you have to really think creatively how to replicate it and if it's at all possible to replicate the whole experience for a person who cannot see because, well, games were basically built in a visual way. So this is really new, this is explored just a bit and there is still a lot of things unknown, for instance, whether blind people are able in all possible games to compete with sighted people in multiplayer

 

Bart: MHm

 

Pawel: or this is not feasible and maybe maybe we should talk about like a computer game para Olympics or something like that.

 

[everyone laughs]

 

Yeah, I mean, when we think of e-ga… of E of gaming as an e-sport, where you actually train and get some awards and you get better and better at it and it's treated like a normal sport…

 

Bart: mhm

 

Pawel: at some point you start having these thoughts and… Yeah, this is this is a whole new chapter to be explored, but I'm really happy to see that this is being done, and there are events on accessible gaming, there are people who want to think about how to do this better. I recently read, for example, about somebody doing a Minecraft mod where apparently blind people will be able to play, and it's also quite a challenge because the whole game revolves around building completely new creations out of preset blocks and the ways in which you can bind these blocks together is the countless and you can be really creative, but it's supervisual exercise, so how to replicate this for people who cannot see the blocks? And, yeah, this is this is really a whole new spectrum.

 

Mario: MHm, Ok, I see what I will do when I go to retirement.

 

Pawel: Yeah, sure.

 

[everyone laughs]

 

Pawel: By then, By then, hopefully the whole the whole spectrum will be somehow more accessible and there will be more in the know as to what we should do about this, how games should be accessible and to what extent they can be still competitive.

 

Mario: mhm, Ok, ok. No, no, no. But it's really cool to see that the stuff is evolving. As a matter of fact, we just  we just talked about PlayStation 4, it was like half year ago.

 

Pawel: Mm hmm.

 

Mario: And right now, it's cool to see really that the PlayStation 5 they came… basically they sorted out many not… I I I I will not say all issues, but many issues which were presented in the 4, especially ones that were connected to the firmware thing and stuff where you depending where you were depending on on US firmwares etc, which is now being fixed, so that that's really good.

 

Bart: It's really nice that there is now a screen reader in this kind of device even…

 

Mario: yeah

 

Bart: I never thought I would play video games with my child, but even if you're not playing the games, you still can have an an idea what they are doing on the on the device.

 

Pawel: That too, for sure.

 

Bart: When you cannot follow the screen, you can at least if you have any doubts, turn on the screen reader and and and check what they are, where they are or if they get stuck, you can you can help with… That’s really empowering.

 

Mario: MHm, Yeah, yeah, yeah.

 

Hlynur: Exciting stuff

 

 

Pawel: Yeah, Xbox also as Xbox Series X  is also out, but Xbox already had reputation for having good accessibility on board, Narrator was there.

 

Bart: Yeah,There is, Microsoft knows about that that They have screen reader in in Windows and for Sony I think it's quite new to to do all the development.

 

Pawel: Yeah, yeah. And you have like a… They actually developed brand new TTS voices, so you can hear…

 

Mario and Bart: oh, wow!

 

Pawel:  they have some custom. They don't use anything ready.

 

Bart: [admiration noise]

 

Pawel: They made their own voices so…

 

Bart: [admiration noise again]

 

 they also think about this. Funnily enough, from what I heard, Xbox doesn't really offer that many accessible games and with PS5, you already see, for example, the last of us, there might be some more. And well, that that’s quite ironic that, you know, the console that gave sort of better accessibility impression at first doesn't offer that many games that are accessible themselves so like the…. It's still spinning, I guess. It's the whole idea of making really complex titles accessible… Yeah, that's still sort of developing… And I'm really curious. Maybe one day I'll be able to play FIFA. That would be  interesting.

 

Bart: MHm

 

Mario: [laughter]

 

Yeah, yeah. but who knows? There are some… As a matter of fact, I know that there are some game developers from Croatia who just recently got purchased by huge productions studios from the states so… Yeah, maybe I should send them some tweets also, like: “Hey, guys! Pay attention to the accessibility”.

 

Pawel: Definitely.

 

Mario: yeah, yeah.

 

Hlynur: This is this is relatively new on this scale that we have been seeing like with the last of us, So so I think we need to be very much on edge and and regularly letting people know of the importance of good accessibility in video games and that it is even possible.

 

Pawel: Yeah, the problem is you cannot really give people ready made solutions, like with the WCAG like “Look, here are the guidelines, just follow” because in gaming, the problem is you have to innovate. [laughter]

 

Hlynur: Yeah

 

Pawel: And not everyone has the resources and the the inbound values of of innovating in this direction, so that's where it might be a bit slower than with… it is with websites and apps.

 

Mario: Mhm

 

Hlynur: Yeah, just just letting people know that that others are doing it and it is possible.

 

Pawel: Mm hmm. For sure.

 

Hlynur: Just just an eye opener, I think. I think we aren’t much further than that in this field.

 

Pawel: Mhm

 

Mario: Yeah, yeah, let's see. If we start bombarding them with many new ideas and if somebody has also enough time to go into the beta testing teams for whatever, then you are on the right track because then you can give the give enough input to developers What are they supposed to do, Of course, if they want to listen to it but yeah.

 

Hlynur: Mhm

(Jingle)

(Voiceover) Demo time.

Time for a demonstration.

 

(Mario) This time for you, a contribution from Mr. Benjamin Hofer from Germany who send us the demo of envision glasses. This is a new product which just appeared on the market and you will be able to hear the demo at the end of our podcast.

 

 

 

(Benjamin) Hi, everyone. Welcome to my first demonstration on EBU Access Cast. I hope you're doing well. My name is Benjamin Hofer. I'm from Germany. And today I'd like to talk about a brand new piece of technology which not only has a lot of great features, but also is usable by both totally blind and visually impaired people. I'm talking about envision glasses today.

Envision, some of you might have heard of the company making the Envision AI app. The app is available for both iOS and Android devices. It can do many things like reading text, recognising people, objects, products and importing pictures from different sources and so on. Envision now brought these features on a pair of smart glasses. More exactly said on the Google Glass Enterprise Edition 2, which is based on Android and thus is an open system. So most hopefully we can expect more features. And also third party apps on those glasses.

Envision glasses is available as of now. It was officially launched on November 26 and can be looked up and ordered on envision's website: Letsenvision.com/glasses and of course together with local distributors.

I have my own pair of glasses since the end of October. I got them during the pre-order shipments and now I'd like to tell you a bit about how the glasses look like, what features we have on them right now and how I'm using them. When you order the glasses, you can choose between a standard frame and the Smith optics frame. The standard frame is without lenses, whereas the smith  optics frame is for those who are wearing glasses and those who want to put their own lenses in. I have the standard frame that's basically a small and lightweight titanium frame which is attached to the device itself when setting up the glasses.

For the setup of the glasses, you need the Envision app. They are asking you to go to envisionglasses.com and then download and install the app if you haven't yet done so and change to the glasses section in the app.

 

And there you can pair the glasses with your app and you get explanations on how to attach the frame. The frame is basically the left part of the glasses. It goes from your left ear until the right hinge where it's connected to the device itself. The device itself is the right part of the glasses. So this is the part which is on the right side of your head when you are wearing them. And it contains the battery and the processing unit and the touchpad. You control the glasses using the touchpad and using very similar gestures to those you are used to on iPhone and Android devices. So with VoiceOver or TalkBack. There are swipe gestures, double tap, single tap, tap and hold, and with 1 or 2 fingers. If you're used to an iPhone or an Android phone, you will be very familiar with operating the glasses.

The touchpad is located right on the temple. When you're wearing the glasses where you can access it very easily and goes from the right ear to the front until the hinge. The camera is located in the front on the right hand side, but is angled to the left. So if you're holding something in front of you, it is recognised like the camera were in the middle.

 

Let's do a short demo of the features of the glasses. If I do a 2 finger swipe down, I always come back to home.

 

 

 

(Envision glasses) Home.

 

(Benjamin) If I double tap here:

 

(Envision glasses) It is 14:20.

Today is Sunday, 29th of November. Battery level is at 72 percent. You are connected to Fritzbox 7490.

 

 

(Benjamin)

It tells me basic information about the time

 

(Envision glasses) Two finger swipe down gesture to set your device to sleep mode

 

(Benjamin) and the date, Wi-Fi status and battery life. And I can do a two finger swipe down to put the glasses into sleep mode. The sleep mode is now active. Let's wake it up. Yes.

 

(Envision glasses) Home.

 

(Benjamin) And then go through the different options in the menu.

 

(Envision glasses) Read.

Identify. Find. Call. Device settings. Feature preferences. Help.

 

(Benjamin) That's it for the categories. Let's go back.

 

(Envision glasses) Home. Read.

 

(Benjamin) And if I double tap on Read:

 

(Envision glasses) Read.

Instant text.

 

(Benjamin) I have the features in this category shwon. If I do a tap and hold, every feature is briefly described.

 

(Envision glasses) Instant text. Instantly read text around you. Single tap to play or pause. Two finger tap for more options. In more options, you can choose to switch between online and offline Instant text.

 

(Benjamin) Double tapping on that will read every text in front of me.

[takes a piece of paper]

(Envision glasses) To kill a Mockingbird. Harper Lee. In instant text.

 

(Benjamin) So that was a cover of a book, it reads every text that's in front of the camera so you can use it very well for identifying groceries, but also outside for reading signs or for reading displays.

 

For example, I'm using it to control my coffee machine that is not accessible by default, but I can read and control the menu using instant text feature. That's really cool.

In the description, it told us that you can choose between online and offline mode. The offline mode is usable without a Wi-Fi connexion, whereas the online mode requires internet connectivity and the online mode is more accurate and in my opinion, a lot better in recognising the different kinds of text.

 

(Envision glasses) Instant text.

 

(Benjamin) That was Instant text.

 

(Envision glasses) Scan text.

 

(Benjamin) Scan text is for longer and denser text like in documents and things in the mail, screens and so on.

 

 

(Envision glasses) Scan text. Scan a document or a long piece of text. Two finger tap for more options. You can enable or disable text detection which detects how many words are appearing in front of you. Double tap again to take a scan of the text.

 

 

(Benjamin)

Text detection is very handy in my opinion. It detects the words appear in front of you.

 

[fast clicking sounds, the sound of a camera taking a picture, a few tones indicating that something is processing]

 

And you can take a picture by the

 

(Envision glasses) Reader. To Kill a Mockingbird, the unforgettable novel of a childhood in a sleepy southern town, in the crisis of conscience that rocked it. To Kill a Mockingbird became both an instant bestseller and a critical success when it was first published in.

 

 

(Benjamin) I can stop the screen reader by a single tap and.

 

(Envision glasses) [some tones] Text exported successfully. Compassionate, dramatic and deeply moving.

 

(Benjamin) And we can export the text to the app so that it cannot be opened up in the envision app and then we can do something with it, like saving it to a cloud or sending it via email.

 

 

(Envision glasses)

Swipe down. Scan text. Batch scan.

 

(Benjamin) The last feature in the Read category is Batch scan. It basically works like Scan text, but is for scanning multiple pages at once.

 

 

 

(Envision glasses) Read.

 

(Benjamin) If I do a swipe down, I can leave the Read category and

 

(Envision glasses) Identify.

 

(Benjamin) Go to Identify. Let's double tap.

 

(Envision glasses) Identify. Ddscribe scene.

[two beeps, the sound of a camera taking a picture, some tones indicating that something is processing]

 

A laptop computer sitting on top of a desk.

 

 

(Benjamin)

Ok, I think that's kind of clear what this feature does.

 

It snaps a picture and tries to describe what it sees.

 

(Envision glasses) Describes scene.

 

Detect colours.

 

(Benjamin) The second and last one here is Detect colours. That works like any other colour detector, it tries to recognise the colours, for example, on clothing. I have quite good experiences with it, but be aware of light conditions. So light conditioning has to be really good to make this work ell.

 

(Envision glasses) Identify. Find.

 

(Benjamin) The next category is Find.

 

(Envision glasses) Find object.

 

Back tap.

 

(Benjamin) And here you can select between different objects and find them.

 

(Envision glasses) Bicycle.

Chair. Cup. Bottle. Keyboard. Cup. Laptop. [ping sound]

 

(Benjamin)

Here we have got a keyboard, cat laptop.

 

Ok, now I detected my laptop, so it blinks when it detects the object you've selected before.

 

Find object, find people, the next one is find people, find people, find people, looks like then find people OK.

 

I pointed it to my direction and it recognised me because I have saved my face in the Invision system, which can be done within the app. So you have to go through glasses, tap in the app and then save your faces and then it will recognise faces. If there's a person that is not saved, it will plain as it did with object detection before. So you can tell if there are persons in the room or outside explore.

 

The Explorer feature is kind of both. It detects objects and people.

 

Tv person DVD.

 

But it's more OK, it recognised AR person here on a picture on the cover, and it recognised the Bart and the laptop was recognised as a TV. So we can use it to look around with the glasses and scan the environment. That was the last feature here.

 

Fine.

 

Oh, and now that's the call category. That's quite cool feature. You can add an ally in the Invision app who can be a friend or family member and who can see in the video call everything you have in front of you with the glasses and help you out in some situations. I think this is very good at your device settings.

 

Device settings contain settings such as speech rate volume, Wi-Fi connectivity, Bluetooth, connexions. You can connect the glasses to Bluetooth speakers or headphones. You can do software updates there if some are available and you can set the glasses to sleep mode and powered them on feed feature preferences. Your preferences contain preferences for some features of the glasses, for example, the online offline mode of the instant text feature and the text detection. So the detection of the scanned text feature and so on. And the help section finally contains tutorials for every feature and a playground to exercise the gestures on the touchpad. So that's it for the demo of the functions. I think the product is a very good one, especially because it's an open system. And as I said, it can be extended also by third party apps, especially think of things like Be my eyes or IRA where it's available, or like a navigation apps that you can really control lots of things on the glasses itself. I personally use the glasses, especially for reading things on packages, on screens, on displays, reading my mail with it. So every kinds of things. But I also manage to read things outside, like signs or text on the stars or something like that.

 

As I said before, it's really important that you have good light conditions when using the glasses. So if you are in a dimly lit or dark room, it's likely that most of the features are not working properly. That's what I had to keep in mind as a totally blind person. But Invision is aware of that and they probably will try to add some features that tell you how good light conditions are and how the glasses can work. So maybe that getting a bit easier in the future. For me, this is not a big problem. You have to be only aware of it. That's it for my demonstration. For more information, go to lets envision.com/glasses.

 

I hope you enjoyed the demo. Thank you for listening.

 

Mario: All right. So I think we covered everything for this show. This is the last episode for 20/20, believe it or not.

 

So. In the name of all of our team, we wish you a merry Christmas and a happy New Year.

 

Pawel: Oh yeah, and happy New Year.

 

Mario: And if you got some interesting stuff and some interesting demos of some products, send it to us and we will love. Publish it in the next episode. We do have some indications of some some of our listeners that we will get some demos. But until we don't get it, we don't publish it. So you better send it. Yeah, of course. As always, we are reachable by our email Ebuaccesscast@euroblind.org and via our Ebuaccesscast Twitter. So you can reach to us whatever you want with your comments, suggestions. And we hope we are not too boring for you andyeah, Or as always, the episode will betranscribed. And yeah, we are looking forward to see you in 2021. Take care. Well, thank you for listening.

 

This has been EBU Access Cast.