Virtual Assistants – One of the most important accessibility trends

Virtual Assistants have become part of our everyday life. AI robots like Alexa and Siri are revolutionizing the way we communicate with our smart devices. However, what many might not realize is the impact of this technology when it comes to accessibility. This blog post will provide another perspective to these emerging AI robots.

I was eating breakfast one morning while listening to Amazon’s “Alexa” giving me the latest news brief. I was in the process of deciding what to talk about in an upcoming presentation on accessibility and usability, and as I was too lazy to reach up and get my computer from across the room to do some research, I decided to ask Alexa to tell me something about accessibility in user design. That is when I realized that the Alexa device itself is actually a great example of a revolutionizing invention within accessibility.

So what is it? Most people have probably heard of virtual assistants by now, and a lot of people also have this technology in their phones, in the form of e.g. Apple iPhone’s “Siri”. They are AI-driven robots which can provide information and help with a wide variety of tasks, such as answering to commands like ”what time is it?”, “solve this mathematical problem”, “play me some music”, “give me the latest news”, and so forth. These are all tasks which, before, would have required a computer screen in order to perform. Considering the possibility of speaking with the computer in order to retrieve information, there is no doubt about the benefits this invention generates in terms of accessibility within usability.

Before, I would often think of virtual assistants as some excess technology product which would make life a bit more comfortable for lazy people like myself. However, this is quite a selfish reasoning when considering the great use that these AI-robots pose for a lot of people. Such as people with refractive errors, who are not able to read the news on a desktop, or elderly people who are having a hard time adapting to an advanced smartphone. Instead, they can just ask their virtual assistant for the information or task they are seeking, hence eliminating usability barriers.

Not only do these virtual assistants facilitate the information seeking process for disabled or elderly people. They could also enhance internet connectivity for people living in developing countries, which has been pointed out by Forbes. The virtual assistants are a lot cheaper than desktop computers and smartphones, and there is no need for the user of being able to read or write. This shows once again how these virtual assistants can eliminate barriers when it comes to accessibility.

While most of the well-known commercial virtual assistants are not known mainly for being an accessibility tool, one company which has targeted people with physical disabilities is Tecla. One of their recent devices, the Tecla-e, can be built into a wheelchair, allowing people to operate their other smart devices, or retrieve information, directly through speaking to their wheelchair. Furthermore, Tecla states that virtual assistants are one of the four most important trends within assistive technology.

Conclusively, virtual assistants are great examples of technology that serves a purpose for which it might not have been initially intended for, and a good example of how general technology can improve our society. Of course, there are concerns about this technology, as with a lot of new technology, for example when it comes to privacy matters. However, setting the concerns aside, these AI robots serves a great purpose in the mission for providing inclusive design and accessibility for all people.


McCormick, M. (2019, Mars 11). Hey Siri! Could Virtual Assistants Be The Missing Link In Internet Accessibility? Retrieved:

Medeiros, J. (2018, April 19). Voice Assistants are Changing How Users with Disabilities Get Things Done [Blog Post]. Retrieved:

Robert, J.D. (2017, November 17). Voice Assistants for Accessibility: Siri, Google Assistant, Cortana, Alexa, and Bixby [Blog Post]. Retrieved:

Design Critique: Logic Pro X


Logic Pro X is an advanced interface for editing audio. Due to its complexity, it requires a fair amount of knowledge in the head in order for users to reap the full benefits of the program. The attributes criticized in this post are, however, rather basic and easy to grasp despite lack of previous experience.

Attribute 1: Volume Control

The volume slider in Logic (red box), emulates a physical volume level control. It is used to lower or raise the volume of the track that is playing by clicking and dragging the slider up or down.

The volume control is accommodated by a descending numerical scale, signifying that something is either increased or decreased when moving the slider up or down. Physical constraints are represented by the upper and lower ends of the score.

(yellow box) is made so that when you hover the mouse pointer over the controls, the words “Volume Fader” appears. This signifies that it is, in fact, the volume that is being adjusted. Sliding the volume lever up or down while playing a track provides both auditory and visual feedback as a green bar (blue box) responds to the volume level. The feedforward design attributes and the immediacy of feedback bridges both the gulf of execution and the gulf of evaluation.

Slip errors occasionally happen as the mouse pointer is moved too abruptly, causing a dramatic volume increase which may hurt the users’ ears. Adding a confirmation message at a certain decibel level in order to surpass the harmful level would prevent this error.

Attribute 2: Pan Knobs

The twisting knobs, pictured in red boxes to the left, let the user pan a certain sound to the left or right side of a stereo image. Twisting to either side will raise the volume of that side while simultaneously lowering the volume of the opposite side.

At first glance, the pan knob does not signify either its functionality or how to use it. In order to provide some discoverability, mapping is made so that the word “pan” (yellow box) appears when hovering over the knob with the mouse pointer. Once the user learns how the controls work, visual feedback is given as a bar lights up in green color (red boxes), in addition to the auditory feedback given as the characteristics of the sound alternates.

The design pictured below provides an improved conceptual model since the knob actually looks like a physical knob, and humans have learned through cultural conventions that physical knobs are for twisting. Labeled with L and R, the design signifies that something will move to the left or right when twisting the knob. Unlike the present design, this mockup implements knowledge in the world.

Attribute 3: Equalizer

Logic’s built-in equalizer is used to lower or raise the volume of a certain frequency area. For example, the equalizer can reduce the brightness of a sound by lowering the volume of a high-frequency area (red box).

The interface appears rather discoverable. Waveforms, dots on the frequency line and a box with information are mapped with common colors (yellow boxes) to make sure the user understands what area is being adjusted and by what amount.

is given in audio as you can hear the properties of a sound change while juggling the settings. Visual feedback is also available, however, in order to view the visual audio simulation, the user needs to push the “Analyzer” button (red arrow), which is not clearly signified. This signification could be improved in order to maintain optimal discoverability of the interface functionality.

Don Norman’s seven stages of action are easily followed within the equalizer; the goal is to decrease the brightness of a sound, you plan to reduce frequencies using the equalizer, you specify the procedure with the help of its mappings, you perform a volume decrease in the high-frequency area, you perceive the change of the sound and interpret its characteristics. Finally, you compare the result by pushing the power button in the top left corner (green box). The appearance of the power button provides feedforward, as humans have learned that such a button will turn something on or off.