Ease of Access
Implementing keyboard accessibility
Improving Screen Reading
Implementing Screen Reading
Localizing automation properties
Accessibility is about making your app usable by people who have limitations that prevent the use of conventional user interfaces. There are many possible disabilities. However, we can address most requirements by following the MSDN guidelines .This means providing:
- Support for keyboard interactions and screen readers.
- Support for user customization, such as font, zoom setting (magnification), color, and high-contrast settings.
- Alternatives or supplements for parts of your UI.
XAML apps have a number of accessibility features built in, designed to help users with disabilities. To make your app usable to the broadest set of customers, including people with disabilities, we should take steps to ensure it works even better with these assistive technologies.
Windows offers several programs and settings that can make the computer easier and more comfortable to use. Additional assistive technology products can be added to your computer if you need other accessibility features. The Ease of Access Center is a central location that you can use to set up the accessibility settings and programs available in Windows. In the Ease of Access Center, you'll find quick access for setting up the accessibility settings and programs included in Windows.
You can configure Narrator, a screen reader, and test your app with varying degrees of success. You can choose a high contrast theme and watch controls used by your app automatically change to match the theme. You can turn off standard animations. And so on.
Many users rely on the keyboard as the sole means of navigating your app UI and accessing its functionality. If your app does not provide good keyboard access, these users can have difficulty using your app or may not be able to use it at all.
To use the keyboard with a control, the control must have focus, and to receive focus (without using a pointer) the control must be accessible in a UI design via tab navigation. By default, the tab order of controls is the same as the order in which they are added to a design surface, listed in XAML, or programmatically added to a container.
However, the default order does not necessarily correspond to the visual order. To be sure your app has a good tab order, test this behavior yourself. You can make the tab order match the visual order by adjusting the XAML. Or you can override the default tab order by setting the TabIndex property
<TextBox x:Name="FirstName" TabIndex="1"/>
<TextBox x:Name="LastName" TabIndex="2"/>
There is a tool called Narrator which is a screen reader that reads text on the screen aloud and describes events like error messages so you can use your PC without a display. Narrator is available in English (United States, United Kingdom, and India), French, Italian, German, Japanese, Korean, Mandarin (Chinese Simplified and Chinese Traditional), Cantonese (Chinese Traditional), Spanish (Spain and Mexico), Polish, Russian, and Portuguese (Brazil).
You Can turn on Narrator anytime by pressing Windows + Enter key or from Ease of Access center.
In windows phone you can turn on narrator from settings ->Ease of Access->Narrator
Let us create a Windows Universal App with a very simple UI. I am going to share UI for both Windows Store and Windows Phone Application. If you are new to Universal Windows App or sharing views and code for both the platforms please follow my another article Conditional Compilation in Universal Apps .
Create a Windows Universal App in C#. I am selecting the blank app and naming it as AccessibiltyExample
To share the view I have dragged one of the MainPage.xaml file to shared project and then deleted it from the other two projects. Now there is a common MainPage.xaml file for both the projects. The view of project looks like this in Solution Explorer.
Now let us create a simple UI with a TextBox a Button and a TextBlock.
If you turn on Narrator and launch the AccessibiltyExample app in windows with English as the Windows default language, you hear the following:
The first utterance is triggered by the app’s window getting focus, and the second utterance is triggered by the TextBox getting focus (which happens automatically).
This experience isn’t good enough, because Narrator doesn’t report the purpose of the TextBox. To fix this, we need to leverage the UI Automation framework, which is as simple as setting the following automation property on the TextBox.
<Viewbox HorizontalAlignment="Stretch" VerticalAlignment="Stretch" Grid.Column="1" Grid.Row="1" >
<TextBox x:Name="userName" AutomationProperties.Name="Enter your name"/>
If you add this property then rerun AccessibiltyExample with Narrator on, you will hear the following:
“Enter your name”
Note that when you give the Enter
Button focus, such as by pressing Tab, Narrator says:
This works automatically, because of built-in
Button behavior that reports its content to the UI Automation framework.
But if text has been added to the screen Narrator gives no indication about that .To fix this problem, we can add the following automation property to the
display TextBlock that identifies it as a live region:
<TextBlock x:Name="display" TextWrapping="Wrap" Text=""
property can be set to one of the following values:
Off—This is the default value.
Polite—Changes should be communicated, but they should not interrupt the screen reader.
Assertive—Changes should be communicated immediately, even if the screen reader is in the midst of speaking.
A live region is an area whose content changes.
Live region changes are not detected automatically, however. You must trigger them in C#. In our example, we just need to add an extra line of code to the existing
Button_Click event handler:
private void Button_Click(object sender, RoutedEventArgs e)
this.display.Text = this.userName.Text;
You have to also add Windows.UI.Xaml.Automation.Peers namespace. These classes are named with the pattern ElementName
AutomationPeer, and have several members that are designed for accessibility as well as automated testing .
Localization help us to reach larger set of audience.We should always Localize our app if we are implementing accessibility. If you want to implement that please follow my another article Localization in Universal Apps.
Fortunately, automation properties can be localized just like any other property. To do this, remove the explicit setting and give the element an
<TextBox x:Name="userName" x:Uid="userName" AutomationProperties.Name="Enter your name"/>
Now in resource file add an entry userName.AutomationProperties.Name , and its value for English should be "Enter your name". In the similar way you can add many languages.
Now let us test our app in windows phone emulator with default language French. If you turn on Narrator by pressing and holding the Volume up key and pressing start button, now you will hear
“Entrez votre nom”
V1 of article.