Monitoring Systems for Checking Websites on Accessibility
Web accessibility monitoring systems support users in checking entire websites for accessibility issues. Although these tools can only check the compliance with some of the many success criteria of the Web Content Accessibility Guidelines, they can assist quality assurance personnel, web administrat...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2021-02-01
|
Series: | Frontiers in Computer Science |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fcomp.2021.628770/full |
id |
doaj-9c8fc587c2f74e5babb3d385bbda93da |
---|---|
record_format |
Article |
spelling |
doaj-9c8fc587c2f74e5babb3d385bbda93da2021-04-02T19:26:32ZengFrontiers Media S.A.Frontiers in Computer Science2624-98982021-02-01310.3389/fcomp.2021.628770628770Monitoring Systems for Checking Websites on AccessibilityAndreas BurkardGottfried ZimmermannBettina SchwarzerWeb accessibility monitoring systems support users in checking entire websites for accessibility issues. Although these tools can only check the compliance with some of the many success criteria of the Web Content Accessibility Guidelines, they can assist quality assurance personnel, web administrators and web authors to discover hotspots of barriers and overlooked accessibility issues in a continuous manner. These tools should be effective in identifying accessibility issues. Furthermore, they should motivate users, as this promotes employee productivity and increases interest in accessibility in general. In a comparative study, we applied four commercial monitoring systems on two of the Stuttgart Media University’s websites. The tools are: 1) The Accessibility module of Siteimprove from Siteimprove, 2) Pope Tech from Pope Tech, 3) WorldSpace Comply (now called axe Monitor) from Deque, and 4) ARC Monitoring from The Paciello Group. The criteria catalogue consists of functional criteria that we gleaned from literature and user experience criteria based on the User Experience Questionnaire. Based on a focus group consisting of experts of Stuttgart Media University, we derived individual weights for the criteria. The functional evaluation criteria are: Coverage of the website and the guidelines, completeness, correctness, support in locating errors, support for manual checks, degree of implementing gamification patterns, support for various input and report formats, and methodological support for the Website Accessibility Conformance Evaluation Methodology 1.0 and for the German procurement law for public authorities Barrierefreie Informationstechnik-Verordnung 2.0. For determination of the user experience criteria, we conducted exploratory think-aloud user tests (n = 15) using a coaching approach. Every participant tested all tools for 15 min (within-subject design). The participants completed post-test questionnaires, including the User Experience Questionnaire. According to our results, Siteimprove turned out to be the best tool for our purposes.https://www.frontiersin.org/articles/10.3389/fcomp.2021.628770/fullaccessibility (for disabled)monitoringgamificationuser experience (UX) evaluationuser testing and evaluationcomparative study |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Andreas Burkard Gottfried Zimmermann Bettina Schwarzer |
spellingShingle |
Andreas Burkard Gottfried Zimmermann Bettina Schwarzer Monitoring Systems for Checking Websites on Accessibility Frontiers in Computer Science accessibility (for disabled) monitoring gamification user experience (UX) evaluation user testing and evaluation comparative study |
author_facet |
Andreas Burkard Gottfried Zimmermann Bettina Schwarzer |
author_sort |
Andreas Burkard |
title |
Monitoring Systems for Checking Websites on Accessibility |
title_short |
Monitoring Systems for Checking Websites on Accessibility |
title_full |
Monitoring Systems for Checking Websites on Accessibility |
title_fullStr |
Monitoring Systems for Checking Websites on Accessibility |
title_full_unstemmed |
Monitoring Systems for Checking Websites on Accessibility |
title_sort |
monitoring systems for checking websites on accessibility |
publisher |
Frontiers Media S.A. |
series |
Frontiers in Computer Science |
issn |
2624-9898 |
publishDate |
2021-02-01 |
description |
Web accessibility monitoring systems support users in checking entire websites for accessibility issues. Although these tools can only check the compliance with some of the many success criteria of the Web Content Accessibility Guidelines, they can assist quality assurance personnel, web administrators and web authors to discover hotspots of barriers and overlooked accessibility issues in a continuous manner. These tools should be effective in identifying accessibility issues. Furthermore, they should motivate users, as this promotes employee productivity and increases interest in accessibility in general. In a comparative study, we applied four commercial monitoring systems on two of the Stuttgart Media University’s websites. The tools are: 1) The Accessibility module of Siteimprove from Siteimprove, 2) Pope Tech from Pope Tech, 3) WorldSpace Comply (now called axe Monitor) from Deque, and 4) ARC Monitoring from The Paciello Group. The criteria catalogue consists of functional criteria that we gleaned from literature and user experience criteria based on the User Experience Questionnaire. Based on a focus group consisting of experts of Stuttgart Media University, we derived individual weights for the criteria. The functional evaluation criteria are: Coverage of the website and the guidelines, completeness, correctness, support in locating errors, support for manual checks, degree of implementing gamification patterns, support for various input and report formats, and methodological support for the Website Accessibility Conformance Evaluation Methodology 1.0 and for the German procurement law for public authorities Barrierefreie Informationstechnik-Verordnung 2.0. For determination of the user experience criteria, we conducted exploratory think-aloud user tests (n = 15) using a coaching approach. Every participant tested all tools for 15 min (within-subject design). The participants completed post-test questionnaires, including the User Experience Questionnaire. According to our results, Siteimprove turned out to be the best tool for our purposes. |
topic |
accessibility (for disabled) monitoring gamification user experience (UX) evaluation user testing and evaluation comparative study |
url |
https://www.frontiersin.org/articles/10.3389/fcomp.2021.628770/full |
work_keys_str_mv |
AT andreasburkard monitoringsystemsforcheckingwebsitesonaccessibility AT gottfriedzimmermann monitoringsystemsforcheckingwebsitesonaccessibility AT bettinaschwarzer monitoringsystemsforcheckingwebsitesonaccessibility |
_version_ |
1721548839597375488 |