Time: 2:30 PM EST; 1:30 PM CST; 12:30 PM MST; 11:30 AM PST
- Saroj: Summarize the email discussion with Ella and her team about their level of understanding of accessibility and a suggested modular approach for scripted testing
- Discussion: BB - the teams understanding of the difference between functional usability and accessibility guidelines like sec 508, 2) what would they like from us to enhance their understanding. ie user scenarios, best practices, checklists etc; 3)what is their preferred testing methodology
- Actions items based on these discussions
Blackboard QA Teleconference Minutes for Wednesday November 28, 2007
Scribe: Saroj & Ella
- Date: Wednesday January 30, 2008
- Time: 2:30 PM EST; 1:30 PM CST; 12:30 PM MST; 11:30 AM PST
- Phone: TBD
- Saroj Primlani – NC State
- Ella - Blackboard / QA Manager
- Thomas Lin – Blackboard / QA Automation Manager
- Alex Ku – Blackboard / QA analyst
- Hadi Rangan -- UIUC
- Mike Grace – NC State
Saroj summarized the email discussion with Ella about the level of Blackboard’s understanding of accessibility guidelines and how they plan to incorporate it in their design and testing process and outlined some strategies that are being used at NC State.
Ella explained that BB is committed to addressing accessibility and is planning on integrating accessibility criteria within their development framework. However, she and her team are overwhelmed by the large volume of pages that are part of the product that would need to be tested and wanted input into strategies of how to approach this.
Saroj suggested a modular approach, evaluating each functional unit for standard elements like navigation, images etc that were specific to the modules and the module specific elements. For example, a chat module would focus on keyboard access to the module using a menu link or keyboard shortcut, navigation between the edit and display pane, the pointer focus point, read order of the content etc. This could be done programmatically using scripts or using a third party tool. Automated tools set to test for elements that are common to all modules could be used to crawl the total web site to check for problems. This would cover a large portion of issues; the balance would be addressed by manual user testing.
Hadi suggested to encorporate FAE report in the QA testing scripts for each module. Until a fully functioning scripts are in place, Hadi suggested to identify all critical pages for all modules and perform FAE testing manually. He believes it will not take more than 2 days for manual testings.
The discussion focused on some of the tools and the type of reports they generated. Saroj mentioned that at NC State some of the developers use AccVerify as it can be integrated with test scripts and used as a tool for crawling the site.
Someone asked a question about framework for JAVA and Mike mentioned that he used Eclipse with accessibility module and that helps captured issues during the development process.
Ella mentioned that they were exploring some tools and were considering Watchfire’s Webxm and probably her team would have access to the tool in December and would try to let us have access so that we could provide input. Saroj mentioned if possible she would try to download a demo and evaluate the reports and functionality.
Saroj asked what we could provide the QA team to help them develop the criteria for testing and suggest that creating some usability scenarios.