Tuesday, January 27, 2015

Quality Assurance reviews for Design, Functionality and Communications


This week a few of the features that I have been writing about will be shipping on webmaker.org - the work for Privacy Day and the new on-boarding experience. You might be wondering what we've been up to during that period of time after the project gets coded until the time it goes live. Two magical words: quality assurance (QA). We are still refining the process, and I am very open to suggestions as to how to improve it and streamline it. For the time being, let me walk you through this round of QA on the Privacy Day content.


It all starts out with a github issue




... and a kickoff meeting

The same team who worked on the prototyping phase of the Privacy Day campaign work are responsible for the quality assurance. We met to kick off and map out our plan for going live. This project required three kinds of reviews - that more or less had to happen simultaneously. We broke down the responsibilities like this:



Aki - (lead engineer) - responsible for preparing the docs and leading a functionality review
Paul - (communication/marketing) - responsible for preparing the docs and leading a marketing review
Jess - (lead designer) - responsible for preparing docs and leading design review
Bobby - (product manager) - responsible for recruiting participants to do the reviews and to wrangle  bug triage.
Cassie - (quality) - responsible for final look and thumbs up to say if the feature is acceptable to ship

Each of us who were responsible for docs wrote up instructions for QA reviewers to follow:




We recruited staff and community to user test on a variety of different devices:


This was done in a few different ways. I did both one on one and asynchronous review sessions with my colleagues and the community. It helps to have both kinds of user tests so that you can get honest feedback. Allowing for asynchronous or independent testing is particularly beneficial because it signals to the reviewer that this is an ongoing process and that bugs can be filed at any point during the review period specified. 


The process is completely open to the community. At any given point the github issues are public, the calls for help are public and the iteration is done openly. 


and if there were any problems, they were logged in github as issues:



The most effective issues have a screenshot with the problem and a recommended solution. Additionally, it's important to note if this problem is blocking the feature from shipping or not.

We acknowledge when user testers found something useful:


and identified when a problem was out of scope to fix before shipping: 






We quickly iterated on fixing bugs and closing issues as a team:








and gave each other some indication when we thought that the problem was fixed sufficiently:






When we are all happy and got the final thumbsup regarding quality, we then....


Close the github issue and celebrate:





Then we start to make preparations to push the feature live (and snoopy dance a little):