How might we improve the online chequing account opening process for low-tech, BMO users.
Ensure the user path to open an account through BMO is clear and usable for low-tech users.
Provide BMO with usable suggestions to improve their online user experience.
Evaluate how BMO’s chequing account opening experience compares to their competitors’.
Conduct user testing with emphasis on the ease of use and clarity of information provided by bmo.com
We defined a low-tech user as someone who
had never sent or received an E-Transfer
1. What are low-tech users’ pain-points related to the online BMO account opening experience?
2. Is it clear to brand new users of the BMO website where they begin the process of opening a chequing account?
3. What information from banks do users most rely on when deciding which chequing account best fits their needs?
4. How do the usability metrics of effectiveness, efficiency, and satisfaction for the BMO’s account opening experience compare to the metrics for experiences offered by Scotiabank and TD Canada Trust?
We used the counterbalancing technique when testing the 3 scenarios to avoid introducing confounding variables
This helped us gather quantitative data that was easily comparable in all scenarios.
We kept our notes from each test in individual excel sheets, organized by participant, scenario and task, and then combined our excel sheets to create a master sheet of notes. This made them easy to compare and help gain insights.
We collected Quantitative data through 3 means:
1. A Post-Scenario Questionnaire
2. A Post-Test Questionnaire
3. Task Success Rate
We used these to compare each scenario and extract insights. We also selected specific information to graph to better show the comparison or to emphasize a point. For task success we converted all the results into binary data to determine success rates
We converted our session notes from our excel document to Miro -- a collaborative digital white board tool.
The data put into the Miro board were raw notes from our participants. This would allow us to organize the participant notes by category so that we could more easily see where the patterns emerge. Before doing the card sorting, we organized them by participant for ease of reference.
We sorted the collected data into:
1. Primary Themes
2. Sub - Themes
Step 1: Go through each sub - category and develop statements that represent the core meaning of the data
Step 2: Compare all insights and highlight the most important points to be considered and further supported by data to aid in the process of presenting findings and design recommendations.
After pulling out the key insights we all re-watched the user sessions to pull more quotes to support our insights. This second round helped us look at the interviews with a different lens; ensuring that we have more than one perspective on the data.
We supported each finding with a screenshot, highlighting an example on the website, a relevant quote from a user and a design recommendation to combat the problem. This ensured that each finding was backed up by substantial data and that we outlined each opportunity for improvement.