The Digital Transformation Office (DTO) usually runs “short” 3-month optimization projects. When the pandemic was confirmed in mid-March, we had to rethink and change our approach so we could help departments respond quickly and effectively to the rapidly changing situation. We began doing rapid-response design research on COVID-19 content and aspects of the Canada.ca information architecture. Our priority continued to be delivering clear and usable mobile-first content on Canada.ca. The crisis situation just meant we had to do it faster. A lot faster!
Here’s how we’ve approached design research as part of the COVID-19 digital response.
1. Working together while being physically apart
We tweaked our existing ways of working with colleagues across government. More than ever, we needed a whole-of-government approach to the web content. Daily interdepartmental video calls to coordinate the effort replaced on-site meetings and we also set up some cross-GC collaboration channels. Within the DTO, we found that many of the processes and ways we worked before COVID still worked well with a distributed team. We continued to work collaboratively using shared documents, and prototyped in the open.
Working closely (but distantly!) with departmental communications and web teams helped us align the right research, with the right questions, at the right time.
2. Prioritizing research that will deliver the biggest impact
We quickly realized that every corner of the Canada.ca landscape would be affected. Our research focussed on improving both the content and information architecture related to pandemic information and services.
We prioritized our work based on which areas would deliver the biggest improvement to digital service delivery. In the very early days, we worked with Health Canada and the Public Health Agency of Canada to test a revised information architecture for COVID-19 topics. We also improved existing content to reduce the demand on call centres.
In the following weeks, we collaborated with other departments on larger research studies that have helped shape the communications and delivery of some of the biggest economic relief programs we’ve seen in our lifetime.
3. COVID-time: Adapting our timelines to days not weeks
Light and fast research helped inform daily recommendations. Where possible, we combined previous research from DTO optimization projects with new research. This allowed us to deliver evidence-based recommendations to web teams and senior management.
We used easy-to-access data like web analytics, search queries, social media monitoring and call centre questions to identify areas where we could focus our research. We often worked questions from social media or call centres directly into task scenarios.
4. Unmoderated testing: Rapid recommendations in 48 hours
The team’s existing access to online testing services allowed us to rapidly test and deliver recommendations within 48 hours. Unmoderated tests were conducted every 2 to 3 days with 6 to 8 Canadians per test. This allowed us to identify big issues and analyze the video recordings in a few hours.
We developed a rhythm for our activities. We packaged prototypes as ready-to-use solutions. This helped make implementation as smooth as possible.
48-hour testing schedule
- Identify priorities in the morning
- Prototype during the day
- Develop scenarios and build the test in the afternoon
- Run test overnight
- Analyze videos and compile recommendations in the morning
- Deliver or communicate results in the afternoon.
Repeat and retest!
Unmoderated testing allowed us to get some rest while the tests ran overnight. It was the fastest way to test prototypes or live content and was flexible enough to recruit some specialized audiences. We also found that if you spend a little time trying your task scenarios out on friends or family first, you may find that people interpret the task in unexpected ways. Even when a task is interpreted differently, you can still learn a lot about the participant’s mental model.
In this unmoderated test video, the participant draws their own conclusions when interpreting a task. When testing a page about economic benefits, we found that many participants identified being self-employed as a business, not as an individual.
As the situation stabilized, research activities became larger and aligned to upcoming announcements. Larger projects, like the Canada Emergency Response Benefit (CERB) underwent many rounds of testing and prototyping.
Iterations of the Canada Emergency Response Benefit (CERB)
- March 27
- First test of CERB prototype with eligibility checkboxes
- March 30
- Page purpose changed to being a triage tool to split applications between CRA and ESDC
- March 31, April 1
- Retests of triage page
- April 6
- CERB goes live
- April 6,8
- Retests of live content
- April 9
- Test of CRA multi-page prototype with checkboxes
- April 16 to 24
- Retests of CRA prototype
- May 2
- CRA multi-page design goes live
- May 22,25
- Test of ESDC prototype with revised CERB content
- June 3
- Revised ESDC page goes live
5. Using moderated testing strategically
We reserved moderated testing for when we needed specialized audiences like business owners, students, or people using assistive technologies. Using a moderator gave us time to schedule the testing over several days while we worked on complex prototypes. This provided more flexibility to ask follow-up questions. When we saw interesting behaviours, we could probe for more details. This let us get the most out of the sessions.
A moderator can encourage participants to continue on a task where they may have otherwise given up in an unmoderated test.
In this moderated testing video, a participant is being reassured that selecting an option will not reveal any personal banking information.
Key lessons learned
- Test early, test often
- Small tests are better than no tests
- Focus your efforts where you can make the biggest impact
- Connect early and often with your call centres
- Include videos and quotations to support your recommendations
- Make your recommendations as easy to implement as possible
- Keep a record of your work (screenshots and summaries)
- Clear language is the best investment for your website
Challenges when doing rapid research
Even though we learned a lot, the journey wasn’t without challenges. One of the hardest transitions for the team in the first few weeks of pandemic design research was speed and the volume of new content being produced. Testing and iterating on live content was the new normal.
Final word
In a crisis, like COVID-19, poor content or a poor user experience undermines the credibility of the government. You can identify many of the usability problems with a page by testing it with even just a few users. Identifying and correcting usability problems early helped divert unnecessary calls from call centres and reduced ineligible or incomplete applications for benefits.
We encourage you to implement advice from web content design specialists about already-established best practices in user experience. Rely on standard templates and content design recommendations from the Canada.ca design system. These are based on best practices that have been tested with users to ensure they work and are accessible.