The Status Evaluation Tool

Click     here     or on the image above to access a demo version of the final dashboard.

Click here or on the image above to access a demo version of the final dashboard.


Client: Fisheries and Oceans Canada

My role: UI design, usability testing, data visualization

Tools: R-Shiny, Tableau, Photoshop, GitHub

Problem

Federal scientists studying Pacific salmon need to know how healthy salmon populations are, but formal assessments only happen every few years, and are expensive and time consuming. Our team developed an application that would make it possible for scientists to assess these populations quickly and inexpensively.

Process

Background:

The State of the Salmon program is a organization within the federal government that is responsible for understanding trends in Pacific salmon. Based on our Mitacs-funded user research project, our team knew that about 20% of the questions that scientists regularly needed to answer related to how healthy salmon populations are, but these scientists often did not have the data or time to answer these questions as they occurred.

Key quotes:

“The number one question I get from managers is ‘How are the stocks doing’, and I need a way to quickly answer that.”

-Senior Research Scientist

“I’m not looking for something to do the analysis for me, but I need a way to check the broader context of salmon status and be able to do comparisons between stocks.”

-Senior Analyst

Proposed solution:

Because users needed to explore a large, complex dataset on demand to answer these questions, we decided to build a browser-based visualization tool in R-Shiny. A major challenge was that the interface needed to be flexible enough to accommodate different data sets with different variables, depending on the scientists questions.

Visualization design:

Scientists wanted to be able to see the physical relationships between salmon populations (ie how the streams they spawned in were linked). This quasi-geographic network showed salmon populations as nodes that were colour coded according to their health status.

Scientists wanted to be able to see the physical relationships between salmon populations (ie how the streams they spawned in were linked). This quasi-geographic network showed salmon populations as nodes that were colour coded according to their health status.

This iteration still showed salmon populations as nodes in a network, but was able to display more data using a heatmap idiom. However, users indicated that coding KPI values with color scales was not precise enough for their use.

This iteration still showed salmon populations as nodes in a network, but was able to display more data using a heatmap idiom. However, users indicated that coding KPI values with color scales was not precise enough for their use.

We decided to use a parallel coordinates idiom to represent the data because it allowed users to see the relationships between stocks, could represent complex multidimensional datasets, and allowed users to make rapid selections within the dataset. We also incorporated a stream network map that showed both the physical location of each stock and their connectivity within the river system because users wanted to be able to reference both of these while doing their analysis, not just connectivity..

We decided to use a parallel coordinates idiom to represent the data because it allowed users to see the relationships between stocks, could represent complex multidimensional datasets, and allowed users to make rapid selections within the dataset. We also incorporated a stream network map that showed both the physical location of each stock and their connectivity within the river system because users wanted to be able to reference both of these while doing their analysis, not just connectivity..

Usability testing:

Once we had developed a prototype, we conducted three sessions of discount usability testing based on Steve Krug’s work. For each session, we recruited three participants from our target user group and had a moderator give them a guided walkthrough of the Tool , before asking them to complete 3-4 tasks. These tasks were based on questions that target users described in interviews as being difficult to answer using existing tools, but critical to their job requirements. During the testing session, at least three team members watched via remote video link and took notes on usability issues. Immediately following the testing session, we had a group debriefing where we compared notes on usability issues, and together created a prioritized list of these issues to address before the next testing session. Once we had redesigned the Tool to address known issues, we conducted another testing session.

Results

Following its release, the Status Overview Tool has been widely adopted by members of our target audience as part of their work flow. We have integrated additional datasets at the request of researchers who wanted to use the tool for work that is outside of our original scope, and we are also in the process of writing up a whitepaper to support the tool’s use as it expands beyond our target audience. We have been recruited by another group to develop a derivative of the tool focused on environmental data.

User feedback:

“In thirty seconds of using this tool I’m able to answer questions about the stocks that used to take me hours of programming to figure out.”

-Senior research scientist

“The tool has an amazing amount of potential to improve our decision making.”

-Division manager