Validation Tool

5.15.20 - 7.15.20
Improved Data Scientists' lives through research and collaborative design efforts.

Quick Summary

Report
Problem
Data Scientists had to
  • Manually seek out sources of truth
  • Wait (up to) 3weeks for validation of their effort
TASK
My Actions
  • Used journey maps to identify areas for improvement
  • Collaborated with Engineering to insure validation time was reduced
ANALYTICS
Outcome
Task Efficiency
75%
Backlog Items
45%

Team

  • Product Manager
  • UX Lead
  • Dev Lead
  • Data Science team
  • Product Owner

Cleaning up 'dirty' data

This martech company builds a digital ad platform for banks, analytics tools for advertisers, and service design tools for its employees. The foundation rests upon accurate consumer transaction data, referred to as transaction “strings”. The Validation Tool ("Val") is a way to manually check the output of an automated string cleaner.

In this role I needed to own the entire UX process, including research, interaction design, visual design, and delivery.

Landscape

A challenge...

One challenge presented to me was to convince my new team that a "rewind" was necessary. To address this, I quickly set up a usability test with the actual users and presented my findings and a (time-bound) action plan.

Identifying all of the frustrations

An understanding of the end user and the problem space was in need. I structured user interviews so that half the time was spent understanding the daily lives of data scientists (their goals, needs, and frustrations) and half the time was spent in contextual interview. The highlights included:

Journey map generated from user interviews

Setting clear boundaries

From the interviews and discussions with the larger team, it became clear that we had several new service design projects going and functional overlap between. We needed to understand how to split the functionalities between them. Here's how I solved this quandary:

I set up a card sort to bucket functionalities with a group of end users. It was great to see the look of relief on their faces when these functions were assigned and organized!

Next, I organized a wireframing activity where teams mocked up the tools and discussed how they might work. I supplied wireframe components and acted as an assistant in case they got stuck.

As a result, we were able to successfully align on the distinctions and prioritize the user need statements for our respective products.

Example card sort

Did we miss anything?

During the Ideation phase, I first wanted to facilitate ideation sessions around Val design concepts that might've been overlooked. The previous designer had delivered one design, but not explored options in the path to their solution. We used Invision Freehand for our ideation sessions so everyone could easily contribute.

Initial concept that contained a list of strings (left) and a comparison window (right)

A little motivation goes a long way

The final design was arrived at by iteratively taking the Data Science team feedback and formulating an iteration. These were the key rationale points:

Early concepts for string counter

Automation (and good design) bring great results

From the onset of the research, it was clear that this tool was going to bring joy to data scientists because it would be handling many of the manual tasks. But how much? A post launch check on the product revealed the following:

These were great results, but we planned on taking the product even further. Having our users under the same roof really made this project fun, fast, and gratifying.

fast_rewind

In retrospect...

I learned that it's important to not just take a handoff from a fellow designer and run forward. You must ask "is there anything missing from our evidence that would inform a better design?"