Customer feedback surveys indicated that formatting embedded tables in Workiva documents was a frustrating and time consuming experience. After identifying this issue, our team worked with the documents team to plan a usability test to identify the pain points within embedded tables.

Role: UX Research Intern 

Depth of involvement: Collaborated on test plan with Senior UX Researcher, took notes on research sessions and synthesized into findings, built study presentation, presented findings to entire R&D team (40+ people) with Senior UX Researcher. 

Image by Nick Fewings



This was one of the first full usability tests we launched on Due to the time constraint on launching this study and needing to get acclimated to using UserTesting, there was some trial and error that went into the pilot and first couple of sessions of this study.


The customer satisfaction feedback surveys were limited in detail. While the information in the survey did point to some broad issues, we had to work with our internal stakeholders to narrow down on what to test our participants on.




  • Get context on survey data.

  • Identify main research questions.

  • Come up with tasks. 

  • Agree on test document. 

  • Finalize recruiting criteria.

  • Finalize research plan with all of the above. 


  • Prepare test documents for each participant in Workiva's test space. 

  • Prepare participant facing instructions and double check log-ins for each participant.

  • Write out happy path for each task for evaluation purposes and to clarify any gaps. 


  • Once our Senior UX Researcher programmed all of the tasks and selected the panel in UserTesting, we ran a pilot session to work out any issues. 

  • I analyzed the notes from the pilot issues for any issues related to the instructions, the test document, or UserTesting since it was new to us. 


  • Once errors from the pilot session were fixed, we 5 more sessions in UserTesting over the course of 2 days. 


  • I grouped the qualitative findings into an affinity diagram to capture common themes among the different evaluators. 

  • I used the data captured by UserTesting to measure time on task, clicks per task, and the pass/fail rate per task. 

  • I gathered all of these findings into a presentation to then shareout with our broader R&D team. 

Image by Markus Winkler


Hidden to protect confidentiality.



This is my favorite project I've done at Workiva because it had huge impact. This study was conducted in Q1 2021 and the spreadsheets team rolled out a host of updates to spreadsheets end of Q2 2021. So far the response from customers has been favorable and I'm proud of the impact this research has had on their day-to-day work.



There were a number of lessons learned specifically around UserTesting. We had to toggle a few of the screening questions a few times to prevent "bad actors" from trying to qualify for the test, despite not having the right background we were looking for. 

 This study was my first time sharing out research findings to a broader audience. I learned a lot from this experience (and have continued to with each share out). Mainly, it's up to you as the researcher to make the findings as engaging as possible. Get creative with order, format, structure, and probably most importantly, if you are enthusiastic about the findings, everyone else will be, too. 

One thing I think about a lot is how realistic our usability studies are. For example, most of our customers use external monitors when they use Workiva, but we never set this up during our usability tests. This has a huge impact on what they're seeing on their screen as they go through the various tasks. This is something I'd love to tackle in the future.