No one asked the end users

No one asked the end users

Building a user research framework from scratch for mission-critical hardware

Building a user research framework from scratch for mission-critical hardware

Amazon

2023-2024

tldr

tldr

Product

Product

OpsHub

OpsHub

Role

Role

Research and design lead

Research and design lead

Problem

Problem

A critical app for managing AWS edge hardware devices had no dedicated UX resources, budget, or research pipeline to validate a complete redesign.

A critical app for managing AWS edge hardware devices had no dedicated UX resources, budget, or research pipeline to validate a complete redesign.

Outcome

Outcome

Built a four-phase research program from scratch, uncovering 80% of usability issues and establishing the team's first scalable testing practice.

Built a four-phase research program from scratch, uncovering 80% of usability issues and establishing the team's first scalable testing practice.

Overview

Overview

A redesign and no way to test it

A redesign and no way to test it

I was the lead designer of a large-scale initiative to redesign OpsHub from the ground up.

OpsHub is a desktop app for managing AWS Snow devices in harsh, disconnected environments. Think military bases, aircrafts, oil rigs, and even outer space (there's even few Snow devices aboard the International Space Station).

The original version of OpsHub was built without UX input and suffered from chronic usability issues and persistent customer complaints.

I was the lead designer of a large-scale initiative to redesign OpsHub from the ground up.

OpsHub is a desktop app for managing AWS Snow devices in harsh, disconnected environments. Think military bases, aircrafts, oil rigs, and even outer space (there's even few Snow devices aboard the International Space Station).

The original version of OpsHub was built without UX input and suffered from chronic usability issues and persistent customer complaints.

Problem

Problem

No budget, no participants, no process

No budget, no participants, no process

By the end of 2023, I had delivered foundational UX work (user research, site maps, customer journeys, user stories) and created a mid-fi prototype that addressed the top customer pain points around installation, monitoring, and management.

The prototype was done. But there was just one huge problem. There was no way to test it.

Our team had no budget, no participant pool, and no method for conducting user testing. :/

But rather than accept these blockers, I worked around them.

By the end of 2023, I had delivered foundational UX work (user research, site maps, customer journeys, user stories) and created a mid-fi prototype that addressed the top customer pain points around installation, monitoring, and management.

The prototype was done. But there was just one huge problem. There was no way to test it.

Our team had no budget, no participant pool, and no method for conducting user testing. :/

But rather than accept these blockers, I worked around them.

Discovery

Discovery

Get out of the building and talk to people

Get out of the building and talk to people

I'm a huge Steve Blank fan.

One of Steve's core beliefs is that great products don’t come from conference rooms. They come from talking to the people actually using them. So that’s where I started.

I recruited real OpsHub users through relationships I built with the AWS Field Team, including systems engineers, network engineers, and solution architects who relied on the product in production.

I then designed a four-phase usability program using a scavenger hunt method, structured across 20 participants and four rounds.

Each session lasted 60 minutes and was run live with screen sharing, video, and recording. I measured task success against predefined click thresholds, then analyzed the results using a five-part thematic framework.

I'm a huge Steve Blank fan.

One of Steve's core beliefs is that great products don’t come from conference rooms. They come from talking to the people actually using them. So that’s where I started.

I recruited real OpsHub users through relationships I built with the AWS Field Team, including systems engineers, network engineers, and solution architects who relied on the product in production.

I then designed a four-phase usability program using a scavenger hunt method, structured across 20 participants and four rounds.

Each session lasted 60 minutes and was run live with screen sharing, video, and recording. I measured task success against predefined click thresholds, then analyzed the results using a five-part thematic framework.

Solution

Solution

Research infrastructure from zero

Research infrastructure from zero

I built an end to end research workflow on my own, including recruiting, session design, test scripts, compensation, recording, transcription, synthesis, and issue tracking.

To make it repeatable, I created intake forms, a compensation guide, testing prototypes, shared synthesis docs, and an Asana workflow to track task success and usability issues against predefined goals.

That gave the team a way to make decisions based on evidence instead of instinct, and helped us prioritize product changes before launch with much more confidence.

I also pushed beyond the immediate project.

I found a procurement shortcut for UserTesting.com that could reduce acquisition time from 12 months to a few weeks, then wrote the business case to help scale research across the broader org.

I built an end to end research workflow on my own, including recruiting, session design, test scripts, compensation, recording, transcription, synthesis, and issue tracking.

To make it repeatable, I created intake forms, a compensation guide, testing prototypes, shared synthesis docs, and an Asana workflow to track task success and usability issues against predefined goals.

That gave the team a way to make decisions based on evidence instead of instinct, and helped us prioritize product changes before launch with much more confidence.

I also pushed beyond the immediate project.

I found a procurement shortcut for UserTesting.com that could reduce acquisition time from 12 months to a few weeks, then wrote the business case to help scale research across the broader org.

Impact

Impact

Data-backed confidence

Data-backed confidence

The research program didn't just fix buttons. It provided the statistical evidence needed to greenlight private preview.

• 80% of usablity issues were identified and documented through basic scavenger hunt sessions.

• 98% task success rate achieved for critical "Unlock Device" and "Profile Switch" workflows after V2 iterations

• Established Snow team's first structured research program, later adopted by 12 adjacent teams

The research program didn't just fix buttons. It provided the statistical evidence needed to greenlight private preview.

• 80% of usablity issues were identified and documented through basic scavenger hunt sessions.

• 98% task success rate achieved for critical "Unlock Device" and "Profile Switch" workflows after V2 iterations

• Established Snow team's first structured research program, later adopted by 12 adjacent teams

The research program didn't just fix buttons. It provided the statistical evidence needed to greenlight private preview.

• 80% of usablity issues were identified and documented through basic scavenger hunt sessions.

• 98% task success rate achieved for critical "Unlock Device" and "Profile Switch" workflows after V2 iterations

• Established Snow team's first structured research program, later adopted by 12 adjacent teams

Reflection

Reflection

Scrappiness as a design tool

Scrappiness as a design tool

I learned that a lack of resources is a design problem in its own right. By treating the research bottleneck like a UX challenge, I was able to build a scrappy but rigorous system the team didn’t have before.

I also learned that in a highly technical, engineer-first culture, few things create alignment faster than watching a real user struggle with something everyone assumed was obvious.

If I were doing it again, I’d push earlier for a standing participant pool and dedicated testing infrastructure. But at the end of the day, a scrappy solution beats no solution any time of the day.

I learned that a lack of resources is a design problem in its own right. By treating the research bottleneck like a UX challenge, I was able to build a scrappy but rigorous system the team didn’t have before.

I also learned that in a highly technical, engineer-first culture, few things create alignment faster than watching a real user struggle with something everyone assumed was obvious.

If I were doing it again, I’d push earlier for a standing participant pool and dedicated testing infrastructure. But at the end of the day, a scrappy solution beats no solution any time of the day.

Working on this project is a reminder that sometimes the most important design decisions happen below the surface. The banner is simple.

The hard part is the metadata schema underneath it, and getting hundreds of service teams to describe their features the same way. It's a systems problem.

When should a recommendation appear? When should it stay quiet? When should it never come back? Those decisions get made in the schema, not the UI.