Beyond Usability Testing: Building a UX Research Practice via a Learning Agenda

A pixelated wizard character coming out of a computer monitor, waving a wand under text that reads, "Build the right thing. Build the thing right."

A mature design research practice dives deeper than usability testing to ensure teams meet user needs.

When most people think of UX research, usability testing is often the first thing that comes to mind. But the most impactful design research goes beyond “building the thing right” to focus on “building the right thing”—identifying and addressing real user needs. Usability is essential, but a mature design research practice dives deeper, using varied methods to uncover unmet needs, validate Jobs to be Done (JTBD), and prioritize the most critical functionality. This structured approach helps product teams avoid the trap of creating a usable yet ultimately useless product.

Utilizing a Learning Agenda

The foundation of a dynamic research practice is establishing an ongoing Learning Agenda. This agenda, curated in tandem with product management and business stakeholders, identifies and prioritizes the most important things the team wants to learn about users, competitors, and the product itself. By syncing once or twice per month (depending on your team’s research capacity), team members can agree on research priorities and focus on what matters most. This agenda-driven method fosters continuous learning and collaboration, aligning research efforts with business goals.

Expanding Research Methods Beyond Usability Testing

Each research method brings unique value, especially when aligned with the Learning Agenda. In addition to usability testing, consider the following methods to deepen the team’s understanding of user needs:

📋 Contextual Inquiries

Purpose: Build empathy and validate assumptions by observing users within their actual environments to uncover hidden needs.

Type: Formative

Practical Steps:

  1. Conduct site visits or virtual observations of users in their natural environment.

  2. Document observations to reveal gaps between assumptions/expectations and real-world behavior.

  3. Share actionable insights that may guide the development of current or future functionality based on research findings.

📋 Customer Interviews & Persona Development

Purpose: Gain foundational insights about user goals, challenges, and motivations to create accurate personas.

Type: Formative

Practical Steps:

  1. Identify and interview members of your target audience from varied backgrounds.

  2. Synthesize themes to create personas that reflect real user challenges and goals.

  3. Regularly update personas, aligning team priorities with evolving user insights.

📋 Affinity Mapping & Customer Journey Mapping

Purpose: Visualize user pain points and map their experiences to uncover key challenges along the user journey.

Type: Formative

Practical Steps:

  1. Use affinity mapping based on data collected from user interviews and contextual inquiries to group feedback around common themes.

  2. Create journey maps that pinpoint moments of user frustration or delight along a process or continuum.

  3. Continuously revise maps as new findings emerge, enriching team empathy.

📋 Concept Testing & Validation

Purpose: Refine ideas by prototyping and testing potential solutions before heavy development investment.

Type: Formative or Summative (depending on the fidelity of the interaction being tested)

Practical Steps:

  1. Present early sketches, concepts, and prototypes to gather user feedback.

  2. Prioritize feedback to assess alignment with core needs.

  3. Iterate on designs to ensure solutions are both desirable and functional.

📋 Usability Testing

Purpose: Evaluate the relative ease that users can navigate a product and complete defined tasks, ensuring the product is both intuitive and efficient.

Type: Summative 

Practical Steps:

  1. Create realistic tasks for users to complete within the product, focusing on essential functions.

  2. Watch how participants interact with the product, noting any points of confusion or friction.

  3. Identify patterns in user challenges and successes, then make design adjustments to improve the experience.

📋 Quantitative Surveys

Purpose: Gather statistical data to validate hypotheses and measure trends across larger user groups, providing a broad view of preferences, satisfaction, and usability metrics.

Type: Formative or Summative (depending on the timing of testing and information being gathered)

Practical Steps:

  1. Develop targeted survey questions based on the Learning Agenda.

  2. Distribute surveys to capture representative user insights.

  3. Analyze results to inform design decisions, identifying any areas needing further qualitative exploration.

Implementing a Research Routine

Incorporating the Learning Agenda and a variety of research methods (formative and summative) into a recurring research routine reinforces the habit of consistent learning and feedback. Incorporating user analytics into these discussions can also add valuable insights and opportunities to dig deeper into users’ actions and perceived intentions. Monthly or bi-weekly syncs between research leads and product teams align efforts with prioritized goals, ensuring the team’s understanding grows in tandem with product releases.

By adopting a structured yet flexible approach, teams can continuously validate assumptions, answer essential questions, and deepen their connection with users. With a Learning Agenda and a varied research toolkit, your team can advance beyond basic usability testing to a full-spectrum, research-driven practice—where every decision is rooted in real, actionable insights.

Previous
Previous

Design Early, Code Smarter: A Case for Shifting Software Team Priorities

Next
Next

Skill Gaps Early-Career UX Designers Need to Overcome to Get Hired or Promoted