In the last post, we covered how to transform insights into a minimum viable design. Now that I have an idea (or several ideas) in mind, the next step is to validate with users that I’m heading in the correct direction. This phase is only relevant for designs with user-facing components. User-facing includes anything with a user interface, workflow, or customer interaction (for example, a JSON configuration file or admin command line interface), even if it’s a small part of the overall design.
PHASE THREE: VALIDATING YOUR DESIGN
Although this phase is equally applicable to engineers and product owners, the definition of a user and the extent of validation may vary. As a product owner, I am more likely to start from a “blank slate”, which means I need multiple rounds of validation to get to a good design. In each round, I usually want to validate my ideas with around three to five users (both customers and analytics engineers) with intentional diversity of use cases, customer size, and customer experience among my users. I carefully pre-plan testing sessions, take diligent notes, and often record conversations.
As an engineer, I’m more likely to design within a space that’s already been explored and represented by a product owner. So usually a single round of quick validation with one to two users (usually a product owner plus an AE, or maybe a few members of another internal team) is enough, and I don’t need to be so meticulous with planning and recording insights. If there are still significant user-facing unknowns, it’s a good opportunity to team up with the PO and the design team to conduct user-testing sessions with customers. For a standard design with no customer sessions necessary, this phase is still important for a few reasons:
- This is the right time to make sure that you and the PO are in alignment – before a design doc is published! In this early design validation stage, a PO should be able to redirect you fairly quickly if you’re presenting something that isn’t going to work. No one wants to redo or throw away work, so make sure you’re on the same page early on!
- POs often design and validate with the aim of filling in the medium to big picture, leaving little (but important) details open-ended or unfilled. This works well to ensure there’s enough flexibility to adapt for architectural or timing constraints (and ensure that their groundwork is still useful even if there’s a significant delay before a story gets picked up). So spending even a little bit of time exploring unvalidated areas (such as label names) can still make a big difference in ease of use to a customer. Of course, PO styles vary, so adjust accordingly.
- Reading a list of requirements and understanding a problem is different from experiencing a problem yourself. Validating your design with a user in some way makes the problem more real, and seeing someone struggle to use your interface or figure out what a parameter refers to can impact your solution in an important way. You may notice an opportunity for cleaning up documentation in the KB, or realize that an “obvious” term is not so obvious. Maybe you realize that a small modification in a nearby area could save time for users. And in the end – seeing a user love your solution will make your day!
So now that I’ve covered why user validation is important, let’s talk more about the how – a user research session. The goal is to present a possible solution with limited bias and collect honest feedback. If you’re conducting user interface or workflow research, then you’ll already be working with the design team (and they’re good at what they do!). However, if you’re getting internal feedback from a PO or stakeholder, you will still want to roughly follow the same best practices as UX researchers. Here are the considerations that are important to me:
- Make sure your design is easy to parse. If you’re presenting a mockup (even internally), it should be clearly readable. It doesn’t have to look amazing (it’s okay, you’re not a professional designer!), but it’s frustrating and a waste of time for others to interpret scribbles. When presenting a technical (non-UI) design or early idea, I like to come prepared with succinct bullet points.
- Take a moment to distance yourself from your ideas. It’s easy to become attached to your work and what feels like a great idea. This can seep into a user research session, when it’s tempting to explain what something does or defend why something is the way that it is. But this isn’t the time for that, and it can really curb the constructive feedback you receive.
- Clarify up front that you’re presenting an idea that’s still in the design phase, and you’re looking for feedback in the early stages. (This avoids panic if you’re presenting something that does not resonate with your interviewee at all).
- If you haven’t already met with this person before, collect relevant background information. For example:
- What’s your role?
- What’s your comfort level with {relevant technology}?
- How many hours per week do you currently spend on this problem?
- Start with a quick walkthrough of the problem, not your solution. If you can, skip talking about your solution altogether (this is fairly easy to do if you have a mockup).
In my datasource management design, I led with “I’m designing an admin-only page for support engineers like you to effectively and remotely manage datasources and troubleshoot connectivity problems. I’m screen-sharing a mockup now, I’ll give you a few minutes of quiet to take a look and absorb what’s here. Feel free to talk out loud with any thoughts you have as you explore it.”
If I’m not presenting a mockup, I focus on keeping my explanation of the design I’m validating short and neutral. Here’s a (slightly contrived) example for datasource management:
“Right now, our architecture doesn’t support multiple connections serving a single datasource. Let’s say that we made connection into a first-class item, with relationships to both the datasource and the connector. What’s your initial reaction?”
{answer}
“This new architecture would also have limitations for logging, in that logs would be at the connection level instead of at the connector level. Any concerns with that?” … and so on. - Resist the urge to (immediately) answer questions. Instead, follow-up with your own question. For example:
- “What does this do?” -> “What do you expect it to do?”
- “How do I accomplish {x}?” -> “How would you expect to be able to accomplish {x}?”
- “Why is {x} the way it is?” -> “What would you expect instead?”
- Ask questions that encourage constructive criticism. This is particularly important when I’m collecting feedback in an area where any design (no matter how bad) is a huge step change from the status quo (particularly if there is no current solution). I find that interviewees are so excited about the prospect of a potential solution that they don’t bother to voice their concerns. Some questions to ask to include:
- Is there functionality you expect to see in this workflow that’s missing?
- What (if anything) do you find confusing about this design?
- Is there anything you see here that you wouldn’t use, or would use only rarely?
- Don’t schedule or conduct meetings with users back-to-back. The time between each session is valuable for brainstorming questions for next time and iterating on your mockup or design. (I don’t always update my mockups, but sometimes it’s clear that a suggestion would be positive).
- If a design includes a user interface, get feedback from the design team! Check in with a UX lead early on – this means as soon as you have a direction in mind. They will work with you to iterate on the design, conduct research sessions with customers, and ensure the styling is consistent with guidelines.
Now what? Find out in Phase Four: Moderating Stakeholder Conflicts.