top of page

Tools & Tips #004

Recently, I read an interesting article about how there's no longer room for traditional UX Design in our post-pandemic world. There's a higher demand for Product Design to establish their return on investment by measuring the impact of their efforts and the solutions.



What's in here?


Chances are, that in your performance reviews, you've been asked to create SMART goals or personal OKRs. If so, you've probably also broke out in a sweat wondering how on Earth you're supposed to measure your efforts and connect them to product performance.


This is especially difficult in product environments like startups where they don't even measure user engagement or have analytics set up. Most companies who want to achieve growth, measure success by how fast they move from concept to execution. Other companies with a more mature structure, measure success through strategy-based key results (KRs) that are relevant to their business model and adopt the "test and learn fast" mentality.



The biggest question for businesses this year is how to deliver business value with fewer people, less time, less confidence, and higher uncertainty.


And, of course, it's also difficult for designers to prove their worth when they're ideas or suggestions are quickly dismissed due to internal stakeholder bias or a lack of data to back it up. I like to call this scenario, the impact paradox (it's a work-in-progress title).


Design Managers are expected to measure their team's impact in a similar way product success is measured through SMART goals. Though this is a great framework for product solutions, collaboration and culture is better when performance is based on task completion and team adoption. Here are some tools Product Designers can use to support their teams and demonstrate their impact.



COMPETITOR ANALYSIS

One of the best ways to become a domain expert is to know who the competitors are – which value propositions do they offer versus the features they actually offer and quantify the differentiators.


This method has earned me some praise in the past from Head of Products and CPOs. I also found this to be a useful tool to identify who the top competitor is in relation to usability heuristics.


Because Product Owners are very feature focused, I'd suggest creating a separate analysis for usability vs. features (aka Unique Selling Propositions). It's also a good practice to consolidate any qualitative feedback from customers in the same spreadsheet so you can quickly grab user quotes to validate some of your findings.



  1. Create a spreadsheet listing all known competitors.

  2. Copy/paste the value propositions from the competitor websites to get an overview of how they're marketing themselves.

  3. On the top row, list each feature or USP that your company is offering.

  4. With a checkbox, mark each competitor that's offering something similar. These marks will automatically quantify and reveal which competitor has the most similar product to yours.

  5. In the bottom row, identify the features that your company does not have but the competitors mentioned most often in their subscription or upgrades pages. These will reveal the differentiators in the competitor space.


To get the best results, collaborate with your PM or Product Marketing Manager on this task. This way, you're also building positive relationships with your colleagues – Something that will be evident in your 360 feedbacks during performance review cycles.



ESTABLISH METRICS together

Design thinking frameworks like the HEART framework or Lean UX canvas that are used to scope out opportunities are, in fact, the same practices that some PMs develop to identify "product metrics".



Product success is measured by the user's experience also known as UX metrics. However, not all the signals and metrics listed in the HEART framework and Sematext's list of KPIs are useable. Depending on the business model (B2B vs. D2C), some metrics are more relevant than others.


One thing I learned while working with a PM who had a business background is that there are lagging metrics like NPS and leading metrics like average session length. Though lagging metrics are great key results for the business's objective, they require a lot of time or traffic in order to determine significance. As a product designer working on a B2B software, it can be difficult to prove impact on UX changes using only lagging metrics so collaborate with your team to develop job goal and tasks (JTBDs) for each user so you and the PM can measure leading metrics like user task completion to determine product success.



By measuring person's performance by how well they were able to collaborate on identifying metrics and creating impactful usability, you have less stakeholders fighting to get their individual ideas implemented.

Prioritize Quick wins

Every designer or UX researcher has a story about how their low-effort idea was backlogged or dismissed for being too small to create a big impact and one day a bored engineer decided to pick up a ticket, implement the small solution and it lead to a significant increase in revenue.



This has happened to me and two other designers I've known who worked at two different, extremely well known companies. Within product teams, there's always this belief that small = less important. And yet, sometimes adding an easy implementation like a contact button on the top navigation does increase the leading metric of Lead-to-Sales and the lagging metric of Conversion.


Who knew? As seen in this case study, UX Research and Product Design did and thankfully, the team listened.




questionnairES & SYNTHESIS

What do you do when you need qual feedback to understand the "why" behind the quant performance but gathering research participants takes too long?



Here are three resources for running usability tests on your audience:



Collaborate with your Research team, PM or Customer Success stakeholders to collect this user data via existing tools used in the product feedback loops like Intercom, Zendesk, Hotjar or Survicate etc.



As a Product Designer, you do not need permission to collect this data or synthesize it yourself. Any company that empowers this level of bias toward action is lucky to have you.

Comentarios


bottom of page