6x6: How PMs Spot and Prioritise Opportunities

Part 5 of 6x6 series, where we share perspectives from six PMs on six questions about product management. See Part 1, Part 2 , Part 3, Part 4, and Part 6.

If a PM enables the team to define its what and how, then spotting and prioritising opportunities is one of their most important responsibilities. With thousands of competing inputs and dozens of decisions a day, how do product managers actually sort the signal from the noise?

Photo by Mel Poole on Unsplash
  • One of my biggest fears at starting at a new company was, what if I can’t figure out what to build next? Now, I realise that’s never the problem, because there are so many places to get ideas. The hard part is narrowing it down.
  • I take every opportunity to hear firsthand from my users. I watch them use our tools in research sessions and pay attention to what they’re struggling with. I read support tickets and understand what their biggest complaints are. While we only hear from a handful of users, there’s a richness to this qualitative feedback that we won’t get with data, which naturally triggers ideas that could work for the whole audience. We understand why people do the things they do through research.
  • Data produces insights about how users are behaving at scale. Where are people dropping out of our funnel? Where are our metrics fluctuating and what does that mean? The data we’re looking at defines the changes we make, which determines how successful our iterative product will be, so we set up our information pathways intentionally. Most products will look at metrics like daily active users, session frequency, and time spent. But what matters is how we slice those metrics, identify abnormalities, and what these say about what’s actually happening. Successful products are underpinned by the art of understanding what’s important to look at and what is actually happening based on the data.

Data just reflects real-life phenomenon, and that’s what we care about. People can forget that. They get so wrapped up in the numbers that they lose track of what they’re actually trying to make happen.

  • XFN outside the team often have specialist expertise (e.g. localisation managers uniquely understand the nuance of language) or can be closer to the user (e.g. operations teams directly respond to user queries). We ask them what they’re noticing and they often have unique insights on areas of the product or processes where users aren’t getting what they need.
  • Our team will naturally see places where usability could be improved or how we could make it easier to build on the product in the future. We know where the skeletons are buried, where we have deprioritised items in the past which could make sense to address now.
  • An obvious one, but so many PMs don’t do it — no matter how specialised it is, I use the product myself regularly. I use our competitors’ products. I have a stack of phones of different sizes and models so I can use the product on all the devices our users do. Our PM team turned on the accessibility features and tried to work through the product with eyes closed. I switched the app to a foreign language and tried to complete some tasks. Build an intuition by thinking like your users and experiencing as many of the painful parts firsthand as you can.
  • Sometimes, requests will come top-down from executives for the team to focus on a particular opportunity. Executives didn’t get to where they are by accident, so they probably have good insights. These may also be part of a bigger strategy or unblocking a key initiative for the company. As a PM, I need to understand what is driving the request, how it aligns with my team and the company goals, and adjust team goals where needed. In many companies, these are truly requests, and it remains up to the team to decide whether it makes sense to pick up this work.
  • I use information aggregated from external parties to understand what’s happening in the market, like industry trend reports, App Annie, or likability scores. I see opportunities, particularly for the longer term, based on the combination of the data these parties are aggregating for everyone and the data I uniquely have from my product and users.
  • Creativity is cross-pollination. Use the artist’s paradigm — steal. Most great ideas are based on the evolution of existing things or just crossing two things together. If I’m narrowly focused on just my area of expertise, I’m missing out on a lot of innovation. I try to apply ideas from one industry to another and see what happens. I read and go to museums.
  • Collect interesting fundamentals. From a young age, I was enthralled with the analog volume dial on the stereo; I dissected why and figured out that I loved the way I could turn it up and down with one motion, rather than repeatedly tapping a digital button. It feels good to lift the laptop lid with one finger, why? I pay attention to when my spidey senses tingle on these observations. Years later, we’re talking about controls in Hololens UI, and I’m realising there’s something that feels good about this vs. that, I’m using all those experiences I’ve collected.
  • Find people coming from very different points of view that it feels safe to be wrong around. Sometimes, I find that it’s just about talking through the space without an agenda. What does each person like about an existing experience and why? What are each of us curious about or baffled by? What frustrates us?
  • Consider if we’re looking at the right altitude of the problem: is there a better problem above and/or below this? If your problem statement becomes solving world hunger or debating between radio boxes, you’ve gone too far either side. The right answer is a bit of a gut intuition. Zeroing in on the right altitude early on is like adjusting the rudder upfront of a giant oil tanker — it’s a lot easier to adjust for success before you gain all the momentum and realise we’ve gone off course.

There’s a massive, beautiful ancient device called the Chinese self-pointing chariot: you point it where you’re going and pull it forward. If you veer off course, one wheel spins faster than the other and the statue keeps pointing in your original direction. The scientists who built it had asked, if we’re going that way, how do we know we’re going in a straight line? And they came up with a great answer. But they knew about metal and magnetic forces back then, too. If they had instead asked the higher-altitude question — how do I orient myself anywhere on Earth? — they might have invented the compass.

  • The most important element of an opportunity is whether it solves a real problem. How many people is it impacting and how important is that impact to those people? All too often we’re so smitten by an idea that we try to make a square peg fit in a round hole. Stop and ask, does this thing deserve my attention? If in doubt, put it away in a drawer and see how often you find yourself coming back to the same idea; if it keeps coming up, there’s probably a real problem worth exploring there.
  • The goals for our team are key in deciding what to prioritise. We set them in the first place to know where to focus, so we start by evaluating each opportunity based on its impact to the goals.
  • Cost of the opportunity is an important factor because we’re deciding where to put our finite resources. We want to create the most value for the lowest cost, but timescales will vary, so this is where strategy helps create a good balance of bets. We consider not just the technical cost, but also research, design time, and operations efforts.
  • Create a balanced portfolio in regards to the lifecycle of the product. We spend some of our time discovering new audiences and problem spaces, while continuing to improve what we already have. Mature products tend to be more stable and focus on capturing existing opportunity, while earlier-stage products tend to explore various problems to solve. Some orgs like to use a 70/20/10 split: 70% of resources go to optimising existing products, 20% to new areas that we already know are growing, and 10% to wild and crazy things that could change the entire landscape.
  • Consider level of confidence. It’s not scientific; for us, it’s just a boolean: are we confident enough on the impact of this to just go execute or does it need more research/input? For areas where we’re capturing known value, we’ve got hard data by which to assess impact, but for new explorations, we’re just making guesses based on past data. Sometimes, we even ship a quick bare-bones MVP in one city to test a specific hypothesis and get a sense of the likely magnitude of impact of something bigger.

If a turkey had to determine what would happen next based on her past, she would never guess she’d have a Thanksgiving Day. So, past data is useful, but you still need to turn it into a prediction.

Prioritisation is not a science, or else you wouldn’t need PMs. You’d write up a formula, ask people to input the data, and get your answer. I wish I could tell you there was something more formal to deciding what to pursue, that I evaluate it against a matrix, but I don’t.

PMs develop intuition over time on which signals matter and what past experiences suggest about the likelihood this project will have the impact we intend it to have. The best way to hone this is by listening and reflecting on what worked and what didn’t.

stories of products and strategies