Best practices for balancing AI-driven analysis with product instinct and expert validation
I have this instinct that when I feel really good about data, I should probably talk to a real data person. At every company I’ve worked at, I’ve had that one data person I trusted to call me out on my bullshit.When I called Mara Church (my go-to data person from Patreon), her advice was to definitely keep a data person in the loop when you’re encountering situations like “What analysis makes sense?” You can consult with AI about that, but you should probably have a functionally excellent person checking that logic. Anytime you encounter uncertainty in statistics, have an expert in the loop. And of course, before you share your results at an all-hands meeting.Hilary Gridley adds crucial guidance about balancing AI-driven analysis with good judgment. She recommends asking yourself a few questions when wondering if you’re good to go with analysis from AI:First, does this actually make sense based on what you know about the product? Balance it with your own product instinct.Second, consider the cost of being wrong. This is maybe the most important one. The cost of being wrong when you’re standing in front of the entire company at an all-hands making proclamations is high. The cost of being wrong when the recommendations would still be legitimate even with small mistakes in the analysis is low, especially if you’re just trying to test things.Third, check yourself by running the analysis multiple times. An LLM might automatically do this, but it’s helpful to check if you’re getting the same types of results. You can even ask the AI for help: “I want to validate this. I don’t trust you. What do you think I should do?” and see what it says.And of course, always check with analysts and others on the team, share results, and get a sense check from them.Hillary emphasizes the importance of being transparent. She tells folks on her team that they’re encouraged to use AI for basically anything they can think of, but they have to be transparent about it - how they’re using it, what inputs they’re feeding in, how they’re prompting it. A lot of people think it’s binary - either AI or not AI - but there’s a ton of nuance in the methods you use and how you get to an answer.➡️ Trust but verify. Use that trusted data person who’ll call you out, balance AI analysis with product instinct, and always consider the cost of being wrong before acting on AI-generated insights.
Hilary: I say to the LLM, “Hey, I’m worried about bias. How can I get rid of it?” These things are so use-case and context dependent that the answer is usually unsatisfactory. It’s some combination of using my judgment, which is probably inherently biased but I think is valuable, plus acknowledging my blind spots. I often ask it things like, “How am I most likely to embarrass myself by running with this information?” or “I’m a product person - what logical fallacies am I probably falling into?” It’s usually pretty good at guessing those things about me.Tal: In general, across many things I do, I like to ask it to play devil’s advocate - “Please poke holes and be brutal.”Follow-up: It’s funny - I often ask it how to communicate with someone on another team in a way that will resonate with them, what I need to understand about how they think. It gives a somewhat cartoonish but often accurate portrayal of the pitfalls of that function. - Hilary
Given the analysis capabilities you’ve shown, how will this change the relationship between product managers and analysts? Do you even need analysts anymore?
Hilary: We need them more than ever.Tal: I think it’s going to make their lives a lot better, just like Mixpanel and Amplitude did when they appeared.Hilary: I think about work like highways - they keep adding lanes, but traffic never improves. The more capacity there is, the more work exists to fill it. I think we have a limitless appetite to keep learning, doing new things, and pushing harder. This just allows us to do more, but I don’t see a world where we’re all put out of business by it.Follow-up from Tal: I showed up to AI knowing what analysis I wanted or at least what question I should be asking. That’s kind of cheating. As a PM, showing up alone like that doesn’t happen as much in the real world. Both examples benefited from having experienced analysts on my team, having conversations with them, thinking about it together. It’s not just the analysis part - it’s the whole process: data strategy, instrumentation, context across the business, what questions to ask, designing the analysis and visualization, checking the math, and taking it over the finish line. So the analyst job is going to be around for a long time, only elevated and made more fun for product analysts now.Check out Hilary’s course (not sponsored, she’s just awesome).