Platform

AI

AI Agents
Sense, decide, and act faster than ever before
AI Visibility
See how your brand shows up in AI search
AI Feedback
Distill what your customers say they want
Amplitude MCP
Insights from the comfort of your favorite AI tool

Insights

Product Analytics
Understand the full user journey
Marketing Analytics
Get the metrics you need with one line of code
Session Replay
Visualize sessions based on events in your product
Heatmaps
Visualize clicks, scrolls, and engagement

Action

Guides and Surveys
Guide your users and collect feedback
Feature Experimentation
Innovate with personalized product experiences
Web Experimentation
Drive conversion with A/B testing powered by data
Feature Management
Build fast, target easily, and learn as you ship
Activation
Unite data across teams

Data

Warehouse-native Amplitude
Unlock insights from your data warehouse
Data Governance
Complete data you can trust
Security & Privacy
Keep your data secure and compliant
Integrations
Connect Amplitude to hundreds of partners
Solutions
Solutions that drive business results
Deliver customer value and drive business outcomes
Amplitude Solutions →

Industry

Financial Services
Personalize the banking experience
B2B
Maximize product adoption
Media
Identify impactful content
Healthcare
Simplify the digital healthcare experience
Ecommerce
Optimize for transactions

Use Case

Acquisition
Get users hooked from day one
Retention
Understand your customers like no one else
Monetization
Turn behavior into business

Team

Product
Fuel faster growth
Data
Make trusted data accessible
Engineering
Ship faster, learn more
Marketing
Build customers for life
Executive
Power decisions, shape the future

Size

Startups
Free analytics tools for startups
Enterprise
Advanced analytics for scaling businesses
Resources

Learn

Blog
Thought leadership from industry experts
Resource Library
Expertise to guide your growth
Compare
See how we stack up against the competition
Glossary
Learn about analytics, product, and technical terms
Explore Hub
Detailed guides on product and web analytics

Connect

Community
Connect with peers in product analytics
Events
Register for live or virtual events
Customers
Discover why customers love Amplitude
Partners
Accelerate business value through our ecosystem

Support & Services

Customer Help Center
All support resources in one place: policies, customer portal, and request forms
Developer Hub
Integrate and instrument Amplitude
Academy & Training
Become an Amplitude pro
Professional Services
Drive business success with expert guidance and support
Product Updates
See what's new from Amplitude

Tools

Benchmarks
Understand how your product compares
Templates
Kickstart your analysis with custom dashboard templates
Tracking Guides
Learn how to track events and metrics with Amplitude
Maturity Model
Learn more about our digital experience maturity model
Pricing
LoginContact salesGet started

AI

AI AgentsAI VisibilityAI FeedbackAmplitude MCP

Insights

Product AnalyticsMarketing AnalyticsSession ReplayHeatmaps

Action

Guides and SurveysFeature ExperimentationWeb ExperimentationFeature ManagementActivation

Data

Warehouse-native AmplitudeData GovernanceSecurity & PrivacyIntegrations
Amplitude Solutions →

Industry

Financial ServicesB2BMediaHealthcareEcommerce

Use Case

AcquisitionRetentionMonetization

Team

ProductDataEngineeringMarketingExecutive

Size

StartupsEnterprise

Learn

BlogResource LibraryCompareGlossaryExplore Hub

Connect

CommunityEventsCustomersPartners

Support & Services

Customer Help CenterDeveloper HubAcademy & TrainingProfessional ServicesProduct Updates

Tools

BenchmarksTemplatesTracking GuidesMaturity Model
LoginSign Up

A Conversation with Quantitative UX Researcher Randy Au

Quantitative UX researcher Randy Au "analyzes all the things." He is part researcher, part educator, and part therapist philosopher. And he loves cats.
Insights

Sep 19, 2019

14 min read

Vince Kosek

Vince Kosek

Contributing Writer

A Conversation with Quantitative UX Researcher Randy Au

When Randy Au, Quantitative UX Researcher at Google, was a kid, it wasn’t like he went around saying, “When I grow up, I’m gonna be a UX Researcher!” Because, who says that? And what exactly is a Quantitative UX Researcher, anyway?

“I work with a lot of UX Designers and Qualitative Researchers now who come from straight up industrial design backgrounds,” says Au, “but they’re some of the top consumers of analytic content, along with Product Managers. Which makes sense. They’re building an interface, they need to know, ‘Where should the buttons go, and is anyone using the buttons I already made?’, right?”

Adding a quant element to UX Research takes a range of skills, which Au has honed in past engineering and analyst roles at Meetup, Bitly, and Primary.com. “I’ve been doing this kind of stuff for over 10 years now, but I’m fairly new to UX,” Au shares. “I came to analytics essentially from the engineering side.”

Today, in his work with Google, Au plays the roles of researcher, educator, engineer, analyst, philosopher, and seemingly sometimes even therapist.

Amplitude’s John Cutler sat down with Randy to ask him about changes he’s seen in the analytics, how he copes with uncertainty, and his take on successfully collaborating with product development teams.

John Cutler: Talk to me about changes you’ve seen in your career. Have there been any perceptible shifts in the landscape that you’ve been occupying?

Randy Au: Well the big one was when the world went to mobile. That was a big one, right? A lot of places went from having one development team to suddenly needing three because you need web, Android and iPhone. Period.

Another one is the shift to self-service. That one comes up quite a bit, over and over. It’s becoming more and more prominent with advances in visualization tools and so forth. I see a lot of people trying to push the boundaries of self-service.

You’ve gone from a data analyst person, telling the team what the results are, to now having this situation where the data person has to somehow put up guardrails so that the self-service environment can function.

That said, there’s still a lot of ad-hoc type stuff that analysts need to provide expertise on, which is a dynamic that doesn’t really change. The difference is in the role of the analyst. You’ve gone from a data analyst person, telling the team what the results are, to now having this situation where the data person has to somehow put up guardrails so that the self-service environment can function. It creates this really weird tension of, ‘I can let people have all the data that they want, but then what guarantees that their conclusions are valid?’.

JC: Right. Sometimes I see this tension between having guardrails and enabling self-service as central to defining the role of analytics. Has that been your experience? Have you had to personally transition in your career or did you latch on to certain models early on?

RA: Usually, quantitative research teams are resource constrained by the fact that there’s only so many quants around. I’m usually on small teams of one or two qualitative researchers. So from that perspective, we are usually an insights kind of team.

The issue is that we just can’t handhold everyone all the time. If things get really constrained and we don’t have a lot of time, we have to rely on a lot of self-service. Then, we become more like your tutors. We teach kids who are willing to learn. Every team always has one or two people that are very interested in data and want to use it.

Every team always has one or two people that are very interested in data and want to use it.

We train those willing learners as much as they’re willing to take. It’s always been amazing that you can always find one or two people willing to do that. What would happen if there were teams that just don’t have that kind of person around? That kind of makes me worry sometimes. But we’ve been lucky so far. So, yeah. That’s how we do it.

The ideal would be to have a quant embedded within each product team and they just live and breathe that one thing. But you know, that’s hard to do.

JC: It’s interesting because part of educating people is breaking down preconceived notions of what analytics is, and what it’s capable of. A lot of people think that it’s all A/B tests that cough up a definitive answer.

RA: Exactly! That’s the nice part about that part (the A/B testing part) of analytics. It gives you a theoretically right and wrong answer.

But there’s also this blurry part where, the more stats you know, the more you’re like, “Oh, that’s actually some hand-wavy bullshit!”. Because at 95% confidence it just means that there is a one in 20 chance you’re actually getting something totally wrong. I’m sure we’ve done all sorts of tests that totally yielded false positives. We’ve run so many experiments, it almost has to be the case.

At least in theory, there’s a correct answer. We can just say, “Okay, we’re satisfied and we have confidence.” But then when you get away from that, you start going into the area of, “Here’s the stuff that’s just indicative.”

There’s conditions that lead to increasing uncertainty. Sometimes we don’t have enough data points and we’re working on a very hard problem, or the causality isn’t particularly clear. But the reality is, we have to launch next quarter and this is the closest thing we’ve got. We’re going to just have to take it on faith because that’s all we have.

Yes, all the science and knowledge and analytics we have and we still have something where we’re kind of taking a bet! We’re not left 100% in the dark, but we’re still making a bet.

And when you lay it out on the table like that, it sounds really sketchy. I mean, at the end of the day, I think we all do it to varying degrees. But it’s a scary thing. It’s like, “Yes, all the science and knowledge and analytics we have and we still have something where we’re kind of taking a bet! We’re not left 100% in the dark, but we’re still making a bet.” And that’s the nature of business; you take the risk because that’s where the rewards are.

JC: Some people seem more open to that degree of uncertainty than others. When you find people that you’re working with that just have that sort of extremely utopian view of yes/no things, how do you talk them off the ledge? Do you have any go-to strategies?

RA: Sometimes…but it depends. Sometimes I frame it as like, “This is a bet, given all the things we know. It’s a bet, but it’s the most likely bet.”

Plus, we have some qualitative research and our own personal experiences of what products and humans and things are. It seems reasonable. It’s not totally outlandish.

This appeals to humanity because we’re not particularly good at placing the most likely bets. If you ran down the numbers, you would actually prove that we’re wrong more than we’re not. That’s why we need to use data.

But at the same time, it’s not like you pick between two things and it’s 50/50. The ways to fail probably outnumber the ways to succeed. And so you’re just shaving off those chances to make your odds a little bit better.
And then some people are just not temperamentally down with it. In that situation, if that person is running the show, then the product will move slower. It’s going to be a little bit more methodical, and it might actually be the right call! It depends on what the circumstances are. But you can’t fight human nature in that way.

JC: When you work with less experienced analysts, how do you calibrate their business sense to help them focus on techniques or analyses that are going to have the highest impact?

RA: That’s really tricky because you build business sense from working with business people. You’ll get it when you’re with the PM who’s under the gun to increase revenue this quarter, not next quarter. I learned that, essentially, I have to ask myself, “Is this insight good enough to help them make this decision, because they have to make a decision today?”

With or without me it’s happening, so I better do what I can in the next three hours and get it in. Without that pressure, you can very easily just go wander off.

With or without me it’s happening, so I better do what I can in the next three hours and get it in. Without that pressure, you can very easily just go wander off.

And that sense of “good enough” is just something you have to develop over time.

JC: It seems like you’d need to have some familiarity with the decisions that are being made if the idea is balancing reasonable decision quality with reasonable decision velocity.

RA: I always recommend that people work closely with the teams. When they’re too far removed and you lose the squishy context, it gets tough. People start latching onto artifacts that may or may not actually matter at the end of the day. It’s especially true for the business as you go higher up.

I always recommend that people work as closely with the PM as possible. Make them your best friend.

I always recommend that people work as closely with the PM as possible. Make them your best friend. Because some of the best things I’ve ever done came about when we had a skilled analyst who was familiar with the tooling in a paired situation. We’d have the PM, or tech lead, or whoever sitting next to them. And they’re just pair driving, looking through the product.

JC: Let’s say you had a team that’s been newly unleashed. They have a self-service tool, all their events are instrumented, they’ve developed a bunch of properties. If you could do a three hour class with them to try to make sure they don’t fall into some common pitfalls, what would that look like?

RA: Yeah, if I had a three hour session with a new team, almost 95% of it would be “get to know your world”.

You get in there and it’s like, “Okay, first let’s understand”. It’s kind of like we just dropped into this island full of goodies, right? So now let’s figure out what the heck is going on here. Okay, where are our users? What can we do with them?

But then we peel it back one more level and ask, “Where’s this data coming from? What is this? What does this user mean? What does unique user mean here?”. Making sure we understand those foundations is critical because everything else rests on that.

So I would probably spend almost all my time on the fundamentals of just understanding.

If that’s screwed up in the beginning, then everything you think about later on will just fall apart. So I would probably spend almost all my time on the fundamentals of just understanding.

I’d explain that, “You’re allowed to say this because of these reasons”. And then hopefully they kind of pick up on that. So later, as they build more familiarity they can form those statements on their own. It’s like, “Okay, very good…but there’s that one caveat up there!”

Teaching them that process of figuring out what they’re able to do, would probably be what I’d focus on. I mean, I’m sure it would take me weeks of plinking around. But in a couple of hours I’d want to, as best as I could, impress upon them the fact that all this stuff looks magical (and it is to some degree) but you can do some very, very incorrect things with it very easily.

JC: Right. So it’s not just like, oh, here’s the end day retention chart. Okay, let me make our next quarters strategy off of that chart that I just saw.

RA: Right. Because a retention chart can be misleading very easily. You can have a dull, “Oh, this happens all the time.” But then you have a one week sale or you do a giveaway and suddenly that entire diagonal band of your retention chart is just all screwed up. But people wouldn’t necessarily know to look for that.

JC: In an ideal world, what would you regard as bread and butter questions, that if you could set teams up in a self-service environment, they could be really good at answering?

RA: I mean, the bread and butter things usually are simple in concept. They’re usually just the, “How many people do a thing”. They usually fit this pattern. The problem is that the “do a thing” part is very fickle and it becomes all about the instrumentation.

That bread and butter thing can easily turn into some multi-step funnel or some bizarre combination of things. The actual counting part is very tricky and finicky. So the easier that gets, the more people can self-serve.

If they’re confident that they’re counting the correct thing, then usually the ratios that they come up with are fairly accurate. They make sense in general because they’re not obviously crazy. So, you can kind of trust them.

People in general are kind of hesitant to do very fancy things because they’re like, “Oh, I’m not the expert. I’m just a user of this tool.” So they’re not going to go for anything outlandish. But they do have to be familiar enough that they’re willing to take the chance.

If I could teach them those things, it would open up a lot of the more difficult cases, like the causality ones. Causality is extremely hard.

About the author
Vince Kosek

Vince Kosek

Contributing Writer

More from Vince

Vince Kosek is a product analytics consultant based in Santa Barbara, CA. Vince's career spans financial services, manufacturing, medical devices, and most recently vertical B2B SaaS products in construction and property management. He is passionate about helping cross-functional product teams make better product decisions.

More from Vince
Topics
Platform
  • Product Analytics
  • Feature Experimentation
  • Feature Management
  • Web Analytics
  • Web Experimentation
  • Session Replay
  • Activation
  • Guides and Surveys
  • AI Agents
  • AI Visibility
  • AI Feedback
  • Amplitude MCP
Compare us
  • Adobe
  • Google Analytics
  • Mixpanel
  • Heap
  • Optimizely
  • Fullstory
  • Pendo
Resources
  • Resource Library
  • Blog
  • Product Updates
  • Amp Champs
  • Amplitude Academy
  • Events
  • Glossary
Partners & Support
  • Contact Us
  • Customer Help Center
  • Community
  • Developer Docs
  • Find a Partner
  • Become an affiliate
Company
  • About Us
  • Careers
  • Press & News
  • Investor Relations
  • Diversity, Equity & Inclusion
Terms of ServicePrivacy NoticeAcceptable Use PolicyLegal
EnglishJapanese (日本語)Korean (한국어)Español (Spain)Português (Brasil)Português (Portugal)FrançaisDeutsch
© 2025 Amplitude, Inc. All rights reserved. Amplitude is a registered trademark of Amplitude, Inc.

Recommended Reading

article card image
Read 
Product
Getting Started: Product Analytics Isn’t Just for Analysts

Dec 5, 2025

5 min read

article card image
Read 
Insights
Vibe Check Part 3: When Vibe Marketing Goes Off the Rails

Dec 4, 2025

8 min read

article card image
Read 
Customers
How CAFU Tripled Engagement and Boosted Conversions 20%+

Dec 4, 2025

8 min read

article card image
Read 
Customers
The Future is Data-Driven: Introducing the Winners of the Ampy Awards 2025

Dec 2, 2025

6 min read

Explore Related Content

Integration
Using Behavioral Analytics for Growth with the Amplitude App on HubSpot

Jun 17, 2024

10 min read

Personalization
Identity Resolution: The Secret to a 360-Degree Customer View

Feb 16, 2024

10 min read

Product
Inside Warehouse-native Amplitude: A Technical Deep Dive

Jun 27, 2023

15 min read

Guide
5 Proven Strategies to Boost Customer Engagement

Jul 12, 2023

Video
Designing High-Impact Experiments

May 13, 2024

Startup
9 Direct-to-consumer Marketing Tactics to Accelerate Ecommerce Growth

Feb 20, 2024

10 min read

Growth
Leveraging Analytics to Achieve Product-Market Fit

Jul 20, 2023

10 min read

Product
iFood Serves Up 54% More Checkouts with Error Message Makeover

Oct 7, 2024

9 min read

Blog
InsightsProductCompanyCustomers
Topics

101

AI

APJ

Acquisition

Adobe Analytics

Amplify

Amplitude Academy

Amplitude Activation

Amplitude Analytics

Amplitude Audiences

Amplitude Community

Amplitude Feature Experimentation

Amplitude Guides and Surveys

Amplitude Heatmaps

Amplitude Made Easy

Amplitude Session Replay

Amplitude Web Experimentation

Amplitude on Amplitude

Analytics

B2B SaaS

Behavioral Analytics

Benchmarks

Churn Analysis

Cohort Analysis

Collaboration

Consolidation

Conversion

Customer Experience

Customer Lifetime Value

DEI

Data

Data Governance

Data Management

Data Tables

Digital Experience Maturity

Digital Native

Digital Transformer

EMEA

Ecommerce

Employee Resource Group

Engagement

Event Tracking

Experimentation

Feature Adoption

Financial Services

Funnel Analysis

Getting Started

Google Analytics

Growth

Healthcare

How I Amplitude

Implementation

Integration

LATAM

Life at Amplitude

MCP

Machine Learning

Marketing Analytics

Media and Entertainment

Metrics

Modern Data Series

Monetization

Next Gen Builders

North Star Metric

Partnerships

Personalization

Pioneer Awards

Privacy

Product 50

Product Analytics

Product Design

Product Management

Product Releases

Product Strategy

Product-Led Growth

Recap

Retention

Startup

Tech Stack

The Ampys

Warehouse-native Amplitude