Platform

AI

AI Agents
Sense, decide, and act faster than ever before
AI Visibility
See how your brand shows up in AI search
AI Feedback
Distill what your customers say they want
Amplitude MCP
Insights from the comfort of your favorite AI tool

Insights

Product Analytics
Understand the full user journey
Marketing Analytics
Get the metrics you need with one line of code
Session Replay
Visualize sessions based on events in your product
Heatmaps
Visualize clicks, scrolls, and engagement

Action

Guides and Surveys
Guide your users and collect feedback
Feature Experimentation
Innovate with personalized product experiences
Web Experimentation
Drive conversion with A/B testing powered by data
Feature Management
Build fast, target easily, and learn as you ship
Activation
Unite data across teams

Data

Warehouse-native Amplitude
Unlock insights from your data warehouse
Data Governance
Complete data you can trust
Security & Privacy
Keep your data secure and compliant
Integrations
Connect Amplitude to hundreds of partners
Solutions
Solutions that drive business results
Deliver customer value and drive business outcomes
Amplitude Solutions →

Industry

Financial Services
Personalize the banking experience
B2B
Maximize product adoption
Media
Identify impactful content
Healthcare
Simplify the digital healthcare experience
Ecommerce
Optimize for transactions

Use Case

Acquisition
Get users hooked from day one
Retention
Understand your customers like no one else
Monetization
Turn behavior into business

Team

Product
Fuel faster growth
Data
Make trusted data accessible
Engineering
Ship faster, learn more
Marketing
Build customers for life
Executive
Power decisions, shape the future

Size

Startups
Free analytics tools for startups
Enterprise
Advanced analytics for scaling businesses
Resources

Learn

Blog
Thought leadership from industry experts
Resource Library
Expertise to guide your growth
Compare
See how we stack up against the competition
Glossary
Learn about analytics, product, and technical terms
Explore Hub
Detailed guides on product and web analytics

Connect

Community
Connect with peers in product analytics
Events
Register for live or virtual events
Customers
Discover why customers love Amplitude
Partners
Accelerate business value through our ecosystem

Support & Services

Customer Help Center
All support resources in one place: policies, customer portal, and request forms
Developer Hub
Integrate and instrument Amplitude
Academy & Training
Become an Amplitude pro
Professional Services
Drive business success with expert guidance and support
Product Updates
See what's new from Amplitude

Tools

Benchmarks
Understand how your product compares
Templates
Kickstart your analysis with custom dashboard templates
Tracking Guides
Learn how to track events and metrics with Amplitude
Maturity Model
Learn more about our digital experience maturity model
Pricing
LoginContact salesGet started

AI

AI AgentsAI VisibilityAI FeedbackAmplitude MCP

Insights

Product AnalyticsMarketing AnalyticsSession ReplayHeatmaps

Action

Guides and SurveysFeature ExperimentationWeb ExperimentationFeature ManagementActivation

Data

Warehouse-native AmplitudeData GovernanceSecurity & PrivacyIntegrations
Amplitude Solutions →

Industry

Financial ServicesB2BMediaHealthcareEcommerce

Use Case

AcquisitionRetentionMonetization

Team

ProductDataEngineeringMarketingExecutive

Size

StartupsEnterprise

Learn

BlogResource LibraryCompareGlossaryExplore Hub

Connect

CommunityEventsCustomersPartners

Support & Services

Customer Help CenterDeveloper HubAcademy & TrainingProfessional ServicesProduct Updates

Tools

BenchmarksTemplatesTracking GuidesMaturity Model
LoginSign Up

A Maturity Model for Data Management (& Where Your Team Fits In)

There are three key strategies for managing incoming event data in product analytics. The one you use depends on how far along your organization is in the maturity model.
Insights

Sep 2, 2020

11 min read

Dana Levine

Dana Levine

Former Product Manager, Amplitude

A Maturity Model for Data Management Large

Amplitude has processed more than 31 trillion user events for more than 40,000 digital products.

So, you could say we’ve seen our fair share of strategies for managing incoming event data.

Over the years of working with customers large and small, it’s become clear that there exist three primary strategies for managing data in product analytics, and they align to a company’s maturity level:

  • Strategy #1: Cleanup Workflow
    – Beginner Maturity in Data Management
  • Strategy #2: Approval Workflow
    – Intermediate Maturity in Data Management
  • Strategy #3: Event Planning Workflow
    – Advanced Maturity in Data Management

Although companies generally move from one step to the next as they advance in maturity, these data management strategies are not mutually exclusive, and companies may use them with some overlap.

Generally, however, we’ve found that the teams that are newer to product analytics will typically opt to do mostly retroactive correction on data that is already in the system, while more mature organizations tend more towards up-front planning that ensures high functionality and reduces the amount of correction required.

Data Management Strategy #1: Cleanup Workflow

Maturity Level: Beginner

In many cases, organizations that are new to product analytics are eager to get any events—any events at all—into their systems. The organization doesn’t have much experience with deciding which events to put in, and therefore may just allow employees to add whichever events they choose. Event data is seen as inherently valuable, and every time an engineer is willing to instrument it is seen as a win. Teams at this level typically don’t have a dedicated analytics function or a formal data governor. It is possible that some product managers are carefully thinking about how to instrument their new features, but it is also possible that engineers are simply told to send in events, or in some cases they may even implement a system that automatically tracks clicks and injects relevant properties (similar to auto-tracking, which creates more problems than it purports to solve).

The main characteristic of this approach is that most of the data management occurs after the events are already live within a data analytics platform like Amplitude. Events often need to be renamed or combined, as they aren’t easily usable in the format they are ingested in. For example, “click header menu” might turn into “open settings,” or something that better describes the in-product action it represents. This makes it easier for someone unfamiliar with the data to understand which event to choose.

In addition, this sort of system typically has a lot of events that aren’t explicitly useful. For example, an engineer might instrument every action he or she can think of, only to find that they have inserted duplicate events. In some cases a company may hit their monthly event quota, only to find that a few events that provide little value are costing a lot of money, since analytics platforms often charge by the amount of data ingested.

Cleanup is also often triggered by a lack of clarity. Although an organization at the beginner level of maturity may not be sending many events, it can be difficult to determine the right event to instrument with the analytics platform. Over time, this necessitates renaming events or adding descriptions as the organization learns more about which events they want to track.

In most cases, the cleanup work is performed by one or more people who have a passion for analytics or data-driven culture, often as an after-hours function. They typically don’t have much time for this work, but the event volume is low enough to clean on an occasional basis.

Data Management Strategy #2: Approval Workflow

Maturity Level: Intermediate

At some point, the volume of inbound events becomes too great, and it isn’t possible to only manage events retroactively. In many cases, there is a motivating event; maybe there is a bug that creates thousands of unnecessary event types and takes hours to clean up, or possibly the company starts to realize the value of analytics while also seeing that their implementation is deeply flawed. The organization realizes that cleanup and correction, while still necessary, probably isn’t going to get them to the desired state. They are willing to invest more time into doing things “the right way,” although the resources are often still fairly limited. This is the point where an organization typically begins to transition into an “event approval” workflow.

At this point, someone decides to take on the mantle of data governor, or maybe is even drafted into the role by leadership. The data governor is tasked with keeping track of incoming events, and approving them on a case-by-case basis. At this stage, the organization often has some sort of data management style guide, which gives some ground rules for how to appropriately instrument events.

In some cases, the data governor approves events as-is, but they also have an opportunity to transform events or even to reject them outright. As mentioned earlier, data management strategies overlap. So while the organization may have advanced in maturity, they may still need to do a significant amount of cleanup/correction work (strategy #1). The big difference between this workflow and the cleanup/correction workflow is that the cleanup/correction is often done when the event first comes in, and not at some later point when it becomes obvious that there is a problem.

When we talk to data governors at organizations in this stage, they often tell us that they are hesitant to reject events outright, because it is unlikely that the engineers will take the time to re-instrument. In general, events are still seen as “precious,” and the organization does whatever it can to salvage an event. While it is likely that the team has set some ground rules—such as a style guide for events— adherence to these rules may be inconsistent.

The hallmark of this intermediate level is that there typically is a single data governor, versus no data governor in the earlier stage or multiple data governors in the later stages. This data governor is often a part-timer, and their existing role is typically helped by effective data management. For example, they could be the head of data science or a product manager who has worked with well-run analytics systems in the past. No matter the title, though, their time is limited. Each week the data governor does a little data maintenance work based on inbound requests, but does not have time to plan and execute cross-team instrumentation.

Data Management Strategy #3: Event Planning Workflow

Maturity Level: Advanced

Effective data governance requires a fairly intimate knowledge of the product, and as a company’s product offerings broaden, it is likely that a single data governor cannot be familiar with every aspect. At this point, the organization has advanced to the final maturity level for data management. The company invests in bringing a team together—often in the data science and/or analytics function—that is dedicated to thinking about product analytics and related problems.

This is the point when a company is ready to transition to an event planning workflow. Approval has gotten the team far enough, but it is obvious that for product analytics to really yield its maximum value, it must become a discipline. As such, the analytics team needs to carefully consider every event and property that is sent in. Each member of the analytics team is responsible for working with one or more product teams, and collaborates carefully with that team throughout the event lifecycle. When product managers and designers spec out product features, the analytics team works with them to determine exactly what to measure. They also carefully document the process. Then, analysts work closely with the engineering team during implementation, even going so far as to read pull requests to make sure that instrumentation is performed correctly. They then plan the events in a system such as Amplitude Govern, which ensures that only correct events are ingested. If an event does not match these specifications, it is typically rejected, and engineers will need to make changes. Far from seeing events as precious, these organizations view events as a usable tool that helps them learn more about user behavior.

These more mature organizations still do some level of cleanup/correction, although as in the case of the approval stage, this fulfills a somewhat different purpose. Since these organizations have a lot of events, typically they need to cull these. Events that may have been useful in the past may no longer be needed, or in the interest of reducing costs, they may need to block some events that turned out to be less useful in the long-term. They may still merge or rename events, but this is typically done as taxonomy standards evolve rather than in response to specific mistakes. Likewise, the analytics team may also do a limited amount of event approval, although it is likely that all events will be planned.


Overall, these three data management workflows are appropriate for teams at different levels of maturity. One interesting thing to think about is that the data management dysfunctions (anarchy and dictatorship) don’t purely come from using a particular strategy, but from using that strategy at the wrong time. Anarchy typically results from using a cleanup/correction strategy (or even an approval strategy) too late in the process, while dictatorship can come from using planning too early (typically before an organization has seen the value from product analytics).

In the prior posts of this three-part series, we discussed the four truths of data management and three keys to data functionality. Now, with this maturity model, you have a fuller understanding of how to plan events and make data more usable, accurate, and comprehensive for your entire organization.

The next step is to implement these strategies, and start getting more value from your data.

About the author
Dana Levine

Dana Levine

Former Product Manager, Amplitude

More from Dana

Dana is a former Product Manager at Amplitude. He enjoys building products that make peoples’ lives easier, and using data to answer interesting questions. He has started two companies, worked at a wide variety of startups and larger corporations, and studied Neuroscience and Bioengineering at MIT.

More from Dana
Topics
Platform
  • Product Analytics
  • Feature Experimentation
  • Feature Management
  • Web Analytics
  • Web Experimentation
  • Session Replay
  • Activation
  • Guides and Surveys
  • AI Agents
  • AI Visibility
  • AI Feedback
  • Amplitude MCP
Compare us
  • Adobe
  • Google Analytics
  • Mixpanel
  • Heap
  • Optimizely
  • Fullstory
  • Pendo
Resources
  • Resource Library
  • Blog
  • Product Updates
  • Amp Champs
  • Amplitude Academy
  • Events
  • Glossary
Partners & Support
  • Contact Us
  • Customer Help Center
  • Community
  • Developer Docs
  • Find a Partner
  • Become an affiliate
Company
  • About Us
  • Careers
  • Press & News
  • Investor Relations
  • Diversity, Equity & Inclusion
Terms of ServicePrivacy NoticeAcceptable Use PolicyLegal
EnglishJapanese (日本語)Korean (한국어)Español (Spain)Português (Brasil)Português (Portugal)FrançaisDeutsch
© 2025 Amplitude, Inc. All rights reserved. Amplitude is a registered trademark of Amplitude, Inc.
Blog
InsightsProductCompanyCustomers
Topics

101

AI

APJ

Acquisition

Adobe Analytics

Amplify

Amplitude Academy

Amplitude Activation

Amplitude Analytics

Amplitude Audiences

Amplitude Community

Amplitude Feature Experimentation

Amplitude Guides and Surveys

Amplitude Heatmaps

Amplitude Made Easy

Amplitude Session Replay

Amplitude Web Experimentation

Amplitude on Amplitude

Analytics

B2B SaaS

Behavioral Analytics

Benchmarks

Churn Analysis

Cohort Analysis

Collaboration

Consolidation

Conversion

Customer Experience

Customer Lifetime Value

DEI

Data

Data Governance

Data Management

Data Tables

Digital Experience Maturity

Digital Native

Digital Transformer

EMEA

Ecommerce

Employee Resource Group

Engagement

Event Tracking

Experimentation

Feature Adoption

Financial Services

Funnel Analysis

Getting Started

Google Analytics

Growth

Healthcare

How I Amplitude

Implementation

Integration

LATAM

Life at Amplitude

MCP

Machine Learning

Marketing Analytics

Media and Entertainment

Metrics

Modern Data Series

Monetization

Next Gen Builders

North Star Metric

Partnerships

Personalization

Pioneer Awards

Privacy

Product 50

Product Analytics

Product Design

Product Management

Product Releases

Product Strategy

Product-Led Growth

Recap

Retention

Startup

Tech Stack

The Ampys

Warehouse-native Amplitude

Recommended Reading

article card image
Read 
Product
Getting Started: Product Analytics Isn’t Just for Analysts

Dec 5, 2025

5 min read

article card image
Read 
Insights
Vibe Check Part 3: When Vibe Marketing Goes Off the Rails

Dec 4, 2025

8 min read

article card image
Read 
Customers
How CAFU Tripled Engagement and Boosted Conversions 20%+

Dec 4, 2025

8 min read

article card image
Read 
Customers
The Future is Data-Driven: Introducing the Winners of the Ampy Awards 2025

Dec 2, 2025

6 min read

Explore Related Content

Integration
Using Behavioral Analytics for Growth with the Amplitude App on HubSpot

Jun 17, 2024

10 min read

Personalization
Identity Resolution: The Secret to a 360-Degree Customer View

Feb 16, 2024

10 min read

Product
Inside Warehouse-native Amplitude: A Technical Deep Dive

Jun 27, 2023

15 min read

Guide
5 Proven Strategies to Boost Customer Engagement

Jul 12, 2023

Video
Designing High-Impact Experiments

May 13, 2024

Startup
9 Direct-to-consumer Marketing Tactics to Accelerate Ecommerce Growth

Feb 20, 2024

10 min read

Growth
Leveraging Analytics to Achieve Product-Market Fit

Jul 20, 2023

10 min read

Product
iFood Serves Up 54% More Checkouts with Error Message Makeover

Oct 7, 2024

9 min read