Analytics Strategy
What Is a "Meaningful Event" — And Why Spotify, Medium, and Shopify Each Have a Different Answer

Your product generates thousands of events every day. Most of them don't tell you whether your product is actually working.
Every time a user opens your app, something gets logged. A session starts. Pages load. Buttons render. Taps register. By the end of the day, your database has thousands of rows — a precise record of everything every user did inside your product.
Most of it is noise.
Not because the data is wrong. The data is accurate. But accuracy and meaning are different things. A page loading tells you a page loaded. A button rendering tells you a button rendered. These events are technically correct, and analytically nearly useless.
The question every product team should be asking isn't "what did users do?" It's "what did users accomplish?" The difference is the difference between a vanity metric and a meaningful one.
What Makes an Event "Meaningful"?
A meaningful event is one that signals genuine value exchange — something that happened because your product delivered on its promise to the user.
It's usually correlated with the outcomes that matter most to your business: retention, LTV, NPS, or whatever your core value proposition resolves to. Users who complete a meaningful event are more likely to come back. They're more likely to refer others. They're more likely to upgrade, renew, or expand.
The key phrase is your product. There is no universal meaningful event. What constitutes genuine engagement on a social platform looks nothing like genuine engagement on a project management tool, which looks nothing like genuine engagement on a marketplace. The right meaningful event is specific to your product, your user, and your business model.
Finding it requires case study thinking. Let's look at how three very different companies define theirs.
Case Study 1: Shopify — Transactions, Not Visits
When a seller builds a store on Shopify, traffic is visible and easy to measure. Every visit gets logged. Page views accumulate. Session data flows in continuously.
But here's the question Shopify has to answer: is this store healthy?
Traffic doesn't answer that question. A store can get ten thousand visits and zero revenue. The event that actually signals value for a Shopify merchant is a completed transaction — a customer who visited, added a product to their cart, and checked out. This event represents a genuine exchange of value: a product changed hands, money moved, the merchant's business function was fulfilled.
Visits represent intent, at best. Transactions represent outcome.
The implication is practical: if Shopify wants to know whether a merchant's store is performing well — or whether a change to the platform helped or hurt merchant outcomes — the right metric is not traffic. It's conversion. Measuring sessions without measuring transactions is building a dashboard that can tell you how busy the store looks but not whether it's working.
Case Study 2: Medium — Reads, Not Views
Medium operates as a reading and writing platform. Its value proposition is getting the right articles in front of the right readers. The success of that proposition would seem simple to measure: views.
But a view on Medium just means a page loaded. It means someone followed a link from Twitter or Google, the article rendered in their browser, and their attention might have lasted anywhere from two seconds (immediate back button) to fifteen minutes (genuine reading). These are not the same event. Treating them as equivalent is the analytics equivalent of measuring a restaurant's success by counting everyone who walked past the door.
Medium's meaningful event is a read — defined by a user scrolling through a sufficient percentage of the article. A read indicates that the content actually delivered value. The user spent time with it. They engaged with the ideas.
A writer with 500 reads outperforms a writer with 5,000 views in terms of actual impact on Medium's mission. A recommendation algorithm optimized for reads surfaces genuinely good content. One optimized for views surfaces clickbait.
The distinction isn't academic. The entire editorial and algorithmic direction of the platform depends on which event it treats as meaningful.
Case Study 3: Spotify — The 30-Second Stream
Spotify's core value proposition is music discovery and listening. Every time someone presses play, a stream event fires. At scale, this produces billions of events per day.
But not all play events are equal. An accidental two-second play — a user's thumb slipped — fires the same event as a three-minute full listen. A bot streaming an artist's catalog in the background fires the same event as a genuine fan playing their favorite song.
Spotify's solution is the 30-second stream: a play event that lasts at least 30 seconds. Their data team validated that 30-second streams closely track full song listens — almost no one stops a song they're genuinely enjoying at the 30-second mark, and almost every accidental play ends well before it. The threshold is empirically derived, not arbitrary.
The 30-second stream is the event that powers some of Spotify's most consequential decisions:
Artist royalties are calculated from 30-second streams, not total plays
Recommendation algorithms weight 30-second streams as signals of genuine taste
Editorial playlists use streaming patterns to surface tracks that users actually finish
Bot detection flags listening patterns where streams cluster but 30-second thresholds aren't met
This single event definition is load-bearing infrastructure for the entire platform.
How to Find Your Meaningful Event
The process for identifying your product's meaningful event runs through four steps.
Step 1: Ask what value actually looks like
Not "what did users click?" but "what did they accomplish?" The answer depends on your product category:
Productivity tools: task or project completion
Content platforms: content consumption completion (read, watch, listen)
Marketplaces: transaction between buyer and seller
SaaS tools: successful execution of a core workflow
Social platforms: content creation or meaningful social interaction
If you can describe your product's value proposition in a sentence, the meaningful event is usually the moment that sentence becomes true for a specific user.
Step 2: Validate correlation with business outcomes
The meaningful event hypothesis should be testable. Do users who complete this event retain at a higher rate? Do they have higher LTV? Do they have better NPS scores?
The simplest validation is cohort analysis: compare 90-day retention for users who completed the candidate meaningful event versus those who didn't. If the event is genuinely meaningful, the retention curves will diverge significantly.
Step 3: Define the threshold empirically
Spotify didn't guess 30 seconds. They looked at the distribution of play lengths and found where genuine listening clustered versus accidental plays. Medium didn't guess a scroll depth percentage. They tested which thresholds correlated with re-sharing behavior and return visits.
Your threshold should come from your data. If you're defining a "completed onboarding" event, look at where users who converted to retained users were in the onboarding flow versus those who didn't. Let the signal tell you where the meaningful threshold is.
Step 4: Confirm it's instrumentable
A meaningful event is only useful if you can track it. Check with engineering whether the start and end events you need are already captured or need to be added. A scroll depth event, for example, requires deliberate front-end instrumentation — it won't appear automatically in most standard setups.
From Meaningful Event to Engagement Metrics
Once you've identified your meaningful event, you can calculate it several ways depending on what question you're asking:
Total meaningful events in a time window measures overall engagement volume
Meaningful events per user measures engagement intensity — how deeply is each user engaging?
Percent of users who complete the meaningful event measures adoption and breadth of engagement
Percent of sessions that include the meaningful event measures session quality
The right time window depends on your product's natural rhythm: 7 days for weekly products, 28 or 30 days for monthly products, longer for low-frequency products like travel or tax software.
Meaningful event metrics outperform generic active user metrics for two specific use cases. They're ideal as A/B test success metrics because they're specific enough to reflect whether a deliberate product change actually moved needle. And they're excellent OKR KPIs because they connect team effort to the user value that the team is supposed to be creating.
Every Product Has a 30-Second Stream
The Spotify 30-second stream is a memorable example because the stakes are high — it drives royalty calculations affecting millions of artists. But the same principle applies at every scale.
Every product has a version of this: a single event that, when it happens, means your product did its job. For some products it's a transaction. For others it's a completed article, a sent message, a booked appointment, a finished workout, a filed tax return.
Finding yours requires a combination of business judgment, data validation, and honest reflection on what your product's promise actually is. The challenge is resisting the temptation to measure what's easy to measure rather than what matters.
The companies that get this right build engagement metrics that are genuinely diagnostic — that can tell the difference between a product that's working and one that just looks busy.
The full engagement metrics framework — meaningful events, DAU/WAU/MAU ratios, and HEART metrics — is covered in The Data Strategist course.
Blog
Contact
Explore
Home
Categories
Archive
Connect
About the Author
Newsletter
Contact