You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-[Cohorts or segmentation](/docs/data/cohorts) (for slicing data by user type)
61
+
-[Feature flags](/feature-flags) (for safely rolling out fixes)
62
+
-[Experiments](/experiments) (for measuring your changes)
63
+
-[Surveys](/surveys) (for getting direct feedback from users)
64
64
65
65
You'll also need:
66
-
-**A clear definition of what "successful onboarding" looks like**. This is [your activation event](https://posthog.com/newsletter/wtf-is-activation) – the moment a user has gotten enough value to stick around (more on this below).
66
+
-**A clear definition of what "successful onboarding" looks like**. This is [your activation event](/newsletter/wtf-is-activation) – the moment a user has gotten enough value to stick around (more on this below).
67
67
-**At least a few hundred users going through your flow**. You need enough data to spot patterns. If you're at an earlier stage, you can still follow this recipe; just watch more replays and lean harder on qualitative signals until your numbers catch up.
68
68
69
69
<details>
70
70
<summary><strong>If using PostHog...</strong></summary>
71
71
72
-
You'll need it installed in your product and receiving data. If you haven't set this up yet, [start here](https://posthog.com/docs/getting-started/install). Make sure you're capturing the key events in your onboarding flow (signups, form completions, button clicks, etc.). If you have autocapture enabled, you're probably already covered.
72
+
You'll need it installed in your product and receiving data. If you haven't set this up yet, [start here](/docs/getting-started/install). Make sure you're capturing the key events in your onboarding flow (signups, form completions, button clicks, etc.). If you have autocapture enabled, you're probably already covered.
73
73
74
-
We highly recommend calling [`posthog.identify()`](https://posthog.com/docs/product-analytics/identify) when users sign up or log in; you'll be able to track them across sessions and devices, which makes your funnel data much more reliable.
74
+
We highly recommend calling [`posthog.identify()`](/docs/product-analytics/identify) when users sign up or log in; you'll be able to track them across sessions and devices, which makes your funnel data much more reliable.
75
75
76
76
</details>
77
77
78
78
### Substitutions
79
79
80
-
**No session replay tool?** You can interview users directly instead, but you'll be relying on their memory of what happened rather than what actually happened. It works, it's just slower and less reliable. [Here's our guide to running effective user interviews](https://posthog.com/blog/10x-engineers-do-user-interviews) if you go this route.
80
+
**No session replay tool?** You can interview users directly instead, but you'll be relying on their memory of what happened rather than what actually happened. It works, it's just slower and less reliable. [Here's our guide to running effective user interviews](/blog/10x-engineers-do-user-interviews) if you go this route.
81
81
82
82
**No surveys?** It's okay, replays will get you 80% of the way there; the other 20% is context you'll have to infer.
83
83
84
-
**No feature flags?** You can ship straight to production. We won't judge. (...we will judge a little.) If something breaks, you'll just have to roll back manually. [Here's why we think feature flags are worth it](https://posthog.com/blog/feature-flag-benefits-use-cases).
84
+
**No feature flags?** You can ship straight to production. We won't judge. (...we will judge a little.) If something breaks, you'll just have to roll back manually. [Here's why we think feature flags are worth it](/blog/feature-flag-benefits-use-cases).
85
85
86
86
**No cohorts or segmentation?** You can still run this recipe, you'll just be looking at all users as one group. If your drop-off is consistent across everyone, that's fine. If it's not, you'll have a harder time figuring out who's actually struggling.
87
87
88
88
---
89
89
90
90
## Step 1: Prep the finish line
91
91
92
-
Before you can fix your onboarding, you need to [define what a successful onboarding flow actually means](https://posthog.com/docs/new-to-posthog/activation). This is your activation event, the thing you'll measure everything against.
92
+
Before you can fix your onboarding, you need to [define what a successful onboarding flow actually means](/docs/new-to-posthog/activation). This is your activation event, the thing you'll measure everything against.
93
93
94
94
Ask yourself: What's the moment when a user has gotten enough value that they're likely to stick around? What you're looking for is a **value-producing action**.
95
95
@@ -102,9 +102,9 @@ Some examples:
102
102
-**For an analytics product**: Sent their first event and created an insight
103
103
-**For a CRM**: Added their first contact and sent an email
104
104
105
-
Not sure what yours is? Try looking at your retention data – [what do retained users do that churned users don't?](product-engineers/customer-retention-metrics) That should give you a starting point.
105
+
Not sure what yours is? Try looking at your retention data – [what do retained users do that churned users don't?](/product-engineers/customer-retention-metrics) That should give you a starting point.
106
106
107
-
Here's [how we figured out our activation metric at PostHog](https://posthog.com/product-engineers/activation-metrics) (spoiler: it took a few iterations).
107
+
Here's [how we figured out our activation metric at PostHog](/product-engineers/activation-metrics) (spoiler: it took a few iterations).
108
108
109
109
Pick one. Be opinionated. You can always adjust later.
110
110
@@ -135,12 +135,9 @@ A few tips:
135
135
- Use sequential order (the default) so users must complete steps in the order you've defined.
136
136
- Set a reasonable conversion window (7 to 14 days is a good starting point).
137
137
138
-
<details>
139
-
<summary><strong>If using PostHog...</strong></summary>
140
-
141
-
Head to [Product Analytics](https://app.posthog.com/insights) → **New insight** → [**Funnel**](https://posthog.com/docs/product-analytics/funnels). If you have [autocapture](https://posthog.com/docs/product-analytics/autocapture) enabled, many of these events may already be tracked for you; check your [activity](https://app.posthog.com/events) to see what's coming in.
138
+
If using PostHog...
142
139
143
-
</details>
140
+
Head to [Product Analytics](https://app.posthog.com/insights) → **New insight** → [**Funnel**](/docs/product-analytics/funnels). If you have [autocapture](/docs/product-analytics/autocapture) enabled, many of these events may already be tracked for you; check your [activity](https://app.posthog.com/events) to see what's coming in.
144
141
145
142
> 👨🍳 *Chef's tip: Start with your core flow. Once it's optimized, create separate funnels for specific segments. Don't forget to name your funnel something specific (e.g., "Self-serve onboarding Q1 2025") so future-you knows what it's measuring when you have 47 funnels.*
146
143
@@ -158,11 +155,11 @@ Save your funnel and let data collect for at least a week. You need enough users
158
155
159
156
As a rough guideline:
160
157
- A few hundred users entering onboarding is usually enough to start
161
-
- More is better if you plan to [segment by user type or device later](https://posthog.com/blog/how-to-do-user-segmentation)
158
+
- More is better if you plan to [segment by user type or device later](/blog/how-to-do-user-segmentation)
162
159
163
160
While you wait, you can:
164
-
-[Set up session replay, if you haven't already](https://posthog.com/docs/session-replay)
165
-
-[Prep an exit survey for later](https://posthog.com/docs/surveys/creating-surveys) (we'll use it in Step 6)
161
+
-[Set up session replay, if you haven't already](/docs/session-replay)
162
+
-[Prep an exit survey for later](/docs/surveys/creating-surveys) (we'll use it in Step 6)
166
163
167
164
> 👨🍳 *Chef's tip: Resist the urge to peek daily. Set a calendar reminder for one week out – watching the pot won't make it boil faster.*
168
165
@@ -192,9 +189,9 @@ If it's a context problem, try segmenting your funnel to find a clearer diagnosi
192
189
193
190
**If using PostHog:**
194
191
195
-
Click on the drop-off number in your funnel to see the actual users who didn't make it. You can save these as a [cohort](https://posthog.com/docs/data/cohorts) for further analysis – useful for targeting with surveys later, or watching their replays in Step 5.
192
+
Click on the drop-off number in your funnel to see the actual users who didn't make it. You can save these as a [cohort](/docs/data/cohorts) for further analysis – useful for targeting with surveys later, or watching their replays in Step 5.
196
193
197
-
Use [breakdowns](https://posthog.com/docs/product-analytics/funnels#breakdowns) to slice your funnel by user properties, device, or any event property.
194
+
Use [breakdowns](/docs/product-analytics/funnels#breakdowns) to slice your funnel by user properties, device, or any event property.
198
195
199
196
> 👨🍳 *Chef's tip: Export your cohort of dropped-off users. They're useful for more than just replays – you can target them with win-back marketing campaigns or surveys later.*
200
197
@@ -208,22 +205,22 @@ You're be able to say: "Users drop off most at *[this step]*" and "It affects *[
208
205
209
206
## Step 5: Watch the replays
210
207
211
-
You know where users drop off, [now you need to find out *why*](https://posthog.com/tutorials/explore-insights-session-recordings).
208
+
You know where users drop off, [now you need to find out *why*](/tutorials/explore-insights-session-recordings).
212
209
213
210
Watch 10–15 recordings. You're looking for patterns:
214
211
- Are users getting confused at a specific UI element?
215
212
- Are they rage-clicking something that doesn't work?
216
213
- Are they abandoning after seeing a specific screen (pricing, permissions request, etc.)?
217
-
- Are they hitting errors? (Check the console logs in the replay; PostHog [captures these](https://posthog.com/docs/session-replay/console-log-recording) too.)
218
-
- Is something failing quietly? Look at network requests if you have [network recording](https://posthog.com/docs/session-replay/network-recording) enabled.
214
+
- Are they hitting errors? (Check the console logs in the replay; PostHog [captures these](/docs/session-replay/console-log-recording) too.)
215
+
- Is something failing quietly? Look at network requests if you have [network recording](/docs/session-replay/network-recording) enabled.
219
216
220
217
If recordings are looking wildly different from one user to the next, go back to Step 4 and segment further; you're probably mixing multiple problems together.
221
218
222
-
**If using PostHog**
219
+
**If using PostHog...**
223
220
224
221
- Fastest way: Click directly on the drop-off in your funnel – PostHog will pull up recordings for those users automatically. (This is one of the nice things about having replay and analytics in one tool!)
225
222
226
-
- Manual way: In [Session Replay](https://app.posthog.com/replay), click **Show filters** → **Filter for events or actions** → select the last event users completed before dropping off. ([Session replay filtering guide](https://posthog.com/tutorials/filter-session-recordings))
223
+
- Manual way: In [Session Replay](https://app.posthog.com/replay), click **Show filters** → **Filter for events or actions** → select the last event users completed before dropping off. ([Session replay filtering guide](/tutorials/filter-session-recordings))
227
224
228
225
If you saved a cohort in Step 4, you can filter replays by that cohort directly.
229
226
@@ -259,7 +256,7 @@ Trigger this right after your activation event fires; they'll remember while it'
259
256
260
257
**If using PostHog...**
261
258
262
-
Go to [Surveys](https://app.posthog.com/surveys) → [**New survey**](https://posthog.com/docs/surveys/creating-surveys).
259
+
Go to [Surveys](https://app.posthog.com/surveys) → [**New survey**](/docs/surveys/creating-surveys).
263
260
264
261
You can set display conditions based on URL, user properties, or events. For the completion survey, trigger it when your activation event fires.
265
262
@@ -300,25 +297,25 @@ You've got one fix in the oven, designed to address your hypothesis.
300
297
301
298
## Step 8: Taste before serving *(optional, but recommended)*
302
299
303
-
Whatever the change, don't dump it straight into production, roll it out gradually if you can [**using feature flags**](https://posthog.com/blog/best-open-source-feature-flag-tools).
300
+
Whatever the change, don't dump it straight into production, roll it out gradually if you can [**using feature flags**](/blog/best-open-source-feature-flag-tools).
304
301
305
302
Feature flags let you:
306
303
- Release to 10–20% of users first, then ramp up
307
304
- Target specific user segments (e.g., new users only)
308
305
- Kill the change instantly if something goes wrong
309
306
310
-
Want statistical proof it worked? [Run an A/B experiment](https://posthog.com/product-engineers/how-to-do-ab-testing). This is optional, [not every fix needs one](https://newsletter.posthog.com/p/ab-testing-mistakes-i-learned-the). But it's worth it when:
307
+
Want statistical proof it worked? [Run an A/B experiment](/product-engineers/how-to-do-ab-testing). This is optional, [not every fix needs one](https://newsletter.posthog.com/p/ab-testing-mistakes-i-learned-the). But it's worth it when:
311
308
- The change is significant (like a full flow redesign)
312
309
- You're debating between multiple solutions
313
310
- You need to convince stakeholders with data
314
311
315
-
While step isn't mandatory, [it helps avoid "we think this worked" decisions](https://posthog.com/newsletter/what-we've-learned-about-ab-testing).
312
+
While step isn't mandatory, [it helps avoid "we think this worked" decisions](/newsletter/what-we've-learned-about-ab-testing).
316
313
317
314
** If using PostHog...**
318
315
319
-
Use [feature flags](https://posthog.com/docs/feature-flags) to roll out your fix to a percentage of users first.
316
+
Use [feature flags](/docs/feature-flags) to roll out your fix to a percentage of users first.
320
317
321
-
To run an experiment, go to [Experiments](https://app.posthog.com/experiments) → [**New experiment**](https://posthog.com/docs/experiments/creating-an-experiment). Use your feature flag as the basis – PostHog will split users into control and test groups and track your funnel as the goal metric.
318
+
To run an experiment, go to [Experiments](https://app.posthog.com/experiments) → [**New experiment**](/docs/experiments/creating-an-experiment). Use your feature flag as the basis – PostHog will split users into control and test groups and track your funnel as the goal metric.
322
319
323
320
> 👨🍳 *Chef's tip: If you're nervous about a big change, start at 5% rollout. You can always ramp up, but you can't un-serve a burnt dish. Also, not every fix needs a full experiment – but if it's a big change, or you need to convince stakeholders, statistical proof is worth the extra time.*
324
321
@@ -350,7 +347,6 @@ If you used a feature flag or experiment, check the results in [Experiments](htt
350
347
351
348
> 👨🍳 *Chef's tip: Screenshot your before/after funnels. They make great artifacts for retros, stakeholder updates, and convincing your team that this stuff actually works.*
Your fix worked. If so, **congrats, you've improved activation!**
@@ -368,7 +364,7 @@ If it didn't, that's okay. Go back to Step 4 with a new hypothesis. Onboarding o
368
364
369
365
**Onboarding drop-off **is when users start your onboarding flow but leave before completing it. They signed up and showed intent, but never reached the point where they experienced real value from your product (your activation event).
370
366
371
-
Onboarding drop-off is [not the same as churn](tutorials/churn-rate) – these users never really started using your product in the first place.
367
+
Onboarding drop-off is [not the same as churn](/tutorials/churn-rate) – these users never really started using your product in the first place.
372
368
</details>
373
369
374
370
<details>
@@ -426,8 +422,8 @@ Roll out changes gradually so you can measure impact and roll back if needed.
426
422
<summary><strong>What tools do I need to track and fix onboarding drop-offs?</strong></summary>
427
423
428
424
At a minimum, you need:
429
-
-[**Product analytics**](blog/best-open-source-analytics-tools) to build funnels and see where users drop off
430
-
-[**Session replay**](blog/best-session-replay-tools) to understand what users were doing before they left
425
+
-[**Product analytics**](/blog/best-open-source-analytics-tools) to build funnels and see where users drop off
426
+
-[**Session replay**](/blog/best-session-replay-tools) to understand what users were doing before they left
431
427
432
428
You can assemble this with separate tools (for example, analytics, replay, feature flags, and experiments), but that usually means more setup, more context switching, and slower iteration.
433
429
@@ -466,10 +462,10 @@ With PostHog, [everything's connected in one place](/products):
0 commit comments