Skip to content

Commit e41612b

Browse files
URL fixes
1 parent 3b9cedd commit e41612b

File tree

1 file changed

+39
-43
lines changed

1 file changed

+39
-43
lines changed

contents/blog/how-to-find-and-fix-onboarding-drop-off.mdx

Lines changed: 39 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -52,44 +52,44 @@ Onboarding drop-offs are usually an indication of friction, not rejection. Users
5252
You don't need a massive stack for this recipe.
5353

5454
**Required:**
55-
- [A product analytics tool](https://posthog.com/blog/best-open-source-analytics-tools) (for building your onboarding funnel)
56-
- [A session replay tool](https://posthog.com/blog/best-session-replay-tools) (for watching what users actually do)
55+
- [A product analytics tool](/blog/best-open-source-analytics-tools) (for building your onboarding funnel)
56+
- [A session replay tool](/blog/best-session-replay-tools) (for watching what users actually do)
5757

5858
*Optional, but recommended:*
59-
- [Autocapture](https://posthog.com/docs/product-analytics/autocapture) (for skipping manual event setup)
60-
- [Cohorts or segmentation](https://posthog.com/docs/data/cohorts) (for slicing data by user type)
61-
- [Feature flags](https://posthog.com/feature-flags) (for safely rolling out fixes)
62-
- [Experiments](https://posthog.com/experiments) (for measuring your changes)
63-
- [Surveys](https://posthog.com/surveys) (for getting direct feedback from users)
59+
- [Autocapture](/docs/product-analytics/autocapture) (for skipping manual event setup)
60+
- [Cohorts or segmentation](/docs/data/cohorts) (for slicing data by user type)
61+
- [Feature flags](/feature-flags) (for safely rolling out fixes)
62+
- [Experiments](/experiments) (for measuring your changes)
63+
- [Surveys](/surveys) (for getting direct feedback from users)
6464

6565
You'll also need:
66-
- **A clear definition of what "successful onboarding" looks like**. This is [your activation event](https://posthog.com/newsletter/wtf-is-activation) – the moment a user has gotten enough value to stick around (more on this below).
66+
- **A clear definition of what "successful onboarding" looks like**. This is [your activation event](/newsletter/wtf-is-activation) – the moment a user has gotten enough value to stick around (more on this below).
6767
- **At least a few hundred users going through your flow**. You need enough data to spot patterns. If you're at an earlier stage, you can still follow this recipe; just watch more replays and lean harder on qualitative signals until your numbers catch up.
6868

6969
<details>
7070
<summary><strong>If using PostHog...</strong></summary>
7171

72-
You'll need it installed in your product and receiving data. If you haven't set this up yet, [start here](https://posthog.com/docs/getting-started/install). Make sure you're capturing the key events in your onboarding flow (signups, form completions, button clicks, etc.). If you have autocapture enabled, you're probably already covered.
72+
You'll need it installed in your product and receiving data. If you haven't set this up yet, [start here](/docs/getting-started/install). Make sure you're capturing the key events in your onboarding flow (signups, form completions, button clicks, etc.). If you have autocapture enabled, you're probably already covered.
7373

74-
We highly recommend calling [`posthog.identify()`](https://posthog.com/docs/product-analytics/identify) when users sign up or log in; you'll be able to track them across sessions and devices, which makes your funnel data much more reliable.
74+
We highly recommend calling [`posthog.identify()`](/docs/product-analytics/identify) when users sign up or log in; you'll be able to track them across sessions and devices, which makes your funnel data much more reliable.
7575

7676
</details>
7777

7878
### Substitutions
7979

80-
**No session replay tool?** You can interview users directly instead, but you'll be relying on their memory of what happened rather than what actually happened. It works, it's just slower and less reliable. [Here's our guide to running effective user interviews](https://posthog.com/blog/10x-engineers-do-user-interviews) if you go this route.
80+
**No session replay tool?** You can interview users directly instead, but you'll be relying on their memory of what happened rather than what actually happened. It works, it's just slower and less reliable. [Here's our guide to running effective user interviews](/blog/10x-engineers-do-user-interviews) if you go this route.
8181

8282
**No surveys?** It's okay, replays will get you 80% of the way there; the other 20% is context you'll have to infer.
8383

84-
**No feature flags?** You can ship straight to production. We won't judge. (...we will judge a little.) If something breaks, you'll just have to roll back manually. [Here's why we think feature flags are worth it](https://posthog.com/blog/feature-flag-benefits-use-cases).
84+
**No feature flags?** You can ship straight to production. We won't judge. (...we will judge a little.) If something breaks, you'll just have to roll back manually. [Here's why we think feature flags are worth it](/blog/feature-flag-benefits-use-cases).
8585

8686
**No cohorts or segmentation?** You can still run this recipe, you'll just be looking at all users as one group. If your drop-off is consistent across everyone, that's fine. If it's not, you'll have a harder time figuring out who's actually struggling.
8787

8888
---
8989

9090
## Step 1: Prep the finish line
9191

92-
Before you can fix your onboarding, you need to [define what a successful onboarding flow actually means](https://posthog.com/docs/new-to-posthog/activation). This is your activation event, the thing you'll measure everything against.
92+
Before you can fix your onboarding, you need to [define what a successful onboarding flow actually means](/docs/new-to-posthog/activation). This is your activation event, the thing you'll measure everything against.
9393

9494
Ask yourself: What's the moment when a user has gotten enough value that they're likely to stick around? What you're looking for is a **value-producing action**.
9595

@@ -102,9 +102,9 @@ Some examples:
102102
- **For an analytics product**: Sent their first event and created an insight
103103
- **For a CRM**: Added their first contact and sent an email
104104

105-
Not sure what yours is? Try looking at your retention data – [what do retained users do that churned users don't?](product-engineers/customer-retention-metrics) That should give you a starting point.
105+
Not sure what yours is? Try looking at your retention data – [what do retained users do that churned users don't?](/product-engineers/customer-retention-metrics) That should give you a starting point.
106106

107-
Here's [how we figured out our activation metric at PostHog](https://posthog.com/product-engineers/activation-metrics) (spoiler: it took a few iterations).
107+
Here's [how we figured out our activation metric at PostHog](/product-engineers/activation-metrics) (spoiler: it took a few iterations).
108108

109109
Pick one. Be opinionated. You can always adjust later.
110110

@@ -135,12 +135,9 @@ A few tips:
135135
- Use sequential order (the default) so users must complete steps in the order you've defined.
136136
- Set a reasonable conversion window (7 to 14 days is a good starting point).
137137

138-
<details>
139-
<summary><strong>If using PostHog...</strong></summary>
140-
141-
Head to [Product Analytics](https://app.posthog.com/insights)**New insight**[**Funnel**](https://posthog.com/docs/product-analytics/funnels). If you have [autocapture](https://posthog.com/docs/product-analytics/autocapture) enabled, many of these events may already be tracked for you; check your [activity](https://app.posthog.com/events) to see what's coming in.
138+
If using PostHog...
142139

143-
</details>
140+
Head to [Product Analytics](https://app.posthog.com/insights)**New insight**[**Funnel**](/docs/product-analytics/funnels). If you have [autocapture](/docs/product-analytics/autocapture) enabled, many of these events may already be tracked for you; check your [activity](https://app.posthog.com/events) to see what's coming in.
144141

145142
> 👨‍🍳 *Chef's tip: Start with your core flow. Once it's optimized, create separate funnels for specific segments. Don't forget to name your funnel something specific (e.g., "Self-serve onboarding Q1 2025") so future-you knows what it's measuring when you have 47 funnels.*
146143
@@ -158,11 +155,11 @@ Save your funnel and let data collect for at least a week. You need enough users
158155

159156
As a rough guideline:
160157
- A few hundred users entering onboarding is usually enough to start
161-
- More is better if you plan to [segment by user type or device later](https://posthog.com/blog/how-to-do-user-segmentation)
158+
- More is better if you plan to [segment by user type or device later](/blog/how-to-do-user-segmentation)
162159

163160
While you wait, you can:
164-
- [Set up session replay, if you haven't already](https://posthog.com/docs/session-replay)
165-
- [Prep an exit survey for later](https://posthog.com/docs/surveys/creating-surveys) (we'll use it in Step 6)
161+
- [Set up session replay, if you haven't already](/docs/session-replay)
162+
- [Prep an exit survey for later](/docs/surveys/creating-surveys) (we'll use it in Step 6)
166163

167164
> 👨‍🍳 *Chef's tip: Resist the urge to peek daily. Set a calendar reminder for one week out – watching the pot won't make it boil faster.*
168165
@@ -192,9 +189,9 @@ If it's a context problem, try segmenting your funnel to find a clearer diagnosi
192189

193190
**If using PostHog:**
194191

195-
Click on the drop-off number in your funnel to see the actual users who didn't make it. You can save these as a [cohort](https://posthog.com/docs/data/cohorts) for further analysis – useful for targeting with surveys later, or watching their replays in Step 5.
192+
Click on the drop-off number in your funnel to see the actual users who didn't make it. You can save these as a [cohort](/docs/data/cohorts) for further analysis – useful for targeting with surveys later, or watching their replays in Step 5.
196193

197-
Use [breakdowns](https://posthog.com/docs/product-analytics/funnels#breakdowns) to slice your funnel by user properties, device, or any event property.
194+
Use [breakdowns](/docs/product-analytics/funnels#breakdowns) to slice your funnel by user properties, device, or any event property.
198195

199196
> 👨‍🍳 *Chef's tip: Export your cohort of dropped-off users. They're useful for more than just replays – you can target them with win-back marketing campaigns or surveys later.*
200197
@@ -208,22 +205,22 @@ You're be able to say: "Users drop off most at *[this step]*" and "It affects *[
208205

209206
## Step 5: Watch the replays
210207

211-
You know where users drop off, [now you need to find out *why*](https://posthog.com/tutorials/explore-insights-session-recordings).
208+
You know where users drop off, [now you need to find out *why*](/tutorials/explore-insights-session-recordings).
212209

213210
Watch 10–15 recordings. You're looking for patterns:
214211
- Are users getting confused at a specific UI element?
215212
- Are they rage-clicking something that doesn't work?
216213
- Are they abandoning after seeing a specific screen (pricing, permissions request, etc.)?
217-
- Are they hitting errors? (Check the console logs in the replay; PostHog [captures these](https://posthog.com/docs/session-replay/console-log-recording) too.)
218-
- Is something failing quietly? Look at network requests if you have [network recording](https://posthog.com/docs/session-replay/network-recording) enabled.
214+
- Are they hitting errors? (Check the console logs in the replay; PostHog [captures these](/docs/session-replay/console-log-recording) too.)
215+
- Is something failing quietly? Look at network requests if you have [network recording](/docs/session-replay/network-recording) enabled.
219216

220217
If recordings are looking wildly different from one user to the next, go back to Step 4 and segment further; you're probably mixing multiple problems together.
221218

222-
**If using PostHog**
219+
**If using PostHog...**
223220

224221
- Fastest way: Click directly on the drop-off in your funnel – PostHog will pull up recordings for those users automatically. (This is one of the nice things about having replay and analytics in one tool!)
225222

226-
- Manual way: In [Session Replay](https://app.posthog.com/replay), click **Show filters****Filter for events or actions** → select the last event users completed before dropping off. ([Session replay filtering guide](https://posthog.com/tutorials/filter-session-recordings))
223+
- Manual way: In [Session Replay](https://app.posthog.com/replay), click **Show filters****Filter for events or actions** → select the last event users completed before dropping off. ([Session replay filtering guide](/tutorials/filter-session-recordings))
227224

228225
If you saved a cohort in Step 4, you can filter replays by that cohort directly.
229226

@@ -259,7 +256,7 @@ Trigger this right after your activation event fires; they'll remember while it'
259256

260257
**If using PostHog...**
261258

262-
Go to [Surveys](https://app.posthog.com/surveys)[**New survey**](https://posthog.com/docs/surveys/creating-surveys).
259+
Go to [Surveys](https://app.posthog.com/surveys)[**New survey**](/docs/surveys/creating-surveys).
263260

264261
You can set display conditions based on URL, user properties, or events. For the completion survey, trigger it when your activation event fires.
265262

@@ -300,25 +297,25 @@ You've got one fix in the oven, designed to address your hypothesis.
300297

301298
## Step 8: Taste before serving *(optional, but recommended)*
302299

303-
Whatever the change, don't dump it straight into production, roll it out gradually if you can [**using feature flags**](https://posthog.com/blog/best-open-source-feature-flag-tools).
300+
Whatever the change, don't dump it straight into production, roll it out gradually if you can [**using feature flags**](/blog/best-open-source-feature-flag-tools).
304301

305302
Feature flags let you:
306303
- Release to 10–20% of users first, then ramp up
307304
- Target specific user segments (e.g., new users only)
308305
- Kill the change instantly if something goes wrong
309306

310-
Want statistical proof it worked? [Run an A/B experiment](https://posthog.com/product-engineers/how-to-do-ab-testing). This is optional, [not every fix needs one](https://newsletter.posthog.com/p/ab-testing-mistakes-i-learned-the). But it's worth it when:
307+
Want statistical proof it worked? [Run an A/B experiment](/product-engineers/how-to-do-ab-testing). This is optional, [not every fix needs one](https://newsletter.posthog.com/p/ab-testing-mistakes-i-learned-the). But it's worth it when:
311308
- The change is significant (like a full flow redesign)
312309
- You're debating between multiple solutions
313310
- You need to convince stakeholders with data
314311

315-
While step isn't mandatory, [it helps avoid "we think this worked" decisions](https://posthog.com/newsletter/what-we've-learned-about-ab-testing).
312+
While step isn't mandatory, [it helps avoid "we think this worked" decisions](/newsletter/what-we've-learned-about-ab-testing).
316313

317314
** If using PostHog...**
318315

319-
Use [feature flags](https://posthog.com/docs/feature-flags) to roll out your fix to a percentage of users first.
316+
Use [feature flags](/docs/feature-flags) to roll out your fix to a percentage of users first.
320317

321-
To run an experiment, go to [Experiments](https://app.posthog.com/experiments)[**New experiment**](https://posthog.com/docs/experiments/creating-an-experiment). Use your feature flag as the basis – PostHog will split users into control and test groups and track your funnel as the goal metric.
318+
To run an experiment, go to [Experiments](https://app.posthog.com/experiments)[**New experiment**](/docs/experiments/creating-an-experiment). Use your feature flag as the basis – PostHog will split users into control and test groups and track your funnel as the goal metric.
322319

323320
> 👨‍🍳 *Chef's tip: If you're nervous about a big change, start at 5% rollout. You can always ramp up, but you can't un-serve a burnt dish. Also, not every fix needs a full experiment – but if it's a big change, or you need to convince stakeholders, statistical proof is worth the extra time.*
324321
@@ -350,7 +347,6 @@ If you used a feature flag or experiment, check the results in [Experiments](htt
350347

351348
> 👨‍🍳 *Chef's tip: Screenshot your before/after funnels. They make great artifacts for retros, stakeholder updates, and convincing your team that this stuff actually works.*
352349
353-
354350
<CalloutBox icon="IconInfo" title="You're done when..." type="fyi">
355351

356352
Your fix worked. If so, **congrats, you've improved activation!**
@@ -368,7 +364,7 @@ If it didn't, that's okay. Go back to Step 4 with a new hypothesis. Onboarding o
368364

369365
**Onboarding drop-off **is when users start your onboarding flow but leave before completing it. They signed up and showed intent, but never reached the point where they experienced real value from your product (your activation event).
370366

371-
Onboarding drop-off is [not the same as churn](tutorials/churn-rate) – these users never really started using your product in the first place.
367+
Onboarding drop-off is [not the same as churn](/tutorials/churn-rate) – these users never really started using your product in the first place.
372368
</details>
373369

374370
<details>
@@ -426,8 +422,8 @@ Roll out changes gradually so you can measure impact and roll back if needed.
426422
<summary><strong>What tools do I need to track and fix onboarding drop-offs?</strong></summary>
427423

428424
At a minimum, you need:
429-
- [**Product analytics**](blog/best-open-source-analytics-tools) to build funnels and see where users drop off
430-
- [**Session replay**](blog/best-session-replay-tools) to understand what users were doing before they left
425+
- [**Product analytics**](/blog/best-open-source-analytics-tools) to build funnels and see where users drop off
426+
- [**Session replay**](/blog/best-session-replay-tools) to understand what users were doing before they left
431427

432428
You can assemble this with separate tools (for example, analytics, replay, feature flags, and experiments), but that usually means more setup, more context switching, and slower iteration.
433429

@@ -466,10 +462,10 @@ With PostHog, [everything's connected in one place](/products):
466462

467463
### Pairs well with
468464

469-
- [The AARRR pirate funnel explained](https://posthog.com/product-engineers/aarrr-pirate-funnel)
465+
- [The AARRR pirate funnel explained](/product-engineers/aarrr-pirate-funnel)
470466
- [10 things we've learned about A/B testing](https://newsletter.posthog.com/p/10-things-weve-learned-about-ab-testing)
471-
- [How to think like a growth engineer](https://posthog.com/newsletter/think-like-a-growth-engineer)
472-
- [What we've learned about talking to users](https://posthog.com/blog/product-for-engineers-1)
467+
- [How to think like a growth engineer](/newsletter/think-like-a-growth-engineer)
468+
- [What we've learned about talking to users](/blog/product-for-engineers-1)
473469
- [50 things we've learned about building successful products](https://newsletter.posthog.com/p/50-things-weve-learned-about-building)
474470

475471
---

0 commit comments

Comments
 (0)