Skip to content
Merged
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 29 additions & 3 deletions contents/handbook/engineering/development-process.md
Original file line number Diff line number Diff line change
Expand Up @@ -264,11 +264,31 @@ Engineers should apply the following best practices for _all_ new releases:
- Ensure docs are updated to reflect the new release.
- Ensure all new features include at least one pre-made template (or equivalent) for users.

### Self-hosted and hobby versions
### When to A/B test

We have [sunset support for our kubernetes and helm chart managed self-hosted offering](/blog/sunsetting-helm-support-posthog). This means we no longer offer support for fixing to specific versions of PostHog. A [docker image is pushed for each commit to master](https://hub.docker.com/r/posthog/posthog). Each of those versions is immediately deployed to PostHog Cloud.
There are two broad categories of things we A/B test:

The [deploy-hobby script](https://github.com/PostHog/posthog/blob/master/bin/deploy-hobby) allows you to set a `POSTHOG_APP_TAG` environment variable and fix your docker-compose deployed version of PostHog. Or you can edit your docker-compose file to replace each instance of `image: posthog/posthog:$POSTHOG_APP_TAG` with a specific tag e.g. `image: posthog/posthog:9c68581779c78489cfe737cfa965b73f7fc5503c`
- Changes intended to move a metric (eg. changing CTAs to see if it improves click-through)
- Changes that could impact large swaths of users and their behavior, to make sure there is no negative impact (eg. moving all items in the left nav into a drawer)

The former is an optimization scheme; the latter is _required_ to make sure we don't break things. Just like we create tests in our codebase to make sure new code doesn't disrupt existing features, we also need to do behavioral testing to make sure our new features aren't disrupting existing user behaviors.

A/B tests make sense when:

- There is sufficient traffic to give results in 1-2 weeks
- The change isn't simply adding a new feature (eg adding a totally new feature and A/B testing if people use the feature isn't exactly informative, though you _should_ be looking at metrics for features you ship to see if anyone uses them)
- If the feature is designed to improve some other metric like retention or stickiness, then test away!
- The change impacts user behavior (eg most backend changes should have code tests - not behavioral A/B tests)

If you're not sure something should be A/B tested, run one anyway. Feature flags (which experiments run on top of) are a great kill-switch for rolling back features in case something goes sidwways. And it's always nice to know how your metrics move the numbers!

It's easy to just think "this makes more sense, let's just roll it out." Sometimes that's okay, sometimes that's being arrogant. We obviously can't and shouldn't test everything, but running A/B tests frequently gets your comfortable with being wrong, which is a _very_ handy skill to have.

#### A/B test metrics

Experiment design is a bit of an art. There are different types of [metrics](/docs/experiments/metrics) you can use in PostHog experiments. Another benefit of running experimeints is forcing yourself to think through what other things your change might impact, which oftentimes doesn't happen in the regular release cycle!
Copy link

@danisaza danisaza Dec 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Heads up: there's a typo here (experimeints -> experiments)

Really digging this documentation, by the way!


Generally, a good pattern is to set up 1-2 primary metrics that you anticipate might be impacted by the A/B test, as well as 3+ secondary metrics that might also be good to keep an eye on, just in case. If you aren't sure what metrics to be testing, just ask! Lots of people are excited to help think this through (especially #team-growth and Raquel!).

### Releasing as a beta

Expand All @@ -285,3 +305,9 @@ Announcements, whether for beta or final updates, are a Marketing responsibility
In order to ensure a smooth launch [the owner](/handbook/engineering/development-process#assign-an-owner) should tell Marketing about upcoming updates as soon as possible, or include them in an All-Hands update.

It's _never_ too early to give Marketing a heads-up about something by tagging them in an issue or via the Marketing Slack channel.

### Self-hosted and hobby versions

We have [sunset support for our kubernetes and helm chart managed self-hosted offering](/blog/sunsetting-helm-support-posthog). This means we no longer offer support for fixing to specific versions of PostHog. A [docker image is pushed for each commit to master](https://hub.docker.com/r/posthog/posthog). Each of those versions is immediately deployed to PostHog Cloud.

The [deploy-hobby script](https://github.com/PostHog/posthog/blob/master/bin/deploy-hobby) allows you to set a `POSTHOG_APP_TAG` environment variable and fix your docker-compose deployed version of PostHog. Or you can edit your docker-compose file to replace each instance of `image: posthog/posthog:$POSTHOG_APP_TAG` with a specific tag e.g. `image: posthog/posthog:9c68581779c78489cfe737cfa965b73f7fc5503c`
Comment on lines +309 to +313
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this section is just moved down from above