Skip to content

Commit 98e1200

Browse files
authored
Update configuring-experiments-v1.md (#1020)
Add section about pausing experiments, the differences between pausing and stopping an experiment
1 parent 2162e13 commit 98e1200

File tree

1 file changed

+44
-1
lines changed

1 file changed

+44
-1
lines changed

docs/tools/experiments-v1/configuring-experiments-v1.md

Lines changed: 44 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -117,6 +117,44 @@ Test users will be placed into the experiment Offering variants, but sandbox pur
117117
If you want to test your paywall to make sure it can handle displaying the Offerings in your experiment, you can use the [Offering Override](/dashboard-and-metrics/customer-history/offering-override) feature to choose a specific Offering to display to a user.
118118
:::
119119

120+
## Pausing an experiment
121+
122+
Once an experiment is running, you can pause it to stop enrolling new customers while continuing to collect data from existing participants. This is useful when you want to:
123+
124+
- Evaluate the long-term impact of experiment exposure on already enrolled customers
125+
- Stop exposing new customers to test variants while maintaining consistent behavior for existing participants
126+
- Temporarily halt enrollment while analyzing preliminary results
127+
128+
When paused:
129+
- New customers will no longer be enrolled in the experiment
130+
- Customers already enrolled will continue to see their assigned variant
131+
- Data collection continues for all enrolled customers
132+
- Results will continue to update with data from existing participants for up to 400 days
133+
134+
## Resuming a paused experiment
135+
136+
You can resume a paused experiment to start enrolling new customers again by clicking the **Resume** button.
137+
138+
Before resuming an experiment, we'll check if it conflicts with any currently active experiments. If resuming would cause audience overlap, you'll see an error message and will need to either pause or stop the conflicting active experiment(s) to avoid overlap.
139+
140+
## Stopping an experiment
141+
142+
Once an experiment is running or paused, you can permanently stop it by clicking the **Stop** button. This is appropriate when you want to:
143+
144+
- End the experiment after a clear winner has emerged and you're ready to implement the results
145+
- Stop a poorly performing experiment to prevent further negative impact
146+
- Clear the way for a new experiment with an overlapping audience
147+
- Conclude an experiment that has run its full planned duration
148+
149+
:::warning Stopping vs. Pausing
150+
**Pausing** is reversible - you can resume the experiment later. **Stopping** is permanent - the experiment cannot be restarted. Consider pausing instead of stopping if you might want to resume enrollment in the future.
151+
:::
152+
153+
When an experiment is stopped:
154+
- New customers will no longer be enrolled
155+
- Customers who were enrolled will begin receiving the Default Offering on their next paywall view
156+
- Results will continue to refresh for 400 days after the experiment has ended
157+
120158
## Running multiple tests simultaneously
121159

122160
You can use Experiments to run multiple test simultaneously as long as:
@@ -168,7 +206,12 @@ When an experiment is running, only the percent of new customers to enroll can b
168206
| Can I run an experiment targeting different app versions for each app in my project? | No, at this time we don't support setting up an experiment in this way. However, you can certainly create unique experiments for each app, and target them by app version to achieve the same result in independent test. |
169207
| Can I add multiple Treatment groups to a single test? | No, you cannot add multiple Treatment groups to a single test. However, by running multiple tests on the same audience to capture each desired variant you can achieve the same result. |
170208
| Can I edit the enrollment criteria of a started experiment? | Before an experiment has been started, all aspects of enrollment criteria can be edited. However, once an experiment has been started, only new customers to enroll can be edited; since editing the audience that an experiment is exposed to would alter the nature of the test. |
171-
| Can I restart an experiment after it's been stopped? | After you choose to stop an experiment, new customers will no longer be enrolled in it, and it cannot be restarted. If you want to continue a test, create a new experiment and choose the same Offerings as the stopped experiment. You can use the **duplicate** feature to quickly recreate the same experiment configuration. _(NOTE: Results for stopped experiments will continue to refresh for 400 days after the experiment has ended)_ |
209+
| What's the difference between pausing and stopping an experiment? | Pausing temporarily stops new customer enrollment while existing participants continue to see their assigned variant. The experiment can be resumed later. Stopping permanently ends the experiment: new customers won't be enrolled and existing participants will see the Default Offering on their next paywall view. A stopped experiment cannot be restarted. Both paused and stopped experiments continue collecting data for up to 400 days. |
210+
| Can I pause an experiment multiple times? | Yes, you can pause and resume an experiment as many times as needed. This allows you to control enrollment based on your testing needs and timeline. |
211+
| Will pausing affect the data collection for already enrolled customers? | No, pausing only affects new enrollments. Customers already in the experiment will continue to see their assigned variant and their behavior will continue to be tracked in the experiment results for up to 400 days. |
212+
| Can I edit a paused experiment? | When an experiment is paused, you cannot edit its configuration. You can only resume it to continue enrollment or stop it permanently. To make changes, you would need to stop the experiment and create a new one. |
213+
| How do paused experiments affect the audience overlap checks? | Paused experiments don't count toward audience overlap since they're not actively enrolling new customers. However, when you try to resume a paused experiment, we'll check if it conflicts with any currently active experiments and prevent resuming if there's an overlap. |
214+
| Can I restart an experiment after it's been stopped? | After you choose to stop an experiment, new customers will no longer be enrolled in it, and it cannot be restarted. However, if you need to temporarily halt new enrollments with the option to resume later, consider using the pause feature instead. Paused experiments can be resumed at any time. If you've already stopped an experiment and want to continue testing, create a new experiment and choose the same Offerings as the stopped experiment. You can use the duplicate feature to quickly recreate the same experiment configuration. *(NOTE: Results for stopped experiments will continue to refresh for 400 days after the experiment has ended)* |
172215
| Can I duplicate an experiment? | Yes, you can duplicate any existing experiment from the experiments list using the context menu. This creates a new experiment with the same configuration as the original, which you can then modify as needed before starting. This is useful for running similar tests or follow-up experiments. |
173216
| What happens to customers that were enrolled in an experiment after it's been stopped? | New customers will no longer be enrolled in an experiment after it's been stopped, and customers who were already enrolled in the experiment will begin receiving the Default Offering if they reach a paywall again. Since we continually refresh results for 400 days after an experiment has been ended, you may see renewals from these customers in your results, since they were enrolled as part of the test while it was running; but new subscriptions started by these customers after the experiment ended and one-time purchases made after the experiment ended will not be included in the results. |
174217

0 commit comments

Comments
 (0)