# Scene A/B tests

Use an A/B test to determine which version of a Scene has the best impact based on your selected metric.

## About A/B tests for Scenes

<p>Create variations of [Scene](https://www.airship.com/docs/reference/glossary/#scene) content by duplicating an existing Scene or creating screens from scratch. You can make a single change, such as changing a button label in a screen, or provide entirely different content.<p>Audience members are randomly selected and split equally to receive your control Scene (Variant A) and your variant Scene (Variant B) for the targeted audience.<p>Related events and conversions are recorded for both audiences, providing data you can use to evaluate Scene performance based on your selected metric.</p>

<p>When running A/B tests and a [Holdout Experiment](https://www.airship.com/docs/reference/glossary/#holdout_experiment) simultaneously, Airship prevents holdout group users from being included in the A/B tests. This eliminates potentially skewed data in cases where there are overlapping experimentation audiences. It also ensures that the most critical experiments maintain integrity.</p>

<p>To prepare for your tests, see <a href="https://www.airship.com/docs/guides/experimentation/a-b-tests/about/">About A/B testing</a>.</p>

Scene A/B test metrics:

| Metric | Description |
| --- | --- |
| **Scene completion** | The user viewed all screens in the Scene. |
| **Push Opt-in** | The user tapped a button, text, image, or screen configured with the [Push Opt-in action](https://www.airship.com/docs/guides/messaging/in-app-experiences/configuration/button-actions/#push-opt-in). |
| **Adaptive Link** | The user followed an [Adaptive Link](https://www.airship.com/docs/reference/glossary/#adaptive_link) in the Scene. |
| **App Rating** | The user tapped a button, text, image, or screen configured with the [App Rating action](https://www.airship.com/docs/guides/messaging/in-app-experiences/configuration/button-actions/#app-rating). |
| **Deep Link** | The user followed a deep link in the Scene. |
| **Preference Center** | The user opened the [Preference Center](https://www.airship.com/docs/reference/glossary/#preference_center) in your app. |
| **App Settings** | The user opened their device's settings page for your app. |
| **Share** | The user tapped a button, text, image, or screen configured with the [Share action](https://www.airship.com/docs/guides/messaging/in-app-experiences/configuration/button-actions/#share). |
| **Web Page** | The user tapped a button, text, image, or screen configured with the [Web Page action](https://www.airship.com/docs/guides/messaging/in-app-experiences/configuration/button-actions/#web-page). |
| **Submit Responses** | The user tapped a button, text, image, or screen configured with the [Submit Responses action](https://www.airship.com/docs/guides/messaging/in-app-experiences/configuration/button-actions/#submit-responses). |

## Creating a Scene A/B test

1. Go to **Messages**, then **Messages Overview**, and select the pencil icon (
) for a Scene.
1. Go to the **Content** step, select **Experiments** in the left sidebar, and then select **Create experiment**. A Scene must have at least one screen configured before the Experiments option is available.
1. Enter a name and description, and then choose the metric to use for reporting experiment performance.
1. Check the box for **Copy content from existing Scene** if you want to duplicate the current Scene's content and edit. Keep the box unchecked if you want to create a variant content from scratch.
1. Select **Save**.
1. Configure screens for variant B as you would for a new Scene. See [Configuring Scene content](https://www.airship.com/docs/guides/messaging/in-app-experiences/configuration/content/).
   > **Important:** Both variants must include the same action/event associated with the experiment's primary metric. For example, if you want to use Submit Responses as your primary metric, you must configure that action for a button in both variants.

   > **Tip:** * A test with a single variable is measurable. When you make multiple changes in the variant, you will not know which change had an effect.
>    * If your primary metric is Push Opt-in, consider testing the order of your screens so that users don't dismiss the Scene before the request.
>    * If your primary metric is Scene Completion, focus on the number of screens and their content value. For example, a long Scene (more than 5 screens) will often get a lower completion rate than a shorter one.

1. Select **Done**.
1. Go to the **Review** step to review the device preview and Scene summary.
1. Select **Finish** or **Update** to start the test. You cannot start an A/B test for a Scene that has unpublished changes.

You cannot edit a Scene's content while an A/B test is active.

## Selecting the winning variant

After starting an A/B test, compare the performance of the variants in the Scene's Content step or in its message report to determine which (or if either) message is having the expected impact.

<p>You may want to end an A/B test early if you see a significant drop in conversions or engagement. If the drop is not significant or if it is observed early on in the test period, you may want to let the test continue, as the rate may correct itself. Another reason to end a test early is if you notice an error in your content. To end a test early, select a winner. This effectively cancels the test.</p>

See also [Implementing A/B tests, outcomes, and compliance](https://www.airship.com/docs/guides/experimentation/a-b-tests/about/#implementing-ab-tests-outcomes-and-compliance) in *About A/B testing*.

After selecting a winning variant, the Scene is republished with the winner, and the A/B test ends.

1. Go to **Messages**, then **Messages Overview**, and select the report icon (
) for your Scene.
1. Select **Scene Detail** and compare the metrics of variants A and B.
    * The default view is based on the metric selected when creating the experiment. If other applicable metrics are available, you can choose from the dropdown menu, and the displayed data will update. If not relevant to both variants, N/A appears instead of a value.
    * Conversions are calculated as the number of users who performed the action defined in the primary metric divided by the number of users who entered the Scene. See [Scene Reports](https://www.airship.com/docs/guides/messaging/in-app-experiences/scenes/create/scene-reports/) for more information about individual statistics.
1. Select **Select as winner** and confirm your choice.
