The Full Stack Developer Guide Developer Hub

Welcome to the Full Stack Developer Guide developer hub. You'll find comprehensive guides and documentation to help you start working with the Full Stack Developer Guide as quickly as possible, as well as support if you get stuck. Let's jump right in!

Get Started    

Run A/B tests

Not all experiments are tied to specific features that you've already flagged in your code. Sometimes, you'll want to run a standalone test to answer a specific question—which of two (or more) variations performs best? For example, is it more effective to sort the products on a category page by price or category?

These one-off experiments are called A/B tests, as opposed to feature tests that run on features you've already flagged. With A/B tests, you define two or more variation keys and then implement a different code path for each variation. From the Optimizely interface, you can determine which users are eligible for the experiment and how to split traffic between the variations, as well as the metrics you'll use to measure each variation's performance.

1. Select A/B Test in your project

In the Experiments tab, click Create New Experiment and select A/B Test.

2. Set an experiment key

Specify an experiment key.
Your experiment key must contain only alphanumeric characters, hyphens, and underscores. The key must also be unique for your Optimizely project so you can correctly disambiguate experiments in your application.

Don’t change the experiment key without making the corresponding change in your code.

3. Set experiment traffic allocation

The traffic allocation is the fraction of your total traffic to include in the experiment, specified as a percentage. For example, you set an allocation of 50% for an experiment that is triggered when a user does a search. This means:

  • The experiment is triggered when a visitor does a search, but it won’t be triggered for all users. 50% of users who do a search will be in the experiment, but 50% of users who do a search won't.
  • Users who don’t do a search also won't be in the experiment. In other words, the traffic allocation percentage may not apply to all traffic for your application.

For more information, see our KB article Traffic allocation and distribution.

Optimizely determines the traffic allocation at the point where you call the Activate method in the SDK.

You can also add your experiment to an exclusion group at this point.

4. Set variation keys and traffic distribution

Variations are the different code paths you want to experiment on. Enter a unique variation key to identify the variation in the experiment and optionally a short, human-readable description for reporting purposes.

You must specify at least one variation. There’s no limit to how many variations you can create.

By default, variations are given equal traffic distribution. Customize this value for your experiment's requirements.

5. (Optional) Add an audience

You can opt to define audiences if you want to show your experiment only to certain groups of users. See Define audiences and attributes and Set audiences for experiments .

6. Add a metric

Add events that you’re tracking with the Optimizely SDKs as metrics to measure impact. Whether you use existing events or create new events to use as metrics, you must add at least one metric to a experiment. To re-order the metrics, click and drag them into place.

Important

The top metric in an experiment is the primary metric. Stats Engine uses the primary metric to determine whether an A/B test wins or loses, overall. Learn about the strategy behind primary and secondary metrics.

7. Complete your experiment setup

Click Create Experiment to complete your experiment setup.

8. Implement the code sample into your application

Once you've defined an A/B test, you'll see a code sample for implementing it in your application.

For each A/B test, you use the Activate method to decide which variation a user falls into, then use an if statement to apply the code for that variation. See the example below.

import com.optimizely.ab.android.sdk.OptimizelyClient;
import com.optimizely.ab.config.Variation;

// Activate an A/B test
Variation variation = optimizelyClient.activate("app_redesign", userId);
if (variation != null) {
  if (variation.is("control")) {
    // Execute code for "control" variation
  } else if (variation.is("treatment")) {
    // Execute code for "treatment" variation
  }
} else {
  // Execute code for users who don't qualify for the experiment
}
using OptimizelySDK;

// Activate an A/B test
string userId = "";
var variation = optimizelyClient.Activate("app_redesign", userId);
if (variation != null && !string.IsNullOrEmpty(variation.Key))
{
  if (variation.Key == "control")
  {
    // Execute code for variation A
  }
  else if (variation.Key == "treatment")
  {
    // Execute code for variation B
  }
}
else
{
  // Execute code for your users who don’t qualify for the experiment
}
import com.optimizely.ab.Optimizely;
import com.optimizely.ab.config.Variation;

// Activate an A/B test
Variation variation = optimizelyClient.activate("app_redesign", userId);
if (variation != null) {
  if (variation.is("control")) {
    // Execute code for "control" variation
  } else if (variation.is("treatment")) {
    // Execute code for "treatment" variation
  }
} else {
  // Execute code for users who don't qualify for the experiment
}
// Activate an A/B test
var variation = optimizelyClientInstance.activate('app_redesign', userId);
if (variation === 'control') {
  // Execute code for "control" variation
} else if (variation === 'treatment') {
  // Execute code for "treatment" variation
} else {
  // Execute code for users who don't qualify for the experiment
}
// Activate an A/B test
var variation = optimizelyClient.activate("app_redesign", userId);
console.log(`User ${userId} is in variation: ${variation}`);
if (variation === "control") {
  // Execute code for "control" variation
} else if (variation === "treatment") {
  // Execute code for "treatment" variation
} else {
  // Execute code for users who don't qualify for the experiment
}
#import "OPTLYVariation.h"

// Activate an A/B test
OPTLYVariation *variation = [client activate:@"app_redesign" userId:userId];
if ([variation.variationKey isEqualToString:@"control"]) {
    // Execute code for "control" variation
} else if ([variation.variationKey isEqualToString:@"treatment"]) {
    // Execute code for "treatment" variation
} else {
    // Execute code for users who don't qualify for the experiment
}
// Activate an A/B test
$variation = $optimizelyClient->activate('app_redesign', $userId);
if ($variation == 'control') {
  // Execute code for "control" variation
} else if ($variation == 'treatment') {
  // Execute code for "treatment" variation
} else {
  // Execute code for users who don't qualify for the experiment
}
# Activate an A/B test
variation = optimizely_client.activate('app_redesign', user_id)
if variation == 'control':
  pass
  # Execute code for "control" variation
elif variation == 'treatment':
  pass
  # Execute code for "treatment" variation
else:
  pass
  # Execute code for users who don't qualify for the experiment
# Activate an A/B test
variation = optimizely_client.activate('app_redesign', user_id)
if variation == 'control'
  # Execute code for "control" variation
elsif variation == 'treatment'
  # Execute code for "treatment" variation
else
  # Execute code for users who don't qualify for the experiment
end
// Activate an A/B test
let variation = client?.activate("app_redesign", userId:"12122")
if (variation?.variationKey == "control") {
// Execute code for "control" variation
} else if (variation?.variationKey == "treatment") {
// Execute code for "treatment" variation
} else {
// Execute code for users who don't qualify for the experiment
}

The Activate method:

  • Evaluates whether the user is eligible for the experiment and returns a
    variation key if so. For more on how the variation is chosen, see User bucketing and the API reference for Activate.
  • Sends an event to Optimizely to record that the current user has been exposed to the A/B test. You should call Activate at the point you want to record an A/B test exposure to Optimizely. If you don't want to record an A/B test exposure, use the Get Variation method instead.

Note

If any of the conditions for the experiment aren't met, the response is null. Make sure that your code adequately handles this default case. In general, you'll want to run the baseline experience.