The Full Stack Developer Guide Developer Hub

Welcome to the Full Stack Developer Guide developer hub. You'll find comprehensive guides and documentation to help you start working with the Full Stack Developer Guide as quickly as possible, as well as support if you get stuck. Let's jump right in!

Get Started    

Preview experiment variations

Before you publish a test or rollout a feature live to your visitors, it's important to make sure it works the way you expect.

Full Stack provides tools to help you verify your tests. Use them to verify your test end-to-end, checking that each variation of your app looks, feels, and operates correctly. It’s also important that any metrics being tracked report correctly on the Results page.

When activating a test, you must provide a userId. Normally, we use the userId to randomize which bucket to respond with. When QA testing, you should check the behavior of each variation. There are two options you can use to tell Optimizely to skip the randomized bucketing and return a specific bucket: forced bucketing and whitelisting. Both of these methods require the test to be running, so start the test in a staging environment. If you can't use Environments, you can start your test with 0% of traffic allocated to it and use these methods for testing.

QA checklist

Here's a QA checklist:

  1. Start your test in a non-production environment.
  2. Force yourself into each experience using either forced bucketing or whitelisting.
  3. Trigger any conversion events that are being tracked by the test AND are on the same page as a variation change. Testing pre-existing or downfunnel events may not be necessary, unless those events also change.
  4. If you're unable to see your testing activity affect the Results page, this may be due to how Optimizely counts conversions. Conversions are attributed to the first variation assignment.
    It may be necessary to reset results in between testing your variations.
  5. If step 4 doesn’t produce the results you expected, troubleshoot by increasing the log level and consulting your error handler to debug.
  6. Repeat steps 2 through 5 for each variation of your test.
  7. Confirm your results. There is a 5 minute interval on results page updates, so you might have to wait up to 5 minutes for it to update. The time of the last update is shown on the results page.
  8. Launch in your production environment.

Re-testing core metrics each time is redundant and unnecessary. However, you should always test any new metrics you’ve added, especially if they are on the same page as a variation change.

Rather than whitelisting, consider pausing all variations except the one you'd like to test instead.


Preview experiment variations


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.