Dev guideAPI Reference
Dev guideAPI ReferenceUser GuideGitHubNuGetDev CommunitySubmit a ticketLog In
GitHubNuGetDev CommunitySubmit a ticket

Implementation checklist

This topic covers important configuration details and best practices to employ while using the Optimizely Feature Experimentation SDKs.

When preparing to implement Optimizely Feature Experimentation in a production environment, it is a good idea to familiarize yourself thoroughly with the configuration details and best practices that will streamline the entire process.

📘

Note

If you use SDK versions released in April 2018 or earlier, see a previous version of this topic.

Architectural diagrams

These diagrams give high-level context for several items in this checklist.

The system context diagram shows a top-level view of Optimizely and your app:

system context diagram shows a top-level view of Optimizely and your app

The SDK & app container diagrams show more detail:

SDK & app container diagrams show more detail:

Datafile management

The datafile is a JSON representation of OptimizelyConfig. It contains all the data needed to deliver and track your experiments and flag deliveries for an environment in your Optimizely Feature Experimentation project.

You have the following options for synchronizing the datafile between your Optimizely Feature Experimentation project and your application:

  • Pull method (recommended) – The SDKs automatically fetch the datafile. The SDKs poll and fetch the latest version of the datafile at whatever frequency you set when you instantiate the SDK.
  • Push method – Use webhooks to fetch and manage datafiles based on application changes. Use this method alone or in combination with polling if you need faster updates.
  • Customized method – If you want to customize or extend how you access the datafile, you can fetch the datafile using the Optimizely CDN datafile URL.

Other important considerations for datafile management include:

  • Caching and persistence.
  • Synchronization between SDK instances.
  • Network availability.

📘

Note

To ensure webhook requests originate from Optimizely, secure your webhook using a token in the request header.

SDK configuration

The Optimizely Feature Experimentation SDKs are highly configurable and can meet the needs of any production environment, but adequate scaling may require overriding some default behavior to best meet the needs of your application.

Logger

Verbose logs are critical. The default no-operation SDK logger gives you the scaffolding to create a custom logger. It is fully customizable and can support use cases like writing logs to an internal logging service or vendor. However, it is intentionally non-functional out-of-the-box. Create a logger that suits your needs and pass it to the Optimizely client.

For more information, see the documentation for the logger and the SimpleLogger reference implementation.

Error handler

In a production environment, errors must be handled consistently across the application. The Optimizely Feature Experimentation SDKs allow you to provide a custom error handler to catch configuration issues like an unknown experiment key or unknown event key. This handler should cause the application to fail gracefully to deliver a normal user experience. It should also ping an external service, like Sentry, to alert the team of an issue.

🚧

Important

If you do not provide a handler, errors will not surface in your application.

User profile service

Building a User Profile Service (UPS) helps maintain consistent variation assignments between users when test configuration settings change.

The Optimizely Feature Experimentation SDKs bucket users using a deterministic hashing function, so as long as the datafile and user ID are consistent, it will always evaluate to the same variation. When test configuration settings change, adding a new variation or changing traffic allocation can change a user’s variation and alter the user experience.

Learn more about bucketing behavior in Optimizely Feature Experimentation.

A UPS solves this by persisting information about the user in a datastore. At a minimum, it should create a mapping of user ID to variation assignment. Implementing a UPS requires exposing a lookup and save function that either returns or persists a user profile dictionary. Our documentation includes the JSON schema for this dictionary. This service also assumes all user IDs are consistent across all use cases and sessions.

📘

Note

We recommend caching user information after first lookup to speed future lookups.

Let us walk through an example. Using Redis or Cassandra for the cache, you can store user profiles in a key-value pair mapping. You can use a hashed email address mapping to a variation assignment. To keep sticky bucketing for six hours at a time, set a time to live (TTL) on each record. As Optimizely buckets each user, the UPS will interface with this cache and make reads/writes to check assignments before bucketing normally.

Build an SDK wrapper

Many developers prefer to use wrappers to both encapsulate the functionality of an SDK and simplify maintenance. This can be done for all the configuration options described above. Our GitHub repository includes a few examples; see Demo apps and SDK wrappers.

Environments

Optimizely Feature Experimentation's environments feature lets you confirm behavior and run tests in isolated environments, like development or staging. This makes it easier to deploy tests in production safely. Environments are customizable and should mimic your team's workflow. Most customers use two environments: development and production. This allows engineering and QA teams to inspect tests safely in an isolated setting while site visitors are exposed to tests running in the production environment.

View production as your real-world workload. A staging environment should mimic all aspects of production so you can test before deployment. In these environments, all aspects of the SDK—including the dispatcher and logger—should be production-grade. In local environments like test or development, it is okay to use the out-of-the-box implementations instead.

Environments are kept separate and isolated from each other with their own datafiles. For more security, Optimizely Feature Experimentation allows you to create secure environments, which require authentication for datafile requests. Our server-side SDKs support initialization with these authenticated datafiles. We recommend using this feature only in projects that exclusively use server-side SDKs and implementation. If you fetch the datafiles in a client-side environment, they may become accessible to end-users

User IDs and attributes

User IDs identify the unique users in your tests. It is especially important in a production setting to both carefully choose the type of user ID and set a broader strategy of maintaining consistent IDs across channels. Our documentation explores different approaches and best practices for choosing a user ID.

Attributes allow you to target users based on specific properties. In Optimizely, you can define which attributes should be included in a test. Then, in the code itself, you can pass an attribute dictionary on a per-user basis to the SDK, which will determine which variation a user sees.

📘

Note

Attribute fields and user IDs are always sent to Optimizely’s backend through impression and conversion events. It is up to you to responsibly handle fields (for example, email addresses) that may contain personally identifiable information (PII). Many customers use standard hash functions to obfuscate PII.

Integrations

Build custom integrations with Optimizely Feature Experimentation using a notification listener. Use notification listeners to programmatically observe and act on various events that occur within the SDK and enable integrations by passing data to external services.

Here are a few examples:

  • Send data to an analytics service and report that user_123 was assigned to variation A.
  • Send alerts to data monitoring tools like New Relic and Datadog with SDK events to better visualize and understand how A/B tests can affect service-level metrics.
  • Pass all events to an external data tier, like a data warehouse, for additional processing and to leverage business intelligence tools.

QA and testing

Before you go live with your test, we have a few final tips:

  • Consider your QA options. To manually test different experiences, force yourself into a variation using allowlisting.
  • Ensure everything is working smoothly in a development or staging environment paired with the corresponding datafile generated from a test environment within Optimizely. This will confirm the datafile is accurate and can be verified by checking your SDK logs.
  • Run an A/A test to double-check that data is being captured correctly. This helps ensure there are no differences in conversions between the control and variation treatments. Read more about A/A testing.

If you have questions, contact support. If you think you have found a bug, file an issue in the SDK’s GitHub repository, and Optimizely will investigate as soon as possible.