HomeGuidesAPI Reference
Submit Documentation FeedbackJoin Developer CommunityOptimizely GitHubOptimizely NuGetLog In

Apps maintained by third-parties

This topic describes how to integrate several third-party applications, gain insights into how visitors behave in specific experiments and variations, and identify experiments and variations based on the analytics strings listed in third-party integration reports and data.

Several useful apps have been developed by other companies to enable their products to work seamlessly with Optimizely Web Experimentation. In most cases, the documentation for those integrations is maintained by the developers. This article provides you with a list of these integrations, as well as handy links to the relevant documentation and support resources.

Integrate with ContentSquare

If you use ContentSquare and Optimizely Web Experimentation, you can enable the ContentSquare integration to better understand how and why users are interacting with your digital platforms, based on experiments and visitors.

The integration between ContentSquare and Optimizely Web Experimentation lets you analyze user interactions for each variation, helping you understand why your experiments win and enabling you to see and quantify the change in user behavior.

To get the ContentSquare integration running, follow ContentSquare's integration guide.

Integrate with Decibel Insight

If you use Decibel Insight and Optimizely Web Experimentation, you can enable the Decibel Insight integration to collect variation information on pages where you're using both Optimizely Web Experimentation and Decibel Insight.

With this integration, you can use Decibel Insight features to uncover variation-specific visitor behavior information. This will help you understand exactly how visitors behave for different Optimizely Web Experimentation variations.

Before you integrate Decibel Insight, make sure to disable the Mask descriptive names in the project and third-party integrations option in your Optimizely Web Experimentation project Privacy Settings.

To get the integration up and running, follow the Decibel Insight integration guide.

Since Decibel Insight built this integration, contact them for support.

Integrate with Econda

If you use Econda and Optimizely Web Experimentation, you can view the Econda documentation to find out how to enable the Econda with Optimizely to merge your experiment and variation data in Econda Analytics with your other dimensions and metrics from your web analytics.

The integration between Econda and Optimizely Web Experimentation will help you understand your visitors' visit duration, bounce rate, etc.

To get the Econda integration up and running, follow Econda's integration guide on their documentation. On Optimizely Web Experimentation's side, we have the integration code and the steps for setting this integration up. You can also reach out to your Optimizely support or Customer Success Manager for assistance on getting this integration up and running.

Since Econda built this integration, contact them for further support, through their website.

Integrate with Fivetran

You can turn on your Fivetran and Optimizely Web Experimentation integration in a matter of minutes. Fivetran’s zero-maintenance connector then ETL’s all of your raw Optimizely Web Experimentation data into your Data Warehouse, enabling Optimizely users to blend their Optimizely data with data from their other business applications to gain deep insights.

To get the Fivetran integration up and running, follow Fivetran's setup guide.

Since Fivetran built this integration, please reach out to them for support. Find their contact information on their support website..

Integrate with Mouseflow

If you use Mouseflow and Optimizely Web Experimentation, you can enable the Mouseflow integration to filter recording lists, heatmap lists, funnels, and form analytics reports based on your experiments and variations.

The integration between Mouseflow and Optimizely Web Experimentation will help you understand how visitors to your site click, move, scroll, and browse in different experiments and variations.

To get the Mouseflow integration up and running, follow Mouseflow's integration guide. Since Mouseflow built this integration, contact them for support, on their contact website.

Integrate with SessionCam

If you use SessionCam and Optimizely, you can enable the SessionCam integration to see replays of visitors to your site in Optimizely experiments and variations.

The integration between SessionCam and Optimizely between will help you understand how visitors to your site click, move, scroll, and browse in different experiments and variations, as well as automatic machine-learning powered suggestions for areas on your site where visitors struggle the most.

To get the SessionCam integration up and running, follow SessionCam's integration guide.

Since SessionCam built this integration, please reach out to them for support.

Integrate with Segment

To integrate Optimizely Web Experimentation with Segment, see their documentation.

Name conventions for third-party integrations

Analytics strings are tags that identify the experiment and variation IDs are associated with data you're tracking within a third-party integration like Google Analytics or Adobe Analytics. When you create a custom report within the integrated tool, the experiments and variations that you're tracking will be listed by their assigned analytics strings.

For example, if you integrate Google Analytics (GA) with Optimizely Web Experimentation, GA will place a custom dimension on your website to track the data you specify for visitors. You can target this GA custom dimension in Optimizely Web Experimentation to track the actions you have set up in GA for visitors in your Optimizely experiment. To define these visitors,

Optimizely automatically tags the GA custom dimension with the experiment and variation ID (in the form of analytics string). When you filter your data or create custom reports in GA, you'll see the analytics string so you can identify which Optimizely experiment and variation the visitor was allocated to.

Analytics string format

Optimizely Web Experimentation uses a consistent format for analytics strings, so understanding the format will help you identify experiments and variations when you're reading reports in third-party integrations. Experiments and campaigns created after 3:00 p.m. Pacific Time (PT) on May 23, 2017, use the updated analytics string format described below.

Experiments and campaigns created before 3:00 p.m. PT on May 23, 2017, use the old analytics string format, but they can be converted to use the new string format. For more information, contact your Customer Success Manager.

The general analytics string format for A/B test experiments is:
experiment_name(experimentID):variation_name(variationID)

The general analytics string format for Personalization campaigns is:
campaign_name(campaignID):experiment_audience_string(experimentID):variation_name(variationID)

In both cases, all IDs are integers.

Experimentation format examples

Experimentation analytics strings when your project settings are not set to mask descriptive names

Scenario: the visitor is bucketed in a variation.

  • Old format: experiment_name(1234):variation_name(4321):treatment
  • Updated format: experiment_name(1234):variation_name(4321)

Scenario: the visitor is not bucketed in a variation because the visitor is excluded by traffic allocation.

  • Old format: experiment_name(1234):variation_name(4321):holdback
  • Updated format: no string is sent because the visitor is not in any experiment.

Experimentation analytics strings when your project settings are set to mask descriptive names

Scenario: the visitor is bucketed in a variation.

  • Old format: everyone_else(1234):(4321):treatment
  • Updated format: (1234):(4321)

Personalization format examples

Personalization analytics strings when your project settings are not set to mask descriptive names

Scenario: The visitor is bucketed in a variation in an experiment with no audience.

  • Old format: campaign_name(1234):everyone_else(1234):variation_name(4321):treatment
  • Updated format: campaign_name(1234):everyone(1234):variation_name(4321)

Scenario: The visitor is bucketed in a variation in an unnamed experiment with audiences.

  • Old format: campaign_name(1234):experiment_audience_string(1234):variation_name(4321):treatment
  • Updated format: campaign_name(1234):experiment_audience_string(1234):variation_name(4321)

Scenario: The visitor is bucketed in a variation in a named experiment.

  • Old format: campaign_name(1234):experiment_name(1234):variation_name(4321):treatment
  • Updated format: campaign_name(1234):experiment_name(1234):variation_name(4321)

Scenario: The visitor is placed in the campaign holdback.

  • Old format: campaign_name(1234):experiment_audience_string(1234):variation_name(4321):holdback
  • Updated format: campaign_name(1234):experiment_audience_string(1234):variation_name(4321):holdback

📘

Note

experiment_audience_string is an automatically-generated name based on the audiences used in the Personalization campaign.

Personalization analytics strings when your project settings are set to mask descriptive names

Scenario: The visitor is bucketed in a variation in an experiment with no audience.

  • Old format: (1234):everyone_else(1234):(4321):treatment
  • Updated format: (1234):(1234):(4321)

Scenario: The visitor is bucketed in a variation in an experiment with audiences.

  • Old format: (1234):aud_id1,aud_id2(1234):(4321):treatment
  • Updated format: (1234):(1234):(4321)

Scenario: The visitor is placed in the campaign holdback.

  • Old format: (1234):aud_id1,aud_id2(1234):(4321):holdback
  • Updated format: (1234):(1234):(4321):holdback