HomeDev GuideAPI Reference
Dev GuideAPI ReferenceUser GuideGitHubNuGetDev CommunitySubmit a ticketLog In
GitHubNuGetDev CommunitySubmit a ticket

Delivery assurance concepts

Shows Optimizely partners how to achieve peak performance and address business and implementation bottlenecks. It also describes some tradeoffs to improve overall performance of your site and tips for a smooth project.

Optimizely DXP is a set of modules that offers tremendous depth and capabilities.

Optimizely Content Management System (CMS) and Optimizely Customized Commerce are tightly bound, so both are included in this topic.

Types of performance bottlenecks

Many factors can cause performance issues, but they tend toward one of the following categories:

  • Content structure and presentation – Content structure affects performance for the end user and the content editor. As sites grow over time, monitoring content structure becomes an important and ongoing task for things such as how much data is loaded, large images, having thousands of pages under one node in the CMS page tree, Customized Commerce catalog, and so on.
  • Back-end performance – Coding issues in the APIs, inefficient querying or loading, and database issues can affect performance. Issues can affect a single user or get worse for everyone as the load increases. Problems may occur in the base code or custom code.
  • Third-party interactions – Real-time calls from an external provider or importing product data to the platform can cause performance issues. While these issues are sometimes the most difficult to manage because they are out of the developer's control, there are some strategies you can employ.

Design considerations and approach

The approach to customizations and implementation options should consider performance impact. To know your progress, start testing early. For example, if you have a product import with hundreds of thousands of SKUs, start testing with big volumes early to give yourself the chance to improve it.

The following sections describe design considerations for your application.

Content Modeling

Content modeling is part of every Optimizely project. It is a complex task that often has changing requirements. Done properly, it helps you in every way.

  • Keep the editor in mind – Content matters, and editors create content. What seems logical to the developer may not be easy for content creators, such as having too many page or block templates, too many pages under one node, or content containing many nesting levels, complicating the editor's tasks. Categorization and taxonomy can help with structuring content.
  • Display names and descriptions – Use names and full descriptions that consider how a non-developer approaches content and avoid developer shortcuts, code names, and abbreviations.
  • Structure properties in tabs – Find a logical way to structure properties into tabs. What is logical is subjective. Some like to separate tabs for content and page settings; others prefer to use them by frequency.
  • Limit the number of available content types – Too many content types are time-consuming for developers and editors to maintain. Editors do not need all page types, such as a branch in the page tree, a settings page or block, or a 404 page. You can hide those by code or control them with access rights, making them available to a few administrators.
  • Restrict block usage – You can restrict block types so that specific blocks can be used only in specific content areas. This makes it easier for editors to keep a consistent design, has less risk for editors to make mistakes, and keeps a cleaner structure throughout the site.
  • Avoid layers of nested blocks – Multiple layers of nested blocks can be useful in some cases. However, it can be difficult for an editor to understand and maintain multiple layers of nesting. Loading many blocks in a nested structure can also cause performance issues. Flat structures and other solutions (such as lists) may accomplish the same goal.
  • Catalog structure – Optimizely's catalog system provides great flexibility for designing and implementing catalogs. You should consider the following design factors.
    • You need to plan how to support and maintain ecommerce solutions that require integrations into other systems to manage products in the catalog. Products are likely imported from an enterprise resource planning (ERP) or product information management (PIM) system. A personalization engine might be added later.
    • Define what you want to accomplish with the catalog.
      • How will it be managed?
      • Are there marketing, promotion, and merchandising requirements that need Support?
      • Does the catalog structure make sense to the business team?
    • Plan the data you need in the catalog. In an ERP, there is a lot of product data; probably not all of it is needed in the ecommerce catalog. Usually, "less is more," and only data with a purpose – such as search, display purposes, or administrative tasks – should be in the catalog.
    • There are not too many variants directly belonging to one product. From a performance perspective, keep the number of variants below a hundred. How the data is used impacts performance, but with a number exceeding that, the risk for performance degradation is higher. Also, having hundreds of variants is not user-friendly. You should have about 30 variations beneath a product for performance and manageability.
    • A product-product-variant hierarchy is usually a good pattern to reduce the number of variants per product. Also, you can use categorization to reduce the number of variants per product.

See also the following topics:

Import products

Importing products to the catalog is part of almost every ecommerce project that requires lots of mapping and test runs to ensure properties are correctly stored. If you want to import hundreds of thousands of SKUs, manage them by testing and optimizing for the best performance. If you want to update the catalog frequently, you should create a delta import so that you do not have to re-import thousands of unchanged products.

See also the following topics:

Personalized recommendations

Visitor groups let you personalize content based on criteria for each group. However, too many visitor group segments on a page can be difficult to maintain and might affect performance. If you use visitor groups, be aware of caching rules for these pages because personalization is not applied to areas of a page that are cached.

See also Recommendations.

Log configuration

The Episerver.Logging.LogManager logging abstraction directs DXP cloud service logs into .NET Diagnostic Trace, stores it in the BLOB storage, and can be accessed from the portal as a live stream or downloaded for offline analysis. Never set up logging into the database because it will affect performance and possibly cause the database to bloat.

Optimizely keeps an activity log of when changes are made to content items. This is available in the Admin view. You can change the default values to extend the activities with your types. By default, activities are persisted for 12 months. If set for a longer time, monitor the database size.

See also the following topics:

Schedule jobs

Optimizely provides default scheduled jobs for different tasks. You also can build your jobs. Any job uses resources in your solution, which might affect performance.

  • Consider how often a job needs to be executed.
  • Run a job during non-peak hours. For example, a product import might take a couple of hours or more to complete and includes a lot of interaction with the database.
  • Many default scheduled jobs are related to the solution, such as Maintain DB indexes, Change log auto truncate, and Automatic emptying of trash. Consider these default jobs and the intervals and times that suit your project.

See also the following topics:

Database access

Optimizely is designed with a code-first approach, so there is no need to access the database itself.

  • APIs – You should never access the database or stored procedures directly or modify the Optimizely data schema, which could cause you to be unable to upgrade and is bad for performance. Use the APIs provided to access, modify, and create data.
  • Custom DB tables – You can add custom database tables if you need to store custom data in Optimizely DB. However, ensure you are not interfering with the standard schema and, if possible, do not access it directly. You do not have the included standard cache layer available for custom tables. Also, if you add custom tables, ensure they are optimized for performance, such as suitable table indexes. Use clean-up jobs where needed.
  • Caching – Try to minimize calls to the database and use the caching layers described in the next section.

See also the following topics:


Caching is the single biggest tool you can leverage to ensure good performance.

  • Object cache – Optimizely automatically caches a read-only version of content objects requested from the internal APIs for better performance. This is called the object cache, which you can configure if needed. For Commerce (PaaS), you can have different expiration times for each subsystem, such as the catalogs and orders.
  • Output cache – You can cache the entire HTML response of page, which saves processing time.


Every customer on Optimizely DXP has a content delivery network (CDN) included in the solution, which can improve performance considerably, especially in global scenarios. Traffic on your site first comes to the CDN; if there is no cached response, the request goes to the web apps.

When you build your application, consider the CDN from the start, not something you add right before going live.

Determine which of your frequently requested objects can be cached. Typical objects are images, videos, CSS, and static scripts, but what and to what extent is up to each implementation?

Cloudflare relies on file extension names rather than MIME type, so ensure files are named appropriately. Some tools can provide insights on what you are caching in the CDN, such as Dr Flare, an add-on to Chrome that shows what is and is not cached in Cloudflare on your site.

  • Configuring cache headers – The CDN is controlled by setting cache headers in the application to decide what to cache, for how long, and determine whether content has changed. Each project determines how to configure the headers and the strategy it uses. The cache hit rate describes the percentage of requests the CDN could answer from its cache and is a good number to monitor.
  • Cache-control – You decide if the content should be cached. If set to public, the content is cached based on your rules. The following example caches an object for a maximum of 1200 seconds:
Cache-Control: public, max-age=1200, must-revalidate

For information about headers and configuration, see Demystify Edge TTL in Cloudflare.

  • ETags – ETags are best used for static content such as files, assets, and pages without an output cache. It should be used for dynamic content as a landing page. You can use stand-alone ETags in combination with other cache settings if you want to optimize performance. In addition to files and assets, you can use ETags to help custom API controllers respond faster. However, ensure the API does not serve dynamic content or include personalization.
  • Version identifiers – In most cases, you can include a version identifier in the path to a resource. The best way is to use unique file names for each deployment. For example, use site-1.0.css for the first deployment will use and site-1.1.css for the second. This way, the CDN will use the correct CSS right after deployment. If you cannot use version identifiers, set the maximum lifespan for objects. A rule of thumb is Version identifier plus indefinite caching or No identifier plus maximum lifespan.

See also the following topics:

Subscription health and monitoring

  • DXP Management Portal – The DXP management portal gives you an overview of your subscription and operations in your environments. If there are issues with your site, a good place to start is the Troubleshooting tab, where you can open a log stream to see errors in real time. Further, you can restart environments and purge the CDN if needed.
  • Application Insights – By default, Application Insights are configured for every customer. Optimizely Support uses it, and partners can get access where you can troubleshoot exceptions in the web apps and performance issues.

See also the following topics:

Search & Navigation

Search & Navigation is included when you run solutions on Optimizely DXP. You can make your content searchable and also build listings and landing pages.

When you need to request and filter a batch of content, such as when you might request pages based on a certain property, use Search & Navigation instead of getting the content from the database.

Also, use Search & Navigation when building content or commerce listings.

You also can create faceting and filtering through the API to improve the end-user experience.

Index Search & Navigation

By default, IContent on your site is indexed and available for search and listings. When a content item is published, it is instantly added to or updated in the index. This is called event indexing.

A scheduled job is included to index the entire site. You should rely on event indexing when you implement your solution.

Indexing takes time and resources on your web app and Search & Navigation index, so you should not run indexing jobs too often. Instead of re-indexing more frequently, investigate solutions to improve the event index when issues occur from adding documents to the index.

Multiple use cases exist where you do not want all content or certain properties indexed. For example, the start page or a "My pages" page might not be suitable for the index to be available from a search. Be aware of the following issues when constructing the index:

  • Think through levels of nested documents because nesting decreases performance when retrieved. This relates to content or catalog structure; another area is to limit the depth of content areas to index.
  • If you use dictionaries, limit the number of unique keys because too many will hurt performance.
  • There might be cases where you do not want prices and inventory in your index. Complex pricing structures with hundreds of thousands of prices will take longer to index and be queried.

Query Search & Navigation

Almost all sites have a search page. You should consider the design features you want to leverage from the start. For example, what type of documents should be searchable? Do you want to leverage features like best bets?

You can implement Search & Navigation as a typed or unified search.

  • Use typed search for most querying scenarios such as retrieval, navigations, and listings that do not involve free text search and doing fine-grained and type-specific filtering.
  • Use unified search when you build standard search pages that do not require filtering on type-specific properties and generic functionality without knowing the content types for which the index will be used.

In many cases, especially if you have a lot of content or products, you should implement pagination. If you return all results for a search, it often results in a lot of data. Load smaller batches as the user interacts with the results. Also, when you learn about your visitor's behavior, you might realize that 90% of searches are for products, so you can load products initially and load content items and documents later when the user interacts with those types of results.

Monitor your usage of wildcard searches because (although useful) excessive use may seriously impact performance. Follow the recommendations in the documentation to avoid performance downgrade.

Optimize Search & Navigation

Search & Navigation contains multiple optimization options, in code and for the editor, such as boosting, best bets, auto-complete, and related queries.

For the editor, there are also search statistics to evaluate user behavior. To get statistics, you need auto boosting or auto-complete implementation of click tracking.

Tracking stores clicks from a user after a search to improve results based on behavior. You do not need tracking for a completely functional search, but some features need tracking to let you optimize.

Build a resilient solution

Search & Navigation is useful for building landing pages, listings, and navigation based on it, and you should build resilience into your site in case the service becomes unavailable. See Building a resilient solution for how to cope with downtime.

See also the following topics:


Querying Search & Navigation

Optimizing Search & Navigation

Build a resilient solution

Go live


You can select deploying through the DXP portal or using the deployment API. You can connect your deployment tool with the APIs, such as Azure dev ops or Octopus deploy.

  • Deployment API – The deployment API is a REST API that lets you automate your CI or CD pipeline. You can also export your databases and sync content from production to pre-production and integration. Customers that use the deployment API deploy four times as often as other customers.
  • Deploy database changes – Deployments sometimes include breaking changes where the old site version works with the new one. To address this, create a maintenance page that shows when the database is taken offline and the schema is updated.
  • Smooth deploy (Content Cloud only) – Smooth deploy offers zero-downtime deployments that apply database changes without taking the site offline or using a maintenance page. If you use smooth deployment, the database must support read-only mode.
  • Warmup – When a web app is run, whether for scaling out or deploying, you should warm it up before it starts receiving user requests. By default, Optimizely's automation engine attempts to issue a request to the start page. You can add more pages and customize the warmup using the section in web config.

See also the following topics:

Load test

You should load test your site before going live, preferably earlier than two weeks before, to have time-evaluated results. It will help you learn how the sites work during your expected traffic load and what happens when load increases. You run the load test as a partner using your chosen tool.

See also the following topics:

Upgrade strategy

Optimizely has a continuous release strategy, meaning software should be released when it is done. Optimizely releases updates every week. The releases follow semantic versioning: Major.Minor.Patch. Majors contain breaking changes, minors add functionality, and patches are bug fixes. How often you upgrade your solution is up to you. You do not need to upgrade every week, but the more often you upgrade, the smoother every upgrade is, and you get features and fixes.


Optimizely provides developers and editors with a lot of documentation about its products. The Optimizely World site has a forum and blogs where you can ask questions, blog about something you created, learn from others, and find inspiration. Optimizely has a highly active community and is a great source of information on leveraging the Optimizely platform.

OMVP's. The Optimizely Most Valuable Professionals program is Optimizely's way of saying "thank you" to its most active and knowledgeable contributors. They help Optimizely by providing feedback on products. They are active mentors leading the community by providing an enormous amount of help, knowledge, and inspiration in an open, courteous, and professional manner. Anyone building solutions on Optimizely benefits from reading their blogs and forum posts.