HomeGuidesAPI ReferenceGraphQL
Submit Documentation FeedbackJoin Developer CommunityOptimizely GitHubOptimizely NuGetLog In

Delivery Assurance Concepts

This topic shows Optimizely partners how to achieve peak performance and address business and implementation bottlenecks. It also describes some tradeoffs to improve overall performance of your site and tips for a smooth project.

Introduction

Optimizely DXP is a set of modules that offers tremendous depth and capabilities.

Content Cloud and Commerce Cloud are tightly bound together, so both are included in this topic.

Types of performance bottlenecks

Performance issues can be cause by many factors, but they tend towards one of the following categories:

  • Content structure and presentation. Content structure affects performance for both the end user and the content editor. As sites grow over time, monitoring content structure becomes an important and ongoing task for things such as how much data is loaded, very large images, having thousands of pages under one node in the Content Cloud page tree or Commerce Cloud catalog, and so on.
  • Back-end performance. Coding issues in the APIs, inefficient querying/loading, and database issues can affect performance. Issues can affect a single user, or get worse for everyone as the load increases. Problems may occur in the base code or in custom code.
  • Third-party interactions. Real-time calls from an external provider, or importing a lot of product data to the platform can cause performance issue. While these issues are sometimes the most difficult to manage because they are out of the developer’s control, there are some strategies you can employ.

Design considerations and approach

The approach to customizations and implementation options should consider performance impact. To know your progress, start testing early. For example, if you have a product import with hundreds of thousands of SKUs, start testing with big volumes early to give yourself the chance to improve it.

The following sections describe design considerations for your application.

Content Modeling

Content modeling is part in every Optimizely project. It is a complex task often with changing requirements. Done properly, it helps you in every way.

  • Keep the editor in mind. Content matters, and content is created by editors. What seems logical to the developer may not be easy for content creators, such as having too many page/block templates, too many pages under one node, or content that contains many levels of nesting, which complicates the editor's tasks. Categorization and/or taxonomy can help with structuring content.
  • Display names and descriptions. Use names and full descriptions that consider how a non-developer approaches content, and avoid developer short-cuts, code names, and abbreviations.
  • Structure properties in tabs. Find a logical way to structure properties into tabs. What is logical is subjective. Some like to separate tabs for content and page settings; others by frequency of use.
  • Limit the number of available content types. Having too many content types is time consuming for both developers and editors to maintain. Editors do not need all page types such as a branch in the page tree, a settings page/block, or a 404-page. You can hide those by code or control them with access rights, making them available to a small number of administrators.
  • Restrict block usage. You can restrict block types so that specific blocks can be used only in specific content areas. This makes it easier for editors to keep a consistent design, less risk for editors to make mistakes, and keeps a cleaner structure throughout the site.
  • Avoid layers of nested blocks. Multiple layers of nested block can be useful in some cases. However, it can be difficult for an editor to understand and maintain multiple layers of nesting. Plus, loading many blocks in a nested structure can also cause performance issues. Flat structures and other solutions (such as lists) may accomplish the same goal.
  • Catalog structure.Optimizely’s catalog system provides great flexibility for designing and implementing catalogs. You should consider the following design factors.
    • You need to plan for how to support and maintain e-commerce solutions that require integrations to other systems to manage products to the catalog. Products are likely imported from an enterprise resource planning (ERP) system or product information management (PIM) system. A personalization engine might be added later.
    • Define what you want to accomplish with the catalog.
      • How will it be managed?
      • Are there marketing, promotion, and merchandising requirements that needs to be supported?
      • Does the catalog structure make sense to the business team?
    • Plan the data you need in the catalog. In an ERP, there are a lot of product data and probably not all are needed in the e-commerce catalog. Usually, "less is more," and only data with a purpose - such as search, display purposes or administrative tasks - should be in the catalog.
    • Do not have too many variants directly belonging to one product. From a performance perspective, keep the number of variants below a hundred. How the data is used impacts performance but with a number exceeding that, the risk for performance degradation is higher. Also, having hundreds of variants is not user-friendly. You should have about 30 variations beneath a product for performance and manageability.
    • To reduce the number of variants per product, a product-product-variant hierarchy is usually a good pattern. Also, you can use categorization to reduce the number of variants per product.

See also the following topics:

Importing products

Importing products to the catalog is part of almost every e-commerce project that requires lots of mapping and test runs to ensure properties are correctly stored. If you want to import hundreds of thousands of SKUs, manage it with testing and optimizing for best performance. If you want to update the catalog frequently, you should create a delta import so that you do not have to re-import thousands of unchanged products.

See also the following topics:

Personalization

Visitor groups let you personalize content based on criteria for each group. However, too many  visitor group segments on a page can be difficult to maintain and might affect performance. If you use visitor groups, be aware of caching rules for these pages because personalization is not applied to areas of a page that are cached.

See also Personalization.

Log configuration

The Episerver.Logging.LogManager logging abstraction directs DXP cloud service logs into .NET Diagnostic Trace, stores it in the BLOB storage, and can be accessed from the portal as a live stream or downloaded for offline analysis. Never setup logging to the database because it will affect performance and possible bloating of the database.

Optimizely keeps an activity log of when changes are made to content items. This is available in the Admin view. You can change the default values to extend the activities with your own types. By default, activities are persisted for 12 months. If set a longer time, monitor the database size.

See also the following topics:

Timing of scheduled jobs

Optimizely provides default scheduled jobs for different tasks. You also can build your own jobs. Any job uses resources in your solution which might affect performance.

  • Consider how often a job needs to execute.
  • Run a job during non-peak hours. For example, a product import might take a couple of hours or more to complete and includes a lot of interaction with the database.
  • Many of the default scheduled jobs are related to maintaining the solution, such as Maintain DB indexes, Change log auto truncate, and Automatic emptying of trash. Consider these default jobs and the intervals and times that suit your project.

See also the following topics:

Database access

Optimizely is designed with a code-first approach, so there is no need to access the database itself.

  • APIs. You should never access the database or stored procedures directly, or modify the Optimizely data schema, which could cause you to be unable to upgrade and is bad for performance. Always use the APIs provided for accessing, modifying, and creating data.
  • Custom DB tables. If you need to store custom data in Optimizely DB, you can add custom database tables, but make sure you are not interfering with the standard schema and, if possible, do not access it directly. You do not have the included standard cache layer available for custom tables. Also, if you add custom tables, make sure they are optimized for performance such as suitable table indexes. Use clean-up jobs where needed.
  • Caching. Try to minimize calls to the database and use the caching layers described in the next section.

See also the following topics:

Caching

Caching is the single biggest tool you can leverage to ensure good performance.

  • Object cache. By default Optimizely automatically caches a read-only version of content objects requested from the internal APIs for better performance. This is called the object cache which you can configure if needed. For Commerce Cloud, you can have different expiration times for each subsystem, such as the catalogs and orders.
  • Output cache. You can cache the entire HTML response of page, which saves processing time.

The CDN

Every customer on Optimizely DXP has a content delivery network (CDN) included in the solution. which can improve performance considerably especially in global scenarios. All traffic on your site first comes to the CDN and if there is not a cached response, the request goes to the web apps.

When you build your application, give consideration to the CDN from the start and not something you add right before going live.

Determine which of your frequently requested objects can be cached. Typical objects are images, videos, CSS, and static scripts but what and to what extent is up to each implementation.

Cloudflare relies on file extension names rather than MIME type, so make sure files are named appropriately. There are tools that can provide insights on what you are caching in the CDN, such as Dr Flare which is an add-on to Chrome that shows what is and is not cached in Cloudflare on your site.

  • Configuring cache headers. The CDN is controlled by setting cache headers in the application to decide what to cache, for how long, and determine whether content has changed. Each project determines how to configure the headers and the strategy it uses. The cache hit rate describes the percentage of requests the CDN could answer from its own cache and is a good number to monitor.
  • Cache-control. You decide if the content should be cached. If set to public, the content is cached based on the rules you set. The following example caches an object for a maximum of 1200 seconds:
Cache-Control: public, max-age=1200, must-revalidate
For information about headers and configuration, see [Demystifying Edge TTL in Cloudflare](https://world.optimizely.com/blogs/elias-lundmark/dates/2020/4/demystifying-edge-ttl-in-cloudflare/).
  • ETags. ETags are best used for static content as files, assets, and pages without output cache. It should be used for dynamic content as a landing page. You can use stand-alone ETags in combination with other cache settings if you want to optimize performance. In addition to files and assets, you can use ETags for custom API controllers to have faster responses, but make sure the API does not serve dynamic content or includes personalization.
  • Version identifiers. In most cases, you can include a version identifier in the path to a resource. The best way is to use unique file names for each deployment. For example, use site-1.0.css for the first deploy and site-1.1.css for the second. This way the CDN uses the correct CSS right after deployment. If you cannot use version identifiers, make sure to set the maximum lifespan for objects. A rule of thumb is Version identifier plus indefinite caching or No identifier plus maximum lifespan.

See also the following topics:

Subscription health and monitoring

  • DXP Management Portal. The DXP management portal gives you an overview of your subscription and do operations on your environments. If there are issues with your site a good place to start is the Troubleshooting tab where you can open a log stream to see errors in real-time. Further, you can restart environments and purge the CDN if needed.
  • Application Insights. By default, Application Insights are configured for every customer. It is used by Optimizely support and partners can get access where you can troubleshoot exceptions in the web apps and performance issues.

See also the following topics:

Search & Navigation

Search & Navigation is included when you run solutions on Optimizely DXP. You can make your content searchable and also build listings and landing pages.

General recommendations

When you need to request and filter a batch of content, such as when you might request pages based on a certain property, use Search & Navigation instead of getting the content from the database.

Also use Search & Navigation when you build listings for content or commerce.

You also can create faceting and filtering through the API to improve the end user experience.

Indexing

By default, IContent on your site is indexed and made available to use in search and listings. When a content item is published, it is instantly added to or updated in the index. This is called event indexing.

A scheduled job is included for indexing the entire site. You should rely on the event indexing when you implement your solution.

Indexing takes time and resources on your web app and Search & Navigation index, so you should not run indexing jobs too often. Instead of re-indexing more frequently, investigate solutions to improve event index when issues occur from adding documents to the index.

There are multiple use cases where you do not want all content or certain properties indexed. For example, the start page or a “My pages” page might not be suitable to have in the index to be available from a search. Be aware of the following issues when when constructing the index.

  • Think through levels of nested documents because nesting decreases performance when retrieved. This relates to content/catalog structure; another area is to limit the depth of content areas to index.
  • If you use dictionaries, limit the number of unique keys because too many will have a negative impact on performance.
  • There might be cases where you do not want to have prices and inventory in your index. For complex pricing structures with hundreds of thousands of prices will take longer both to index and to be queried.

Querying Search & Navigation

Almost all sites have a search page. You should think through the design features you want to leverage from the start. For example, what type of documents should be searchable?, do you want to leverage features like best bets?, and so on.

You can implement Search & Navigation as a typed search or a unified search.

  • Use typed search for most querying scenarios such as retrieval, navigations, and listings that do not involve free text search and doing fine grained and type specific filtering.
  • Use unified search when you build standard search pages that do not require filtering on type-specific properties and generic functionality not knowing the content types for which the index will be used.

In many cases, especially if you have a lot of content or products, you should implement pagination. If you return all results for a search, it often results in a lot of data. Load smaller batches as the user interacts with the results. Also, when you learn about your visitor’s behavior you might realize that 90% of searches are for products, so you can load products initially and load content items and documents later when the user interacts with those types of results.

Monitor your usage of wildcard searches because (although useful) excessive use may seriously impact performance. Follow the recommendations in the documentation to avoid performance downgrade.

Optimizing Search & Navigation

Search & Navigation contains multiple optimization options, both in code and for the editor, such as boosting, best bets, auto complete, and related queries.

For the editor there is also search statistics to evaluate user behavior. To get statistics, use auto boosting or auto complete implementation of click tracking is needed.

Tracking stores clicks from a user after a search to improve results based on behavior. You do not need tracking for a completely functional search, but some features need tracking to let you optimize.

Build a resilient solution

Search & Navigation is useful for building landing pages, listings and navigation based on it, and you should build resilience into your site in case the service becomes unavailable. See Building a resilient solution to how to cope with downtime.

See also the following topics:

Indexing

Querying Search & Navigation

Optimizing Search & Navigation

Build a resilient solution

Go live

Deploying

You can select deploying through the DXP portal or using the deployment API. With the API you can connect your own deployment tools, such as Azure dev ops or Octopus deploy.

  • Deployment API. The deployment API is a REST API that lets you automate your CI/CD pipeline. You can also export your databases and synchronize content from production down to pre-production and integration. Customers that use the deployment API deploy four times as often as other customers.
  • Deploying database changes. At times, deployments include breaking changes where the old version of the site cannot work with the new version of the database. To address this, create a maintenance page which is shown when the database is taken offline, and the schema is updated.
  • Smooth deploy (Content Cloud only). Smooth deploy offers zero-downtime deployments that applies database changes without taking the site offline or using a maintenance page. If you use smooth deploy, the database must support read-only mode.
  • Warmup. When a new web app is run, whether it is for scaling out or deploying, you should warm it up before it starts receiving requests from users. By default, Optimizely’s automation engine attempts to issue a request to the start page. You can add more pages and customize the warmup by using the <applicationInitialization> section in web config.

See also the following topics:

Load testing

You should load test your site before going live, preferably earlier than two weeks before for you to have time-evaluated results. It will help you learn how the sites are working during your expected traffic load but also what happens when load increases. The load test is run by you as partner using the tool of your choice.

See also the following topics:

Upgrade strategy

Optimizely has a continuous release strategy, which means that software should be released as soon as it is done. Optimizely releases updates every week. The releases follow semantic versioning: Major.Minor.Patch. Majors contains breaking changes, Minors added functionality, and Patches are bugfixes. How often you upgrade your solution is up to you. You do not need to upgrade every week but the more often you upgrade the smoother every upgrade is and you get new features and fixes.

Community

Optimizely provides a lot of documentation around our products both for developers and editors. On the Optimizely World site there is a forum and blogs where you can ask questions, blog about something you created, and learn from others and find inspiration. Optimizely has a highly active community that is a great source of information on how to leverage the Optimizely platform.

OMVP’s. The Optimizely Most Valuable Professionals program is our way of saying “thank you” to our most active and knowledgeable contributors. They help us by providing feedback on products and are active mentors leading the community by providing enormous amount of help, knowledge and inspiration in a manner that is open, courteous, and professional. Anyone building solutions on Optimizely benefits from reading their blogs and forum posts.

Resources


Did this page help you?