Disclaimer: This website requires Please enable JavaScript in your browser settings for the best experience.

Dev GuideAPI Reference
Dev GuideAPI ReferenceUser GuideLegal TermsGitHubDev CommunityOptimizely AcademySubmit a ticketLog In
Dev Guide

Crawl product pages

Describes how search engines crawl product pages.

Search engines use automated bots to crawl content of your website. These bots update the search engine records for web content and search indices of your website.

Optimizely Configured Commerce has a Search Engineering Optimization (SEO) feature that allows bots to consume server-side rendered content instead of the dynamically rendered Angular pages.

The following table represents the crawlers that will trigger the SEO Catalog:

CrawlerCrawler Description
botThis substring will catch all crawlers with "bot" in the UserAgent
crawlerThis substring will catch all crawlers with "crawler" in the UserAgent
baiduspiderBaidu's web crawling spider
80legs80 legs web crawling and screen scraping platform
ia_archiverAlexa's web and site audit crawler
voyagerCosmix Corporation's web crawling bot
curlCommand-line tool for transferring data
wgetCommand-line tool for retrieving files
yahoo! SlurpYahoo!'s web-indexing robot
mediapartners-googleGoogle's web-indexing robot

From a coding standpoint, the logic is handled in the SearchCrawlerRouteConstraint class located in the InSite.Mvc.Infrastructure assembly. The Match method returns a Boolean response of whether or not the incoming request, specifically the User Agent object, contains any of the crawlers listed above.

This approach is required due to the use of Angular JS in the Configured Commerce front end.

When deploying and testing your website we recommend using a browser, such as Chrome, with developer tools enabled. This way you can spoof the User Agent value to test your SEO settings. Press F12, click Configure throttling, and select the Custom user agent of your choosing.

For information about Device Emulation with Google, see Google developer site.