Optimizely and SEO
Describes search engine optimization in Optimizely.
SEO (Search Engine Optimization) is more than just getting your website to rank high in search engines such as Google or Bing. It is the art of understanding your website visitors and optimizing your site to the needs of those visitors, and that includes things like site speed, image optimization, and well-structured navigation. If you succeed with that, your website will automatically be ranked higher on search engines such as Google.
SEO is an enormous area, employing lots of people, and the details of it are not in the scope of this developer guide. However, this topic contains a checklist with the most important things you, as a developer, can do in Optimizely to improve your site's attractiveness and ranking in search engines. For SEO tips oriented towards the editors, see the Optimizely User Guide.
Customize your templates
One of the most important site optimization tasks for developers is to set up the site templates so that they allow the editors to configure SEO meta data, such as title, keywords, page descriptions etc, consistently on pages. This is an example from the Alloy sample site, where all SEO-related properties have been gathered on an SEO tab:
Metadata
The following metadata tags can be useful to have:
- Title – The title is important for the Search Engine Results Page (SERP).
- Description – The description is also important for the SERP. Google does not use this tag for page ranking, but it is very important for the click-through rate from the search results.
- No indexing – This is used to stop robots from indexing pages that should not be indexed.
- No follow – Used by robots. Rel attributes can be assigned the value nofollow directly on links which means that search engines will not take the link into consideration when indexing your site. The link will not affect the target page's SEO ranking. It is also possible to add the nofollow directive on the robots meta tags or X-Robots-Tag HTTP header; this will stop all links on the page from being followed by the search engines.
You can also extend the meta data with:
-
Open Graph tags – The Open Graph meta tags are code snippets that control how URLs are displayed when they are shared on social media platforms like Facebook, LinkedIn and Twitter. If you do not use Open Graph tags, there is a possibility that incorrect images or descriptions are displayed when someone shares one of your URLs. Read more on The Open Graph protocol.
-
Schema.org information – Schema.org is a semantic vocabulary of tags you can add to your HTML code to improve the way search engines read and represent your page in the search results. One way of using the schema markup is to create so called “rich snippets”. The rich snippets enhance the descriptions in the search results by displaying, for example, contact information, event information, product ratings, or videos. The rich snippets do not affect Google's ranking of your site directly but they can entice more visitors to click your links in the search results.
This page uses schema markup for recipe ratings.
Schema.org differs from OpenGraph in purpose: OpenGraph is for the relation and typing of content, while schema.org is for the structured representation of content. Schema.org information (as JSON+LD or data-attributes) is what drives the internal site search box on Google, as well as accurate breadcrumbs.
Read more on Schema.org Markup.
-
Image 'alt' attributes – The main purpose of the alt attributes on images is web accessibility. The alt tags are used by screen readers and help visually impaired site visitors to better understand the images used on the site. They are also displayed on the website in the event that the actual image cannot be displayed. The alt attributes are also used by search engines when they are crawling a site. The search engines cannot see the image, instead they use the alt attribute to better understand the image and how it should be indexed. Image alt attributes can be added by editors in the TinyMCE.
-
Alternative language URLs – If you have multiple versions of a page for different languages or regions, you can use href link headers to tell Google which version is the most appropriate version for each language or region. Add alternative language URLs to the head, for example, with the hreflang attribute in the head element. See Tell Google about localized versions of your page for more information.
Good testing tools for metadata
Responsive design
Search engines, such as Google, favor websites that have a responsive design and are optimized for mobile phones and tablets; it is therefore very important to have a mobile-first approach to the design of your website. (Assuming that your site is not one of the very few sites that are only accessed from desktops.) Responsive design sites also load quicker, which is another reason for using it.
For responsive designs, you can use the “Viewport Meta Element”. By using this element, you can control the layout for web pages on mobile browsers. It is included in the head section of your web page. Syntax:
<meta name=”viewport” content=”width=device-width,initial-scale=1″>
Optimize website loading times
Another important aspect for good SEO is optimizing the site loading times. A faster website is rewarded in SEO rankings compared to a slower site. Some things to take into considerations are:
- If possible, use a highly available redundant network with auto scale-up and scale-out.
- Use a CDN cache for fast loading of site assets. Optimizely DXP uses Cloudflare's CDN cache for this.
- Use responsive design and image size optimizations to decrease loading times.
Ideally, you want to serve your visitors top-notch images in high resolution, but unfortunately, high-resolution images can increase your website’s loading time, especially for users on smaller devices, which affect their user experience negatively and also decrease your SEO ranking. Therefore, set up your site to use different preconfigured image sizes, so that the visitor is served whatever image quality that is appropriate for their screen size while getting a fast loading time. Automate this so your editors do not have to spend time thinking of images sizes.
One way of achieving different image sizes easily on an Optimizely website is by using the ImageResizer add-on, which allows you to define a srcset in the image tags for breakpoint optimization. See this blog post for more information.
<img class="img-fluid"
src="/globalassets/pictures/image.jpg?w=800"
srcset="/globalassets/pictures/image.jpg?w=640 640w,
/globalassets/pictures/image.jpg?w=750 750w,
/globalassets/pictures/image.jpg?w=800 800w,
/globalassets/pictures/image.jpg?w=1080 1080w,
/globalassets/pictures/image.jpg?w=1440 1440w,
/globalassets/pictures/image.jpg?w=1600 1600w,
/globalassets/pictures/image.jpg?w=2400 2400w,"
sizes="(min-width: 800px) 800px, 100vw"
alt="Image description">
Another way of optimizing your image handling is by using the free-for-use ImageProcessor add-on. See this blog post for more information on that add-on.
Caching of images
Images, just like other assets on your website, need to be cached to give good performance. Generally, the expiry timeout on assets can be longer than on content, as these are probably not updated as often. It is important to get the balance right between short cache times, when the cached content expires too quickly and the performance might be affected, and long cache times when the cached content is not updated so site visitors are served old content.
To avoid this problem where visitors are served outdated content from their local cache, you can add a version number or file hash to the URL and set cache lifetime to a minimum of a month; this is known as cache busting. See How To Easily Add Cache Busting On Your Optimizely Website by Jon D Jones for more information.
URLs
A URL is a human-readable text that replaces the IP addresses that computers use to communicate with servers. The URLs also tell both users and search engines of the file structure on the website.
There are some benefits from making sure you are using good SEO-optimized URLs. A good URL improves the user experience, and good user experience is rewarded by the search engines as well as the visitors. A clear and concise URL increases the click-through rate.
Things to consider when optimizing URLs:
- Keep them short and concise. Avoid unnecessary words in the URLs while still making sense.
- Make sure the URLs contain your keywords (that is, the terms people are using when trying to find your content).
- Use absolute links and not relative links, if possible.
- Use simple addresses for campaigns.
- Try to avoid URL dynamic parameters. (If you have to use them, ask Google to disregard the URL parameters when indexing the site.)
- Do not allow special characters in your links (like &%*<> or blank spaces). These characters both make the URL less legible and search engines may not be able to crawl them.
- Use only lower-case, as search engines might see example.com/products and example.com/Products as two different pages.
- Make sure your links are valid. See the Link Validation scheduled job.
- Make sure obsolete content is removed or expired, as not to confuse search engines.
- For duplicated content, that you do not want to remove, create a 301 redirect.
- If you have two URLs that serve very similar content, you should use a canonical tag.
The Canonical tag is an HTML link tag with rel=canonical attribute, and by using this attribute in the HTML code, you tell search engines that this URL is the main page and that they should not index the other duplicated pages.
For more technical information on how rewriting URLs to a more friendly format, see Routing in the Content Cloud Developer Guide.
Sitemaps and robots files
Sitemaps and robots.txt are two types of text files placed in the root folder of the web site that are important to ensure that proper content is indexed and duplicated content are excluded.
Sitemaps
A sitemap is an XML file following a defined format for sitemaps. The sitemap lists the URLs on your site and their internal priority, and is used by search engines to correctly crawl and index your site. The sitemap does not affect the ranking of your site in search engines, but a properly organized sitemap can indirectly influence your site ranking. This is especially true for websites containing a large number of pages in a very deep structure below the start page.
The sitemap should not be a static file, but dynamically generated, preferably at least once a day. To automate this, see the add-ons found under SEO automation add-ons: Sitemaps and robots files.
Robots.txt
The robots.txt is a simple text file at the root of your domain. The search engines check this file for instructions on how to crawl your site. It is recommended to have a robots.txt file, even it is empty, because most search engines will look for it, and if you do not have one, all search engines can crawl and index your entire site.
Crawling and indexing are not the same thing; URLs reached through internal or external links may be indexed even though they are not crawled. Crawling is when the search engines looking through the content for each URL they find. Indexing is when the search engines store and organize the information they find when crawling the internet.
You should not use the robots.txt to stop indexing, as some search engines and malicious bots, such as harvesting bots, may disregard the robots.txt.
To stop indexing, you should instead consider:
- Meta noindex, see Google's: Robots meta tag, data-nosnippet, and X-Robots-Tag specifications for more information.
- X-Robots-Tag HTTP header, see Google's: Robots meta tag, data-nosnippet, and X-Robots-Tag specifications for more information.
- Protect sensitive information behind password login.
Automatic landing pages
An “Automatic landing page” is content (for example, a page type instance) which renders differently depending on user input. It is a form of a personalized content-type. A Google hit can, for example, point to an automatic landing page, which shows content related to the Google search.
Optimizely Content Recommendations
Optimizely Content Recommendations is part of the Optimizely Digital Experience Platform (DXP) and a great feature for analyzing website visitors' behavior and personalizing content to match their interest. By personalizing content, you can increase the site attractiveness and improve the SEO ranking. It can also be used to fine-tune your Google Ads campaign by providing AI insights in specific KPIs.
SEO automation add-ons
SEO automation tools increase your ROI with three main features:
Here are some useful add-ons for analytics and automating SEO tasks.
Google Analytics for Optimizely
Install the Optimizely Google Analytics add-on to get a Google Analytics dashboard directly in the Optimizely user interface.
ContentKing
The third-party tool ContentKing uses real-time auditing and content tracking to monitor your website and define an SEO score for each page in a website.
Using the ContentKing CMS API, you can trigger priority auditing of a page through an API, which is basically telling ContentKing that a page has changed, and requesting re-evaluation of the SEO score. See this blog post by Mark Prins on how to integrate ContentKing with Optimizely.
Site Attention
The Site Attention is a free add-on found on the Optimizely Marketplace. Among its many features, you find:
- Organize key phrase categories in one place, for everyone to use.
- Add new key phrases and share them between editors.
- Monitor the likely SEO impact of your copy as you type.
- Get suggestions for more powerful keywords and phrases to use.
- Measure your content’s SEO performance against your KPIs – and make changes if needed.
SiteImprove
The Siteimprove Website Optimizer is another free add-on found on the Optimizely Marketplace.
It streamlines workflows for your web teams, and lets them fix errors and optimize content directly within the Optimizely editing environment. The Siteimprove add-on also has features like:
- Alerts editors of misspellings and broken links.
- Provides information on readability levels and accessibility issues (A, AA, AAA conformance level).
- Provides insights on different SEO aspects, such as technical, content, UX, and mobile.
- Displays information on page visits and page views, as well as feedback rating and comments.
404 handler
The 404 handler is a free add-on and is found on the Optimizely NuGet feed. It gives you better control over your 404 pages, in addition to allowing redirects for old URLs that no longer work.
See the GitHub repository documentation.
Sitemaps and robots files
- Sitemaps is a free add-on and found on Optimizely Marketplace.
- POSSIBLE.RobotsTxtHandler is a free add-on and found on the Optimizely NuGet feed. It handles the delivery and modification of the robots.txt file.
- The SIRO Sitemap and Robots Generator is a free add-on found on the Optimizely Marketplace. It lets you generate configurable and scalable dynamic sitemap.xml and robots.txt files with support for multi-site, multilingual Optimizely solutions.
Related topics
- Search Engine Optimization (SEO) Starter Guide by Google
- How Search Engines Work: Crawling, Indexing, and Ranking by Moz
- Search Quality Evaluators Guidelines by Google
- The Web Developer’s SEO Cheat Sheet by Moz
- SEO Cheat Sheet: Anatomy of a URL by Moz
- Episerver Images got Responsive Resize by Wałdis Iljuczonok
- Responsive images with Episerver and ImageResizer by Creuna
- What You Need To Know About Rel Nofollow Links, Google & The Law
Updated 4 months ago