Disclaimer: This website requires Please enable JavaScript in your browser settings for the best experience.

Dev guideRecipesAPI ReferenceChangelog
Dev guideRecipesUser GuidesNuGetDev CommunityOptimizely AcademySubmit a ticketLog In
Dev guide

Handle large results

Use pagination to efficiently handle large datasets in Optimizely Graph.

Handling large datasets efficiently is crucial for maintaining performance, enhancing user experience, and optimizing resource management in Optimizely Graph. A common solution to this challenge is pagination, which splits results into smaller chunks so data can be delivered and consumed more smoothly.

Pagination is a technique used to divide large datasets into manageable chunks, letting you access data incrementally. It improves performance by reducing the amount of data transferred at once and enhances user experience by presenting information in a more digestible format.

Importance of pagination in large datasets

Large datasets introduce both technical and user experience challenges. Without pagination, requests may attempt to load thousands of items at once, leading to delays or even timeouts. Proper pagination helps with the following:

  • Performance – Loading large amounts of data at once can strain server resources and lead to slow response times. Pagination mitigates this by fetching only the necessary data.
  • User experience – Presenting data in smaller, organized chunks makes it easier for users to navigate and interact with content.
  • Resource management – Efficient management of server and network resources is critical, especially when dealing with datasets that may exceed tens of thousands of items.

Pagination methods in Optimizely Graph

Optimizely Graph supports two distinct pagination approaches, each optimized for different use cases and dataset sizes.

Skip/limit pagination

Skip/limit pagination is a straightforward method in which you specify how many items to skip and how many to fetch. It is best suited for traditional page navigation, smaller datasets, and administrative interfaces. See Skip/limit pagination for detailed information.

Best suited for

  • Traditional page navigation (for example, page 1, page 2, and so on).
  • Small to medium datasets (under 10,000 items).
  • Admin dashboards and back-office tools.

The following is an example query for skip/limit pagination:

query BasicSkipLimitPagination($skip: Int = 0, $limit: Int = 20) {
  Content(
    skip: $skip
    limit: $limit
    orderBy: { StartPublish: DESC }
  ) {
    total
    items {
      Name
      _concreteType
      RelativePath
      StartPublish
    }
  }
}

Skip/limit pagination becomes less efficient with very large datasets. As the skip value increases, queries may slow down because the system must process and count past many records.

Cursor-based pagination

Cursor-based pagination uses a cursor (a unique pointer) to mark your position in the dataset. Instead of skipping records, you fetch results relative to a cursor. This approach is more efficient and consistent for large or frequently changing datasets. See Cursor-based pagination for detailed information.

Best suited for

  • Large datasets (10,000+ items).
  • Infinite scrolling (for example, social feeds or content lists).
  • Real-time or dynamic data where items may be added or removed frequently.

The following is an example query for cursor-based pagination:

query BasicCursorPagination($cursor: String, $limit: Int = 20) {
  Content(
    cursor: $cursor
    limit: $limit
    orderBy: { StartPublish: DESC }
  ) {
    total
    cursor
    items {
      Name
      _concreteType
      RelativePath
      StartPublish
    }
  }
}

With cursor-based pagination, performance remains stable regardless of dataset size, because the query does not need to scan past thousands of items.

Best practices

Effective pagination is essential for handling large datasets in Optimizely Graph.

  • Use skip/limit pagination for traditional page navigation with datasets under 10,000 items.
  • Use cursor-based pagination for large datasets and infinite scrolling scenarios.
  • Include stable sort orders with unique fields to ensure consistent results.
  • Implement proper error handling for cursor expiration and invalid page numbers.
  • Optimize field selection to minimize data transfer and improve performance.
  • Monitor performance metrics to make informed decisions about pagination strategies.

By selecting the appropriate pagination strategy, you can ensure that your applications remain fast, user-friendly, and capable of scaling as data grows.