Handle large results
Use pagination to efficiently handle large datasets in Optimizely Graph.
Handling large datasets efficiently is crucial for maintaining performance, enhancing user experience, and optimizing resource management in Optimizely Graph. A common solution to this challenge is pagination, splitting results into smaller chunks so data can be delivered and consumed more smoothly.
Pagination is a technique used to divide large datasets into manageable chunks, allowing users to access data incrementally. It improves performance by reducing the amount of data transferred at once and enhances user experience by presenting information in a more digestible format.
Importance of pagination in large datasets
Large datasets bring both technical and user experience challenges. Without pagination, requests may try to load thousands of items at once, leading to delays or even timeouts. Proper pagination helps with:
- Performance – Loading large amounts of data at once can strain server resources and lead to slow response times. Pagination helps mitigate this by fetching only the necessary data.
- User experience – Presenting data in smaller, organized chunks makes it easier for users to navigate and interact with content.
- Resource management – Efficiently managing server and network resources is critical, especially when dealing with datasets that could exceed tens of thousands of items.
Pagination methods in Optimizely Graph
Optimizely Graph supports two distinct pagination approaches, each optimized for different use cases and dataset sizes.
Skip/limit pagination
Skip/limit pagination is a straightforward method where you specify how many items to skip and how many to fetch. It is best suited for traditional page navigation, smaller datasets, and admin interfaces. See Skip/limit pagination for detailed information.
Best for:
- Traditional page navigation (for example, page 1, page 2, and so on)
- Small to medium datasets (under 10,000 items)
- Admin dashboards and back-office tools
The following is an example query for skip/limit pagination:
query BasicSkipLimitPagination($skip: Int = 0, $limit: Int = 20) {
Content(
skip: $skip
limit: $limit
orderBy: { StartPublish: DESC }
) {
total
items {
Name
_concreteType
RelativePath
StartPublish
}
}
}Skip/limit becomes less efficient with very large datasets. As the skip value grows, queries may slow down because the system has to count past many records.
Cursor-based pagination
Cursor-based pagination uses a cursor (a unique pointer) to mark your place in the dataset. Instead of skipping through records, you fetch results relative to a cursor. This makes it more efficient and consistent for large or changing datasets. See Cursor-based pagination for detailed information.
Best for:
- Large datasets (10,000+ items)
- Infinite scrolling (for example, social feeds, content lists)
- Real-time or dynamic data where items may be added or removed frequently
The following is an example query for cursor-based pagination:
query BasicCursorPagination($cursor: String, $limit: Int = 20) {
Content(
cursor: $cursor
limit: $limit
orderBy: { StartPublish: DESC }
) {
total
cursor
items {
Name
_concreteType
RelativePath
StartPublish
}
}
}With cursor-based pagination, performance remains stable regardless of dataset size, because the query does not need to scan past thousands of items.
Best Practices
Effective pagination is crucial for handling large datasets in Optimizely Graph.
- Use skip/limit for traditional page navigation with datasets under 10,000 items.
- Use cursor pagination for large datasets and infinite scroll.
- Include stable sort orders with unique fields to ensure consistent results.
- Implement proper error handling for cursor expiration and invalid page numbers.
- Optimize field selection to minimize data transfer and improve performance.
- Monitor performance metrics to make informed decisions about pagination strategy.
By choosing the right strategy, you can ensure your applications stay fast, user-friendly, and capable of handling growth.
Updated 1 day ago
