If an e-commerce database has 50,000 products, returning all of them in a single GET /api/products request will crash your database, exhaust Kestrel's RAM, and freeze the user's browser. You must implement Pagination (breaking data into chunks) and Filtering (narrowing the data).
Instead of placing 10 consecutive [FromQuery] properties in a controller method signature, we encapsulate them into a strongly-typed class.
public class ProductParameters
{
const int maxPageSize = 50; // Hard limit to prevent abuse
public int PageNumber { get; set; } = 1;
private int _pageSize = 10;
public int PageSize
{
get { return _pageSize; }
set { _pageSize = (value > maxPageSize) ? maxPageSize : value; }
}
// Filtering Fields
public decimal? MinPrice { get; set; }
public decimal? MaxPrice { get; set; }
public string Category { get; set; }
}
We bind this class to the method using [FromQuery]. EF Core intercepts this logic and applies a highly efficient OFFSET / FETCH query in SQL Server.
[HttpGet]
// The [FromQuery] attribute binds "GET /api/products?pageNumber=2&pageSize=20&minPrice=5&category=Shoes"
public async Task<IActionResult> GetProducts([FromQuery] ProductParameters filter)
{
var products = await _repo.GetProductsAsync(filter);
return Ok(products);
}
public async Task<List<Product>> GetProductsAsync(ProductParameters filter)
{
// Use deferred execution! Do NOT put .ToList() until the very end.
var collection = _context.Products as IQueryable<Product>;
// 1. APPLY FILTERS
if (!string.IsNullOrWhiteSpace(filter.Category)) {
collection = collection.Where(c => c.Category == filter.Category);
}
if (filter.MinPrice.HasValue) {
collection = collection.Where(c => c.Price >= filter.MinPrice.Value);
}
if (filter.MaxPrice.HasValue) {
collection = collection.Where(c => c.Price <= filter.MaxPrice.Value);
}
// 2. APPLY PAGINATION
// Skip bypasses previous pages. Take grabs the exact chunk requested.
return await collection
.Skip((filter.PageNumber - 1) * filter.PageSize)
.Take(filter.PageSize)
.ToListAsync(); // NOW it finally executes the SQL!
}
A frontend React developer needs to know if they are on the "Last Page" to disable the "Next" button. We convey this using a custom HTTP Header.
var metadata = new
{
TotalCount = totalRecordsInDatabase,
PageSize = filter.PageSize,
CurrentPage = filter.PageNumber,
TotalPages = (int)Math.Ceiling(totalRecordsInDatabase / (double)filter.PageSize)
};
// Insert it securely into the HTTP Headers
Response.Headers.Add("X-Pagination", JsonSerializer.Serialize(metadata));
Q: "Why is Keyset Pagination (Cursor Pagination) considered vastly superior to Offset Pagination (Skip/Take) for massive databases like Twitter or Facebook feeds?"
Architect Answer: "Offset pagination (Skip/Take) becomes exponentially slower the deeper you go into pages. If a user requests `Skip(5,000,000).Take(50)`, the SQL engine physically cannot instantly jump to record index 5,000,000. It must retrieve and throw away the first five million rows in memory before grabbing the 50 we want. Cursor pagination avoids `Skip` completely. Instead, it relies on passing the ID of the last item seen (the Cursor) via `WHERE Id > @LastCursorId ORDER BY Id TAKE 50`. Because SQL leverages the B-Tree Index on the Primary Key, it instantly seeks directly to the cursor ID in O(log n) time, drastically outperforming Offset Pagination on deep queries."