Best practices for API gateway design when handling large datasets

Hi everyone,

I am working on a REST-based architecture where Apigee is used as an API gateway in front of backend services built with Node.js.

Currently, the backend is responsible for business logic such as pagination, filtering, and sorting, while Apigee mainly acts as a proxy and security layer. As the system scales and starts handling larger datasets and higher traffic, we are reviewing best practices around API gateway responsibilities and overall design.

I would appreciate insights on:

  • Which responsibilities are best handled at the backend versus the API gateway
  • Best practices for response shaping and performance optimization at the gateway level
  • Trade-offs between offset-based and cursor-based pagination in scalable API architectures
  • Any real-world experiences using Apigee in similar scenarios

Looking forward to learning from the community’s experiences.

Thanks!