Best practices for API gateway design when handling large datasets

Hi everyone,

I am working on a REST-based architecture where Apigee is used as an API gateway in front of backend services built with Node.js.

Currently, the backend is responsible for business logic such as pagination, filtering, and sorting, while Apigee mainly acts as a proxy and security layer. As the system scales and starts handling larger datasets and higher traffic, we are reviewing best practices around API gateway responsibilities and overall design.

I would appreciate insights on:

  • Which responsibilities are best handled at the backend versus the API gateway
  • Best practices for response shaping and performance optimization at the gateway level
  • Trade-offs between offset-based and cursor-based pagination in scalable API architectures
  • Any real-world experiences using Apigee in similar scenarios

Looking forward to learning from the community’s experiences.

Thanks!

Hi @Shahzaib_Aziz, happy new year!

We saw your question and wanted to let you know we’re keeping it on our radar. We’ll also invite others in the community to pitch in and share their thoughts.

Hi @Shahzaib_Aziz , thanks for posting. In general your separation of doing security & authorization at the gateway level and business logic in the backend services makes absolute sense, regardless of the data size. So your backends should offer the pagination and Apigee doesn’t generally touch that. There are exceptions where there might be older backends that don’t do a great job with features like pagniation and Apigee can help, but normally the backends should take care of it.

I personally like how the BigQuery API handles pagniation and filters, see API docs here: Method: tabledata.list  |  BigQuery  |  Google Cloud Documentation. It’s simple and effective, but there are many established patterns out there.

Hi Tyayers,

Thank you for the clear explanation and for validating our approach. We agree that pagination and filtering should primarily be handled by backend services, with Apigee focusing on gateway-level responsibilities such as security and authorization.

The BigQuery pagination and filtering pattern you mentioned is a great reference — simple, clean, and scalable. We’ll review it closely and consider adopting similar patterns as our APIs continue to evolve. For now, we’ll keep Apigee in mind mainly for exceptional or legacy scenarios where backend capabilities may be limited.

Really appreciate you sharing your experience and guidance.

1 Like

Hi Alexet,

Happy New Year to you as well! Thanks for the update and for keeping the question on your radar. I appreciate you looping in others from the community — looking forward to hearing additional perspectives.

1 Like