Logging Large Payloads: Understanding the 10 MB Data Persistence Limit in Application Integration

Application Integration has a 30 MB limit for the total data size of any single execution. However, a lower limit of 10 MB determines how data is saved in the logs (for details on both limits, see Application Integration Quotas)

When the total size of an integration’s variables (input, output, and intermediate data) exceeds 10 MB, the system takes a specific action:

  • Integration Flow Continues: The integration steps (e.g., API Trigger, Data Mapping, Send Email) are still recorded, meaning you know which tasks ran.
  • Variable Persistence Disabled: The system stops saving the full content of the large variables in the local log database.

Within the Application Integration local logging UI (see Application Integration Local Logging), this leads to:

  1. A Warning Message stating that “Database persistence is disabled as the total integration data size… exceeds the allowed limit of 10 MB.”
  2. The Variable Content for the large payload is not visible when inspecting the task details (input/output).

Example Integration Flow

Consider a simple integration designed to process files from Google Cloud Storage (GCS) and send an email .

Example 1: Small Payload (< 10 MB)

This scenario is with a file size of approximately 5 MB, which is below the 10 MB logging threshold.

  • Result: The integration runs, and the variable persistence is enabled.
  • Observation in the UI: The execution log for the Data Mapping task (and other tasks) shows the full content of the file variable . You can inspect the data directly within the log details.

Example 2: Large Payload (> 10 MB)

This scenario is with a file size of approximately 15 MB, which is above the 10 MB logging threshold.

  • Result: The integration runs, but variable persistence is disabled.
  • Observation in the UI: The execution log still displays the full sequence of integration steps, but the log details for the Data Mapping task show a Warning Message . The large variable’s content is not visible for inspection.

Cloud Logging

This constraint on large payload logging also applies to Cloud Logging, which has similar size limitations for log entries and API requests (refer to Cloud Logging API Limits).

Recommended Practice for Large Payloads

For integrations with payloads over 10 MB, the best approach is to store and manage variable payload content in external storage for debug and observability operations.

Reasoning for External Storage:

It is generally not suitable to display large payloads within a real-time log user interface. A file management platform, such as the GCS console, is a more appropriate and optimized tool for handling large data. Using a file platform allows you to:

  • View file metadata (size, creation date, type).
  • Download the content for detailed inspection.
  • Manage access and lifecycle policies for the large data object.

Since the variable payload is not directly accessible from the logs, we suggest storing the variable content in a platform like GCS whenever this variable is updated in the integration flow. This allows your operations team to view its content at each step if needed.

2 Likes