Cloud function being used is built to push logs in GCP. Maximum request size is 156KB.
During load, the number of invocations increase up to 100/sec and memory usage also seems to be hitting 1GB. Many of the requests were rejected with below:
Function invocation was interrupted. Error: memory limit exceeded.
Currently 1 GB of memory is allocated to the Cloud function.
Note: No reading of secret or BQ is being done. Its just read data and push it to logs
Anyone else facing same issue? Increasing memory might reduce the error but to just send request to logs should not consume this high memory.