Hi there,
I got an error when the Cloud Function executed.
run_scheduled_query_ga4_daily_master TypeError: Cannot read properties of undefined (reading 'jobCompletedEvent')
at exports.runScheduledQuery
Here’s the cloud function i’m using: (from Simo’s blog https://www.teamsimmer.com/2022/12/07/how-do-i-trigger-a-scheduled-query-when-the-ga4-daily-export-happens/)
const bigqueryDataTransfer = require('@google-cloud/bigquery-data-transfer');
exports.runScheduledQuery = async (event, context) => {
// Update configuration options
const projectId = 'my_project_id';
const configId = 'id from scheduled query';
const region = 'eu';
// Load the log data from the buffer
const eventData = JSON.parse(Buffer.from(event.data, 'base64').toString());
const destinationTableId = eventData.protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.load.destinationTable.tableId;
// Grab the table date and turn it into the run time for the scheduled query
const tableTime = destinationTableId.replace('events_', '');
const year = tableTime.substring(0, 4),
month = tableTime.substring(4, 6),
day = tableTime.substring(6, 8);
// Set the run time for the day after the table date so that the scheduled query works with "yesterday's" data
const runTime = new Date(Date.UTC(year, month - 1, parseInt(day) + 1, 12));
// Create a proto-buffer Timestamp object from this
const requestedRunTime = bigqueryDataTransfer.protos.google.protobuf.Timestamp.fromObject({
seconds: runTime / 1000,
nanos: (runTime % 1000) * 1e6
});
const client = new bigqueryDataTransfer.v1.DataTransferServiceClient();
const parent = client.projectLocationTransferConfigPath(projectId, region, configId);
const request = {
parent,
requestedRunTime
};
const response = await client.startManualTransferRuns(request);
return response;
};
I’m using log query in my log sink
resource.type="bigquery_dataset"
resource.labels.dataset_id="analytics_123456789"
resource.labels.project_id="my_project_name"
protoPayload.metadata.tableCreation.reason="JOB"
protoPayload.serviceName="bigquery.googleapis.com"
protoPayload.methodName="google.cloud.bigquery.v2.JobService.InsertJob"
protoPayload.authenticationInfo.principalEmail="firebase-measurement@system.gserviceaccount.com"
protoPayload.resourceName:"tables/events_"
NOT
protoPayload.resourceName:"tables/events_intraday"
Simo’s blog is using
protoPayload.methodName="jobservice.jobcompleted"
protoPayload.authenticationInfo.principalEmail="firebase-measurement@system.gserviceaccount.com"
protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.load.destinationTable.datasetId="my_project_id.dataset_id"
protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.load.destinationTable.tableId=~"^events_\d+"
I’m not expert in this area, so I asked for help from Duet AI and Duet AI suggested me to add
const protoPayload = {
methodName: "google.cloud.bigquery.v2.JobService.InsertJob",
jobCompletedEvent: {
job: {
jobConfiguration: {
load: {
destinationTable: {
datasetId: "my_project_id.dataset_id",
tableId: "~^events_\d+"
}
}
}
}
}
};
but not sure where do use it in the cloud function and how?









