Update(write-back) data on BigQuery from Looks?

@hwimoon Very weird! If you add in print(r) below the r = ... line, you’ll be able to observe what Looker’s sending over in the logs. For example, my function is parsing a dictionary that looks like this:

{'type': 'cell', 'scheduled_plan': None, 'attachment': None, 'data': {'value': 'Emma', 'rendered': 'Emma', 'name': 'Emma'}, 'form_params': {'annotation': 'Yes'}}

Give that a spin and let me know what you see. Also, go ahead and share the LookML for your action with the endpoint redacted.

@jesse_carah

It seems that the variable “r” is empty (i.e., {}). Very weird.

Here is my LookML code:

dimension: managers {
  label: "Manager or BM"
  type: string
  sql: ${TABLE}.ManagerName ;;

    action: {
      label: "Test"
      url: "https://us-west2-XXXXX.cloudfunctions.net/manager_mtd_input_test"
    param: {
      name: "manager"
      value: "{{ value }}"
    }
    form_param: {
      name: "performance"
      type: string
      label: "Enter Performance"
    }
  }
}

@hwimoon I don’t know what to say – I tested your LookML out, and am seeing data flow through. You can use this tool to inspect the contents of your payload, by swapping out your endpoint with one generated by the service.

@jesse_carah

I just checked out the link and it does seem like the data is flowing through. At this point, I would think the request is being rejected by the Cloud Function (GCP), which is causing the request to be empty. Do you think it could be some permission related problem?

Here’s one possibility - Between the time the original article was written and now, GCP has changed some defaults for newly deployed Cloud Functions. You’ll want to make sure that you have opted-in to a publicly accessible function.

See here for more details: https://cloud.google.com/functions/docs/securing/managing-access-iam#allowing_unauthenticated_function_invocation

@jesse_carah
@fabio1

Hey Fabio, thank you so much for your reply! I should have continued to share my progress. I was able to get my Cloud Function working by checking the checkbox “Allow unauthenticated invocations” as shown below when first creating the function.

Very excited that it is working now - however, the word “unauthenticated” concerns me a little bit. It seems that anybody with the trigger URL is able to write to our data, is that correct? If so, how do people safely manage and protect their database while using this “write-back” workflow? Please let me know. Thank you very much!!

Well, the situation is that GCP is not doing the authentication for you. It is expected that your code will handle any authentication. The Action framework includes a secret that is configured when you add the Action to Looker, and which you should use to authenticate that the request came from your Looker instance.

This article talks about setting up the secret when deploying the official ActionHub code, but the process is similar for a custom written Action

1 Like

Hi, I have a need which seems similar to the one Dimitri described 2 years ago. Unfortunately the link provided by @jesse_carah is broken.

Is there any way to see Jesse’s post or should I start a new thread?

Hey Arthur, here ya go:

https://community.looker.com/open-source-projects-78/export-the-results-of-a-looker-query-to-bigquery-9720

1 Like

Hi Jeese,

Any code snippet to move system activity data into bigquery using cloud functions.

Thanks

Srinivasarao

Hi @jesse_carah

I have tried the same method that you posted. But when I’m trying to update the field, it is appending as another row and updates the name column with the old value and the annotation column with the new value.

In the text payload, the updating value is in the data and the new value is in the annotation. can you help me with which part I have made mistake in?

{'type': 'cell', 'scheduled_plan': None, 'attachment': None, 'data': {'value': '1000', 'rendered': '1000', 'name': '1000'}, 'form_params': {'annotation': '"1800"'}}
  dimension: annotation {
    type: string
    sql: ${TABLE}.annotation ;;
    action: {
      label: "Update the Field"
      url: "https://us-central1-decisive-triode-363515.cloudfunctions.net/function-1"
      param: {
        name: "name"
        value: "{{ value }}"
      }

      form_param: {
        name: "annotation"
        type: string
        label: "Update the Field"
        # default: "No"
        description: "Update the name field if there is any error while initial entry"

      }
    }
  }

  dimension: name {
    type: string
    sql: ${TABLE}.name ;;
    action: {
      label: "Update the Field"
      url: "https://us-central1-decisive-triode-363515.cloudfunctions.net/function-1"
      param: {
        name: "name"
        value: "{{ value }}"
      }

      form_param: {
        name: "annotation"
        type: string
        label: "Update the Field"
        # default: "No"
        description: "Update the name field if there is any error while initial entry"

      }
    }
  }

@vr27

Good question – my example uses the streaming inserts API, which is append only. The code snippet that I shared publishes an ingestion timestamp for the annotation. You can use SQL to filter down on the most recent annotation – e.g.,: https://stackoverflow.com/questions/25269410/google-big-query-sql-get-most-recent-column-value

Hope that helps.

Do we get all the columns which are present in the table look once we update the value , if this works we can get primary key and based on that we can update the existing Bigquery table by writing the logic in the cloud function