Is there a way to let users update data on DB right from looks?
I read about actions here but I found it very abstract.
Can someone please explain the process for this?
Thanks,
Krish.
Is there a way to let users update data on DB right from looks?
I read about actions here but I found it very abstract.
Can someone please explain the process for this?
Thanks,
Krish.
Hey Krishna,
Iâve been able to write data from Looker to BigQuery using both Data Actions as well as the Looker Action Hub. In either case, youâll need to push data from Looker to some middleware that will interpret the webhook from Looker and perform the necessary operations to then stream the data to BigQuery.
Luckily, Google has a great service called Google Cloud Functions that makes this really easy. Like AWSâs Lambda, Cloud Functions let you deploy code that gets executed based off of some event. With a data action, you can push JSON containing data from Looker as well as user-defined form parameters to a Cloud Function endpoint. The Cloud Function then parses the JSON, extracts the relevant values, and calls on the BigQuery SDK to stream the results to BigQuery.
Hereâs a quick overview of how to use Cloud Functions to stream data from Looker to BigQuery. In this example, weâll create a data action and cloud function that lets an end user persist an annotation to BigQuery:
In this example, weâre going to attach a data action to field, and allow end-users to mark whether or not a name is a cool name.
dimension: name {
type: string
sql: ${TABLE}.name ;;
action: {
label: "Cool Name?"
url: ""
param: {
name: "name"
value: "{{ value }}"
}
form_param: {
name: "annotation"
type: select
label: "Cool name?"
default: "No"
description: "Do you think that this name is a cool name?"
option: {
name: "No"
}
option: {
name: "Yes"
}
}
}
}
Note: Weâre going to leave the url blank for now. Once weâve spun up the cloud function weâll paste the endpoint in.
Weâre now going to write a simple Python function that writes the user selected annotation to BigQuery, and place it in main.py
import google.cloud.bigquery as bigquery
import datetime
import time
def annotation(request):
r = request.get_json() # Fetch the data action JSON
client = bigquery.Client()
dataset_id = '' # Replace with name of the BQ dataset
table_id = '' # replace with your table ID
table_ref = client.dataset(dataset_id).table(table_id)
table = client.get_table(table_ref) # API request
# request variables
name = r['data']['name']
annotation = r['form_params']['annotation']
# system variables
sys_time = int(time.time())
row_to_insert = [
(
name,
annotation,
datetime.datetime.fromtimestamp(sys_time).strftime('%Y-%m-%d %H:%M:%S')
)
]
row = client.insert_rows(table, row_to_insert) # API request to insert row
return '{"looker": {"success": true,"refresh_query": true}}' # return success response to Looker
Additional things to configure:
google-cloud-bigquery==1.5.0
in requirements.txt
Cheers!
Jesse
Thank you so much Jesse!
Going to try this now!
Yeah, let me know how it goes! Happy to discuss further.
Is it possible to push all the data this way into BigQuery? This would be super useful to create custom segments of users for example.
Hello Jesse!
I followed the exact process, but I am seeing an error on Cloud Functions:
NameError: name âbigqueryâ is not defined
I included google-cloud-bigquery==1.5.0 in requirements.txt but still am seeing the same error.
EDIT:
Fixed this by adding âfrom google.cloud import bigqueryâ
Hey Krish, good catch. It looks like I didnât copy over the first line of the cloud function when I pasted it in. Did you get the action to work?
Dimitri â yes, that is totally possible, but weâll need to leverage the Action Hub framework to push all results to BigQuery. Iâll take that as a challenge and try to get you a working example sometime next week.
Hey Jesse,
Yes I finally made it work. But there is one issue (which I am trying to fix).
Data is being duplicated whenever I try to update. Any idea why thatâs happening?
I tried updating rows with ids 8 and 10, and you can see data being duplicated
Amazing, thank you @jesse_carah ! Looking forward to it.
Hey Krish,
Check out this discussion about the append-only nature of BigQuery.
I think the best move here is to have your Cloud Function insert a timestamp when each record is created, and create a view that selects only the most recent record. One approach to doing that is explained here.
Cheers!
Jesse
Thank you, Jesse!
I now have a much better picture of this!
Hey Dimitri,
I just wanted to update you that Iâm making progress, but my goal to getting a working POC up and running this week was perhaps too bold.
In the meantime, could you elaborate on the specific use-case? What sort of data are you trying to push back to BigQuery?
Cheers,
Jesse
@jesse_carah Whenever people construct a view which represents a segment, e.g. âall user_ids who did xyz and live in Londonâ. I would like to enable them to save this list of user_ids as a segments so that it can be used in other dashboards/analysis. The only missing part is to be able to save the output of an explore into an existing BigQuery table. This way everyone would be able to create custom (arbitrary complicated) segments. Hope that makes sense!?
I think this a very common analytical pattern in general!
@Dimitri_Masin â I got this working �� . Iâm going to make a post this afternoon describing how to get this up and running.
@Dimitri_Masin check out the new post here. Let me know if you have any questions.
Hey @hwimoon letâs take your question here:
Hi Jesse,> > I have been trying to follow your instructions on using Data Action literally step by step, and Looker keeps telling me that âthe form was not acceptedâ. What does the error mean and how can I get around this?> > Please let me know! Solving this would be phenomenal for meâŚ!> > Thank you,> Hwi Moon
Iâm curious if you can see anything in the Cloud Function logs indicating whether the action completed correctly. My code snippet doesnât do error handling all that well ��. My guess is that the function is failing and returning some exit condition that Looker doesnât understand.
Take a look at the logs and let me know what you see â you can access them directly in the cloud function console.
Thanks for your reply! I have just try to re-run and see if I get anything in the log, but there are no new entries. I think I messed up somewhere in LookML(URL) or configuration/permissions. But I think I followed everything 100% in your instructions⌠and very confused!
The error message is as shownâŚ
@hwimoon please share your LookML. You can redact anything sensitive if youâd like.
@jesse_carah Sorry for the late post - missed the notification for some reason.
I think there is an error in the cloud function (Python 3.7). It seems to me that my âr=request.get_json()â does not have any keys or values in it, causing errors when I do anything like r[âdataâ] or r[âform_paramsâ].
I could not figure out why - please help!