Hi I have a gen2 python functions.
It’s triggered by upload to GCS.
It reads QR codes in the file and appends the results to a Google Sheets sheet.
When my code fails to read QR, it should be empty but it’s going to be filled with values just finished before the request.
It looks to me its inheriting the variables.
This stopped when I limited the concurrency to 1( previously 10).
signed_url = ref = plate = chassis = engineCode = modelNum = categoryNum = firstReg = modelCode = weight = engineType = kind = kind_for = ragicID = yard = originalJpyear = ""
...
def qr_value_process(value: str, sourec_file_name):
global ref, plate, chassis, engineCode, modelNum, categoryNum, firstReg, modelCode, weight, engineType, kind, kind_for,ragicID,yard,originalJpyear
# QR value reading
# Sample => Go to settings.py
if value==None or value == '':
logger.warning('No QR data')
return "", "", "", "", "", "", "", "", "", "", "", "", "", "", ""
lines = value.split("\n")
logger.info('Split lines = [{'+"','".join(lines)+"']")
related_lines = list(filter(lambda x: x.startswith("QR-Code:2/") or x.startswith(
"2/") or x.startswith("QR-Code:K/") or x.startswith("K/"), lines)) # Sometimes '~QR-Code:~' is missing
logger.info('Filtered lines = [{'+"','".join(related_lines)+"']")
for line in related_lines:
parts = line.split("/")
head = parts[0]
# print("文字列長さ : " + str(len("".join(parts[1:]))))
# 乗用車処理
if head.startswith("QR-Code:2") or head.startswith("2"):
# print("乗用車")
if re.search(pattern_for_qr2, line):
logger.info("乗用車QR2 found on " + sourec_file_name)
plate = parts[1]
chassis = parts[3]
modelCode = parts[4]
kind = parts[-1]
...
return ref, plate, chassis, engineCode, modelNum, categoryNum, firstReg, modelCode, weight, engineType, kind, kind_for,ragicID, yard, originalJpyear
...
sheet.append_row([ref, plate, chassis, engineCode, modelNum, categoryNum, firstReg, modelCode, weight, engineType, kind, kind_for, ragicID, signed_url, value, triggered_file_name, yard, originalJpyear],value_input_option=ValueInputOption.user_entered)
We upload around 50 files at once on daily basis and I want each to be processed separately to avoid timeout issue but thought to use concurrency to save some invoking cost.
Is there any way to keep using concurrency but avoid this from happening?
Does this happen to Cloud Run as well if I set concurrency to more than 1?