Back in 2021 I created a simple, fragile but functional Appsheet CRM over our Google Sheets “database”.
The synch times have always been too long but I’m getting tired of the frequent waitng, so trying to get this sorted during off-peak January.
The spreadsheet has three main Tables, one with around 10,000 entries (with 12 Virtual Columns and 20 format rules) and one with around 5,000 (with 13 Virtual Columns and 25 format rules) and another smaller but often used Sheet with arounnd 50 entries (with 24 Virtual Columns and 58 format rules).
Does that info raise any alarms in itself? I get that Virtual Columns are resource-expensive. I’ve had a quick look at the Performance Analyzer. Here’s some data from that:
sync time for my primary apps are 10-40 seconds long, 50-100k rows for about 20 tables and 40 users.
i moved away from virtual columns specifically for computations. eg. if i need a row updated based on inputted values i run a grouped action sequence that does update logic.
don’t know much about your use case but in my case:
i run an invoicing / crm / inventory erp-like app
i separated the ‘invoice creation table and associated order rows’ from the ‘final database table of all completed invoices’
this way i can run quick virtual columns on smaller 200-400 row tables, then when the user completes the data input and other actions, the rows are copied to ‘main database’ tables, and deleted from the entry table.
eg.
TABLE A/B/C → most of the temporary quick entry functions that virtual columns speed up
TABLE D/E/F → copies of ‘completed work’ in table a/b/c specifically for record keeping, graphs and whatever
Thanks for the reply. I get the gist of your strategy but don’t have the know-how to dig deeper.
To all: Is there an appropriate place on this forum to request paid assistance? Or should I go to a gig website? We’re just a husband & wife business so we have a limited budget but would really like to have a smoother experience than we’ve been getting. We use a fair bit of Appscript scripting in the backend responding to Appsheet Actions.
Otherwise, my suggestion is to go through each virtual column one-by-one, table by table. Ask yourself how that data gets inputted in the first place eg. simple example of removing virtual columns
Total Amounts (probable virtual column) = Number of Units * Price
Your users can input the units manually via a form or input action. Your form save action (or whatever action your user uses to edit the row / create a new row entry) can simply just hard-code ‘write’ the computed value in a static column (non-virtual column).
But that is a pretty simple change. Depending on your app needs, it can also be a structural problem / challenge of your tables / interface. eg. ‘for each invoice (row) in my table, i don’t need to have computed statistics for that specific customer → instead i’d have a the customer table summarize that for me through related tables as REF_ROWS or Related something as SELECT(Customer[ID],AND([Inquiry Date]<=Today(),[Status]=”not served”)) → something like that help with general information processing
—– bottom line
Removal of most virtual columns has helped me scale up. Eg. i use to have a virtual column displaying the statistics of each customer as a rich-text longtext type. I simply changed it to a static column that prints the concatenate(//table//stuff//table//more stuff) as simple text on the google sheet → yes they don’t update in real-time, but you don’t need them updated most of the time until you need to actually look at them, or run an automation for every hour or day
Why do you feel the Sync times are too long? The logs are showing between 5 - 10 seconds each. For an app the size you are indicating that seems to be perfectly acceptable.
Your Virtual Column combined time is less than 1 second which is phenomenal!
The only I see of getting times any faster is by either:
Implementing very strict archiving/purging paradigm to keep your main data as small as possible.
Insert Security Filters to reduce the data downloaded to the user to only what they need to see.
Use an SQL database to decrease read times. If you couple this with Security Filters, you can get the fastest possible data download times.
Thank you for the feedback - it’s got me thinking more specifically about the particular situations that are causing the long delays. I’ll continue with that and maybe post a video in the coming days.