By Raj
Estimated reading time: 12 minutes
Automate Inventory Management in Google Sheets with Apps Script
Many SMEs run inventory in spreadsheets because full ERP is overkill and costly. The pain is manual updates: stock levels go stale, reorder points are missed, and multi-location or multi-channel data lives in separate files. Automating inventory management in Google Sheets with Apps Script gives you a single source of truth, reorder alerts, and optional sync from e‑commerce or APIs—without buying new software. This guide covers data model design, implementation with batching and chunking, concurrency and failure recovery, when to choose Sheets over ERP, and how to scale when SKU count or locations grow.
Common mistakes when automating inventory
- One giant sheet with no tabs or structure (hard to maintain and trigger).
- Manual-only updates with no time-driven or event-driven triggers.
- Per-row
setValue()loops instead ofsetValues()for bulk writes. - No reorder logic (min/max or reorder point) so alerts never fire.
- Pulling external data (e.g. Shopify) in a single long run that hits execution time or URL Fetch quota.
- No lock around stock deduction so two runs can double-deduct the same SKU.
Architecture for inventory automation
Keep a clear separation: a Stock sheet (SKU, current qty, min/max or reorder point, location), a Transactions or log sheet (in/out, date, source), and optionally a Settings sheet (reorder thresholds, notification emails). Scripts read and write in bulk; triggers run checks on a schedule or after edits. For data coming from outside (e.g. Shopify, WooCommerce), use API integrations with batching and chunking so you stay within execution time and URL Fetch quota.
Data model design
SKU normalization and multi-location modeling
Use one canonical identifier per product (SKU or internal ID) across all sheets. Strip whitespace and enforce case (e.g. trim and uppercase) when reading from forms or APIs so lookups don’t fail. If you ingest from multiple channels, map channel-specific IDs to your canonical SKU in a small mapping table or a dedicated column so Stock and Transactions stay consistent. For locations, either one row per SKU per location (columns: SKU, LocationId, Qty, Min, Max) or a single Stock sheet with location as a column and composite key (SKU + Location). The former is easier for range-based reads; the latter can work with a filter. Choose one and stick to it so reorder logic and sync scripts don’t branch on two different shapes.
Transaction log integrity and stock reconciliation
Every stock change should append a row to Transactions: SKU, location (if used), delta (+/-), timestamp, source (e.g. “Sync”, “Manual”, “Order 123”). That gives you an audit trail and a basis for reconciliation. Write transactions in the same run that updates Stock so you never have a Stock update without a corresponding log entry. Periodically (e.g. weekly) sum transaction deltas per SKU/location and compare to current Stock. If they diverge, flag the row for review or write the reconciled value. Store the last reconciliation timestamp so you only process new transactions since then. This catches double deductions, missed updates, or manual edits that bypassed the script.
Implementation strategy
1. Stock sheet and reorder check
One row per SKU (and per location if needed). Columns: SKU/id, name, current quantity, min quantity, reorder quantity (or max), last updated. Read the entire Stock range in one getValues(), compute low-stock rows in memory, then write back with one setValues() for the status column. Use getValues() and setValues() for full-range reads and writes so one run can update many rows without timing out. Avoid per-cell reads or writes inside a loop.
function checkReorderPoints() {
const ss = SpreadsheetApp.getActiveSpreadsheet();
const sheet = ss.getSheetByName("Stock");
const lastRow = sheet.getLastRow();
if (lastRow < 2) return;
const numRows = lastRow - 1;
const data = sheet.getRange(2, 1, 2 + numRows - 1, 6).getValues();
const qtyCol = 2, minCol = 3, statusCol = 5;
for (var i = 0; i < data.length; i++) {
data[i][statusCol] = data[i][qtyCol] <= data[i][minCol] ? "LOW" : "";
}
sheet.getRange(2, 6, 2 + data.length - 1, 6).setValues(data.map(function (r) { return [r[statusCol]]; }));
}2. Chunked sync with PropertiesService cursor
When syncing from an API (e.g. Shopify inventory), process in chunks. Store the last processed cursor (e.g. order id or page offset) in PropertiesService.getScriptProperties(). Each run fetches one chunk, updates Stock and Transactions, then writes the new cursor. If more data remains, create a one-off time-driven trigger to run again in a few minutes; when done, delete the trigger.
function syncInventoryChunk() {
const lock = LockService.getScriptLock();
try {
lock.waitLock(30000);
const props = PropertiesService.getScriptProperties();
const cursor = props.getProperty("inventorySyncCursor") || "0";
const CHUNK = 200;
var nextCursor = cursor;
var updates = fetchInventoryPage(cursor, CHUNK);
if (updates.length === 0) {
props.deleteProperty("inventorySyncCursor");
deleteTriggersFor("syncInventoryChunk");
return;
}
const sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName("Stock");
const lastRow = sheet.getLastRow();
if (lastRow < 2) return;
const numRows = lastRow - 1;
const data = sheet.getRange(2, 1, 2 + numRows - 1, 4).getValues();
var skuMap = {};
data.forEach(function (r, i) { skuMap[r[0]] = i; });
for (var u = 0; u < updates.length; u++) {
var idx = skuMap[updates[u].sku];
if (idx !== undefined) data[idx][2] = updates[u].qty;
nextCursor = updates[u].id;
}
sheet.getRange(2, 1, 2 + data.length - 1, 4).setValues(data);
props.setProperty("inventorySyncCursor", nextCursor);
if (updates.length >= CHUNK) {
ScriptApp.newTrigger("syncInventoryChunk").timeBased().after(2 * 60 * 1000).create();
} else {
deleteTriggersFor("syncInventoryChunk");
}
} finally {
lock.releaseLock();
}
}3. Inbound and outbound updates
When stock moves, update the Stock sheet and append to Transactions in one logical flow. Batch: getValues() for the affected rows, modify the array, setValues() back, then appendRow() (or a batch append) for the transaction log. For high volume, process in chunks and persist cursor in PropertiesService so a time-driven trigger can continue.
Reorder alerts and triggers
Use an installable time-driven trigger (e.g. daily at 6 AM) to run the reorder check. Optionally send email via GmailApp or MailApp with a summary of low-stock items. Stay under the 6-minute execution limit and daily email quota; for many SKUs, run the check in one batch and send one digest email. For custom UIs or internal tools that need forms and dashboards, see Apps Script web apps.
Example: A retail SME with 12,000 SKUs needed reorder alerts without ERP. We built a Stock sheet with min/max columns, a daily trigger that ran a reorder check, and a digest email listing items below min. Stock data was updated nightly from their sales channel via a chunked UrlFetchApp sync so the script never hit execution time or URL Fetch limits.
Second example: A distributor selling on Shopify, Amazon, and a B2B portal needed one master stock view. We synced inventory from each channel in separate chunked jobs (cursor per channel in script properties), wrote into a single Stock sheet with a “source” column, and ran a nightly reconciliation that flagged discrepancies. Reorder alerts used the consolidated view so purchasing didn’t rely on a single channel’s numbers.
In most SME deployments, batching reads and writes for inventory checks reduces execution time by 50–70% compared to per-cell updates.
Need inventory automation designed for your SKU count and data sources? We build Sheets + Apps Script solutions that stay within quotas and triggers.
Discuss your setupConcurrency and data integrity
Simultaneous edits, LockService, and double deduction
If a time-driven sync and a manual edit (or two triggers) run at once, both can read the same Stock row, compute a new quantity, and write back. The last write wins and the other update is lost—e.g. one run deducts 5, another deducts 3, but the final value only reflects one deduction (double deduction or lost deduction). Same risk when multiple script runs append to Transactions: interleaved rows and no single source of truth for “current” state.
Use LockService.getScriptLock() around any block that reads Stock (or state), modifies it, and writes back. Acquire the lock at the start of the critical section, release in a finally block so errors don’t leave the lock held. The second run waits (e.g. waitLock(30000)) and then runs with fresh data. Use for: sync chunks, reorder checks that write status, and any script that updates the same rows another trigger might touch.
Deduct only after you hold the lock. Read current qty, validate (qty ≥ deduction), then set new qty and append the transaction row in the same run. If you use a “pending” or “reservation” table, consume reservations inside the same lock so two runs can’t both consume the same reservation.
Failure recovery and logging
Status sheet and error tracking
Maintain a small Status sheet or range: last run timestamp, script name, rows processed, outcome (OK / ERROR), and optional error message. Write to it at the end of each run (or in a finally block). You can then see at a glance whether the nightly sync or reorder check succeeded without opening Apps Script execution log. In a try/catch, log the error to the Status sheet and optionally email a digest to ops. Include script name, timestamp, and a short message so someone can open the execution log and filter by time. Avoid rethrowing if you’ve already logged—otherwise the trigger may retry and duplicate work.
Retry strategy and idempotent sync design
For transient failures (e.g. URL Fetch timeout), retry once or twice with a short delay inside the same run. For “run exceeded execution time”, don’t retry in the same run—persist the cursor and let the next scheduled run continue. That way each run is bounded and you don’t burn quota on repeated timeouts. Design syncs so re-running the same chunk doesn’t double-apply. Use “last updated” or “version” from the source and overwrite Stock only when the incoming data is newer. For append-only transaction feeds, store the last processed id and skip already-seen records so a retry doesn’t duplicate transactions.
When ERP is overkill vs Sheets
When ERP is overkill, cost, and tradeoffs
Full ERP makes sense when you have multi-entity ledgers, complex manufacturing BOMs, regulated lot/serial tracking, or heavy procurement workflows with approvals. For a single-entity SME with a few thousand SKUs, one or two locations, and straightforward in/out inventory, Sheets plus Apps Script often covers reorder alerts, transaction logging, and channel sync without the cost and implementation time of ERP.
ERP typically has per-seat licensing, implementation fees, and ongoing support. Sheets and Apps Script are free within Google Workspace and quota; cost is build time (yours or a developer) and optional integrations. The break-even is where manual errors or stockouts from spreadsheet-only processes cost more than the ERP plus implementation—or where compliance (e.g. lot traceability) demands a dedicated system.
Sheets gives flexibility and visibility in one place; you own the data and can change logic without vendor lock-in. Tradeoffs: execution time and quota limits, no native multi-tenant or role-based access at the field level (use sheet protection and sharing), and you must design concurrency and failure recovery yourself. ERP offloads that but adds cost and rigidity. Choosing Sheets is a conscious trade for SMEs that prioritize speed to value and low fixed cost.
Scalability and quotas
Apps Script has a 6-minute execution limit per run and daily caps on URL Fetch, emails, and spreadsheet writes. Design for these from the start so you don’t hit a wall at 5k or 10k SKUs. When syncs or reorder checks run long, see the execution time limit guide for chunking, LockService, and trigger patterns that keep runs within quota.
Quota breakdown
- Runtime: 6 minutes per execution. Chunk work and use a cursor so each run does a bounded batch.
- URL Fetch: Daily limit (order of 20k calls). Batch API requests where possible; use
UrlFetchApp.fetchAll()for parallel reads; cache responses if data doesn’t change every run. - Spreadsheet: Read/write limits per day. Prefer
getValues()/setValues()over many small calls; one range read per sheet per run when feasible. - Email: Daily send limit. One digest per run instead of one email per low-stock SKU.
Deployment and regional sheet separation
One script per workbook is the default. For very large orgs, split by business unit or region: separate workbooks (and thus separate quota pools) with a coordinator sheet or a single “master” script that delegates to other scripts via ScriptApp.run(scriptId, functionName) if you need cross-sheet aggregation. Each deployment has its own triggers and quota. When locations or channels have different owners or refresh cycles, use one sheet per region (or per channel) and a summary sheet that aggregates via formulas or a script. That keeps sync jobs small and avoids one 50k-row sheet that every run must read. Reorder logic can run per sheet or on the summary depending on who needs to act.
Scaling this automation for production
For multi-location or high-SKU production use, structure scaling around quotas, triggers, monitoring, and governance. To automate inventory management in Google Sheets with Apps Script at scale, keep chunk sizes bounded, use a status sheet for visibility, and document which script owns which trigger.
Quotas and infrastructure
- Split by workbook or script: Separate sheets or deployments per region/channel so each has its own quota pool.
- Chunk external syncs: Use a queue or cursor so nightly syncs don’t exceed URL Fetch or runtime in one run.
Triggers and runtime
- Time-driven triggers: Schedule reorder checks and syncs at off-peak times.
- One trigger per job: Avoid duplicate triggers; delete and recreate only when needed.
Monitoring and governance
- Status or log sheet: Record last run time, rows updated, and errors so you can debug without opening execution log.
- Document ownership: Note which script owns which sheet and trigger so teams don’t duplicate or conflict.
If inventory logic currently lives in Excel with VBA, moving it to Sheets and Apps Script centralizes data and avoids per-machine macros. VBA to Apps Script migration is an option when consolidating legacy workbooks.
FAQ
How much does it cost to automate inventory in Google Sheets?
Google Sheets and Apps Script are free within quota. Cost is your time or a developer to build triggers, formulas, and optional API links. No per-seat SaaS fees; you own the sheet and script.
What are the limitations of inventory automation in Sheets?
Execution time (6 min per run), daily URL Fetch and write quotas, and sheet size (10M cells). For 10k+ SKUs or real-time sync from many sources, use chunk processing and optional external systems.
Is inventory data in Google Sheets secure?
Data lives in your Google Workspace. Use sharing and sheet protection to limit who edits what. Scripts run with the owner’s authority; avoid storing secrets in code—use Script Properties or OAuth where needed.
What are alternatives to Sheets for inventory?
Dedicated inventory SaaS, ERP modules, or databases with a front end. Sheets + Apps Script suits SMEs that already use Google Workspace and want one place for stock, orders, and reports without new software.
About the author
Raj is an Apps Script and Google Workspace automation specialist. He builds and audits production scripts for Sheets, Gmail, Calendar, and API integrations—including inventory syncs, triggers, and quota-aware design—for teams and SMEs.
More about Raj