Black floppy disk

Using Google Sheets as database

No Comments

(feature image by unsplash-logoVincent Botta)


Sometimes you just need a simple database and also the ability to have a really simple way to edit the data in a quick and straightforward way, without compromising on integrity or accountability. This is where Google Sheets offers a totally valid solution!

Google Sheets provides us with an already familiar interface to create, edit, and view all our data in columns. We can search, order, and even do bulk operations. Also, Google provides an API to use these sheets in a programmatic way, which we are going to use for this article.

Create the database

First, you need to create a new spreadsheet. I trust you know how to do that. Then create your first table (aka worksheet) and a couple of columns. In my example, I created a recipes table with columns for a unique identifier (id), the name, a description (desc), and two date fields (createdAt, updatedAt).

Google sheet example

Sheet setup

The two date fields should be auto-filled by our “database”. So how are we going to do that? Simple: We attach a script to our spreadsheet. Goto Tools -> Script Editor and this will open a new window with an empty script.

function onEdit(e) {
  const columns = e.source
  const createdIdx = columns.indexOf('createdAt');
  const updatedIdx = columns.indexOf('updatedAt');
  const isNewRow = !e.source
    .getRange(e.range.getRow(), createdIdx + 1)
  if (createdIdx >= 0 && isNewRow) {
      .getRange(e.range.getRow(), createdIdx + 1)
      .setValue(new Date().toISOString());
  if (updatedIdx >= 0) {
      .getRange(e.range.getRow(), updatedIdx + 1)
      .setValue(new Date().toISOString());

onEdit is a callback that is invoked every time a cell was edited by a user. You can read about onEdit and the event argument in Googles API documentation.

In the script above, we first check if the respective date column exists, then for createdAt we have to check if it is filled already. If not populate the field with the current date. updatedAt will be filled with the current timestamp after every edit made to the row by a user.

Now we have our very basic database table set up and ready to be accessed by an application.

Creating the API

We are going to implement the database API in Node.js. Google provides a library to communicate with its APIs called googleapis which is available as a npm module. Go and create a new project and install this library. Also, we are going to use typescript because we are not cavemen, and LokiJS as an in-memory database so we don’t have to constantly call the spreadsheet API.

$> mkdir sheet-api
$> cd sheet-api
$> yarn init
question name (sheet-api):
$> yarn add googleapis google-auth-library \
    typescript lokijs \
    @types/node @types/lokijs
[1/4] ?  Resolving packages...
✨  Done in 13.37s.

Service account

Now to connect to our database we need to authenticate with the Google API. To do this we will create a service account. Navigate to the Google Cloud Platform Console IAM & admin -> Service accounts. You should have a GCP project based on the sheet scripts we created earlier. If not create a new project. Create a new service account (I named mine sheet-bot), you don’t need to assign any roles, but do need to create a key. Choose JSON as the option for this key, and it should download you a credentials file containing all information needed to authenticate as this service account in our next steps. Save this file to your project folder as sheet-api/credentials.json. Also save the email address of the service account, because we are going to need it later.

google cloud platform service account overview

GCP service account


To make use of our newly created service account, we are going to use google-auth-librarys utils to create a JWT and authorize our script. We will encapsulate this as a function in api/auth.ts:

import { readFile } from 'fs';
import { resolve } from 'path';
import { promisify } from 'util';
import { JWT } from 'google-auth-library';
const promisedFile = promisify(readFile);
export async function auth() {
  const credentials = JSON.parse(
    await promisedFile(resolve(__dirname, '../credentials.json'), 'utf-8'),
  const client = new JWT({
    email: credentials.client_email,
    key: credentials.private_key,
    scopes: [''],
  await client.authorize();
  return client;

Accessing the sheet

We will wrap the googleapis methods to operate on sheets with a helper function. First, because we don’t need all the functionality the sheet-API provides, and second we can slap some typings onto this and provide some level of confidence when we later deal with the data coming from and into the sheets.

import { google } from 'googleapis';
import { auth } from './auth';
const sheetsApi = google.sheets({ version: 'v4' });
export async function readSheet(
  spreadsheetId: string,
  range: string,
  firstRowAsKeys?: true,
): Promise<T[]>;
export async function readSheet(
  spreadsheetId: string,
  range: string,
  firstRowAsKeys: boolean = true,
): Promise<T[] | string[][]>; {
  const {
    data: {
      values: [keys, ...values],
  } = await sheetsApi.spreadsheets.values.get({
    auth: await auth(),
    valueRenderOption: 'UNFORMATTED_VALUE',
  return firstRowAsKeys
    ? =>
          (acc, key, idx) => ({
            [key]: columns[idx],
          {} as T,
    : [keys, ...values];

The code itself is pretty straightforward: we await the data with the use of our auth-helper detailed above. Then we parse the sheet data, which comes in a two-dimensional array. If we want to treat the first row as keys we are going to create an array of objects, where the properties are the mapped columns of each row. Take note at the valueRenderOption: 'UNFORMATTED_VALUE', this ensures that data, that is formatted in a special way (e.g. currencies) arrives as a raw value in our application (e.g. without the currency sign).

One last thing to do is to add the service account to our database sheet as a collaborator so it can access the sheet. Goto the sheet and click the button Share in the top-right corner. Here you just have to paste the email you saved earlier and add it with read/write access.

Google sheets share settings

Google sheet share settings

Creating a consumer

Now that we have our basic API wrapper finished, we can start creating our business model.

export type Recipe = {
  id: string;
  name: string;
  desc: string;
  createdAt: string;
  updatedAt: string;

This is just our DTO with all columns/properties we created in our sheet.

import * as Loki from 'lokijs';
import { readSheet } from '../api/sheets';
import { Recipe } from './Recipe';
const sheetId = '1o0VAQ4f2QafBjLUd53yCWEtdwKonu5wPM33CttxBTXI';
const sheetRange = 'Recipes!A:E';
const db = new Loki('recipes.json');
const collection = db.addCollection('recipes', { indices: ['id'] });
export async function setup() {
  const data = await readSheet(sheetId, sheetRange);
export async function refresh() {
  const data = await readSheet(sheetId, sheetRange);
  const ids = =>;
    obj =>
      ids.includes( &&
      new Date(data.find(d => === >
        new Date(obj.updatedAt).getTime(),
    obj => Object.assign(obj, data.find(d => ===,
  collection.findAndRemove({ id: { $not: { $in: ids } } });
  collection.insert(data.filter(d => !collection.findOne({ id: })));
export { collection };

As mentioned above, we are going to utilize LokiJS as an in-memory database to provide some basic things that come in handy (query, sorting, indices etc), so we don’t have to make that many API calls to query our sheet.

We are going to load all data in our sheet into memory on setup. This should be fine for every use case where Google Sheets is a viable option (hint: if you have more data than memory can reasonably hold, use a real database), you could even modify the setup to load the database in a build step and never query Google API for data in production setups. This is what I did at a company I worked for btw.

Next, we have a refresh function that will update our in-memory database to the current state of the sheet (insert, update, and remove entities).

And lastly we export the collection to further use in our application:

import { createServer } from 'http';
import { collection, setup, refresh } from './store/recipes';
const port = +process.env.PORT || 8000;
setup().then(() => {
  createServer(async (req, res) => {
    try {
      const data = collection.find();
      res.statusCode = 200;
      res.end(JSON.stringify(data), 'utf8');
    } catch (err) {
      res.statusCode = 500;
  }).listen(port, () => {
    console.log(`?  Server listening on port ${port}!`);
    setInterval(refresh, 30000);

This code starts a basic HTTP server that will serve all our recipes as a JSON array, and update with new data every 30 seconds.


Google Sheets provides an easy-to-set-up and -manage way to fill some very basic database needs. You can automate scripts in the sheet to emulate calculated fields or validity checks when edits are made. Combined with an in-memory database, we can achieve reasonable performance, especially if we look into the costs (basically zero).

Like I mentioned above, I did implement this database in a company I worked for. We had a catalog of cities, airports, and countries, plus some relations (an airport belongs to a city etc.), which I validated on change with scripts and fancy drop-down menus. The benefit was that even non-technical personnel could easily access this data and make modifications (tag airports or cities, check boxes for cities that should / should not appear in certain features etc.) without coding a complex UI or buying third party software. We then bundled all data in a build step and ran some basic verification. All in all, it worked seamlessly.


The complete project can be found on GitHub:

Jonathan Burke is a senior software engineer working at codecentric in Hamburg, Germany. He is a web application developer with experience in JavaScript / TypeScript and Node.js. He loves experimenting on the bleeding edge of technology.


Your email address will not be published. Required fields are marked *