Getting started with Spanner in Node.js


Objectives

This tutorial walks you through the following steps using the Spanner client library for Node.js:

  • Create a Spanner instance and database.
  • Write, read, and execute SQL queries on data in the database.
  • Update the database schema.
  • Update data using a read-write transaction.
  • Add a secondary index to the database.
  • Use the index to read and execute SQL queries on data.
  • Retrieve data using a read-only transaction.

Costs

This tutorial uses Spanner, which is a billable component of the Google Cloud. For information on the cost of using Spanner, see Pricing.

Before you begin

Complete the steps described in Set up, which cover creating and setting a default Google Cloud project, enabling billing, enabling the Cloud Spanner API, and setting up OAuth 2.0 to get authentication credentials to use the Cloud Spanner API.

In particular, make sure that you run gcloud auth application-default login to set up your local development environment with authentication credentials.

Prepare your local Node.js environment

  1. Follow the steps to Set Up a Node.js Development Environment

  2. Clone the sample app repository to your local machine:

    gitclonehttps://github.com/googleapis/nodejs-spanner

    Alternatively, you can download the sample as a zip file and extract it.

  3. Change to the directory that contains the Spanner sample code:

    cdsamples/
  4. Install dependencies using npm:

    npminstall

Create an instance

When you first use Spanner, you must create an instance, which is an allocation of resources that are used by Spanner databases. When you create an instance, you choose an instance configuration, which determines where your data is stored, and also the number of nodes to use, which determines the amount of serving and storage resources in your instance.

Execute the following command to create a Spanner instance in the region us-central1 with 1 node:

gcloudspannerinstancescreatetest-instance--config=regional-us-central1\--description="Test Instance"--nodes=1

Note that this creates an instance with the following characteristics:

  • Instance ID test-instance
  • Display name Test Instance
  • Instance configuration regional-us-central1 (Regional configurations store data in one region, while multi-region configurations distribute data across multiple regions. For more information, see About instances.)
  • Node count of 1 (node_count corresponds to the amount of serving and storage resources available to databases in the instance. Learn more in Nodes and processing units.)

You should see:

Creatinginstance...done.

Look through sample files

The samples repository contains a sample that shows how to use Spanner with Node.js.

Take a look through the samples/schema.js file, which shows how to create a database and modify a database schema. The data uses the example schema shown in the Schema and data model page.

Create a database

GoogleSQL

nodeschema.jscreateDatabasetest-instanceexample-dbMY_PROJECT_ID

PostgreSQL

nodeschema.jscreatePgDatabasetest-instanceexample-dbMY_PROJECT_ID

You should see:

Createddatabaseexample-dboninstancetest-instance.
The following code creates a database and two tables in the database.

GoogleSQL

/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');// creates a clientconstspanner=newSpanner({projectId:projectID,});constdatabaseAdminClient=spanner.getDatabaseAdminClient();constcreateSingersTableStatement=`CREATE TABLE Singers ( SingerId INT64 NOT NULL, FirstName STRING(1024), LastName STRING(1024), SingerInfo BYTES(MAX), FullName STRING(2048) AS (ARRAY_TO_STRING([FirstName, LastName], " ")) STORED,) PRIMARY KEY (SingerId)`;constcreateAlbumsTableStatement=`CREATE TABLE Albums ( SingerId INT64 NOT NULL, AlbumId INT64 NOT NULL, AlbumTitle STRING(MAX)) PRIMARY KEY (SingerId, AlbumId),INTERLEAVE IN PARENT Singers ON DELETE CASCADE`;// Creates a new databasetry{const[operation]=awaitdatabaseAdminClient.createDatabase({createStatement:'CREATE DATABASE `'+databaseID+'`',extraStatements:[createSingersTableStatement,createAlbumsTableStatement,],parent:databaseAdminClient.instancePath(projectID,instanceID),});console.log(`Waiting for creation of ${databaseID} to complete...`);awaitoperation.promise();console.log(`Created database ${databaseID} on instance ${instanceID}.`);}catch(err){console.error('ERROR:',err);}

PostgreSQL

/** * TODO(developer): Uncomment these variables before running the sample. */// const instanceId = 'my-instance';// const databaseId = 'my-database';// const projectId = 'my-project-id';// Imports the Google Cloud client libraryconst{Spanner,protos}=require('@google-cloud/spanner');// creates a clientconstspanner=newSpanner({projectId:projectId,});constdatabaseAdminClient=spanner.getDatabaseAdminClient();asyncfunctioncreatePgDatabase(){// Creates a PostgreSQL database. PostgreSQL create requests may not contain any additional// DDL statements. We need to execute these separately after the database has been created.const[operationCreate]=awaitdatabaseAdminClient.createDatabase({createStatement:'CREATE DATABASE "'+databaseId+'"',parent:databaseAdminClient.instancePath(projectId,instanceId),databaseDialect:protos.google.spanner.admin.database.v1.DatabaseDialect.POSTGRESQL,});console.log(`Waiting for operation on ${databaseId} to complete...`);awaitoperationCreate.promise();const[metadata]=awaitdatabaseAdminClient.getDatabase({name:databaseAdminClient.databasePath(projectId,instanceId,databaseId),});console.log(`Created database ${databaseId} on instance ${instanceId} with dialect ${metadata.databaseDialect}.`);// Create a couple of tables using a separate request. We must use PostgreSQL style DDL as the// database has been created with the PostgreSQL dialect.conststatements=[`CREATE TABLE Singers  (SingerId bigint NOT NULL, FirstName varchar(1024), LastName varchar(1024), SingerInfo bytea, FullName character varying(2048) GENERATED ALWAYS AS (FirstName || ' ' || LastName) STORED, PRIMARY KEY (SingerId) ); CREATE TABLE Albums  (AlbumId bigint NOT NULL, SingerId bigint NOT NULL REFERENCES Singers (SingerId), AlbumTitle text, PRIMARY KEY (AlbumId) );`,];const[operationUpdateDDL]=awaitdatabaseAdminClient.updateDatabaseDdl({database:databaseAdminClient.databasePath(projectId,instanceId,databaseId),statements:[statements],});awaitoperationUpdateDDL.promise();console.log('Updated schema');}createPgDatabase();

The next step is to write data to your database.

Create a database client

Before you can do reads or writes, you must create a Database:

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');// Creates a clientconstspanner=newSpanner({projectId});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);// The query to executeconstquery={sql:'SELECT 1',};// Execute a simple SQL statementconst[rows]=awaitdatabase.run(query);console.log(`Query: ${rows.length} found.`);rows.forEach(row=>console.log(row));

You can think of a Database as a database connection: all of your interactions with Spanner must go through a Database. Typically you create a Database when your application starts up, then you re-use that Database to read, write, and execute transactions. Each client uses resources in Spanner.

If you create multiple clients in the same app, you should call Database.close() to clean up the client's resources, including network connections, as soon as it is no longer needed.

Read more in the Database reference.

Write data with DML

You can insert data using Data Manipulation Language (DML) in a read-write transaction.

You use the runUpdate() method to execute a DML statement.

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);database.runTransaction(async(err,transaction)=>{if(err){console.error(err);return;}try{const[rowCount]=awaittransaction.runUpdate({sql:`INSERT Singers (SingerId, FirstName, LastName) VALUES (12, 'Melissa', 'Garcia'), (13, 'Russell', 'Morales'), (14, 'Jacqueline', 'Long'), (15, 'Dylan', 'Shaw')`,});console.log(`${rowCount} records inserted.`);awaittransaction.commit();}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}});

Run the sample using the writeUsingDml argument.

nodedml.jswriteUsingDmltest-instanceexample-dbMY_PROJECT_ID

You should see:

4recordsinserted.

Write data with mutations

You can also insert data using mutations.

You write data using a Table object. The Table.insert() method adds new rows to the table. All inserts in a single batch are applied atomically.

This code shows how to write the data using mutations:

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);// Instantiate Spanner table objectsconstsingersTable=database.table('Singers');constalbumsTable=database.table('Albums');// Inserts rows into the Singers table// Note: Cloud Spanner interprets Node.js numbers as FLOAT64s, so// they must be converted to strings before being inserted as INT64stry{awaitsingersTable.insert([{SingerId:'1',FirstName:'Marc',LastName:'Richards'},{SingerId:'2',FirstName:'Catalina',LastName:'Smith'},{SingerId:'3',FirstName:'Alice',LastName:'Trentor'},{SingerId:'4',FirstName:'Lea',LastName:'Martin'},{SingerId:'5',FirstName:'David',LastName:'Lomond'},]);awaitalbumsTable.insert([{SingerId:'1',AlbumId:'1',AlbumTitle:'Total Junk'},{SingerId:'1',AlbumId:'2',AlbumTitle:'Go, Go, Go'},{SingerId:'2',AlbumId:'1',AlbumTitle:'Green'},{SingerId:'2',AlbumId:'2',AlbumTitle:'Forever Hold your Peace'},{SingerId:'2',AlbumId:'3',AlbumTitle:'Terrified'},]);console.log('Inserted data.');}catch(err){console.error('ERROR:',err);}finally{awaitdatabase.close();}

Run the sample using the insert argument.

nodecrud.jsinserttest-instanceexample-dbMY_PROJECT_ID

You should see:

Inserteddata.

Query data using SQL

Spanner supports a SQL interface for reading data, which you can access on the command line using the Google Cloud CLI or programmatically using the Spanner client library for Node.js.

On the command line

Execute the following SQL statement to read the values of all columns from the Albums table:

gcloudspannerdatabasesexecute-sqlexample-db--instance=test-instance\--sql='SELECT SingerId, AlbumId, AlbumTitle FROM Albums'

The result should be:

SingerIdAlbumIdAlbumTitle11TotalJunk12Go,Go,Go21Green22ForeverHoldYourPeace23Terrified

Use the Spanner client library for Node.js

In addition to executing a SQL statement on the command line, you can issue the same SQL statement programmatically using the Spanner client library for Node.js.

Use Database.run() to run the SQL query.

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);constquery={sql:'SELECT SingerId, AlbumId, AlbumTitle FROM Albums',};// Queries rows from the Albums tabletry{const[rows]=awaitdatabase.run(query);rows.forEach(row=>{constjson=row.toJSON();console.log(`SingerId: ${json.SingerId}, AlbumId: ${json.AlbumId}, AlbumTitle: ${json.AlbumTitle}`);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.awaitdatabase.close();}

Here's how to issue the query and access the data:

nodecrud.jsquerytest-instanceexample-dbMY_PROJECT_ID

You should see the following result:

SingerId:1,AlbumId:1,AlbumTitle:TotalJunkSingerId:1,AlbumId:2,AlbumTitle:Go,Go,GoSingerId:2,AlbumId:1,AlbumTitle:GreenSingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeaceSingerId:2,AlbumId:3,AlbumTitle:Terrified

Query using a SQL parameter

If your application has a frequently executed query, you can improve its performance by parameterizing it. The resulting parametric query can be cached and reused, which reduces compilation costs. For more information, see Use query parameters to speed up frequently executed queries.

Here is an example of using a parameter in the WHERE clause to query records containing a specific value for LastName.

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);constquery={sql:`SELECT SingerId, FirstName, LastName FROM Singers WHERE LastName = @lastName`,params:{lastName:'Garcia',},};// Queries rows from the Albums tabletry{const[rows]=awaitdatabase.run(query);rows.forEach(row=>{constjson=row.toJSON();console.log(`SingerId: ${json.SingerId}, FirstName: ${json.FirstName}, LastName: ${json.LastName}`);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}

Here's how to issue the query and access the data:

nodedml.jsqueryWithParametertest-instanceexample-dbMY_PROJECT_ID

You should see the following result:

SingerId:12,FirstName:Melissa,LastName:Garcia

Read data using the read API

In addition to Spanner's SQL interface, Spanner also supports a read interface.

Use Table.read() to read rows from the database. Use a KeySet object to define a collection of keys and key ranges to read.

Here's how to read the data:

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);// Reads rows from the Albums tableconstalbumsTable=database.table('Albums');constquery={columns:['SingerId','AlbumId','AlbumTitle'],keySet:{all:true,},};try{const[rows]=awaitalbumsTable.read(query);rows.forEach(row=>{constjson=row.toJSON();console.log(`SingerId: ${json.SingerId}, AlbumId: ${json.AlbumId}, AlbumTitle: ${json.AlbumTitle}`);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.awaitdatabase.close();}

Run the sample using the read argument.

nodecrud.jsreadtest-instanceexample-dbMY_PROJECT_ID

You should see output similar to:

SingerId:1,AlbumId:1,AlbumTitle:TotalJunkSingerId:1,AlbumId:2,AlbumTitle:Go,Go,GoSingerId:2,AlbumId:1,AlbumTitle:GreenSingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeaceSingerId:2,AlbumId:3,AlbumTitle:Terrified

Update the database schema

Assume you need to add a new column called MarketingBudget to the Albums table. Adding a new column to an existing table requires an update to your database schema. Spanner supports schema updates to a database while the database continues to serve traffic. Schema updates don't require taking the database offline and they don't lock entire tables or columns; you can continue writing data to the database during the schema update. Read more about supported schema updates and schema change performance in Make schema updates.

Add a column

You can add a column on the command line using the Google Cloud CLI or programmatically using the Spanner client library for Node.js.

On the command line

Use the following ALTER TABLE command to add the new column to the table:

GoogleSQL

gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\--ddl='ALTER TABLE Albums ADD COLUMN MarketingBudget INT64'

PostgreSQL

gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\--ddl='ALTER TABLE Albums ADD COLUMN MarketingBudget BIGINT'

You should see:

Schemaupdating...done.

Use the Spanner client library for Node.js

Use Database.updateSchema to modify the schema:

/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');// creates a clientconstspanner=newSpanner({projectId:projectId,});constdatabaseAdminClient=spanner.getDatabaseAdminClient();// Creates a new index in the databasetry{const[operation]=awaitdatabaseAdminClient.updateDatabaseDdl({database:databaseAdminClient.databasePath(projectId,instanceId,databaseId),statements:['ALTER TABLE Albums ADD COLUMN MarketingBudget INT64'],});console.log('Waiting for operation to complete...');awaitoperation.promise();console.log('Added the MarketingBudget column.');}catch(err){console.error('ERROR:',err);}finally{// Close the spanner client when finished.// The databaseAdminClient does not require explicit closure. The closure of the Spanner client will automatically close the databaseAdminClient.spanner.close();}

Run the sample using the addColumn argument.

nodeschema.jsaddColumntest-instanceexample-dbMY_PROJECT_ID

You should see:

AddedtheMarketingBudgetcolumn.

Write data to the new column

The following code writes data to the new column. It sets MarketingBudget to 100000 for the row keyed by Albums(1, 1) and to 500000 for the row keyed by Albums(2, 2).

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);// Update a row in the Albums table// Note: Cloud Spanner interprets Node.js numbers as FLOAT64s, so they// must be converted to strings before being inserted as INT64sconstalbumsTable=database.table('Albums');try{awaitalbumsTable.update([{SingerId:'1',AlbumId:'1',MarketingBudget:'100000'},{SingerId:'2',AlbumId:'2',MarketingBudget:'500000'},]);console.log('Updated data.');}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}

Run the sample using the update argument.

nodecrud.jsupdatetest-instanceexample-dbMY_PROJECT_ID

You should see:

Updateddata.

You can also execute a SQL query or a read call to fetch the values that you just wrote.

Here's the code to execute the query:

// This sample uses the `MarketingBudget` column. You can add the column// by running the `add_column` sample or by running this DDL statement against// your database:// ALTER TABLE Albums ADD COLUMN MarketingBudget INT64// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);constquery={sql:'SELECT SingerId, AlbumId, MarketingBudget FROM Albums',};// Queries rows from the Albums tabletry{const[rows]=awaitdatabase.run(query);rows.forEach(asyncrow=>{constjson=row.toJSON();console.log(`SingerId: ${json.SingerId}, AlbumId: ${json.AlbumId}, MarketingBudget: ${json.MarketingBudget?json.MarketingBudget:null}`);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}

To execute this query, run the sample using the queryNewColumn argument.

nodeschema.jsqueryNewColumntest-instanceexample-dbMY_PROJECT_ID

You should see:

SingerId:1,AlbumId:1,MarketingBudget:100000SingerId:1,AlbumId:2,MarketingBudget:nullSingerId:2,AlbumId:1,MarketingBudget:nullSingerId:2,AlbumId:2,MarketingBudget:500000SingerId:2,AlbumId:3,MarketingBudget:null

Update data

You can update data using DML in a read-write transaction.

You use the runUpdate() method to execute a DML statement.

// This sample transfers 200,000 from the MarketingBudget field// of the second Album to the first Album, as long as the second// Album has enough money in its budget. Make sure to run the// addColumn and updateData samples first (in that order).// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);consttransferAmount=200000;database.runTransaction((err,transaction)=>{if(err){console.error(err);return;}letfirstBudget,secondBudget;constqueryOne=`SELECT MarketingBudget FROM Albums WHERE SingerId = 2 AND AlbumId = 2`;constqueryTwo=`SELECT MarketingBudget FROM Albums WHERE SingerId = 1 AND AlbumId = 1`;Promise.all([// Reads the second album's budgettransaction.run(queryOne).then(results=>{// Gets second album's budgetconstrows=results[0].map(row=>row.toJSON());secondBudget=rows[0].MarketingBudget;console.log(`The second album's marketing budget: ${secondBudget}`);// Makes sure the second album's budget is large enoughif(secondBudget < transferAmount){thrownewError(`The second album's budget (${secondBudget}) is less than the transfer amount (${transferAmount}).`);}}),// Reads the first album's budgettransaction.run(queryTwo).then(results=>{// Gets first album's budgetconstrows=results[0].map(row=>row.toJSON());firstBudget=rows[0].MarketingBudget;console.log(`The first album's marketing budget: ${firstBudget}`);}),]).then(()=>{// Transfers the budgets between the albumsconsole.log(firstBudget,secondBudget);firstBudget+=transferAmount;secondBudget-=transferAmount;console.log(firstBudget,secondBudget);// Updates the database// Note: Cloud Spanner interprets Node.js numbers as FLOAT64s, so they// must be converted (back) to strings before being inserted as INT64s.returntransaction.runUpdate({sql:`UPDATE Albums SET MarketingBudget = @Budget WHERE SingerId = 1 and AlbumId = 1`,params:{Budget:firstBudget,},}).then(()=> transaction.runUpdate({sql:`UPDATE Albums SET MarketingBudget = @Budget WHERE SingerId = 2 and AlbumId = 2`,params:{Budget:secondBudget,},}));}).then(()=>{// Commits the transaction and send the changes to the databasereturntransaction.commit();}).then(()=>{console.log(`Successfully executed read-write transaction using DML to transfer ${transferAmount} from Album 2 to Album 1.`);}).then(()=>{// Closes the database when finisheddatabase.close();});});

Run the sample using the writeWithTransactionUsingDml argument.

nodedml.jswriteWithTransactionUsingDmltest-instanceexample-dbMY_PROJECT_ID

You should see:

Successfullyexecutedread-writetransactionusingDMLtotransfer$200000fromAlbum2toAlbum1.

Use a secondary index

Suppose you wanted to fetch all rows of Albums that have AlbumTitle values in a certain range. You could read all values from the AlbumTitle column using a SQL statement or a read call, and then discard the rows that don't meet the criteria, but doing this full table scan is expensive, especially for tables with a lot of rows. Instead you can speed up the retrieval of rows when searching by non-primary key columns by creating a secondary index on the table.

Adding a secondary index to an existing table requires a schema update. Like other schema updates, Spanner supports adding an index while the database continues to serve traffic. Spanner automatically backfills the index with your existing data. Backfills might take a few minutes to complete, but you don't need to take the database offline or avoid writing to the indexed table during this process. For more details, see Add a secondary index.

After you add a secondary index, Spanner automatically uses it for SQL queries that are likely to run faster with the index. If you use the read interface, you must specify the index that you want to use.

Add a secondary index

You can add an index on the command line using the gcloud CLI or programmatically using the Spanner client library for Node.js.

On the command line

Use the following CREATE INDEX command to add an index to the database:

gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\--ddl='CREATE INDEX AlbumsByAlbumTitle ON Albums(AlbumTitle)'

You should see:

Schemaupdating...done.

Using the Spanner client library for Node.js

Use Database.updateSchema() to add an index:

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});constdatabaseAdminClient=spanner.getDatabaseAdminClient();constrequest=['CREATE INDEX AlbumsByAlbumTitle ON Albums(AlbumTitle)'];// Creates a new index in the databasetry{const[operation]=awaitdatabaseAdminClient.updateDatabaseDdl({database:databaseAdminClient.databasePath(projectId,instanceId,databaseId),statements:request,});console.log('Waiting for operation to complete...');awaitoperation.promise();console.log('Added the AlbumsByAlbumTitle index.');}catch(err){console.error('ERROR:',err);}finally{// Close the spanner client when finished.// The databaseAdminClient does not require explicit closure. The closure of the Spanner client will automatically close the databaseAdminClient.spanner.close();}

Run the sample using the createIndex argument.

nodeindexing.jscreateIndextest-instanceexample-dbMY_PROJECT_ID

Adding an index can take a few minutes. After the index is added, you should see:

AddedtheAlbumsByAlbumTitleindex.

Read using the index

For SQL queries, Spanner automatically uses an appropriate index. In the read interface, you must specify the index in your request.

To use the index in the read interface, use the Table.read() method.

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);constalbumsTable=database.table('Albums');constquery={columns:['AlbumId','AlbumTitle'],keySet:{all:true,},index:'AlbumsByAlbumTitle',};// Reads the Albums table using an indextry{const[rows]=awaitalbumsTable.read(query);rows.forEach(row=>{constjson=row.toJSON();console.log(`AlbumId: ${json.AlbumId}, AlbumTitle: ${json.AlbumTitle}`);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}

Run the sample using the readIndex argument.

nodeindexing.jsreadIndextest-instanceexample-dbMY_PROJECT_ID

You should see:

AlbumId:2,AlbumTitle:ForeverHoldyourPeaceAlbumId:2,AlbumTitle:Go,Go,GoAlbumId:1,AlbumTitle:GreenAlbumId:3,AlbumTitle:TerrifiedAlbumId:1,AlbumTitle:TotalJunk

Add an index for index-only reads

You might have noticed that the previous read example doesn't include reading the MarketingBudget column. This is because Spanner's read interface doesn't support the ability to join an index with a data table to look up values that are not stored in the index.

Create an alternate definition of AlbumsByAlbumTitle that stores a copy of MarketingBudget in the index.

On the command line

GoogleSQL

gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\--ddl='CREATEINDEXAlbumsByAlbumTitle2ONAlbums(AlbumTitle)STORING(MarketingBudget)

PostgreSQL

gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\--ddl='CREATEINDEXAlbumsByAlbumTitle2ONAlbums(AlbumTitle)INCLUDE(MarketingBudget)

Adding an index can take a few minutes. After the index is added, you should see:

Schemaupdating...done.

Using the Spanner client library for Node.js

Use Database.updateSchema() to add an index with a STORING clause:

// "Storing" indexes store copies of the columns they index// This speeds up queries, but takes more space compared to normal indexes// See the link below for more information:// https://cloud.google.com/spanner/docs/secondary-indexes#storing_clause// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});constdatabaseAdminClient=spanner.getDatabaseAdminClient();constrequest=['CREATE INDEX AlbumsByAlbumTitle2 ON Albums(AlbumTitle) STORING (MarketingBudget)',];// Creates a new index in the databasetry{const[operation]=awaitdatabaseAdminClient.updateDatabaseDdl({database:databaseAdminClient.databasePath(projectId,instanceId,databaseId),statements:request,});console.log('Waiting for operation to complete...');awaitoperation.promise();console.log('Added the AlbumsByAlbumTitle2 index.');}catch(err){console.error('ERROR:',err);}finally{// Close the spanner client when finished.// The databaseAdminClient does not require explicit closure. The closure of the Spanner client will automatically close the databaseAdminClient.spanner.close();}

Run the sample using the createStoringIndex argument.

nodeindexing.jscreateStoringIndextest-instanceexample-dbMY_PROJECT_ID

You should see:

AddedtheAlbumsByAlbumTitle2index.

Now you can execute a read that fetches all AlbumId, AlbumTitle, and MarketingBudget columns from the AlbumsByAlbumTitle2 index:

// "Storing" indexes store copies of the columns they index// This speeds up queries, but takes more space compared to normal indexes// See the link below for more information:// https://cloud.google.com/spanner/docs/secondary-indexes#storing_clause// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);constalbumsTable=database.table('Albums');constquery={columns:['AlbumId','AlbumTitle','MarketingBudget'],keySet:{all:true,},index:'AlbumsByAlbumTitle2',};// Reads the Albums table using a storing indextry{const[rows]=awaitalbumsTable.read(query);rows.forEach(row=>{constjson=row.toJSON();letrowString=`AlbumId: ${json.AlbumId}`;rowString+=`, AlbumTitle: ${json.AlbumTitle}`;if(json.MarketingBudget){rowString+=`, MarketingBudget: ${json.MarketingBudget}`;}console.log(rowString);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}

Run the sample using the readStoringIndex argument.

nodeindexing.jsreadStoringIndextest-instanceexample-dbMY_PROJECT_ID

You should see output similar to:

AlbumId:2,AlbumTitle:ForeverHoldyourPeace,MarketingBudget:300000AlbumId:2,AlbumTitle:Go,Go,Go,MarketingBudget:nullAlbumId:1,AlbumTitle:Green,MarketingBudget:nullAlbumId:3,AlbumTitle:Terrified,MarketingBudget:nullAlbumId:1,AlbumTitle:TotalJunk,MarketingBudget:300000

Retrieve data using read-only transactions

Suppose you want to execute more than one read at the same timestamp. Read-only transactions observe a consistent prefix of the transaction commit history, so your application always gets consistent data. Use Database.runTransaction() for executing read-only transactions.

The following shows how to run a query and perform a read in the same read-only transaction:

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);// Gets a transaction object that captures the database state// at a specific point in timedatabase.getSnapshot(async(err,transaction)=>{if(err){console.error(err);return;}constqueryOne='SELECT SingerId, AlbumId, AlbumTitle FROM Albums';try{// Read #1, using SQLconst[qOneRows]=awaittransaction.run(queryOne);qOneRows.forEach(row=>{constjson=row.toJSON();console.log(`SingerId: ${json.SingerId}, AlbumId: ${json.AlbumId}, AlbumTitle: ${json.AlbumTitle}`);});constqueryTwo={columns:['SingerId','AlbumId','AlbumTitle'],};// Read #2, using the `read` method. Even if changes occur// in-between the reads, the transaction ensures that both// return the same data.const[qTwoRows]=awaittransaction.read('Albums',queryTwo);qTwoRows.forEach(row=>{constjson=row.toJSON();console.log(`SingerId: ${json.SingerId}, AlbumId: ${json.AlbumId}, AlbumTitle: ${json.AlbumTitle}`);});console.log('Successfully executed read-only transaction.');}catch(err){console.error('ERROR:',err);}finally{transaction.end();// Close the database when finished.awaitdatabase.close();}});

Run the sample using the readOnly argument.

nodetransaction.jsreadOnlytest-instanceexample-dbMY_PROJECT_ID

You should see output similar to:

SingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeaceSingerId:1,AlbumId:2,AlbumTitle:Go,Go,GoSingerId:2,AlbumId:1,AlbumTitle:GreenSingerId:2,AlbumId:3,AlbumTitle:TerrifiedSingerId:1,AlbumId:1,AlbumTitle:TotalJunkSingerId:1,AlbumId:2,AlbumTitle:Go,Go,GoSingerId:1,AlbumId:1,AlbumTitle:TotalJunkSingerId:2,AlbumId:1,AlbumTitle:GreenSingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeaceSingerId:2,AlbumId:3,AlbumTitle:TerrifiedSuccessfullyexecutedread-onlytransaction.

Cleanup

To avoid incurring additional charges to your Cloud Billing account for the resources used in this tutorial, drop the database and delete the instance that you created.

Delete the database

If you delete an instance, all databases within it are automatically deleted. This step shows how to delete a database without deleting an instance (you would still incur charges for the instance).

On the command line

gcloudspannerdatabasesdeleteexample-db--instance=test-instance

Using the Google Cloud console

  1. Go to the Spanner Instances page in the Google Cloud console.

    Go to the Instances page

  2. Click the instance.

  3. Click the database that you want to delete.

  4. In the Database details page, click Delete.

  5. Confirm that you want to delete the database and click Delete.

Delete the instance

Deleting an instance automatically drops all databases created in that instance.

On the command line

gcloudspannerinstancesdeletetest-instance

Using the Google Cloud console

  1. Go to the Spanner Instances page in the Google Cloud console.

    Go to the Instances page

  2. Click your instance.

  3. Click Delete.

  4. Confirm that you want to delete the instance and click Delete.

What's next