TransferRun.runTime
.run_date
run_time
parameter in the following format: %Y-%m-%d
; for example, 2018-01-01
. This format is compatible with ingestion-time partitioned tables.Scheduled queries support runtime parameters in the destination table name with a templating syntax.
The templating syntax supports basic string templating and time offsetting. Parameters are referenced in the following formats:
{run_date}
{run_time[+\-offset]|"time_format"}
Parameter | Purpose |
---|---|
run_date | This parameter is replaced by the date in format YYYYMMDD . |
run_time | This parameter supports the following properties:
|
'\{' and '\}'
."YYYY|MM|DD"
, you can escape them in the format string as: '\"'
or '\|'
.run_time (UTC) | Templated parameter | Output destination table name |
---|---|---|
2018-02-15 00:00:00 | mytable | mytable |
2018-02-15 00:00:00 | mytable_{run_time|"%Y%m%d"} | mytable_20180215 |
2018-02-15 00:00:00 | mytable_{run_time+25h|"%Y%m%d"} | mytable_20180216 |
2018-02-15 00:00:00 | mytable_{run_time-1h|"%Y%m%d"} | mytable_20180214 |
2018-02-15 00:00:00 | mytable_{run_time+1.5h|"%Y%m%d%H"} or mytable_{run_time+90m|"%Y%m%d%H"} | mytable_2018021501 |
2018-02-15 00:00:00 | {run_time+97s|"%Y%m%d"}_mytable_{run_time+97s|"%H%M%S"} | 20180215_mytable_000137 |
You can set up a scheduled query to authenticate as a service account. A service account is a special account associated with your Google Cloud project. The service account can run jobs, such as scheduled queries or batch processing pipelines, with its own service credentials rather than an end user's credentials.
Read more about authenticating with service accounts in Introduction to authentication.
You can set up the scheduled query with a service account. If you signed in with a federated identity, then a service account is required to create a transfer. If you signed in with a Google Account, then a service account for the transfer is optional.
You can update an existing scheduled query with the credentials of a service account with the bq command-line tool or Google Cloud console. For more information, see Update scheduled query credentials.
When you specify a CMEK with a transfer, the BigQuery Data Transfer Service applies the CMEK to any intermediate on-disk cache of ingested data so that the entire data transfer workflow is CMEK compliant.
You cannot update an existing transfer to add a CMEK if the transfer was not originally created with a CMEK. For example, you cannot change a destination table that was originally default encrypted to now be encrypted with CMEK. Conversely, you also cannot change a CMEK-encrypted destination table to have a different type of encryption.
You can update a CMEK for a transfer if the transfer configuration was originally created with a CMEK encryption. When you update a CMEK for a transfer configuration, the BigQuery Data Transfer Service propagates the CMEK to the destination tables at the next run of the transfer, where the BigQuery Data Transfer Service replaces any outdated CMEKs with the new CMEK during the transfer run. For more information, see Update a transfer.
You can also use project default keys. When you specify a project default key with a transfer, the BigQuery Data Transfer Service uses the project default key as the default key for any new transfer configurations.
For a description of the schedule syntax, see Formatting the schedule. For details about schedule syntax, see Resource: TransferConfig
.
Open the BigQuery page in the Google Cloud console.
Run the query that you're interested in. When you are satisfied with your results, click Schedule.
The scheduled query options open in the New scheduled query pane.
On the New scheduled query pane:
My scheduled query
. The scheduled query name can be any value that you can identify later if you need to modify the query.Optional: By default, the query is scheduled to run Daily. You can change the default schedule by selecting an option from the Repeats drop-down menu:
To specify a custom frequency, select Custom, then enter a Cron-like time specification in the Custom schedule field— for example, every mon 23:30
or every 6 hours
. For details about valid schedules including custom intervals, see the schedule
field under Resource: TransferConfig
.
To change the start time, select the Start at set time option, enter the selected start date and time.
To specify an end time, select the Schedule end time option, enter the selected end date and time.
To save the query without a schedule, so you can run it on demand later, select On-demand in the Repeats menu.
For a GoogleSQL SELECT
query, select the Set a destination table for query results option and provide the following information about the destination dataset.
For Destination table write preference, choose either Append to table to append data to the table or Overwrite table to overwrite the destination table.
Choose the Location Type.
If you have enabled the destination table for query results, you can select Automatic location selection to automatically select the location where the destination table resides.
Otherwise, choose the location where the data being queried is located.
Advanced options:
Optional: CMEK If you use customer-managed encryption keys, you can select Customer-managed key under Advanced options. A list of your available CMEKs appears for you to choose from. For information about how customer-managed encryption keys (CMEKs) work with the BigQuery Data Transfer Service, see Specify encryption key with scheduled queries.
Authenticate as a service account If you have one or more service accounts associated with your Google Cloud project, you can associate a service account with your scheduled query instead of using your user credentials. Under Scheduled query credential, click the menu to see a list of your available service accounts. A service account is required if you are signed in as a federated identity.
Additional configurations:
Optional: Check Send email notifications to allow email notifications of transfer run failures.
Optional: For Pub/Sub topic, enter your Pub/Sub topic name, for example: projects/myproject/topics/mytopic
.
Click Save.
Option 1: Use the bq query
command.
To create a scheduled query, add the options destination_table
(or target_dataset
), --schedule
, and --display_name
to your bq query
command.
bqquery\ --display_name=name\ --destination_table=table\ --schedule=interval
Replace the following:
name
. The display name for the scheduled query. The display name can be any value that you can identify later if you need to modify the query.table
. The destination table for the query results. --target_dataset
is an alternative way to name the target dataset for the query results, when used with DDL and DML queries.--destination_table
or --target_dataset
, but not both.interval
. When used with bq query
, makes a query a recurring scheduled query. A schedule for how often the query should run is required. For details about valid schedules including custom intervals, see the schedule
field under Resource: TransferConfig
. Examples: --schedule='every 24 hours'
--schedule='every 3 hours'
--schedule='every monday 09:00'
--schedule='1st sunday of sep,oct,nov 00:00'
Optional flags:
--project_id
is your project ID. If --project_id
isn't specified, the default project is used.
--replace
overwrites the destination table with the query results after every run of the scheduled query. Any existing data is erased. For non-partitioned tables, the schema is also erased.
--append_table
appends results to the destination table.
For DDL and DML queries, you can also supply the --location
flag to specify a particular region for processing. If --location
isn't specified, the nearest Google Cloud location is used.
For example, the following command creates a scheduled query named My Scheduled Query
using the query SELECT 1 from mydataset.test
. The destination table is mytable
in the dataset mydataset
. The scheduled query is created in the default project:
bq query \ --use_legacy_sql=false \ --destination_table=mydataset.mytable \ --display_name='My Scheduled Query' \ --schedule='every 24 hours' \ --replace=true \ 'SELECT 1 FROM mydataset.test'
Option 2: Use the bq mk
command.
Scheduled queries are a kind of transfer. To schedule a query, you can use the bq command-line tool to make a transfer configuration.
Queries must be in StandardSQL dialect to be scheduled.
Enter the bq mk
command and supply the following required flags:
--transfer_config
--data_source
--target_dataset
(optional for DDL and DML queries)--display_name
--params
Optional flags:
--project_id
is your project ID. If --project_id
isn't specified, the default project is used.
--schedule
is how often you want the query to run. If --schedule
isn't specified, the default is 'every 24 hours' based on creation time.
For DDL and DML queries, you can also supply the --location
flag to specify a particular region for processing. If --location
isn't specified, the nearest Google Cloud location is used.
--service_account_name
is for authenticating your scheduled query with a service account instead of your individual user account.
--destination_kms_key
specifies the key resource ID for the key if you use a customer-managed encryption key (CMEK) for this transfer. For information about how CMEKs work with the BigQuery Data Transfer Service, see Specify encryption key with scheduled queries.
bqmk\ --transfer_config\ --target_dataset=dataset\ --display_name=name\ --params='parameters'\ --data_source=data_source
Replace the following:
dataset
. The target dataset for the transfer configuration. name
. The display name for the transfer configuration. The display name can be any value that you can identify later if you need to modify the query.parameters
. Contains the parameters for the created transfer configuration in JSON format. For example: --params='{"param":"param_value"}'
. query
parameter.destination_table_name_template
parameter is the name of your destination table. write_disposition
parameter, you can choose WRITE_TRUNCATE
to truncate (overwrite) the destination table or WRITE_APPEND
to append the query results to the destination table. data_source
. The data source: scheduled_query
.--service_account_name
flag is for authenticating with a service account instead of an individual user account.--destination_kms_key
specifies the key resource ID for the Cloud KMS key—for example, projects/project_name/locations/us/keyRings/key_ring_name/cryptoKeys/key_name
.For example, the following command creates a scheduled query transfer configuration named My Scheduled Query
using the query SELECT 1 from mydataset.test
. The destination table mytable
is truncated for every write, and the target dataset is mydataset
. The scheduled query is created in the default project, and authenticates as a service account:
bqmk\--transfer_config \--target_dataset=mydataset \--display_name='My Scheduled Query' \--params='{"query":"SELECT 1 from mydataset.test","destination_table_name_template":"mytable","write_disposition":"WRITE_TRUNCATE"}' \--data_source=scheduled_query \--service_account_name=abcdef-test-sa@abcdef-test.iam.gserviceaccount.com
The first time you run the command, you receive a message like the following:
[URL omitted] Please copy and paste the above URL into your web browser and follow the instructions to retrieve an authentication code.
Follow the instructions in the message and paste the authentication code on the command line.
Use the projects.locations.transferConfigs.create
method and supply an instance of the TransferConfig
resource.
Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
To view the status of your scheduled queries, in the navigation menu, click Scheduling and filter for Scheduled Query. Click a scheduled query to get more details about it.
Scheduled queries are a kind of transfer. To show the details of a scheduled query, you can first use the bq command-line tool to list your transfer configurations.
Enter the bq ls
command and supply the transfer flag --transfer_config
. The following flags are also required:
--transfer_location
For example:
bq ls \ --transfer_config \ --transfer_location=us
To show the details of a single scheduled query, enter the bq show
command and supply the transfer_path
for that scheduled query or transfer config.
For example:
bq show \ --transfer_config \ projects/862514376110/locations/us/transferConfigs/5dd12f26-0000-262f-bc38-089e0820fe38
Use the projects.locations.transferConfigs.list
method and supply an instance of the TransferConfig
resource.
Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
To update a scheduled query, follow these steps:
Scheduled queries are a kind of transfer. To update scheduled query, you can use the bq command-line tool to make a transfer configuration.
Enter the bq update
command with the required --transfer_config
flag.
Optional flags:
--project_id
is your project ID. If --project_id
isn't specified, the default project is used.
--schedule
is how often you want the query to run. If --schedule
isn't specified, the default is 'every 24 hours' based on creation time.
--service_account_name
only takes effect if --update_credentials
is also set. For more information, see Update scheduled query credentials.
--target_dataset
(optional for DDL and DML queries) is an alternative way to name the target dataset for the query results, when used with DDL and DML queries.
--display_name
is the name for the scheduled query.
--params
the parameters for the created transfer configuration in JSON format. For example: --params='{"param":"param_value"}'.
--destination_kms_key
specifies the key resource ID for the Cloud KMS key if you use a customer-managed encryption key (CMEK) for this transfer. For information about how customer-managed encryption keys (CMEK) works with the BigQuery Data Transfer Service, see Specify encryption key with scheduled queries.
bqupdate\ --target_dataset=dataset\ --display_name=name\ --params='parameters' --transfer_config\RESOURCE_NAME
Replace the following:
dataset
. The target dataset for the transfer configuration. This parameter is optional for DDL and DML queries. It is required for all other queries.name
. The display name for the transfer configuration. The display name can be any value that you can identify later if you need to modify the query.parameters
. Contains the parameters for the created transfer configuration in JSON format. For example: --params='{"param":"param_value"}'
. query
parameter.destination_table_name_template
parameter is the name of your destination table. This parameter is optional for DDL and DML queries. It is required for all other queries.write_disposition
parameter, you can choose WRITE_TRUNCATE
to truncate (overwrite) the destination table or WRITE_APPEND
to append the query results to the destination table. This parameter is optional for DDL and DML queries. It is required for all other queries.--destination_kms_key
specifies the key resource ID for the Cloud KMS key—for example, projects/project_name/locations/us/keyRings/key_ring_name/cryptoKeys/key_name
.RESOURCE_NAME
: The transfer's resource name (also referred to as the transfer configuration). If you don't know the transfer's resource name, find the resource name with: bq ls --transfer_config --transfer_location=location
.For example, the following command updates a scheduled query transfer configuration named My Scheduled Query
using the query SELECT 1 from mydataset.test
. The destination table mytable
is truncated for every write, and the target dataset is mydataset
:
bq update \ --target_dataset=mydataset \ --display_name='My Scheduled Query' \ --params='{"query":"SELECT 1 from mydataset.test","destination_table_name_template":"mytable","write_disposition":"WRITE_TRUNCATE"}' --transfer_config \ projects/myproject/locations/us/transferConfigs/1234a123-1234-1a23-1be9-12ab3c456de7
Use the projects.transferConfigs.patch
method and supply the transfer's Resource Name using the transferConfig.name
parameter. If you don't know the transfer's Resource Name, use the bq ls --transfer_config --transfer_location=location
command to list all transfers or call the projects.locations.transferConfigs.list
method and supply the project ID using the parent
parameter.
Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
If you try to update a scheduled query you don't own, the update might fail with the following error message:
Cannot modify restricted parameters without taking ownership of the transfer configuration.
The owner of the scheduled query is the user associated with the scheduled query or the user who has access to the service account associated with the scheduled query. The associated user can be seen in the configuration details of the scheduled query. For information on how to update the scheduled query to take ownership, see Update scheduled query credentials. To grant users access to a service account, you must have the Service Account user role.
Any user who is not the owner of the scheduled query, but has access to all resources mentioned by the query, might still be allowed to update the query. This scenario is only supported if the query can be validated within a minute or two. Otherwise, you get the same error message mentioned previously. If the query is too complex, you can instead update the scheduled query credentials to take direct ownership of the scheduled query or to use a service account.
If you're scheduling an existing query, you might need to update the user credentials on the query. Credentials are automatically up to date for new scheduled queries.
Some other situations that could require updating credentials include the following:
You receive an INVALID_USER error when you attempt to schedule the query:
Error code 5 : Authentication failure: User Id not found. Error code: INVALID_USERID
You receive the following restricted parameters error when you attempt to update the query:
Cannot modify restricted parameters without taking ownership of the transfer configuration.
To refresh the existing credentials on a scheduled query:
Find and view the status of a scheduled query.
Click the MORE button and select Update credentials.
Allow 10 to 20 minutes for the change to take effect. You might need to clear your browser's cache.
Scheduled queries are a kind of transfer. To update the credentials of a scheduled query, you can use the bq command-line tool to update the transfer configuration.
Enter the bq update
command and supply the transfer flag --transfer_config
. The following flags are also required:
--update_credentials
Optional flag:
--service_account_name
is for authenticating your scheduled query with a service account instead of your individual user account.For example, the following command updates a scheduled query transfer configuration to authenticate as a service account:
bqupdate\--update_credentials \--service_account_name=abcdef-test-sa@abcdef-test.iam.gserviceaccount.com --transfer_config \projects/myproject/locations/us/transferConfigs/1234a123-1234-1a23-1be9-12ab3c456de7
Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
In addition to scheduling a query to run in the future, you can also trigger immediate runs manually. Triggering an immediate run would be necessary if your query uses the run_date
parameter, and there were issues during a prior run.
For example, every day at 09:00 you query a source table for rows that match the current date. However, you find that data wasn't added to the source table for the last three days. In this situation, you can set the query to run on historical data within a date range that you specify. Your query runs using combinations of run_date
and run_time
parameters that correspond to the dates you configured in your scheduled query.
After setting up a scheduled query, here's how you can run the query by using a historical date range:
After clicking Schedule to save your scheduled query, you can click the Scheduled queries button to see the list of scheduled queries. Click any display name to see the query schedule's details. At the top right of the page, click Schedule backfill to specify a historical date range.
The chosen runtimes are all within your selected range, including the first date and excluding the last date.
Example 1
Your scheduled query is set to run every day 09:00
Pacific Time. You're missing data from January 1, January 2, and January 3. Choose the following historic date range:
Start Time = 1/1/19
End Time = 1/4/19
Your query runs using run_date
and run_time
parameters that correspond to the following times:
Example 2
Your scheduled query is set to run every day 23:00
Pacific Time. You're missing data from January 1, January 2, and January 3. Choose the following historic date ranges (later dates are chosen because UTC has a different date at 23:00 Pacific Time):
Start Time = 1/2/19
End Time = 1/5/19
Your query runs using run_date
and run_time
parameters that correspond to the following times:
After setting up manual runs, refresh the page to see them in the list of runs.
To manually run the query on a historical date range:
Enter the bq mk
command and supply the transfer run flag --transfer_run
. The following flags are also required:
--start_time
--end_time
bqmk\ --transfer_run\ --start_time='start_time'\ --end_time='end_time'\resource_name
Replace the following:
start_time
and end_time
. Timestamps that end in Z or contain a valid time zone offset. Examples: resource_name
. The scheduled query's (or transfer's) Resource Name. The Resource Name is also known as the transfer configuration.For example, the following command schedules a backfill for scheduled query resource (or transfer configuration): projects/myproject/locations/us/transferConfigs/1234a123-1234-1a23-1be9-12ab3c456de7
.
bq mk \ --transfer_run \ --start_time 2017-05-25T00:00:00Z \ --end_time 2017-05-25T00:00:00Z \ projects/myproject/locations/us/transferConfigs/1234a123-1234-1a23-1be9-12ab3c456de7
For more information, see bq mk --transfer_run
.
Use the projects.locations.transferConfigs.scheduleRun method and supply a path of the TransferConfig resource.
Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
To delete a scheduled query on the Scheduled queries page of the Google Cloud console, do the following:
On the Scheduled query details page, click Delete.
Alternatively, you can delete a scheduled query on the Scheduling page of the Google Cloud console:
Select Delete.
Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
To pause the scheduled runs of a selected query without deleting the schedule, you can disable the schedule.
To disable a schedule for a selected query, follow these steps:
Select Disable.
To enable a disabled scheduled query, click the
Actions menu for the scheduled query that you want to enable and select Enable.Scheduled queries are executed with the creator's credentials and project, as if you were executing the query yourself. Scheduled queries are always run as batch query jobs.
Although scheduled queries use features of BigQuery Data Transfer Service, they are not transfers and are not subject to the load jobs quota. Scheduled queries are subject to the same BigQuery quotas and limits as manual queries.
Scheduled queries are priced the same as manual BigQuery queries.
Scheduled queries are supported in the following locations.
Region description | Region name | Details |
---|---|---|
Columbus, Ohio | us-east5 | |
Dallas | us-south1 | |
Iowa | us-central1 | |
Las Vegas | us-west4 | |
Los Angeles | us-west2 | |
Mexico | northamerica-south1 | |
Montréal | northamerica-northeast1 | |
Northern Virginia | us-east4 | |
Oregon | us-west1 | |
Salt Lake City | us-west3 | |
São Paulo | southamerica-east1 | |
Santiago | southamerica-west1 | |
South Carolina | us-east1 | |
Toronto | northamerica-northeast2 | |
Region description | Region name | Details |
---|---|---|
Delhi | asia-south2 | |
Hong Kong | asia-east2 | |
Jakarta | asia-southeast2 | |
Melbourne | australia-southeast2 | |
Mumbai | asia-south1 | |
Osaka | asia-northeast2 | |
Seoul | asia-northeast3 | |
Singapore | asia-southeast1 | |
Sydney | australia-southeast1 | |
Taiwan | asia-east1 | |
Tokyo | asia-northeast1 |
Region description | Region name | Details |
---|---|---|
Belgium | europe-west1 | |
Berlin | europe-west10 | |
Finland | europe-north1 | |
Frankfurt | europe-west3 | |
London | europe-west2 | |
Madrid | europe-southwest1 | |
Milan | europe-west8 | |
Netherlands | europe-west4 | |
Paris | europe-west9 | |
Stockholm | europe-north2 | |
Turin | europe-west12 | |
Warsaw | europe-central2 | |
Zürich | europe-west6 |
Region description | Region name | Details |
---|---|---|
Dammam | me-central2 | |
Doha | me-central1 | |
Tel Aviv | me-west1 |
Region description | Region name | Details |
---|---|---|
Johannesburg | africa-south1 |
Multi-region description | Multi-region name |
---|---|
Data centers within member states of the European Union1 | EU |
Data centers in the United States2 | US |
1 Data located in the EU
multi-region is only stored in one of the following locations: europe-west1
(Belgium) or europe-west4
(Netherlands). The exact location in which the data is stored and processed is determined automatically by BigQuery.
2 Data located in the US
multi-region is only stored in one of the following locations: us-central1
(Iowa), us-west1
(Oregon), or us-central2
(Oklahoma). The exact location in which the data is stored and processed is determined automatically by BigQuery.
@run_date
and @run_time
parameters, see Creating table snapshots with a scheduled query.Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-04-25 UTC.