# BigQuery - Inserting rows into BigQuery

Inserting rows into BigQuery can be useful when your data source produces a lot of data which needs to be streamed into BigQuery frequently. Workato provides you with 2 actions to do so:

For data sources that allow you to bulk export data and if you require data only as longer intervals (hours to days), you should consider our load file actions which guarantee faster ingestion speeds on a per-row basis and lower tasks counts.

# Insert row

This action inserts a single row into a table in BigQuery via streaming. There is no limit to the number of rows you can stream per day. When rows are streamed, this data can take up to 90 minutes to become available for copy and export operations. Sample Recipe (opens new window)

Insert row streaming action Insert row action

# Input fields

Field Description
Project The project available in the connection to be billed for the query.
Dataset The dataset which the action or trigger will pull the possible tables from.
Table The table inside the dataset.
Table Fields Only required if table ID is a datapill Declare the columns of the table. This should be the same across all possible values of the datapill.
Ignore schema mismatch If set to "No" and values that are streamed that do not match the expected data type, an error will be thrown. Set to "Yes" to ignore these rows
Fields The columns of the table you have selected.
Insert ID This is used to deduplicate rows when streaming. BigQuery will not stream rows again if the insert IDs are the same.

# Output fields

Field Description
Errors Contains all the errors that occurred during the streaming of this row. Use this to check if inserting a row had an error so you can stream it once more.

# Insert rows in batches

This action inserts a batch of rows into a table in BigQuery via streaming. There is no limit to the number of rows you can stream per day. When rows are streamed, this data can take up to 90 minutes to become available for copy and export operations. Sample recipe (opens new window)

Insert rows streaming action Insert rows action

# Input fields

Field Description
Project The project available in the connection to be billed for the query.
Dataset The dataset which the action or trigger will pull the possible tables from.
Table The table inside the dataset.
Table Fields Only required if table ID is a datapill Declare the columns of the table. This should be the same across all possible values of the datapill.
Ignore schema mismatch If set to "No" and values that are streamed that do not match the expected data type, an error will be thrown. Set to "Yes" to ignore these rows
Fields The columns of the table you have selected.
Insert ID This is used to deduplicate rows when streaming. BigQuery will not stream rows again if the insert IDs are the same.

# Output fields

Field Description
Insert Errors Contains all the errors that have occurred during the streaming of each row. Use this to check if inserting a row had an error so you can stream it once more.
Failed rows Contains data about each of the failed rows. Use this to retry streaming the rows


Last updated: 3/22/2023, 4:16:34 AM