Amazon S3 trigger - New/updated CSV file

Triggers when a CSV file is added or updated in a selected bucket/folder in Amazon S3.

Checks selected folder for new or updated CSV file once every poll interval. The poll interval can be 10 mins or 5 mins, depending on your plan. Check the Pricing and Plans page to find out more. The output includes the file’s metadata and file contents, which are CSV rows delivered in batches.

Note that in Amazon S3, when a file is renamed, it is considered a new file. When a file is uploaded and overwrites an existing file with the same name, it is considered an updated file but not a new file.

Input fields

Field name Description
When first started, this recipe should pick up events from When recipe starts for the first time, it will pick up CSV files created or updated from this specified time. Once recipe has been run or tested, value cannot be changed. Learn more about this field here.
Bucket region The region of the bucket to monitor for new/updated file, e.g. us-west-2. In Amazon S3, go to Bucket > Properties > Static website hosting to find your region in the Endpoint URL.
Bucket The bucket to monitor for new/updated CSV file. Select a bucket from the picklist or enter the bucket name directly.
Folder The folder to monitor for new/updat ed CSV file. Select a folder from the picklist or enter the folder path directly.
Include sub-folders Select Yes to monitor sub-folders for new/updated CSV file as well.
Include files not ending with .csv? Handle the cases when your CSV files exported from other systems may not have .csv extension.
Column names The column names of the CSV file. Upload a sample CSV file to automatically generate column names, or add column names manually.
Column delimiter Delimiter separating the columns in the CSV file.
Contains header row? Select Yes if CSV file contains header row. Workato will not process that row as data.
Batch size Workato divides the CSV file into smaller batches to process more efficiently. This field defines the number of CSV rows to process in each batch (Maximum of 1000 rows/batch). Use a larger batch size to increase data throughput. In some cases, Workato will automatically reduce batch size to avoid exceeding API limit. Learn more about Batch Processing.

This trigger supports Trigger Condition, which allows you to filter trigger events.

Output fields

Field name Description
File File name Full name of the file.
Last modified Last modified timestamp of the file.
E tag The hash of the file object, generated by Amazon S3.
Size The file size in bytes.
Storage class Storage class of this file object. Usually S3 Standard.
CSV rows Row number The number of this CSV row.
CSV columns Contains all column values in this CSV row. You can use the nested datapills to map each column values.
List size Number of rows in this CSV rows list.

results matching ""

    No results matching ""