Oracle database is a multi-model database management system by Oracle Corporation. It can be hosted on-premise or in a private cloud.
All releases of Oracle database are supported.
How to connect to Oracle on Workato
The Oracle connector uses basic authentication to authenticate with Oracle.
|Connection name||Give this Oracle connection a unique name that identifies which Oracle instance it is connected to.|
|On-prem secure agent||Choose an on-premise agent if your database is running in a network that does not allow direct connection. Before attempting to connect, make sure you have an active on-premise agent. Refer to the On-premise agent guide for more information.|
|Username||Username to connect to Oracle.|
|Password||Password to connect to Oracle.|
|Host||URL of your hosted server.|
|Port||Port number that your server is running on, typically 1521.|
|SID/Service name||SID or Service name of your Oracle database instance you wish to connect to.|
Permissions required to connect
At minimum, the database user account must be granted
SELECT permission to the database specified in the connection.
If we are trying to connect to a named schema (
HR_PROD) in an Oracle instance, using a new database user
WORKATO, the following example queries can be used.
First, create a new user dedicated to integration use cases with Workato.
CREATE USER WORKATO IDENTIFIED BY password;
CONNECT to this user.
GRANT CONNECT TO WORKATO;
This allows the user to have login access to the Oracle instance. However, this user will not have access to any tables.
The next step is to grant access to
SUPPLIER table in the
HR_PROD schema. In this example, we only wish to grant
GRANT SELECT,INSERT ON HR_PROD.SUPPLIER TO WORKATO;
Finally, check that this user has the necessary permissions. Run a query to see all grants.
SELECT * FROM DBA_ROLE_PRIVS WHERE GRANTEE = 'WORKATO'; SELECT * FROM DBA_TAB_PRIVS WHERE GRANTEE = 'WORKATO';
This should return the following minimum permission to create a Oracle connection on Workato.
+---------+--------------+--------------+--------------+ | GRANTEE | GRANTED_ROLE | ADMIN_OPTION | DEFAULT_ROLE | +---------+--------------+--------------+--------------+ | WORKATO | CONNECT | NO | YES | +---------+--------------+--------------+--------------+ +---------+---------+------------+---------+-----------+-----------+-----------+ | GRANTEE | OWNER | TABLE_NAME | GRANTOR | PRIVILEGE | GRANTABLE | HIERARCHY | +---------+---------+------------+---------+-----------+-----------+-----------+ | WORKATO | HR_PROD | SUPPLIER | ROOT | SELECT | NO | NO | | WORKATO | HR_PROD | SUPPLIER | ROOT | INSERT | NO | NO | +---------+---------+------------+---------+-----------+-----------+-----------+ 3 rows in set (0.61 sec)
Working with the Oracle connector
Table, view and stored procedure
The Oracle connector works with all tables, views and stored procedures. These are available in pick lists in each trigger/action or you can provide the exact name.
Select a table/view from pick list
Provide exact table/view name in a text field
Single row vs batch of rows
Oracle connector can read or write to your database either as a single row or in batches. When using batch triggers/actions, you have to provide a batch size you wish to work with. The batch size can be any number between 1 and 100, with 100 being the maximum size limit.
Batch trigger inputs
Besides the difference in input fields, there is also a difference between the outputs of these 2 types of operations. A trigger that processes rows one at a time will have an output datatree that allows you to map data from that single row.
Single row output
However, a trigger that processes rows in batches will output them as an array of rows. The Rows datapill indicates that the output is a list containing data for each row in that batch.
Batch trigger output
As a result, the output of batch triggers/actions needs to be handled differently. This recipe uses a batch trigger for new rows in the
users table. The output of the trigger is used in a Salesforce bulk upsert action that requires mapping the Rows datapill into the source list.
Using batch trigger output
This input field is used to filter and identify rows to perform an action on. It is used in multiple triggers and actions in the following ways:
- filter rows to be picked up in triggers
- filter rows in Select rows action
- filter rows to be deleted in Delete rows action
This clause will be used as a
WHERE statement in each request. This should follow basic SQL syntax. Refer to this Oracle documentation for a full list of rules for writing Oracle statements.
Greater than or equal to
Less than or equal to
|IN(...)||List of values||
|LIKE||Pattern matching with wildcard characters (
|BETWEEN||Retrieve values with a range||
IS NOT NULL
NULL values check
Non-NULL values check
String values must be enclosed in single quotes (
'') and column identifiers used must exist in the table.
WHERE condition to filter rows based on values in a single column looks like this.
CURRENCY = 'USD'
If used in a Select rows action, this
WHERE condition will return all rows that have the value 'USD' in the
currency column. Just remember to wrap datapills with single quotes in your inputs.
Using datapills in
Column names that do not conform to standard rules (includes spaces, lower-case letters or special characters) must be enclosed in double quotes (
""). For example, PUBLISHER NAME must be enclosed in backquotes to be used as a valid identifier.
"PUBLISHER NAME" = 'USD'
WHERE condition with enclosed identifier
WHERE condition can also contain subqueries. The following query can be used on the
ID IN (SELECT "USER ID" FROM TICKETS WHERE PRIORITY >= 2)
When used in a Delete rows action, this will delete all rows in the
users table where at least one associated row in the
tickets table has a value of 2 in the
Using datapills in
WHERE condition with subquery
In all triggers and some actions, this is a required input. Values from this selected column are used to uniquely identify rows in the selected table.
As such, the values in the selected column must be unique. Typically, this column is the primary key of the table (e.g.
When used in a trigger, this column must be incremental. This constraint is required because the trigger uses values from this column to look for new rows. In each poll, the trigger queries for rows with a unique key value greater than the previous greatest value.
Let's use a simple example to illustrate this behavior. We have a New row trigger that processed rows from a table. The unique key configured for this trigger is
ID. The last row processed has
100 as it's
ID value. In the next poll, the trigger will use
ID >= 101 as the condition to look for new rows.
Performance of a trigger can be improved if the column selected to be used as the unique key is indexed.
This is required for New/updated row triggers. Values in this selected column are used to identify updated rows.
When a row is updated, the Unique key value remains the same. However, it should have it's Sort column updated to reflect the last updated time. Following this logic, Workato keeps track of values in this column together with values in the selected Unique key column. When a change in the Sort column value is observed, an updated row event will be recorded and processed by the trigger.
Let's use a simple example to illustrate this behavior. We have a New/updated row trigger that processed rows from a table. The Unique key and Sort column configured for this trigger is
UPDATED_AT respectively. The last row processed by the trigger has
ID value of
UPDATED_AT value of
2018-05-09 16:00:00.000000. In the next poll, the trigger will query for new rows that satisfy either of the 2 conditions:
UPDATED_AT > '2018-05-09 16:00:00.000000'
ID > 100 AND UPDATED_AT = '2018-05-09 16:00:00.000000'
For Oracle database, only date, timestamp, timestamp with time zone and timestamp with local time zone column types can be used.
Smart boolean conversion
Oracle does not have a built-in boolean column type. A popular workaround is to use a
NUMBER(1,0) column with a
CHECK (COLUMN_NAME IN(1,0)) constraint. Because of this, standard values from other applications will not map well to this column and may cause unexpected values or errors.
This checkbox allows you to enable automatic smart conversion. If this is set to Yes, the conversion will be applied to all columns with
NUMBER type and precision of
1. This reduces the amount of configuration needed to transform datapills in a recipe. The following table describes the logic for the boolean conversion.
|Input value||Converted value|