Skip to content

RDS Bulk Output

Overview

Note

This feature is only available for instances hosted on AWS.

RDS Bulk Output lets users load the contents of a table (or view) into a table in an Amazon RDS database.


Properties

Name = string

A human-readable name for the component.


RDS Type = drop-down

Select the database. Available options include:


RDS Endpoint = string

The RDS Database Endpoint. If the IAM role attached to the instance (or the manually entered credentials associated with the current environment) has the permissions granted to query the RDS endpoints, you may select the RDS endpoint from a list. Otherwise, you must enter it manually. It can be found in the RDS console and is a long dotted-name and port number, separated by a colon.


Database Name = string

The name of the database within your RDS instance.


Username = string

Your RDS connection username.


Password = string

The corresponding password. Store the password in the component, or create a managed entry for the password using Manage Passwords (recommended).


JDBC Options = column editor

  • Parameter: A JDBC parameter supported by the database driver. The available parameters are determined automatically from the driver, and may change from version to version. They are usually not required as sensible defaults are assumed.
  • Value: A value for the given Parameter.

Database = drop-down

The Snowflake database. The special value, [Environment Default], will use the database defined in the environment. Read Databases, Tables and Views - Overview to learn more.


Schema = drop-down

The Snowflake schema. The special value, [Environment Default], will use the schema defined in the environment. Read Database, Schema, and Share DDL to learn more.


Source Table = drop-down

Select the source table containing the data to be output to Intercom. The selected source schema determines which tables will be available in the dropdown list.


Target Table = string

A name for the new table.


Target Schema = string

A schema in the target database. Required if the RDS Type is PostgreSQL or SQL Server.


Load Columns = dual listbox

The columns from the source table to include in the output job. Use the arrow buttons to include or exclude columns. Columns on the right-hand side will be included. Columns on the left-hand side will be excluded.


Table Maintenance drop-down

Define how the target table is treated.

  • Create If Not Exists: If the named target table doesn't yet exist, it will be created.
  • None: Assume the RDS database already has the table defined with the correct structure.
  • Replace: if the named target table already exists, it will be dropped and replaced by a newly created table. Please use this setting with care.

Primary Key = dual listbox

Specify a set of columns to be used as the primary key for the target table. This is optional, but if specified allows you to UPSERT existing data in the target table.


Update Strategy = drop-down

Select how the output will handle replacing rows with matching primary keys. Options are:

  • Ignore: Any existing row in the target that matches the primary key is not replaced and the matching row from the source table is not uploaded.
  • Replace: Rows in the target table that match the primary key are replaced with the matching rows from the source table.

Truncate Target Table = drop-down

Whether or not to truncate the target table before loading data.


On Warnings = drop-down

Choose whether to Continue with the load if an error is raised, or to Fail the run.


Additional Copy Options = string

Any additional options that you want to apply to the copy. Some of these may conflict with the options the component already sets—in particular, care is taken to escape the data to ensure it loads into the target database even if the data contains row and/or column delimiters, so you should never override the escape or delimiter options.

Options are documented here:


Batch Size = integer

This is optional, and specifies the number of rows to load to the target between each COMMIT. On a very large export, this may be desirable to keep the size of the Amazon RDS log files from growing very large before the data is committed.

Name = string

A human-readable name for the component.


RDS Type = drop-down

Select the database. Available options include:


RDS Endpoint = string

The RDS Database Endpoint. If the IAM role attached to the instance (or the manually entered credentials associated with the current environment) has the permissions granted to query the RDS endpoints, you may select the RDS endpoint from a list. Otherwise, you must enter it manually. It can be found in the RDS console and is a long dotted-name and port number, separated by a colon.


Database Name = string

The name of the database within your RDS instance.


Username = string

Your RDS connection username.


Password = string

The corresponding password. Store the password in the component, or create a managed entry for the password using Manage Passwords (recommended).


JDBC Options = column editor

  • Parameter: A JDBC parameter supported by the database driver. The available parameters are determined automatically from the driver, and may change from version to version. They are usually not required as sensible defaults are assumed.
  • Value: A value for the given Parameter.

Schema = drop-down

Select the table schema. The special value, [Environment Default], will use the schema defined in the environment. For more information on using multiple schemas, read Schemas.


Source Table = drop-down

Select the source table containing the data to be output to Intercom. The selected source schema determines which tables will be available in the dropdown list.


Target Table = string

A name for the new table.


Target Schema = string

A schema in the target database. Required if the RDS Type is PostgreSQL or SQL Server.


Load Columns = dual listbox

The columns from the source table to include in the output job. Use the arrow buttons to include or exclude columns. Columns on the right-hand side will be included. Columns on the left-hand side will be excluded.


Table Maintenance drop-down

Define how the target table is treated.

  • Create If Not Exists: If the named target table doesn't yet exist, it will be created.
  • None: Assume the RDS database already has the table defined with the correct structure.
  • Replace: if the named target table already exists, it will be dropped and replaced by a newly created table. Please use this setting with care.

Primary Key = dual listbox

Specify a set of columns to be used as the primary key for the target table. This is optional, but if specified allows you to UPSERT existing data in the target table.


Update Strategy = drop-down

Select how the output will handle replacing rows with matching primary keys. Options are:

  • Ignore: Any existing row in the target that matches the primary key is not replaced and the matching row from the source table is not uploaded.
  • Replace: Rows in the target table that match the primary key are replaced with the matching rows from the source table.

Truncate Target Table = drop-down

Whether or not to truncate the target table before loading data.


On Warnings = drop-down

Choose whether to Continue with the load if an error is raised, or to Fail the run.


Additional Copy Options = string

Any additional options that you want to apply to the copy. Some of these may conflict with the options the component already sets—in particular, care is taken to escape the data to ensure it loads into the target database even if the data contains row and/or column delimiters, so you should never override the escape or delimiter options.

Options are documented here:


Batch Size = integer

This is optional, and specifies the number of rows to load to the target between each COMMIT. On a very large export, this may be desirable to keep the size of the Amazon RDS log files from growing very large before the data is committed.


Strategy

A select query is issued against the source table. The output is formatted in an appropriate way to load into the target database, and data is streamed in.


Video


Snowflake Delta Lake on Databricks Amazon Redshift Google BigQuery Azure Synapse Analytics