Skip to content

Azure SQL Bulk Output



This feature is only available for instances hosted on Azure.

Load the contents of a table (or view) into a table in a Microsoft Azure database.


Name = string

A human-readable name for the component.

Azure SQL Server Name = drop-down

The server IP or DNS address of the Azure SQL server.

Database Name = drop-down

This is the name of the database within your Azure instance.

Username = string

This is your Azure connection username.

Password = string

The corresponding password. Store the password in the component, or create a managed entry for the password using Manage Passwords (recommended).

JDBC Options = column editor

  • Parameter: A JDBC parameter supported by the Database Driver. The available parameters are determined automatically from the driver, and may change from version to version. They are usually not required, since sensible defaults are assumed.
  • Value: A value for the given Parameter.

Schema = drop-down

Select the table schema. The special value, [Environment Default], will use the schema defined in the environment. For more information on using multiple schemas, read Schemas.

Source Table = drop-down

Select the source table to copy to Azure. The chosen source schema determines what tables will be available for selection.

Target Table = string

A name for the output table.

Target Schema = string

The name of the schema used to create the target table.

Load Columns = dual listbox

The columns from the source table to include in the output job. Use the arrow buttons to include or exclude columns. Columns on the right-hand side will be included. Columns on the left-hand side will be excluded.

Table Maintenance drop-down

Define how the target table is treated.

  • Create If Not Exists: If the named target table doesn't yet exist, it will be created.
  • None: Assume the Azure database already has the table defined with the correct structure.
  • Replace: if the named target table already exists, it will be dropped and replaced by a newly created table. Please use this setting with care.

Primary Key = dual listbox

Select one or more columns to be designated as the table's primary key.

Truncate Target Table = drop-down

Whether or not to truncate the target table before loading data.

On Warnings = drop-down

Choose whether to Continue with the load if an error is raised, or to Fail the run.

Additional Copy Options = string

Any additional options that you want to apply to the copy. Some of these may conflict with the options the component already sets, in particular, care is taken to escape the data to ensure that it loads into the target database even if the data contains row and/or column delimiters, so you should never override the escape or delimiter options.

Batch Size = integer

This is optional, and specifies the number of rows to load to the target between each COMMIT. On a very large export, this may be desirable to keep the size of the Azure log files from growing very large before the data is committed.

While this parameter is optional, the default value is 5000.


A select query is issued against the source table. The output is formatted in an appropriate way to load into the target database, and data is streamed in, meaning that no temporary files are used and the load into the target table begins almost immediately.

Snowflake Delta Lake on Databricks Amazon Redshift Google BigQuery Azure Synapse Analytics