Azure Blob Storage Load (Synapse)
The Azure Blob Storage Load component lets users load data into an existing table from objects stored in Azure Blob Storage.
Note
When performing a COPY command to load files with a Unix-style line ending via this component, users must specify the line ending as 0x0a
in the Record Delimiter property. \\r\
is not supported by the Synapse COPY command.
Example characters are provided inside [] for readability.
Properties
Name
= string
A human-readable name for the component.
Is Public Container
= drop-down
Specify whether the data is being loaded from a publicly accessible Azure storage bucket.
Azure Storage Location
= string
Select the Azure storage location (including filepath) for any data to be loaded.
When a user enters a forward slash character / after a folder name, a validation of the file path is triggered. This works in the same manner as the Go button.
Schema Name
= drop-down
Select the table schema. The special value, [Environment Default], will use the schema defined in the environment. For more information on schemas, read the Azure Synapse documentation.
Table Name
= drop-down
Select the table into which data will be loaded.
Column List
= column editor
- Target Column Value: Select the target column.
- Default Value: Specify the default value.
- Field Number: Set the field number for the column value.
File Format
= drop-down
Select a file format from the dropdown menu.
File Type
= drop-down
Select a file type. Available file types include CSV, ORC, PARQUET.
Field Delimiter
= string
(CSV only) Specify a delimiter to separate columns. The default is a comma [,].
A [TAB] character can be specified as "\ ".
Field Quote
= string
(CSV only) Specify a single character to be used as a quote character. The default character is a double quotation mark ["]. Extended ASCII characters are not supported with UTF-8 for this property.
This property applies to CSV files only.
Field quote characters are escaped in string columns where there is a presence of a double field quote delimiter.
Record Delimiter
= string
(CSV only) Set a delimiter to be used to separate records (rows) in the file.
When performing a COPY command to load files with a Unix-style line ending via this component, users must specify the line ending as 0x0a in the Record Delimiter property. \\r\
is not supported by the Synapse COPY command.
Date Format
= string
(CSV only) Specify the date format of the date mapping to SQL server date formats. To learn more about Transact-SQL date and time data types and functions, please read Microsoft's documentation.
First Row
= integer
(CSV only) Specify the first row that is read in all files for the COPY command. The default value is 1.
Error File Location
= string
(CSV only) Specify the directory where the rejected rows and the corresponding error file should be written. Users can specify the full path from the storage account, or the path relative to the container. If the specified container does not exist, one is created on the user's behalf. A child directory is created with the name "rejectedrows". For more information, please refer to Microsoft's documentation.
Encoding Type
= drop-down
(CSV only) Select the encoding type. The default is UTF-8. This setting specifies the data encoding standard for the files load by the COPY command.
Compress
= drop-down
Specify the data compression method for the external data. The default setting is "None".
Max Errors
= integer
Specify the maximum number of rejected rows allowed in the load before the COPY operation is cancelled. Each row that the COPY operation cannot import is ignored and counted as one error. The default value for this property is 0.
Identity Insert
= drop-down
Specify whether the identity value or values in the imported data file are used for the identity column. The default setting is Off. When off, the identity values for a given column are verified, but not imported. Unique values will be assigned based on the seed and increment values specified during table creation. For more information, please refer to Microsoft's documentation.
Snowflake | Delta Lake on Databricks | Amazon Redshift | Google BigQuery | Azure Synapse Analytics |
---|---|---|---|---|
❌ | ❌ | ❌ | ❌ | ✅ |