Skip to content

Rewrite External Table

The Rewrite External Table component uses SQL provided by the input connection and writes the results out to a new external table.

Note

Running this component will overwrite any existing data on the chosen S3 location, and performing this action on the same location from which the source data is referenced is typically not recommended. The Matillion ETL instance must have access to the chosen S3 bucket and location.

External tables are part of Amazon Redshift Spectrum and may not be available in all regions. For a list of supported regions, read Querying external data using Amazon Redshift Spectrum.

For information about working with external tables, read [Creating external tables for Redshift Spectrum](http://docs.aws.amazon.com/redshift/latest/dg/c-spectrum-external-tables.html.


Properties

Name = string

A human-readable name for the component.


Schema = drop-down

Select the table schema. The special value, [Environment Default], will use the schema defined in the environment. For more information on using multiple schemas, read Schemas.


Target Table = drop-down

The name of the newly created external table.


Location = drop-down

Select the file target location, including S3 bucket path. The Matillion ETL instance must have access to these data (typically, access is granted according to the instance's AWS credentials or if the bucket is public).

A directory named after the target table will be created at this location and then populated with files.


Partition = dual listbox

(Optional) Select source columns to be partitions when writing data. Chosen columns will be queried for distinct values and partitioned file directories will be created (if they don't exist) for those values.


File Format = drop-down

Select the file format. The default setting is Delimited.


Snowflake Delta Lake on Databricks Amazon Redshift Google BigQuery Azure Synapse Analytics