The Workday Extract component uses the Workday API to retrieve and store data—such as employee, financial, and business related data—from Workday to be either referenced by an external table or loaded into a table, depending on your cloud data warehouse. and load it into a table. You can then use transformation components to enrich and manage the data in permanent tables.
To extract data via the Workday Reporting as a Service API, use the Workday Custom Reports component instead.
Using this component may return structured data that requires flattening. For help with flattening such data, we recommend using the Extract Nested Data component for Snowflake.
Name = string
A human-readable name for the component.
Authentication = drop-down
Select an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorize an OAuth entry, read Workday Extract Authentication Guide.
Version = string
The version of Workday Web Services directory you want to use. The default is v39.0, but you can replace this with any valid version. Matillion ETL supports any Workday Web Services version listed here.
Web Service = drop-down
Select the Workday Web Service you want to query. The drop-down list will include all services available to the selected Version.
Operation = drop-down
Select the operation you want to perform on the selected Workday Web Service. The drop-down list will include all operations available to the selected Web Service.
Location = filepath
Provide an Amazon S3 bucket path, Google Cloud Storage (GCS) bucket path, or Azure Blob Storage path that will be used to store the data. The data can then be referenced by an external table. A folder will be created at this location with the same name as the target table.
Integration = drop-down
(GCP only) Choose your Google Cloud Storage Integration. Integrations are required to permit Snowflake to read data from and write to a Google Cloud Storage bucket. Integrations must be set up in advance of selecting them in Matillion ETL. To learn more about setting up a storage integration, read our Storage Integration setup guide.
Warehouse = drop-down
The Snowflake warehouse used to run the queries. The special value, [Environment Default], will use the warehouse defined in the environment. Read Overview of Warehouses to learn more.
Database = drop-down
The Snowflake database. The special value, [Environment Default], will use the database defined in the environment. Read Databases, Tables and Views - Overview to learn more.
Schema = drop-down
The Snowflake schema. The special value, [Environment Default], will use the schema defined in the environment. Read Database, Schema, and Share DDL to learn more.
Target Table = string
A name for the new table. Upon running the job, this table will be recreated and will drop any existing table of the same name.
Connect to a Workday Web Service and select an operation that lets you query data from that service. The results of the query operation are streamed into cloud storage and loaded into a target table.
|Snowflake||Delta Lake on Databricks||Amazon Redshift||Google BigQuery||Azure Synapse Analytics|