Designer overview
Matillion Designer lets you create your own complex ETL pipeline workflows that can be scheduled and run in a Fully-Managed or Hybrid Cloud architecture.
- Fully-Managed: Use a Matillion hosted agent and secure secrets manager to execute your ETL tasks. This is the recommended starting solution.
- Hybrid Cloud: Host your own agent and secrets in your AWS cloud. This requires additional setup. See Agent Setup for details.
You can access Designer through Design data pipelines on the Hub or from the Designer button in the menu.
Getting started
Getting started with Designer is simple. With Matillion's Fully-Managed solution, an agent and secret store will be provided so you can skip setting up infrastructure and get right into designing pipelines.
Designer currently supports Snowflake as a data warehouse so you'll need your Snowflake account, username, password, and warehouse connection details.
Note
PrivateLink to Snowflake is not currently supported.
Note
Some components may have limitations when using a Fully-Managed solution. These limitations are noted in the relevant component documentation.
- Log in to Hub.
- Choose an account.
- Click Design data pipeline from the Hub or Designer from the menu.
- Provide some information about yourself as the administrator of this new account.
- You will receive notice that your agent is being created. A preset project name will be assigned to the project created for you.
- Provide a name for your account and region for the account to reside in.
- After a short wait for the setup to complete, enter your Snowflake credentials. Your Snowflake account name should include the region of the account. For example,
myaccount.eu-central-1
. - Select a default role, warehouse, database, and schema.
- Click Finish and wait for your project to be created. You will be taken straight to the Designer UI.
First time users will be allocated credits as part of the trial onboarding process. For more information, read Free trial.
Designer projects
Projects are groups of resources that work together to allow the creation and execution of pipelines. These resources are defined independently of one another so they can be mixed and matched later. For example, a defined secret can be used in multiple components, pipelines, and environments.
- Environments: A configuration to a target cloud data warehouse, including the default database and schema to be used. Currently only supports Snowflake.
- Schedules: A set of rules dictating when pipelines are automatically run. Schedules are configured to work with a chosen environment, agent, and pipeline, so these can be used in multiple combinations. For example, a single pipeline can be included in multiple schedules to run in different environments.
- Secret definitions: Secrets are used throughout the Designer to securely store sensitive credentials. For example, once defined, these can be used for authentication in connectors and environments. Secrets are stored in your cloud provider's secret manager (AWS Secret Manager) as configured in your agent, and these are the secrets that the Designer has access to.
- OAuths: Authorised connections to third party services using OAuth 2.0 can be stored and used with connectors in your Designer pipelines.
- Branches: Designer is built with a focus on Git source control. Users can create multiple branches within a project to store and track pipelines. Changes to pipelines within the Designer should be committed and merged to make those changes available to other team members through the Designer. Clicking on a branch's name will take you to the Designer for that branch.
- Designer: The Designer itself is a UI that allows for the easy creation of pipelines, incorporating many elements of the above described resources.
Designer pipelines
Pipelines are a logical sequence of components—task-specific building blocks that can be dragged into pipelines and connected to one another to make a complex workflow. This workflow is shown on the canvas. There are two types of pipelines:
- Orchestration pipelines: Orchestration pipelines deal with the loading of data from source system to target data warehouse. Typical orchestration components are connectors, flow logic components, and scripting components.
- Transformation pipelines: Transformation pipelines deal with transforming table data that exists in your target data warehouse, typically because of an orchestration pipeline. Transformation components are often analogs of SQL operations such as creating and deleting tables, joining data, or performing calculations.