Environments
An environment defines the connection between a project and your chosen cloud data warehouse. Environments include useful defaults such as a default warehouse, database, and schema, that can be used to pre-populate component configurations in the Designer. Not added a project yet? Read Add project.
Best practice
We recommend using environments to separate your development and production environments:
- Use development environments for building, testing, and iterating on pipelines before they are deployed.
- Use production environments to run pipelines that are fully deployed to work on live data. Only stable and thoroughly tested pipelines should be deployed here.
- You can also use intermediate environments, such as
staging
,test
, orpreprod
, to validate pipelines before they are deployed to production. These can also be used for performance testing.
For more information, read Matillion's Unlocking Data Productivity DataOps guide.
Add an environment
- In the left navigation, click the Projects icon
.
- Select your project.
- Click the Environments tab.
- Click Add new environment.
Parameter | Description |
---|---|
Environment name | A unique name for the environment. Max 255 characters. |
Agent | A working agent. This is only required if you are using a Hybrid SaaS solution. To learn how to create an agent, read Create an agent. |
Default environment access | Use the drop-down menu to select the default access for all new and existing users added to the project. For more information, read Environment access. |
Click Continue.
Depending on the data platform that you selected when creating your project, follow the corresponding instructions below to specify your cloud data warehouse credentials and select your data warehouse defaults for this environment.
Snowflake
Prerequisites
Before configuring a Snowflake connection, you will need:
- A Snowflake role with the privileges required to set up this connection. For more information, read Snowflake role privileges.
- For key-pair authentication, the private key of a key pair. For more information, read Using Snowflake key-pair authentication.
- For Hybrid SaaS solutions, permission to create and edit secrets in AWS Secrets Manager or Azure Key Vault. For more information, read Using Snowflake key-pair authentication.
For details about Snowflake key-pair authentication, read the Snowflake guide to Configuring key-pair authentication.
Specify credentials
Use the reference tables below to set up your environment connection to your cloud data platform. If you're using a Full SaaS deployment, credentials such as passwords and private keys are stored directly as strings. However, if you're using a Hybrid SaaS deployment with your own AWS or Azure agent, credentials such as passwords and private keys are only retrieved via references to secrets created in either AWS Secrets Manager or Azure Key Vault.
Key-pair
We recommend using key-pair authentication to set up your connection to Snowflake, because Snowflake has announced plans to block single-factor password authentication by November 2025. For more information, read our Tech note.
Refer to this table if you're using Snowflake key-pair authentication.
Parameter | Description |
---|---|
Account | Enter your Snowflake account name and region. In the URL you use to log in to Snowflake, this is the part between https and snowflakecomputing.com . |
Credentials type | Select Key pair. |
Username | Your Snowflake username. |
Private key | Your Snowflake private key. To generate a key, read the Snowflake documentation for Generate the private key. The full content of the generated Snowflake private key file must be copied into this field, including the header and footer lines. Field only available if Credentials type is Key pair when using a Full SaaS deployment model. |
Passphrase | An optional passphrase to use with your private key. Field only available if Credentials type is Key pair and when using a Full SaaS deployment model. |
Vault name | For Hybrid SaaS on Azure deployment models only. Select the Azure Key Vault instance that this project will use to store secrets. Select [Default] to use the default key vault specified in the agent environment variables. |
Private key secret name | A named entry created in AWS Secrets Manager or Azure Key Vault denoting the secret that holds your Snowflake private key. Read Using snowflake key-pair authentication to learn how to store the key as a secret. Field only available if using a Hybrid SaaS deployment model. |
Passphrase secret name (optional) | A named entry created in AWS Secrets Manager or Azure Key Vault denoting the secret that holds your Snowflake key pair passphrase. Field only available if using a Hybrid SaaS deployment model. |
Passphrase secret key (optional) | The secret key tied to your passphrase secret name. Field only available if using a Hybrid SaaS deployment model. |
Note
If your private key has been shared, the format may have been altered. To correct this, run the following command to validate and convert the key to the correct format:
openssl rsa -in key.pem -check
Password
Refer to this table if you're using your Snowflake password to authenticate to Snowflake.
Parameter | Description |
---|---|
Account | Enter your Snowflake account name and region. In the URL you use to log in to Snowflake, this is the part between https and snowflakecomputing.com . |
Credentials type | Select Username and password. |
Username | Your Snowflake username. |
Password | Your Snowflake password. This field is only available if using a Full SaaS deployment; otherwise, you will specify your password as a secret. |
Vault name | For Hybrid SaaS on Azure deployment models only. Select the Azure Key Vault instance that this project will use to store secrets. Select [Default] to use the default key vault specified in the agent environment variables. |
Secret name | A named entry created in AWS Secrets Manager or Azure Key Vault for holding your Snowflake password. Field only available if using a Hybrid SaaS deployment model. |
Secret key | A named secret key tied to your secret name. Field only available if using a Hybrid SaaS on AWS deployment model. |
Select defaults
Parameter | Description |
---|---|
Default role | The default Snowflake role for this environment connection. Read Overview of Access Control to learn more. |
Default warehouse | The default Snowflake warehouse for this environment connection. Read Overview of Warehouses to learn more. |
Default database | The default Snowflake database for this environment connection. Read Database, Schema, and Share DDL to learn more. |
Default schema | The default Snowflake schema for this environment connection. Read Database, Schema, and Share DDL to learn more. |
Databricks
Specify credentials
Use the reference tables below to set up your environment connection to your cloud data platform. If you're using a Full SaaS deployment, credentials such as passwords and private keys are stored directly as strings. However, if you're using a Hybrid SaaS deployment with your own AWS or Azure agent, credentials such as passwords and private keys are only retrieved via references to secrets created in either AWS Secrets Manager or Azure Key Vault.
Parameter | Description |
---|---|
Instance name | Your Databricks instance name. Read the Databricks documentation to learn how to determine your instance name. |
Personal Access Token | Your Databricks personal access token. Read the Databricks documentation to learn how to create a personal access token. |
Vault name | For Hybrid SaaS on Azure deployment models only. Select the Azure Key Vault instance that this project will use to store secrets. Select [Default] to use the default key vault specified in the agent environment variables. |
Secret name | A named entry created in AWS Secrets Manager or Azure Key Vault. |
Secret key | For Hybrid SaaS on AWS deployment model only. A named secret key tied to your secret name. |
Select defaults
Parameter | Description |
---|---|
Endpoint/Cluster | The Databricks cluster that Data Productivity Cloud will connect to. |
Catalog | Choose a Databricks Unity Catalog to connect to. |
Schema | Choose a Databricks schema to connect to. |
Amazon Redshift
Specify credentials
Use the reference tables below to set up your environment connection to your cloud data platform. If you're using a Full SaaS deployment, credentials such as passwords and private keys are stored directly as strings. However, if you're using a Hybrid SaaS deployment with your own AWS or Azure agent, credentials such as passwords and private keys are only retrieved via references to secrets created in either AWS Secrets Manager or Azure Key Vault.
Parameter | Description |
---|---|
Endpoint | The physical address of the leader node. This will be either a name or an IP address. |
Port | This is usually 5439 or 5432, but it can be configured differently when setting up your Amazon Redshift cluster. |
Use SSL | Select this to encrypt communications between Data Productivity Cloud and Amazon Redshift. Some Amazon Redshift clusters may be configured to require this. |
Username | The username for the environment connection. |
Password | For Full SaaS deployment model only. Your Redshift password. |
Vault name | For Hybrid SaaS on Azure deployment models only. Select the Azure Key Vault instance that this project will use to store secrets. Select [Default] to use the default key vault specified in the agent environment variables. |
Secret name | For Hybrid SaaS deployment model only. A named entry created in AWS Secrets Manager or Azure Key Vault. |
Secret key | For Hybrid SaaS on AWS deployment model only. A named secret key tied to your secret name. |
Next, in the Specify AWS cloud credentials dialog, in the drop-down menu, choose one of the following options:
- Use the cloud credentials assigned to the agent you specified when creating this environment.
- Enter different cloud credentials. This will override the IAM role belonging to the agent you specified.
If you choose to enter different cloud credentials, use the fields to enter the cloud credential name, access key ID, and secret access key. For details about access keys, read the AWS documentation.
Select defaults
Parameter | Description |
---|---|
Default database | The database you created when setting up your Amazon Redshift cluster. You may run with multiple database names—in which case, choose the one you want to use for this environment. |
Default schema | This is public by default, but if you have configured multiple schemas within your Amazon Redshift database, you should specify the schema you want to use. |
Default S3 bucket | The S3 bucket that this environment will use for staging data by default, unless specifically overridden within a component. |
Note
If you use a Matillion Full SaaS solution, the cloud credentials associated with your environment will be used to access the S3 bucket.
If you use a Hybrid SaaS solution, your new environment will inherit the agent's execution role (service account role) to access the default S3 bucket specified here.
To overwrite this role, associate different cloud credentials with this environment after you have finished creating it. You can create these credentials before or after creating the environment.
Associate cloud provider credentials with an environment
Each environment in your project must have at least one set of cloud credentials associated with it. This will allow you to access account resources on different platforms other than that hosting your project. For example, if your project is on AWS and you want to access resources in Azure, you need to associate your Azure cloud credentials with the environment.
You can associate credentials from multiple providers, but only one set of credentials for each cloud provider. For example, you can associate both AWS and Azure credentials, but not two different AWS credentials.
You can associate credentials with an environment when you first Create cloud provider credentials, or you can associate them later as follows:
- In your project, click the Environments tab.
- Click the three dots ... on the corresponding row of the environment you want to associate, and select Associate Credentials.
- Select the credentials from the drop-down lists. You can associate one set of credentials for each cloud provider.
- Click Associate.
Manage environments
To view your environments:
- From the Your projects menu, select your project.
-
Navigate to the Environments tab.
Note
Click the column headers to sort your environments by name, default agent, cloud data warehouse account name, or credential type.
Edit an environment
- Click the three dots ... in the row of the environment you want to edit.
- Click Edit environment.
Delete an environment
Warning
Deleting an environment permanently removes the environment from your project. All artifacts and schedules in the deleted environment will be inaccessible. This action cannot be undone.
Before you can delete an environment, you must:
- Disable any active schedules that run pipelines in this environment.
- Change the default environment of any branches that currently use this environment as their default. For more information, read Branches.
To delete an environment:
- Click the three dots ... in the row of the environment you want to delete.
- Click Delete environment.
- In the confirmation dialog, enter the name of the environment you want to delete.
- Click Delete environment.