Skip to content

Feature differences in the Data Productivity Cloud

Some components and features need specific treatment, mitigation, or workarounds when migrated. If you use any of the following features, make sure you understand what specific treatment each one will require.

API trigger

API triggers are supported in the Data Productivity Cloud, but the API works differently. Read the API documentation for further details.

You may need to update your trigger scripts to work with the Data Productivity Cloud API. Evaluate your triggers on a case-by-case basis and update where needed, guided by the API documentation.

Git

Unlike Matillion ETL where Git integration is an optional feature, the Data Productivity Cloud is built with Git as an integral element, providing pipeline version control and making it simple to collaborate and manage data pipelines within your team. Read Git in Designer to learn more about this feature of the Data Productivity Cloud.

If you currently use Git in Matillion ETL, we don't recommend that you use the same Git repository for the Data Productivity Cloud. The Data Productivity Cloud won't recognize the format of Matillion ETL files stored in Git. Although it is possible to connect to the same repository, you won't be able to access your previous Git history. The process of migrating to Git in the Data Productivity Cloud should be:

  1. Create a Data Productivity Cloud project using the Connect your own Git repository option.
  2. Connect the project to a new Git repository with your preferred provider.
  3. Migrate jobs that use Git.
  4. Perform any necessary manual changes to the imported pipelines.
  5. Commit and push the migrated pipelines to the Data Productivity Cloud Git repository.

If you don't currently use the optional Git feature within Matillion ETL, you simply need to select which type of Git repository you want to use in the Data Productivity Cloud and configure it prior to migrating your jobs.

OAuths

For security reasons, we do not migrate credentials such as OAuths from Matillion ETL to the Data Productivity Cloud. Any OAuths you have set up in Matillion ETL will have to be recreated manually in the Data Productivity Cloud to allow your pipelines to run. Read OAuth for details.

Secrets

For security reasons, we do not migrate credentials such as secrets and passwords from Matillion ETL to the Data Productivity Cloud. Any secrets or other credentials you have set up in Matillion ETL will have to be recreated manually in the Data Productivity Cloud to allow your pipelines to run. Read Secrets and secret definitions and Cloud provider credentials for details.

Passwords can't be entered directly into Data Productivity Cloud components. This is by design, to enforce security. All passwords must be stored in secrets, which the component references. Secrets are stored in the Data Productivity Cloud secret manager in a Full SaaS environment, or in your own cloud platform's secret manager in a Hybrid SaaS environment.

  1. Create secrets with the credentials that your pipelines will need to connect to third-party services. Read Secrets and secret definitions and Cloud provider credentials for details.
  2. Update components to point to the secrets you have created.

Webhook and queue triggers for pipelines

The Data Productivity Cloud doesn't natively support triggering pipelines from a webhook or a queue such as SQS or Azure Queue. However, the Data Productivity Cloud architecture shouldn't suffer from some of the internal queuing, scaling, or availability limitations that can make creating a queuing solution necessary for Matillion ETL, making triggering from a webhook or queue unnecessary in most scenarios.

We recommend using the Data Productivity Cloud API for running pipelines directly.

If you need to integrate the Data Productivity Cloud with an existing system based on webhooks or queues (such as triggering a pipeline when a file lands in an S3 bucket), we recommend using AWS Lambda or Azure Functions to implement an API call based on an event.

Variables

Read the following articles to understand how variables will be migrated to the Data Productivity cloud:

Components

Some components need specific treatment when migrated. The following articles describe this in detail: