Skip to content

Migration: Bash scripts

Bash scripts aren't supported in the Data Productivity Cloud, as there is no underlying Linux virtual machine to provide the compute needed to run Bash.


Migration path

Any Bash processes will need refactoring to run in a Data Productivity Cloud pipeline. Our recommendations are (in order of preference):

  1. See if the functionality has a native component equivalent, such as the Print Variables component.
  2. See if the workload can be written in a Python script using the Python Pushdown component. This option is only available for Snowflake environments.
  3. See if the workload can be written in a Python script using the Python Script component. This option is only available for Hybrid SaaS deployments.
  4. Use the Bash Pushdown component, and supply your own Linux machine that the Data Productivity Cloud can connect to over SSH. Advantages of this approach are:
    • You can set the CPU and memory on the Linux VM as needed.
    • You can install any packages or third-party applications you need on the Linux VM.

However, when choosing to use Bash Pushdown, consider the following:

  • You need to set up, secure, update, and manage the Linux machine yourself.
  • You need network access from the Data Productivity Cloud agent to the compute source.

If none of these options suit your use case, reach out to your Matillion Account Manager to discuss alternatives.

Automatic variables

The Data Productivity cloud doesn't support directly accessing automatic variables through the Bash Script component.

If you require this functionality, you can use an Update Scalar component to write the values to user-defined variables, which can then be passed to the script.