Skip to content

Troubleshooting common Designer errors

This guide lists and explains some common error messages that you may encounter in Designer. Where applicable, it also explains how to resolve the underlying issue. After following the steps to resolve an issue, if you still see the same error message, raise a Support ticket. For more information about Support tickets, read Getting support.

If you encounter an error that you think we should add to this guide, contact Matillion Support to let them know.

Pipeline errors

These errors are related to your pipelines themselves, rather than the components they contain.

Column [Column name] is of type [data type] and cannot have null entries

This error message appears if a column is expected to contain entries with the specified data type, but contains one or more null values. This could be because your source data contains null values, there is a mismatch between the data types in your source data and pipeline configuration, or a transformation component has inadvertently added null values to your data.

To resolve this issue:

  1. Check your source data and clean it to remove any null values in this column. This might involve deleting rows or adding appropriate values.
  2. If there aren't any null values in your source data, find the component where this column is defined and check that the data type is correct.
  3. If the data type is correct, check the configuration of your transformation components to make sure that a component isn't adding null values in this column.

If none of these steps work, you can add a component in your transformation pipeline to handle null values in the specified column, for example by replacing any null values with a default value of the correct data type.

Expected: [Column name]:[Data type] Actual: [Column name]:[Data type]

This error message appears if the expected data type of a column does not match its actual data type. For example, this message might appear as Expected: Quantity:VARCHAR(40) Actual: Quantity:VARCHAR(2000). This could be because the source data schema is different to the expected schema in your pipeline, the column type is set incorrectly, or a transformation component is changing the data type.

To resolve this issue:

  1. Find the component where the column is defined and check the column's data type. If it is not correct, update the data type.
  2. If the data type is correct, check the schema of your source data. If it is different to your pipeline configuration, change your pipeline configuration accordingly.
  3. If the schema is the same as your pipeline configuration, check the configuration of your transformation components to make sure that a component isn't changing the column's data type.

Failed to find pipeline with name [Pipeline name]

This error message appears if the Data Productivity Cloud tries to run a pipeline, but cannot find a pipeline with the specified name. This could be because you have spelled the pipeline name incorrectly, the pipeline has been renamed or deleted, or the pipeline is not in the specified location.

To resolve this issue:

  1. Check the pipeline name to ensure that it is spelled correctly.
  2. If the name is spelled correctly, check that the pipeline has not been renamed or deleted. If the pipeline has been renamed, change the component configuration to match. If the pipeline has been deleted, you may need to rebuild it.
  3. If the pipeline has not been renamed or deleted, check that it is in the correct location. If the pipeline is in a different location, change the component configuration to match.

Run pipeline: Something went wrong running your pipeline, please try again

This error message appears if you have an orchestration pipeline and a transformation pipeline with the same name, and the orchestration pipeline contains a Run transformation component to run the transformation pipeline.

To resolve this issue, rename either pipeline and adjust the component configuration if necessary.

Note

We recommend using a naming convention for your pipelines to avoid this issue. For example, you could use O_social_media_data for an orchestration pipeline that loads your data and T_social_media_data for a transformation pipeline that transforms this data.

SQL compilation error: Database [Database name] does not exist or not authorized

This error message appears if the database you are trying to access does not exist or you do not have the required permissions to access this database. As a result, your pipeline cannot compile the SQL query. This could be because the database name is configured incorrectly, the database does not exist yet, or your permissions are not correct.

To resolve this issue:

  1. Check that the database exists and that the database name is correct in your database system.
    • If the database doesn't exist in your database system, create the database.
    • If the database name is different in your database system, update your pipeline configuration to match.
  2. If the database exists and the name is correct, check that your user account has the permissions required to access this database.
  3. If your permissions are correct, check the database name, username and password in your Data Productivity Cloud connection settings. If any of these details are incorrect, update them.

Validation failed due to parameter errors

This error message appears if the Data Selection property contains options that Designer doesn't recognize. The error message will contain details of the options that have not been recognized. Possible reasons for this error include entering invalid options, typos or errors in the options, or entering an option that is no longer valid or has been deprecated.

To resolve this issue, check that all the options you have entered or selected in the Data Selection property are valid and correct. You may find it useful to check our component documentation and the data model for the component, if applicable.


Authorization errors

These errors relate to your authorization methods, account permissions, and credentials.

Authorization permission mismatch: Signature did not match

The Authorization permission mismatch error message appears if you try to perform an action, but do not have the required permissions. The Signature did not match part of this error message explains that the issue is a mismatch between the signature generated by your Software Development Kit (SDK) and the signature expected by the server. This error message can appear because your account does not have the required permissions, the string used as a signature by your SDK is incorrect, or your credentials are configured incorrectly.

To resolve this issue:

  1. Check your account permissions:
    1. Log in to the Azure portal.
    2. Go to the resource where the action is being performed.
    3. Check the access control settings and make sure that your account or service principal has the required permissions for the action you are trying to perform.
    4. Update or grant permissions if required.
  2. If your account permissions are correct, make sure the signatures match:
    1. Enter 'Azure-Storage-Log-String-To-Sign': true into the corresponding method call. For an SAS token, this will be the corresponding generateSas method call.
    2. Compare the logged signature to the signature generated by your SDK.
    3. If they do not match, correct the signature.
    4. If this resolves the issue, we recommend disabling the 'Azure-Storage-Log-String-To-Sign' to avoid potentially exposing sensitive information.
  3. If the signatures match but the issue is not resolved:
    1. Generate a new StorageSharedKeyCredential or SAS token.
    2. Update your credentials in the Data Productivity Cloud.

You are not authorized for this operation

This error message appears if you try to perform an action, but do not have the required permissions. As a result, your pipeline fails when trying to access data. This could be because the credentials you have provided are invalid, your user account may not have the required permissions, or your authentication token may be expired or invalid.

To resolve this issue:

  1. Check that your credentials—your username and password or token—are spelled correctly.
  2. If your credentials are spelled correctly, log in to your corresponding user account and check that your account has the required permissions for the action you are trying to perform.
  3. If you are using an authentication token, generate a new authentication token and update the component configuration to match.
  4. If applicable, check that the API endpoint you are trying to access is correct and that you have the required permissions to access it.

Git errors

These errors relate to Git actions in Designer.

"Something went wrong" when merging changes into a branch

This error message may appear if there are uncommitted changes on the branch that you are trying to merge changes into.

To resolve this issue:

  1. Open the branch that you want to merge changes into.
  2. In the branch menu, click Commit changes.
  3. Go back to your original branch and commit any uncommitted changes.
  4. Merge the changes into your chosen branch.

Component errors

These errors relate to specific component properties or configurations.

Azure Blob: Stream is already closed

This error message appears if your permissions for accessing Azure storage accounts and blobs have not been defined correctly.

To resolve this issue, make sure that all your Azure app registrations for the Data Productivity Cloud contain all the required roles and permissions listed in our Roles and permissions for Cloud Storage documentation.

This error message can appear if your Google Ads Query component is trying to retrieve data using a Manager account with multiple sub-accounts.

To resolve this issue:

  1. Click the Connection Options field and create a parameter for your ManagerID. For more information about the ManagerID, read the Google Ads Query data model.
  2. If you are using multiple client customer IDs, make sure that you have entered them in a comma-delimited list in the Client Customer ID field.

Table Input: Table Input is missing column names field

This error message appears if the Column Names property of your Table Input component has not been configured correctly. This means that the Table Input component doesn't know which columns to load from your table. This may be because no columns were selected in this property, the pipeline is configured incorrectly, or the table schema has changed since the Table Input component was configured.

To resolve this issue:

  1. Check that you have selected the correct columns to load in the Column Names property of your Table Input component.
  2. If the correct columns are selected, validate the pipeline to make sure that all components before the Table Input component are configured correctly.

    Note

    You might also want to check the configuration of any pipelines that write to the relevant table, to make sure that your data is being loaded into the table correctly.

  3. Check the schema of the table you are trying to load from. If the schema has changed, update the Table Input component with the new column names.


Databricks errors

These errors relate to your Databricks cloud data warehouse.

Failure occurred when interacting with third party service 'Databricks'

This error message appears if your pipeline encounters a ConcurrentAppendException when interacting with Databricks. This happens when your pipeline and another process or user simultaneously perform operations to update the same data partition, causing a conflict.

To resolve this issue, retry the failed operation. You can automatically retry some failed tasks by adjusting the Retry option in the task properties.

To prevent conflicts like this from happening in the future, we recommend improving concurrency control using the following methods:

  • Configure your Databricks environment to support transactions, then wrap your update operations in transactions to maintain data integrity.
  • Adjust your data partitions to minimize concurrent access to the same partitions, and use finer-grained partitions to reduce the likelihood of conflicts.
  • Set up an alert for ConcurrentAppendException events in Databricks to notify you if this occurs again.
  • Review your data access patterns and adjust your scheduling to avoid any recurring conflicts.

Snowflake errors

These errors relate to your Snowflake cloud data warehouse.

Component failure: Failed to connect to Snowflake: JDBC driver encountered communication error

This error message appears if your pipeline cannot connect to Snowflake because of a communication error. This could be because the hostnames and port numbers are blocked by a firewall, there are temporary problems with Snowflake services, or a network configuration issue has occurred.

To resolve this issue:

  1. Check the Snowflake service status page. If any issues are reported here, wait until they are resolved and then re-run your pipeline.
  2. If there are no Snowflake service issues, make sure that the hostnames and port numbers are not blocked by your firewall settings:
    1. Find the hostnames and port numbers required for Snowflake connection. These are usually in the SYSTEM$ALLOWLIST configuration.
    2. Add these hostnames and port numbers to your firewall's allowed list.
  3. If your firewall settings are correct, fix any network issues:
    1. Check your network configuration to make sure there are no restrictions or blocks on the required ports.
    2. If you are using a VPN or proxy, make sure that it is configured correctly.
    3. Test your connection to Snowflake using a diagnostic tool.