Skip to content

Cloud credentials

You need credentials to connect to certain destinations. AWS, Azure, and GCP each require valid user credentials to successfully create a destination. You can view and create saved credentials in Data Loader by clicking ManageCloud credentials. Credentials are organized in tabs according to their respective service.

The following guide explains how to view and manage your cloud credentials in Data Loader, as well as a basic example of how to create one.

Edit credentials by clicking the pencil icon next to a credential. You can permanently delete a set of credentials by clicking the corresponding trashcan icon.


Create cloud credentials

To create cloud credentials in Data Loader:

  1. Click ManageCloud Credentials.
  2. Click the tab of AWS, Azure, or GCP.
  3. Click Add {cloud provider} credentials if the tab of that cloud provider has no existing credentials; otherwise, click Add new {cloud provider} credential.
  4. User the appropriate documentation below for your cloud credentials creation form.

Warning

You can't create credentials as part of the destination configuration process. You will need to create them before you create a pipeline.


AWS

AWS credentials are needed for Data Loader to access different services, such as Amazon S3 staging buckets, and for listing Amazon Redshift clusters.

The instance credential is the Matillion credential used to encrypt and decrypt passwords by KMS service. Data Loader uses credentials to access AWS S3.

For AWS credentials, you are required to enter the following for an existing Amazon IAM user:

  • AWS credential label: Give your set of credentials a clear and descriptive name. This is how they're referred to throughout Data Loader.
  • Access Key ID: Enter the required Access Key ID from your Amazon S3 account.
  • Secret Access Key: Enter the required Secret Access Key from your Amazon S3 account.
  • Region: Choose the AWS region that corresponds to your existing AWS resources (and/or where you will add new resources if none currently exist). If the region does not match, an error will occur.

Note

For credentials that you created in Data Loader before August 2023, the region will be inferred as the region that the Hub account is in. If users edit their credentials, they will be asked to explicitly add a region.

Read AWS security credentials for more information about access and secret access keys.

Click Test credentials to confirm that the details are correct at any point. Click Test and save to test the credentials and return to the Manage Cloud Credentials page. If successful, your credentials will be saved and stored in Data Loader. If unsuccessful, you will not be able to continue.

The AWS IAM user credentials must have the following permissions to allow Data Loader to stage data to the S3 bucket:

Policy Permissions
An appropriate policy name
  • s3:DeleteObject
  • s3:GetObject
  • s3:PutObject
  • s3:ListAllMyBuckets
  • s3:ListBucket
  • s3:GetBucketLocation

Azure

For Microsoft Azure credentials, the relevant information can be found in the following places while logged into your Azure portal:

  • Azure credential label: Give your set of credentials a clear and descriptive name. This is how they're referred to throughout Data Loader.
  • Tenant ID: (AKA Directory ID) Azure Active Directory → Properties → Directory ID.
  • Client ID: (AKA Application ID) Azure Active Directory → App Registrations → Registered App → Application ID.
  • Secret Key: Azure Active Directory → App Registrations → Registered App → Settings → Keys.

Click Test credentials to confirm that the details are correct at any point. Click Test and save to test the credentials and return to the Manage Cloud Credentials page. If successful, your credentials will be saved and stored in Data Loader. If unsuccessful, you will not be able to continue.

For more information about Azure credentials, read What is Azure Active Directory?.

Azure users should be assigned a Storage Blob Data Contributor role with the following Actions and DataActions included:

Role Permissions
Storage Blob Data Contributor Actions:
  • Microsoft.Storage/storageAccounts/blobServices/containers/delete
  • Microsoft.Storage/storageAccounts/blobServices/containers/read
  • Microsoft.Storage/storageAccounts/blobServices/containers/write
DataActions:
  • Microsoft.Storage/storageAccounts/blobServices/containers/blobs/delete
  • Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read
  • Microsoft.Storage/storageAccounts/blobServices/containers/blobs/write
  • Microsoft.Storage/storageAccounts/blobServices/containers/blobs/move/action
  • Microsoft.Storage/storageAccounts/blobServices/containers/blobs/add/action

GCP

Google Cloud Platform credentials will be necessary for accessing Google BigQuery and Google Cloud Storage for staging.

For GCP credentials, you are required to enter the following:

  • GCP credential label: Give your set of credentials a clear and descriptive name. This is how they're referred to throughout Data Loader.
  • Access Key ID: Upload a JSON file no larger than 1 MB containing your Service Account File. You can drag-and-drop the file directly into the entry field or click the field to open a file browser.

Click Test credentials to confirm that the details are correct at any point. Click Test and save to test the credentials and return to the Manage Cloud Credentials page. If successful, your credentials will be saved and stored in Data Loader. If unsuccessful, you will not be able to continue.

The GCP IAM user credentials must have the following permissions to allow Data Loader to stage data to a Google Cloud Storage bucket:

Role Permissions
Storage Admin storage.buckets.*