Skip to content

MySQL

This page describes how to configure a MySQL data source. With Data Loader, you can replicate and load your source data into your target destination.

:::info{title='Note'} Tables with defined primary key columns will incrementally load data into the target table when the pipeline is configured. The target table will be truncated and loaded for each run if no primary key columns have been defined for the table. :::

Schema Drift Support: Yes. Read Schema Drift to learn more.

Return to any page of this wizard by clicking Previous.

Click X in the upper-right of the UI and then click Yes, discard to close the pipeline creation wizard.


Prerequisites

  • Read the Allowed IP addresses topic before you begin. You may not be able to connect to certain data sources without first allowing the Batch IP addresses. In these circumstances, connection tests will always fail and you will not be able to complete the pipeline.
  • Your MySQL database server must be running.
  • Enable TCP/IP protocols with the TCP port set to 3306.
  • You must have access to your MySQL database host's IP address or domain.
  • Make sure the MySQL database users has SELECT privileges.

:::info{title='Note'} You must have permission to access your MySQL database resources. :::


Create pipeline

  1. In Data Loader, click Add pipeline.
  2. Choose MySQL from the grid of data sources.
  3. Choose Batch Loading.

Connect to MySQL

Configure the MySQL database connection settings, specifying the following:

Property Description
Server address The URL required to connect to the MySQL database server.
Port The required port number to connect to the MySQL database server. The default value is 3306.
Database name The name of the MySQL database you want to connect to.
Username A valid login username for the MySQL database server.
Password A managed entry representing your MySQL database login password. Choose an existing password from the dropdown menu or click Manage and then click Add new password to configure a new managed password entry. Give the password a label, which is what you can see in the password dropdown menu, and then input the value of the password. Read Manage Passwords to learn more.
Advanced settings Additional JDBC parameters or connection settings. Click Advanced settings and then choose a parameter from the dropdown menu and enter a value for the parameter. Click Add parameter for each extra parameter you want to add. Read Configuration Properties for reference of MySQL connection options. For a list of compatible connection properties, read Allowed connection properties.

Click Test and Continue to test your settings and move forward. You can't continue if the test fails for any reason.


:::error{title='Error'} If you encounter an error where zeroed dates are causing pipelines to fail with an error such as below:

Value '0000-00-00' can't be represented as java.sql.Date

or:

Value '0000-00-00 00:00:00' can't be represented as java.sql.Timestamp

You can solve this error by selecting the zeroDateTimeBehaviour parameter and assigning a value of convertToNull. :::


Choose tables

Choose any tables you wish to include in the pipeline. Use the arrow buttons to move tables to the Tables to extract and load listbox and then reorder any tables with click-and-drag. Additionally, select multiple tables using the SHIFT key.

Click Continue with X tables to move forward.


Configure columns

Choose the columns from each table to include in the pipeline. By default, Data Loader selects all columns from a table.

Click Configure on a table to open Configure table. This dialog lists columns in a table and the data type of each column. Additionally, you can set a primary key and assign an incremental column state to a column.

:::info{title='Note'}

  • Primary Key columns should represent a true PRIMARY KEY that uniquely identifies each record in a table. Composite keys work, but you must specify all columns that compose the key. Based on the primary key, this won't permit duplicate records. Jobs may fail or replicate data incorrectly if these rules aren't applied.
  • Make sure an Incremental column is a true change data capture (CDC) column that can identify whether there has been a change for each record in the table. This column should be a TIMESTAMP/DATE/DATETIME type or an INTEGER type representing a date key or UNIX timestamp. :::

Click Add and remove columns to modify a table before a load. Use the arrow buttons to move columns out of the Columns to extract and load listbox. Order columns with click-and-drag. Select multiple columns using SHIFT.

Click Done adding and removing to continue and then click Done.

Click Continue once you have configured each table.


Choose destination

  1. Choose an existing destination or click Add a new destination.
  2. Select a destination from Snowflake, Amazon Redshift, or Google BigQuery.

Set frequency

Property Description
Pipeline name A descriptive label for your pipeline. This is how the pipeline appears on the pipeline dashboard and how Data Loader refers to the pipeline.
Sync every The frequency at which the pipeline should sync. Day values include 1—7. Hour values include 1—23. Minute values include 5—59. The input is also the length of delay before the first sync.

Currently, you can't specify a start time.

Once you are happy with your pipeline configuration, click Create pipeline to complete the process and add the pipeline to your dashboard.


Troubleshooting

If you get an error stating "Communications link failure", please check your credentials are correct. There is a known issue regarding TLS support, and workarounds are to set connection options by editing the pipeline; but, these should be considered carefully and in conjunction with local guidelines:

  1. Set UseSSL to false or,
  2. Set enabledTLSProtocols to a supported TLS version (such as TLSv1.2 or above) - only available if your MySQL instance is version 8.0.8 or above.

    Note

    If you're using an AWS Aurora MySQL database, read Using TLS with Aurora MySQL DB clusters and check your MySQL version against a supported TLS version before setting this value for the enabledTLSProtocols JDBC parameter.