Matillion's MCP server
Matillion's Model Context Protocol (MCP) server offers a secure, standardized way for Large Language Models (LLMs) to interact with your Data Productivity Cloud resources. By configuring the MCP server, you can enable AI assistants to help with tasks such as monitoring pipeline executions, analyzing credit consumption, and triggering pipeline runs.
You can configure popular AI assistant clients to use the Matillion MCP server (@matillion/mcp-server). The server is distributed as an NPM package and launched using npx, which works with clients that can start local processes.
Use cases
| Use case | Scenario | MCP server solution |
|---|---|---|
| Pipeline monitoring | Data engineers need to quickly investigate pipeline failures and understand execution history. | Ask your AI assistant to investigate failed pipelines in natural language. It can search through recent executions, identify failures, drill into specific step-level errors, and explain what went wrong. You can query by time range, pipeline name, or execution status, and see comprehensive failure analysis, including step-by-step breakdowns and data lineage impact assessment. |
| Cost optimization | Finance and data teams need to understand credit consumption patterns and optimize spending. | Ask your AI assistant to analyze your credit consumption and identify optimization opportunities. See breakdowns of credit usage by pipeline and execution frequency. Your AI assistant can identify your most expensive pipelines, suggest scheduling optimizations, and correlate execution patterns with costs to help you make informed decisions about resource allocation. |
| Pipeline execution | Data engineers need to manually trigger pipeline runs with custom parameters. | Execute pipelines on demand with custom parameters through natural language requests like "run the customer_data pipeline with start_date=2025-01-01." Your AI assistant can trigger manual runs with custom variables and monitor execution progress. This gives you flexible control over pipeline execution without needing to remember complex parameter formats. |
| Streaming pipelines | Teams managing streaming pipelines need to monitor their status and control execution dynamically. | Check the status of streaming pipelines and run them through conversation. Ask your AI assistant questions like "is the Kafka to Snowflake pipeline running?" to get real-time status updates, performance metrics, and event processing statistics. You can also request actions like "stop the streaming pipeline temporarily" to dynamically control execution. |
Prerequisites
Before setting up the Matillion MCP server, ensure you meet the following prerequisites:
- Node.js is installed on your system. We recommend at least v24.
- npm is installed on your system. Version 11 (minimum recommended) should come with Node.js v24 and above.
- The npx command must be available in your system's global PATH.
- You have your Matillion region, client ID, and client secret.
- At least one installed AI client, such as Claude Desktop or the Gemini CLI.
Obtaining Matillion credentials
To find your Matillion region, client ID and client secret:
- Log in to the Data Productivity Cloud.
-
In the left navigation, click your Profile & Account icon.
Note
Your Matillion region is shown below your account name, e.g.
euorus. -
Then, select API credentials from the menu.
-
If you do not already have credentials, click Set an API Credential to create your Client ID and Client Secret.
Note
If you already have a set of credentials for your account, you will need to delete the existing credentials before you can create a new set. To do this, read Delete existing API credentials.
-
Give your credential set a descriptive Name. We recommend that you use a name that suits the application or purpose the credentials will be used for.
- Click Save to create the Client ID and Secret.
- Copy the secret immediately. You are not able to view the secret again after this point. If you do not copy it, or otherwise lose it, you will need to delete these credentials and generate a new set. The Secret window will close automatically after this point.
JSON configuration
All supported clients use the same JSON configuration structure to define the server. Add your credentials to the configuration below, then use that configuration for each client that you want to use.
Warning
Do not share this file publicly after you have entered your credentials.
{
"mcpServers": {
"matillion": {
"command": "npx",
"args": [
"@matillion/mcp-server"
],
"env": {
"MATILLION_CLIENT_ID": "[Your client ID]",
"MATILLION_CLIENT_SECRET": "[Your client secret]",
"MATILLION_REGION": "[Your region]"
}
}
}
}
Client-specific setup
After adding your credentials to the JSON configuration above, follow the instructions below to set up your chosen client.
Claude Code (Visual Studio Code)
- Open your project workspace in Visual Studio Code.
- Create or open the
.vscode/settings.jsonfile. - Add the
mcpServersblock from the JSON configuration above to the.vscode/settings.jsonfile. If the file already has content, merge themcpServersobject into the existing content. - Restart Visual Studio Code. The Matillion MCP server will now be available.
Claude Desktop
Claude Desktop uses a global configuration file to launch MCP servers. Find your Claude Desktop configuration file. If it doesn't exist, create it at the following location:
- For macOS:
~/Library/Application Support/Claude/claude_desktop_config.json -
For Windows:
%APPDATA%\Claude\claude_desktop_config.json -
Add the entire JSON configuration above, including the opening and closing {}, into the Claude Desktop configuration file. If the file already has content, merge the
mcpServersobject into the existing content. - Quit the Claude Desktop app by right-clicking the Claude icon and then clicking Quit. Closing the window will not quit the app completely.
- Restart Claude Desktop. The Matillion MCP server will now be available.
Gemini CLI
The Gemini CLI uses a global settings.json file in your user profile. Find your Gemini CLI configuration file. If it doesn't exist, create it at the following location:
- For macOS or Linux:
~/.gemini/settings.json -
For Windows:
$HOME/.gemini/settings.json(where$HOMEis your user profile directory, e.g.C:\Users\YourName) -
Add the entire JSON configuration above, including the opening and closing {}, into the Gemini CLI configuration file. If the file already has content, merge the
mcpServersobject into the existing content. - If the Gemini CLI is currently running in your terminal, use the
/quitcommand to quit the CLI. - Restart the Gemini CLI. The Matillion MCP server will now be available.
Unsupported clients
The Matillion MCP server cannot currently be configured for the following clients:
- ChatGPT (Desktop and Web)
- Claude.ai (Web)
- Browser-based AI clients
Got feedback or spotted something we can improve?
We'd love to hear from you. Join the conversation in the Documentation forum!