February 2025 changelog
Here you'll find the February 2025 changelog for the Data Productivity Cloud. Just want to read about new features? Read our New Features blog.
28th February
DesignerImprovements 🔧
- Added support for Snowflake "Named" stages to third-party connectors (Asana, GitHub, Mailchimp, etc.).
- Added support for Snowflake storage integration stage access strategies when using AWS or Azure as the staging cloud platform with third-party connectors (Asana, GitHub, Mailchimp, etc.).
26th February
CDCImprovements 🔧
- Updated the CDC agent version from
2.107.4
to2.108.1
. - Library updates and security fixes.
- Fixed an issue where the agent could not compact legacy history.dat files.
25th February
Data Productivity CloudImprovements 🔧
- Updated account admin privileges so that account admins can enable and disable the ability for each user to create projects within an account.
20th February
DesignerImprovements 🔧
- Updated the SQL Script component for Snowflake and Databricks projects. You can now choose whether to specify which project and pipeline variables to declare as SQL variables upon component execution, or to declare all as SQL variables. By default, no project or pipeline variables are declared as SQL variables prior to an SQL Script component execution.
- Updated the OpenAI Prompt, Azure OpenAI Prompt, and Amazon Bedrock Prompt components to add an Append setting to the Create Table Options property.
18th February
StreamingImprovements 🔧New features 🎉
- Updates to Pre-built pipelines. Amazon Redshift now supports these types of pipelines.
17th February
CDCImprovements 🔧
- Updated the CDC agent version from
2.102.27
to2.107.4
. - Updated the matillion.compact-history default value to
true
. - Library updates and security fixes.
13th February
AgentsNew features 🎉
- Hydrid SaaS agents running on AWS can now be connected via AWS PrivateLink, an AWS service that allows you to connect via a secure, private connection. Using AWS PrivateLink, no traffic is exposed to the public Internet when it travels between the Data Productivity Cloud and your own AWS virtual private cloud.
11th February
APINew Endpoints 🎉
The following new endpoints have been added to the Data Productivity Cloud Flex connector:
- List All Schedules
- Create Schedule
- Get Schedule
- List Artifacts
- Get Artifact
- Promote Artifact
- Pipeline Executions
- List Custom Connectors
- List Flex Connectors
- List All Secret References
- Create Secret Reference
7th February
DesignerImprovements 🔧
- Workday and Workday Custom Reports now support OAuth 2.0 Authorization Code Grant authentication. For more information about creating an OAuth connection for these components, read the Workday authentication guide.
6th February
DesignerNew features 🎉Improvements 🔧
Agents:
- Added the ability to restart a Hybrid SaaS agent from within the Data Productivity Cloud.
New orchestration components:
Database transactions for Snowflake and Amazon Redshift allow multiple database changes to be executed as a single, logical unit of work. With the introduction of database transactions in the Data Productivity Cloud, the following orchestration components have been added:
- The Begin component starts a new transaction in the database.
- The Commit component completes a transaction, making all changes since the most recent Begin component visible to other users.
- The Rollback component cancels a transaction, undoing all changes made since the most recent Begin component. These changes remain invisible to other users.
New connector:
- Added the LinkedIn Ads Flex connector for developing data pipelines.
4th February
DesignerImprovements 🔧
Components:
- Improved the dbt Core component by adding two new properties:
- dbt Project Location: Use this property to clarify whether the dbt project is located in an external Git repository that is not connected to your Data Productivity Cloud project, or the dbt project is already hosted in the Git repository connected to your Data Productivity Cloud project.
- dbt Project: This property will list any dbt projects that reside in the Git repository connected to your Data Productivity Cloud project. A directory is a dbt project when it includes a
dbt_project.yml
file.
3rd February
APINew Endpoints 🎉
The following endpoints have been added to the Data Productivity Cloud REST API:
Schedules
Method | Endpoint | Description |
---|---|---|
GET | /v1/projects/{projectId}/schedules | List all schedules for a project |
POST | /v1/projects/{projectId}/schedules | Create a new schedule |
DELETE | /v1/projects/{projectId}/schedules/{scheduleId} | Deletes the schedule by the given schedule ID |
GET | /v1/projects/{projectId}/schedules/{scheduleId} | Get a schedule summary for a given schedule ID |
PATCH | /v1/projects/{projectId}/schedules/{scheduleId} | Update the schedule by the given schedule ID and schedule request |
Artifacts
Method | Endpoint | Description |
---|---|---|
GET | /v1/projects/{projectId}/artifacts | Get a list of artifacts |
PATCH | /v1/projects/{projectId}/artifacts | Enable or disable an artifact |
POST | /v1/projects/{projectId}/artifacts | Create an artifact |
GET | /v1/projects/{projectId}/artifacts/details | Get an artifact by a given version name |
POST | /v1/projects/{projectId}/artifacts/promotions | Promote an artifact to a specific environment |
Connectors
Method | Endpoint | Description |
---|---|---|
GET | /v1/custom-connectors | Lists custom connector profiles for the requesting account |
GET | /v1/flex-connectors | Lists Flex connector profiles |
Secret References
Method | Endpoint | Description |
---|---|---|
GET | /v1/projects/{projectId/secret-references} | List all secret references |
POST | /v1/projects/{projectId/secret-references}/{secretReferenceName} | Create a secret reference |