SAP ODP troubleshooting
Note
- This component is available for Matillion ETL for Snowflake, Amazon Redshift, and Google BigQuery.
- This component is in public preview as of version 1.70.
This page includes troubleshooting information for known issues with the SAP ODP Extract component. This page will be updated as and when issues and workarounds are discovered.
Initialization of repository destination failed
With an incorrect password, or other authentication problems, you may receive errors such as:
Initialization of repository destination INERP_JCO_DESTINATION_NAME3 failed
File RODPS_REPL_TRACE.TXT could not be closed by the operating system
If you receive the error File RODPS_REPL_TRACE.TXT could not be closed by the operating system
the /usr/sap
filesystem has run out of space. Each SAP instance has its own work directory. If the trace is enabled, the file RODPS_REPL_TRACE.txt
is written into the work directory.
The trace is disabled by default.
You can:
- Delete the
RODPS_REPL_TRACE.txt
file. - Disable the trace. read SAP's ODP Trace Guide to learn more.
How to check whether the trace is active
Check RSADMIN
. Here you will find whether RODPS_REPL_TRACE
is active. To view the trace, call transaction AL11 and double-click the DIR_HOME
line.
How to check the size of the trace file
SAP transaction AL11 shows the RODPS_REPL_TRACE.txt
file in its logical location, DIR_HOME
.
How to delete the trace file
To delete the trace file, navigate to the work directory of your SAP system and delete the trace file: rm /usr/sap/S4H/D00/work/RODPS_REPL_TRACE.TXT
.
How to start/stop the trace
Starting and stopping the trace can be achieved by running SAP transaction SA38 in the SAP web GUI, and executing program SAP_RSADMIN_MAINTAIN
.
- Set the object field to
RODPS_REPL_TRACE
. - Set the value to
X
for enabling the trace, or leave blank to disable the trace. - Select the Update button.
When active, the trace produces the file (or as many files as instances you have).
The trace file seems to reset when started or stopped. It may continue to grow and become difficult to view once very large.
Data source not visible in the list of data sources
ABAP CDS data sources should be visible in the drop-down list of the Data Source property. They don't require activation in SAP. If you don't see data sources listed, you have either typed a string that doesn't have any match, or the permissions on the SAP side don't allow the SAP user to access them.
SAPI data sources do require activation in SAP. Read Set Up and Activate DataSources for information. If a SAPI data source isn't visible in the Data Source drop-down, it could be due to one of the following:
- You have selected the wrong context.
- You have typed a string that doesn't match any data source technical name/description.
- The data source in SAP may not be active.
- The user permissions in SAP don't allow the user to access the data source.
Can't select fields to be extracted; the connector always returns all fields in the data source
The majority of the SAP APAB CDS and SAPI data sources allow selecting specific fields, but SAP still returns all fields. It isn't possible from the data source metadata to understand which data sources would allow the field selection. Currently, the connector requires you to select fields, but we intend to modify this behavior to improve the user experience, either defaulting to all fields as selected, removing the requirement to select them, or only showing the list of fields for user information.
Delta extraction does not return order changes to orders (sales/material/purchase) that happened in SAP
Deltas for orders in SAP are part of the Logistic Cockpit. Deltas in logistic queues can be configured to run immediately (Direct) or to build up in a bucket (Queued).
If the data source is configured to run Direct, then as soon as you perform the change in SAP, Matillion ETL can pull the data and would extract the change records.
If the data source is instead configured to run Queued, then when you perform an order change in SAP the change is sent into a bucket and parked. The data source needs to have a scheduled job that retrieves the changes from the bucket and makes them available for extraction. The job needs to be scheduled on SAP through transaction LBWE, and the Matillion ETL delta extraction job should be scheduled, taking into account the SAP job scheduling and processing.
How to connect to the Matillion ETL instance via SSH
The method for connecting to your Matillion ETL instance via SSH will differ depending on your hosting platform:
Prior to an upgrade from Amazon Linux 1 to CentOS, connecting via SSH to your Matillion ETL instance used a default username ec2-user
. However, following the upgrade, the default username is now centos
.
This should not be confused with the Matillion ETL application default username, which remains ec2-user
.