Modeling Guide for SAP Data Hub

SAP Data Services Job

The SAP Data Services Job operator helps execute an SAP Data Services Job in a remote system.

Executing an SAP Data Services job helps users to integrate, transform, and improve the data quality.

This operator has one input port (input) and two output ports (output and error). Use this operator in a graph that uses only other data workflow operators.

Connect the input port to another data workflow operator to trigger its start. On the error output port, the engine writes a message in case of errors when trying to execute the job. On success, the engine writes a message on the output port. If you want the execution of the graph to proceed, connect the respective output ports to other data workflow operators.

Configuration Parameters

Parameter

Type

Description

SAP Data Services Connection

object

Connection ID that references a connection to a remote SAP Data Services system.

Job

string

Name of the SAP Data Services job to execute

Repository

string

Name of the repository that contains the job.

Job Server

string

Name of the job server to use to execute the job.

System Configuration

string

Name of the system configuration to use for running the job.

Global Variables

string

Name-value pairs for global variables. The default value is no variables.

Substitution Parameters

string

Name-value pairs for substitution parameters. The default value is no parameters.

Input

Input

Type

Description

input

string

Accepts message from a connected data workflow operator.

Output

Output

Type

Description

output

string

If the operator executed successfully, this port carries the success message.

error

string

Operator error. If the port is connected then the graph will keep running. If it is not connected, the graph will terminate with the operator error.

For more information, see Execute an SAP Data Hub Pipeline