Modeling Guide

Execute an SAP Data Services Job

Use the SAP Data Services Job operator in the SAP Data Hub Modeler tool to execute an SAP Data Services Job in a remote system.

Prerequisites

  • You have created a connection to an SAP Data Services system using the SAP Data Hub Connection Management application.

Context

Executing an SAP Data Services job helps users to integrate, transform, and improve the data quality. In SAP Data Services, the unit of execution is called jobs. Executing an SAP Data Services job in SAP Data Hub helps in:
  • Data ingestion into a Hadoop cluster for further processing (natively in a Hadoop cluster).
  • Moving data out of SAP Data Hub after processing.

Procedure

  1. Start the SAP Data Hub Modeler.
  2. In the navigation pane, select the Graphs tab.
  3. In the navigation pane toolbar, choose + (Create Graph).
    The tool opens an empty graph editor in the same window, where you can define your graph.
  4. Select the operator.
    A graph can contain a single operator or a network of operators based on the business requirement.
    1. In the navigation pane, choose the Operators tab.
    2. In the search bar, search for the SAP Data Services Job operator.
    3. In the search results, double-click the SAP Data Services Job operator (or drag and drop it to the graph editor) to add it as a process in the graph execution.
  5. Configure the operator.
    1. In the graph editor, double-click the SAP Data Services Job operator.
    2. In the Connection Id text field, enter the required connection ID.
      You can also browse and select the required connection.
    3. In the Description text field, provide a description for the operator.
    4. In the Job field, browse and select the required SAP data services job.
      In the Repository field, the tool automatically populates the repository that contains the data services job. The tool also displays the global variables and the substitution parameters (if any) that are associated with the job.
    5. In the Job Server dropdown list, select the required job server that the modeler must use to execute the job.
    6. In the System Configuration field, enter the required system configuration details that the modeler must use for running the job.
  6. (Optional) Edit global variables.
    If the selected job is associated with global variables, in the Global Variables section the tool displays the global variables, its data type, and default value. You can edit the default value of a global variable associated with the job.
    1. In the Global Variables section, select the required global variable.
      The tool populates values the Gobal Variables section depending on the version of the SAP Data Service system in which you have created the selected job.
    2. In the Value text field, choose and provide the required value.
    3. If you want to define a new global variable for the SAP Data Services job, in the Global Variables section, choose + and provide the variable and value.
  7. (Optional) Edit substitution parameters.
    In the Substitution Parameters section, define the substitution parameters associated with the job, its data type, and default value. You can edit the default value of a substitution parameter or define new substitution parameters.
    1. In the Substitution Parameter section, choose + (Add) to add a new substitution parameter.
    2. Under the Name column, select (or enter) the name of the substitution parameter.
    3. In the Value text field, choose and provide the required parameter value.
  8. Save and execute the graph.
    Use the Workflow Trigger and Workflow Terminator operators in the graph to control the start and stop of the graph execution respectively.