You use the data transfer process (DTP) to transfer data from source objects to target objects within BW. You can also use the data transfer process to access InfoProvider data directly.
You have used transformations to define the data flow between the source and target object.
Creating a Data Transfer Process Using a Process Chain
You are in the plan view of the process chain that you want to use for the data transfer process.
Process type Data Transfer Process is offered in the Load Process and Post-Processing process category.
Drag or double-click the process to insert it into the process chain.
To create a data transfer process as a new process variant, enter a technical name and choose Create.
The dialog box for creating a data transfer process appears.
Select Standard (Can Be Scheduled) as the type of data transfer process.
You can only use the type DTP for Direct Access as the target of the data transfer process for a VirtualProvider. For more information, see Creating Data Transfer Processes for Direct Access.
If you use the data transfer process in a process chain, you can only use the standard data transfer as the target of the data transfer process for a DataStore object. For more information about data transfer processes for real-time data acquisition, see Creating Data Transfer Processes for Real-Time Data Acquisition.
Select the target and source objects.
First select the object type.
Two input helps are available when you select the source and target objects:
Input help: Existing paths: This input help provides a selection of the objects for the starting objects already defined in the data flow. If there is only one object in the data flow, this is selected by default.
Input help: List of all objects: This input help enables you to select the object from the complete list of BW objects.
Object types that are supported as sources: DataSources, InfoCubes, MultiProviders, DataStore objects, InfoSets, InfoObjects, semantically partitioned objects and query elements (Using Queries as InfoProviders).
Object types supported as targets: InfoCubes, DataStore objects, InfoObjects and Open Hub Destinations.
The data transfer process maintenance screen appears.
The header data for the data transfer process shows the description, ID, version and status of the data transfer process, along with the delta status.
Define the parameters on the Extraction tab page:
Choose Extraction Mode.
You can choose Delta or Full mode.
Unlike delta transfers with an InfoPackage, explicit initialization of the delta process is not necessary for delta transfer with a DTP. When the data transfer process is executed in delta mode for the first time, all existing requests are retrieved from the source and the delta status is initialized.
Only the Full extraction mode is available for the following sources:
DataStore Object for Direct Update
In full mode, the DTP supports MultiProviders and semantically partitioned objects as sources. In delta mode, the DTP supports MultiProviders and semantically partitioned objects as sources, as long as they only contain InfoCubes. If the MultiProvider contains a HybridProvider, only the InfoCube contained in the HybridProvider is used for the delta. The DTP does not support delta mode for other InfoProviders, which are part of a MultiProvider. The delta is also not possible if multiple InfoProviders (InfoCubes) of the MultiProvider in a 3.x data flow are updated via the same InfoPackage (and therefore filled with the same 3.x source requests). Error handling is not supported for MultiProviders used as DTP sources.
If you have selected transfer mode Delta, you can define further parameters:
With Only Get Delta Once you define whether the source requests should be transferred only once.
Setting this flag ensures that the content of the InfoProvider is an exact representation of the source data.
A scenario of this type might be required if you always want an InfoProvider to contain the most up-to-date data set for a source, but technical reasons prevent the DataSource on which it is based from delivering a delta (new, changed, or deleted data records). For this type of DataSource, the current data set for the required selection can only be transferred using a Full Update.
In this case, a DataStore object cannot usually be used to determine the missing delta information (overwrite and creation of delta). If this is not logically possible because data is deleted in the source without delivering reverse records for example, you can set this flag and perform a snapshot scenario. Only the most up-to-date request for this DataSource is retained in the InfoProvider. Earlier requests for the DataSource are deleted from the (target) InfoProvider before a new one is requested (this is done by a process in a process chain, for example). They are not transferred again during the DTP delta process. When the system determines the delta when a new DTP request is generated, these earlier (source) requests are seen as already fetched.
Under Get All New Data Request By Request, specify how you want data to be retrieved from the source.
Since a DTP bundles all transfer-relevant requests from the source, it can generate very large requests. If you do not want to use a single DTP request to transfer the dataset from the source because the dataset is too large for example, you can set the Get All New Data Request by Request flag. This specifies that you want the DTP to read only one request from the source at a time. Once processing is completed, the DTP request checks for further new requests in the source. If it finds any, it automatically creates an additional DTP request.
You can change this flag at any time, even if data has already been transferred. If you set this flag, you can transfer data request by request as a one-off activity. If you deselect the flag, the DTP reverts to transferring all new source requests at once at periodically scheduled intervals.
If you created the DTP before SAP NetWeaver 7.0 Support Package Stack 13, a DTP request with this flag only retrieves the first source request. This restricts the way in which the DTPs can be used because requests might accumulate in the source, and the target might not contain the current data. To avoid this, you have to execute the DTP manually until all the source requests have been retrieved.
The system therefore also displays the following flag for this kind of DTP: Retrieve Until No More New Data.
If this flag is set in addition to Get All New Data Request by Request, and the DTP has been activated, the DTP behaves as described above and creates DTP requests until all the new data has been retrieved from the source. In DTP maintenance, the Retrieve Until No More New Data flag is therefore not displayed any more after activation.
If the Retrieve Until No More New Data flag was set, and you remove the selection, the text in the indicator changes from Get All New Data Request by Request to Get One Request Only. If you activate the DTP now, only the first source request will be retrieved with the DTP request.
Under DeltaInit without Data, specify whether the first request of a delta DTP should be transferred without any data (in the same way as when simulating delta initialization for the InfoPackage).
If this flag is set, the source data is only flagged as retrieved and is not transferred to the target. The data flagged as retrieved is not read by the next request. Instead only the new source data accrued in the meantime is read. Use this setting in scenarios where the DTP is automatically implemented within a process chain to rebuild a data target.
This setting is a transport-relevant change to the metadata of the DTP. If you want to mark source data as retrieved for test purposes only, do not use this flag. Instead, on the Execute tab page, set the processing type to Mark Source Data as Retrieved. This means you avoid making transport-relevant changes to the metadata of the DTP (see below).
If a DTP source returns requests with a large number of small data packages, like the write-opimized DataStore object that is supplied data via real-time data acquisition, and the DTP is processed in processing mode 'Parallel Extraction and Processing', you can speed up the request processing by optimizing the package size on the 'Extraction' tab page. The optimum package size is used to bundle together multiple source packages from the same source request in a single data package in the DTP request, thus reducing the number of packages in request processing.
If required, you can use to define filter criteria for the data transfer.
This means that you can use multiple data transfer processes with disjunctive selection conditions to efficiently transfer small sets of data from a source into one or more targets, instead of transferring large volumes of data. The filter therefore restricts the amount of data to be transferred and works like the selections in the InfoPackage. You can specify single values, multiple selections, intervals, selections based on variables, or routines. Choose Change Selection to change the list of InfoObjects that can be selected.
The icon next to the pushbutton indicates that predefined selections exist for the data transfer process. The tool tip for this icon displays the selections as a character string.
Choose to specify how you want to build the data packages that are read from the source (DataSource or InfoProvider). To do this, you define key fields. Data records that have the same key are combined in a single data package.
This setting is only relevant for DataStore objects with data fields that are overwritten. This setting also defines the error stack key fields for handling invalid data records. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.
During parallel processing of time-dependent master data, the semantic key of the DTP may not contain the field of the data source that contains the DATETO information.
Make any further settings that depend on the source object and data type.
Define the parameters on the Update tab page:
Make the settings for error handling:
Under Error Handling, you can either activate or deactivate error handling and specify how valid records are updated in the event of an error.
Under Maximum Number of Errors per Package, specify when (after how many errors) the system terminates the loading process.
For more information about error handling settings, see Handling Data Records with Errors.
Make any further settings that are relevant for the target object.
Define the parameters on the Execute tab page:
On this tab page, the process flow of the program for the data transfer process is displayed in a tree structure.
Specify the status that you want the system to adopt for the request if there are warnings in the log.
Specify how you want the system to determine the overall status of the request.
Once the technical processing stage for a DTP request has been completed, the overall status of the request can be set automatically (based on the technical status of the request) or set manually (by the user). If the overall status is set manually, the status initially remains unchanged, if the technical processing stage was completed with a red or green status. In particular, this means that data for a green request is not released for reporting or further processing. The overall status has to be set manually by the user or by a process in a process chain.
Specify whether red requests should be automatically repeated in process chains. If a DTP resulted in a canceled request during the previous run of a periodically scheduled process chain, this setting is evaluated the next time the process chain is started. If the flag is set, the previous request containing errors is automatically deleted and a new one is started. If the flag is not set, the DTP is terminated and an error message appears explaining that a new request cannot be started until the previous request is either repaired or deleted.
If a delta update is carried out, the repaired request not only contains the data of the terminated request, but also all the data that has been added to the source since then. This can lead to both performance and consistency problems. Example of a consistency problem: A transfer with the prerequisite that data can only be transferred for one day at a time. This transformation then produces incorrect results, which leads to consistency problems if, during the repair, data for more than one day is to be transported at the same time.
Normally the IBM i automatically defines the processing mode for the background processing of the data transfer process in question. You can choose from the following modes:
If you want to execute a delta without transferring any data for test purposes, choose No data transfer; delta status in source: retrieved as the processing mode. This processing mode is provided when the data transfer process extracts in delta mode. In this case you execute the DTP directly in the dialog. A request that is started in this way marks the data that is found in the source as fetched, without actually transferring it to the target. You can still choose this mode if delta requests have already been transferred for this data transfer process. This processing mode implies that no transport-relevant changes have been made to the DTP metadata.
If you want to execute the data transfer process in debugging mode, choose Serially in the Dialog Process (for Debugging) as the processing mode. In this case, you can define breakpoints in the tree structure for the process flow of the program. The request is processed synchronously in a dialog process, and the update of the data is simulated. If you select Expert Mode, you can also define selections for the simulation and activate or deactivate intermediate storage. You can set breakpoints as well. More information: Simulating and Debugging DTP Requests.
More information: Processing Types in the Data Transfer Process
Check the data transfer process, then save and activate it.
Go back to the process chain maintenance transaction.
The data transfer process is displayed in the plan view and can be linked into your process chain. When you activate and schedule the chain, the system executes the data transfer process as soon as it is triggered by an event in the predecessor process in the chain.
Creating Data Transfer Processes from the Object Tree in the Data Warehousing Workbench
The starting point when creating a data transfer process is the target where you want to transfer data to. In the Data Warehousing Workbench, an object tree is displayed, and you have selected the target object.
In the context menu, choose Create Data Transfer Process.
The dialog box for creating a data transfer process appears.
Proceed as described in steps 3 to 10 in the procedure for creating a data transfer process using a process chain. In step 4 however, specify the source object only.
You can now execute the data transfer process directly.
Chooseto display information about the source and target objects, the transformations, and the most recent changes to the data transfer process.
You can define the DB storage parameters with DB Memory Parameters. More information: