Creating Data Transfer Processes 
You use the data transfer process (DTP) to transfer data from source objects to target objects within BW. You can also use the data transfer process to access InfoProvider data directly.
You have used transformations between the source and target objects to define the data flow.
You are in the plan view of the process chain that you want to use for the data transfer process.
The process type Data Transfer Process is available in the Load Process and Post-Processing process category.
Drag or double-click the process to insert it into the process chain.
To create a data transfer process as a new process variant, enter a technical name and choose Create.
The dialog box for creating a data transfer process appears.
Select Standard (Can Be Scheduled) as the type of data transfer process.
Note
You can only use the type DTP for Direct Access as the target of the data transfer process for a VirtualProvider. For more information, see Creating Data Transfer Processes for Direct Access.
If you use the data transfer process in a process chain, you can only use the standard data transfer as the target of the data transfer process for a DataStore object. More information about data transfer processes for real-time data acquisition: Creating Data Transfer Processes for Real-Time Data Acquisition.
Select the target and source objects.
First select the object type.
Two input helps are available when you select the source and target objects:
(Input Help: Existing Paths): This input help provides a selection of the objects for the starting object that were already defined in the data flow. If only one object exists in the data flow, this is selected by default.
(List) with the quick info Input Help: List of All Objects: This input help enables you to select the object from the complete list of BW objects.
Object types that are supported as sources: DataSources, InfoCubes, MultiProviders, DataStore objects, InfoSets, InfoObjects, semantically partitioned objects and query elements (Using Queries as InfoProviders).
Object types supported as targets: InfoCubes, DataStore objects, InfoObjects and Open Hub Destinations.
Choose Continue.
The data transfer process maintenance screen appears.
The header data for the data transfer process shows the description, ID, version, and status of the data transfer process, along with the delta status.
On the Extraction tab page, specify the parameters:
Choose Extraction Mode.
You can choose Delta or Full mode.
Note
In contrast to a delta transfer with an InfoPackage, an explicit initialization of the delta process is not necessary for the delta transfer with a DTP. When the data transfer process is executed in delta mode for the first time, all existing requests are retrieved from the source and the delta status is initialized.
Only the Full extraction mode is available for the following sources:
InfoObjects
InfoSets
DataStore objects for direct update
In full mode, the DTP supports MultiProviders and semantically partitioned objects as sources. In delta mode, the DTP supports MultiProviders and semantically partitioned objects as sources, as long as they only contain InfoCubes. If the MultiProvider contains a HybridProvider, only the InfoCube contained in the HybridProvider is used for the delta. Error handling is not supported for MultiProviders used as DTP sources.
If you selected transfer mode Delta, you can define further parameters:
With Only Get Delta Once you define whether the source requests should be transferred only once.
Setting this indicator ensures that the content of the InfoProvider is an exact representation of the source data.
A scenario of this type may be required if you always want an InfoProvider to contain the most up-to-date data set for a source, but technical reasons prevent the DataSource on which it is based from delivering a delta (new, changed, or deleted data records). For this type of DataSource, the current data set for the required selection can only be transferred using a Full Update.
In this case, a DataStore object cannot usually be used to determine the missing delta information (overwrite and creation of delta). If this is not logically possible because, for example, data is deleted in the source without delivering reverse records, you can set this indicator and perform a snapshot scenario. Only the most up-to-date request for this DataSource is retained in the InfoProvider. Earlier requests for the DataSource are deleted from the (target) InfoProvider before a new one is requested (this is done by a process in a process chain, for example). They are not transferred again during the DTP delta process. When the system determines the delta when a new DTP request is generated, these earlier (source) requests are seen as already fetched.
Under Get All New Data Request By Request, specify how you want data to be retrieved from the source.
Since a DTP bundles all transfer-relevant requests from the source, it generates large requests in some instances. If you do not want to use a single DTP request to transfer the dataset from the source because, for example, the dataset is too large, you can set the Get All New Data Request by Request indicator. This specifies that you want the DTP to read only one request from the source at a time. At the end of processing, the DTP request checks whether there are further new requests in the source. In this case it automatically creates an additional DTP request.
Note
You can change this indicator at any time, even if data has already been transferred. If you set this indicator, you can transfer data request by request as a one-off activity. If you deselect the indicator, the DTP goes back to transferring all new source requests at once at periodically scheduled intervals.
If you created the DTP before SAP NetWeaver 7.0 Support Package Stack 13, a DTP request with this indicator only retrieves the first source request. This restricts the way in which the DTPs can be used because requests might accumulate in the source, and the target might not contain the current data. To avoid this, you must manually execute the DTP until all the source requests have been retrieved.
The system therefore also displays the following indicator for this kind of DTP: Retrieve Until No More New Data.
If this indicator is set in addition to Get All New Data Request by Request, and the DTP has been activated, the DTP behaves as described above and creates DTP requests until all the new data has been retrieved from the source. In DTP maintenance, the Retrieve Until No More New Data indicator is therefore not displayed any more after activation.
If the Retrieve Until No More New Data indicator was set, and you remove the selection, the text in the indicator changes from Get All New Data Request by Request to Get One Request Only. If you activate the DTP now, only the first source request will be retrieved with the DTP request.
Under DeltaInit without Data, specify whether the first request of a delta DTP should be transferred without any data (in the same way as when simulating delta initialization for the InfoPackage).
If this flag is set, the source data is only flagged as retrieved and is not transferred to the target. The data flagged as retrieved is not read by the next request. Instead only the new source data accrued in the meantime is read. Use this setting in scenarios where the DTP is automatically implemented within a process chain to rebuild a data target.
Note
This setting is a transport-relevant change to the metadata of the DTP. If you want to mark source data as retrieved for test purposes only, do not use this flag. Instead, on the Execute tab page, set the processing type to Mark Source Data as Retrieved. This means you avoid making transport-relevant changes to the metadata of the DTP (see below).
If a DTP source returns requests with a large number of small data packages, like the write-opimized DataStore object that is supplied data via real-time data acquisition, and the DTP is processed in processing mode 'Parallel Extraction and Processing', you can speed up the request processing by optimizing the package size on the 'Extraction' tab page. The optimum package size is used to bundle together multiple source packages from the same source request in a single data package in the DTP request, thus reducing the number of packages in request processing.
If necessary, use
(Filter) to define filter criteria for the data transfer.
This means that you can use multiple data transfer processes with disjunctive selection conditions to efficiently transfer small sets of data from a source into one or more targets, instead of transferring large volumes of data. The filter thus restricts the amount of data to be transferred and works like the selections in the InfoPackage. You can specify single values, multiple selections, intervals, selections based on variables, or routines. Choose Change Selection to change the list of InfoObjects that can be selected.
The
( ) icon next to the pushbutton
(Filter) indicates that predefined selections exist for the data transfer process. The tool tip for this icon displays the selections as a character string.
Choose
(Semantic Groups) to specify how you want to build the data packages that are read from the source (DataSource or InfoProvider). To do this, you specify key fields. Data records that have the same key are combined in one data package.
This setting is only relevant for DataStore objects with data fields that are overwritten. This setting also defines the error stack key fields for handling invalid data records. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.
For more information, see Handling of Data Records with Errors and Error Stack.
Note
During parallel processing of time-dependent master data, the semantic key of the DTP may not contain the field of the data source that contains the DATETO information.
Define any further settings that depend on the source object and data type.
On the Update tab page, specify the parameters:
Make the settings for error handling:
Under Error Handling, you can either activate or deactivate error handling and specify how valid records are updated in the event of an error.
Under Maximum Number of Errors per Package, specify when (that is, after how many errors) the system terminates the loading process.
For more information about error handling settings, see Handling Data Records with Errors.
Make any further settings that are relevant for the target object, as required.
On the Execute tab page, specify the parameters:
On this tab page, the process flow of the program for the data transfer process is displayed in a tree structure.
Specify the status that you want the system to adopt for the request if there are warnings in the log.
Specify how you want the system to determine the overall status of the request.
Once the technical processing stage for a DTP request has been completed, the overall status of the request can be set automatically (based on the technical status of the request) or set manually (by the user). If the overall status is set manually, the status initially remains unchanged, if the technical processing stage was completed with a red or green status. In particular, this means that data for a green request is not released for reporting or further processing. The overall status has to be set manually by the user or by a process in a process chain.
Specify whether red requests should be automatically repeated in process chains. If a DTP resulted in a canceled request during the previous run of a periodically scheduled process chain, this setting is evaluated the next time the process chain is started. If the flag is set, the previous request containing errors is automatically deleted and a new one is started. If the flag is not set, the DTP is terminated and an error message appears explaining that a new request cannot be started until the previous request is either repaired or deleted.
Note
If a delta update is carried out, the repaired request not only contains the data of the terminated request, but also all the data that has been added to the source since then. This can lead to both performance and consistency problems. Example of a consistency problem: A transfer with the prerequisite that data can only be transferred for one day at a time. This transformation then produces incorrect results, which leads to consistency problems if, during the repair, data for more than one day is to be transported at the same time.
Normally the system automatically defines the processing mode for the background processing of the respective data transfer process. You can choose from the following modes:
If you want to execute a delta without transferring any data for test purposes, choose No data transfer; delta status in source: retrieved as the processing mode. This processing mode is provided when the data transfer process extracts in delta mode. In this case you execute the DTP directly in the dialog. A request that is started in this way marks the data that is found in the source as fetched, without actually transferring it to the target. If delta requests have already been transferred for this data transfer process, you can still choose this mode. This processing mode implies that no transport-relevant changes have been made to the DTP metadata.
If you want to execute the data transfer process in debugging mode, choose Serially in the Dialog Process (for Debugging) as the processing mode. In this case, you can define breakpoints in the tree structure for the process flow of the program. The request is processed synchronously in a dialog process, and the update of the data is simulated. If you select expert mode, you can also define selections for the simulation and activate or deactivate intermediate storage, in addition to setting breakpoints. For more information see Simulating and Debugging DTP Requests.
More information: Processing Modes of Data Transfer Process
Check the data transfer process and save and activate it.
Go back to the process chain maintenance.
The data transfer process is displayed in the plan view and can be linked into your process chain. When you activate and schedule the chain, the system executes the data transfer process as soon as it is triggered by an event in the predecessor process in the chain.
The starting point when creating a data transfer process is the target into which you want to transfer the data. In the Data Warehousing Workbench, an object tree is displayed and you have selected the target object.
In the context menu, choose Create Data Transfer Process.
The dialog box for creating a data transfer process appears.
Proceed as described in steps 3 to 10 in the procedure for creating a data transfer process using a process chain. However, in step 4 you specify the source object only.
You can now execute the data transfer process directly.
Choose to display information about the source and target objects, the transformations, and the last changes to the data transfer process.
Choose to make settings for parallel processing with the data transfer process. More information: Setting Parallel Processing of BW Processes
With you define the settings for the temporary storage. More information: Handling of Data Records with Errors
You can define the DB storage parameters with . More information: DB Memory Parameters