Show TOC

Transferring Transaction Data from SAP Source Systems (Using Service API)Locate this document in the navigation structure

Prerequisites

The DataSource is delta enabled and supports real-time data acquisition. For your source system, you have installed BW Service API Status SAP NetWeaver 2004s (Plug-In-Basis 2005.1) or higher, or for release 4.6C, Plug-In 2004.1 SP10.

Context

If you want to use data for operational reporting that is available in SAP source systems, use the real-time data acquisition scenario.

Procedure


  1. Activate the DataSource from BI Content and replicate the DataSource in the BW system, if necessary.

    See: Installing the Business Content DataSources in Active Version

    If no BI Content DataSource is available for your case, you can create a generic DataSource and replicate it in the BW system. Set up the generic delta for the DataSource and set the real-time enabled indicator for the DataSource.

    See: Maintaining Generic DataSources

  2. Specify an InfoProvider as the target of the real-time data transfer. This can be a HybridProvider, a DataStore object that is not being used as part of a HybridProvider, or an InfoObject (texts, attributes).

    Note

    The procedure describes the data transfer to a standard DataStore object. There are no activation steps or activation request for write-optimized DataStore objects.

    See: Creating HybridProviders, Creating DataStore Objects, Creating InfoObjects

  3. Create a transformation with the DataSource as the source and the InfoProvider you created previously as the target.

    See: Creating Transformations

  4. If you want to process the data further after the real time data acquisition process, and want to update further InfoProviders, create the appropriate InfoProviders and transformations.

  5. Create an InfoPackage for the DataSource for performing or simulating the delta initialization.

    See: Creating InfoPackages

  6. Create a data transfer process for real-time data acquisition with the DataSource as the source and the InfoProvider as the target.

    See: Creating Data Transfer Processes for Real-Time Data Acquisition

  7. If you want to connect the further processing and updating processes with Real-Time Data Acquisition, and you have created the relevant objects in step 4, create an appropriate process chain. In the start process of the chain choose the scheduling option Direct Scheduling with immediate start, and activate the chain.

  8. Perform the delta initialization.

    If you have transferred data during the delta initialization, execute the following steps. Otherwise, proceed to step 11.

  9. Update the data from the PSA into the InfoProvider.

    1. Go to the maintenance screen for the data transfer process that you created in step 6.

    2. Choose Change to Standard DTP.

    3. On the Execute tab page, start the data transfer process.

    4. For DataStore Objects and HybridProvider: When the data has been successfully loaded into the DataStore object (of the HybridProvider), activate the DTP request in the DataStore object.

    If you are loading data into a HybridProvider, the system automatically starts a process chain after the data has been activated successfully. The process chain updates the data into the InfoCube of the HybridProvider.

  10. To start the process chain to process the data further, in the process chain maintenance choose Schedule.

  11. Change the type of the data transfer process again.

    In the maintenance screen for the data transfer process, choose Change to Real-Time DTP.

  12. Create an InfoPackage for real-time data acquisition for the DataSource.

    See Creating InfoPackages for Real-Time Data Acquisition.

  13. Switch from the InfoPackage to the monitor for real-time data acquisition.

    In InfoPackage maintenance, on the Schedule tab page, choose Assign Daemon.

    The monitor for real-time data acquisition appears (transaction RSRDA).

  14. Define a daemon (using Create Daemon) or select an existing daemon.

  15. Assign the DataSource (and the InfoPackage) to the daemon.

    The InfoPackage appears in the monitor for real-time data acquisition, in the Unassigned Nodes area, under the DataSource for which the InfoPackage was created. In the context menu of the DataSource, choose Assign Daemon to assign the InfoPackage to the daemon. The daemon can now use the InfoPackage for data extraction.

  16. Assign the data transfer process to the daemon.

    In the context menu of the DataSource, choose Assign DTP. The daemon now uses the data transfer process to process data.

    Note

    You can update the data from the DataSource into multiple DataStore objects. In this case, assign the corresponding data transfer processes to the daemon (using the DataSource).

  17. Assign the process chain created earlier to the data transfer process.

    In the context menu of the DTP choose Assign Subsequent Process Chain.

    Note

    You can assign more than one subsequent process chain to a data transfer process.

  18. Change the scheduling option in the subsequent process chain so that the subsequent process chain can be scheduled using real-time data acquisition.

    To do this, in the DTP context menu choose Maintain Subsequent Process Chain, go to the start process, and choose scheduling option Using Meta-Chain or API. Save and activate the chain, schedule it, and then return to the monitor for real-time data acquisition.

  19. In the monitor for real-time data acquisition, start the daemon.

    In the context menu of the daemon, choose Start Daemon with All InfoPackages.

    The daemon waits for a free background job.

Results

While the background job is running, the daemon starts the delta transfer using the InfoPackage for real-time data acquisition. The daemon executes three steps:

  1. In the first step, the daemon calls the service API in the source system. The service API transfers the data records into BW (into the delta queue previously, if necessary) and passes them to the daemon. The data is updated to the PSA table.

  2. In the second step, when the data has been successfully updated to the PSA table, the daemon confirms the transfer, and the service API changes the status of the records in the delta queue.

    Once the request has been successfully closed and the next request is open, the records are deleted from the delta queue.

  3. In the third step, the daemon starts the data transfer process for real-time data acquisition. This updates the data from the PSA table into the InfoProvider. For DataStore objects (of the HybridProvider) the changes are logged in the change log request belonging to the object. The request that transfers the data to the PSA table (PSA request), the data transfer process request and the change log request for each DataStore object have a 1:1 ratio to each other. The data is activated automatically and written to the change log. This allows you to skip an additional activation step in the DataStore object; data is not written to the activation queue of the DataStore object.

The data is now available for analysis and reporting purposes.

If you have assigned a subsequent process chain to the data transfer process, start this process chain when the daemon closes a request.