Start of Content Area

Process documentation Real-Time Data Acquisition  Locate the document in its SAP Library structure


Real-time data acquisition supports tactical decision-making. It also supports operational reporting by allowing you to send data to the delta queue or PSA table in real-time. You then use a daemon to transfer DataStore objects to the operational DataStore layer at frequent regular intervals. The data is stored persistently in BI.

We recommend that you use real-time data acquisition if you want to transfer data to BI at shorter intervals (every minute) than scheduled data transfers and you need up-to-date data to be regularly available in reporting (several times a day, at least).

The following overview displays the differences between standard data acquisition using scheduled data requests and real-time data acquisition:

This graphic is explained in the accompanying text


      The DataSource has to support real-time data acquisition.

Web service DataSources and DataSources from SAP source systems can support real-time data acquisition. In DataSource maintenance, the Extraction tab page shows whether the Real Time property is set.

DataSources from SAP source systems can be used for real-time data acquisition if the following prerequisites are met:

       BI Content DataSources have to be delivered with the property for supporting real-time data acquisition.

       The Real-Time-Enabl.indicator has to be set in the generic delta settings for generic DataSources (more information: Delta Transfer to BIsa).


Note that real-time data acquisition as a property of DataSources in SAP source systems is only possible technically under the following circumstances: In the source system, the BI Service API has at least the version Plug-In-Basis 2005.1 or for 4.6C source systems Plug-In 2004.1 SP10.

Note also that you can only use a Web service DataSource for real-time data acquisition if you have defined it in the (new) DataSource maintenance in BI.

      You use a transformation between the PSA and DataStore object and update data into DataStore objects from the PSA using the data transfer process.

Process Flow

The following figure illustrates the process flow for real-time data acquisition:

This graphic is explained in the accompanying text 

Data is loaded into BI at frequent, regular intervals and is then posted to the DataStore objects that are available for operational reporting. In BI, special InfoPackages are used for this purpose and data transfer processes for real-time data acquisition are created to further process data from the PSA in the DataStore objects. This is scheduled and executed regularly by a dedicated background process (the daemon). Data is available for reporting as soon as it has been successfully posted to the DataStore object and activated. Refresh the query display to display the recent data. The query shows the time that the query was last updated by a daemon run, even if no new data was posted.

You can transfer data from the source to the entry layer of BI (the PSA) in two ways:


      Using a Web service

You use the Web service to write the data from the source into the PSA. The transfer of data is controlled externally, without a request from BI. Only an InfoPackage (for full upload) is required to determine specific parameters for real-time data acquisition.

      Using a service API

Data from an SAP source system can be loaded into the PSA using an InfoPackage created specifically for this purpose. This is triggered when the delta queue in the source system requests data. You have to simulate the initialization of the delta process for the DataSource beforehand.

The following two scenarios are possible:

       The source system application writes the data to the delta queue.

In this case, the daemon retrieves the data without calling the extractor.

       The application does not write data to the delta queue automatically; the extractor writes the data to the delta queue at the request of BI.

For extractors that transfer data synchronously from BI to the service API on request (generic extractors, for example), the daemon calls the extractor, and the extractor writes the data to the delta queue. The data is transferred to BI directly from the delta queue.


      You can only use real-time data acquisition to fill DataStore objects. A two-step data transfer is supported; data is first transferred into the PSA and then into the DataStore object. The DataStore object cannot be used as the source for a further real-time data transfer to another DataStore object.

      Master data cannot be transferred to the BI system with real-time data acquisition as otherwise the navigation attributes of the characteristic could no longer be used in aggregates. This is because aggregates cannot react to real-time updates since the change run in the RDA process cannot be triggered automatically for the loaded data.

      DataSources that are used for real-time data acquisition cannot be used in the delta process for standard data transfer (scheduled staging). A data transfer with RDA and a scheduled data transfer cannot be executed simultaneously in the delta process for a DataSource because there may be only one entry in the delta queue for each DataSource and target system.

In addition to a pure RDA update and a pure standard update, the following update methods, in which both transfer mechanisms are used in parallel, are possible for data that was transferred to the PSA by real-time data acquisition:

The data is updated from the PSA to a DataStore object with a DTP for real-time data acquisition, and to another DataStore object with a standard DTP.

This graphic is explained in the accompanying text

      If you load data into a DataStore object with real-time data acquisition, you cannot load data into this DataStore object simultaneously with an additional DTP. This is because there can be only one open activation request in a DataStore object. Real-time data acquisition keeps an activation request open parallel to each DTP request. In a DTP request, multiple data packages can be loaded during a given time span. Each data package is activated in the DataStore object immediately after it is transferred. A further data transfer process cannot load into the same DataStore object as long as an activation request is open.

Depending on the requirements, you can nevertheless merge the data that you load with a real-time data acquisition in an InfoProvider with additional data sources..

       The DataStore object in which you load data with real-time data acquisition can be used in a MultiProvider or InfoSet.

       Using a process chain, you can restrict the time in which you load data into the DataStore object with real-time data acquisition. You can load data into the same DataStore object with a different data transfer process during the remainder of the time.

More information:

Controlling Real-Time Data Acquisition with Process Chains    

Request Concept     

Converting an Existing Data Flow to Real-Time Data Acquisition

If you want to integrate the transfer of data with real-time data acquisition into an existing data flow, you have two options:

      Using two different DataSources

One DataSource executes the standard data transfer. The other DataSource transfers the data with real-time data acquisition. The data is then combined in a MultiProvider.

      Using a single DataSource

You have to replace the standard data transfer completely with a real-time data acquisition scenario.

More information: Real-Time Data Acquisition



End of Content Area