Real-time data acquisition (RDA) supports tactical decision-making. It also supports operational reporting by allowing you to send data to the delta queue or PSA table in real time. You then use an RDA job to transfer data into InfoProviders in the Operational DataStore layer at defined intervals. The data is stored persistently in BW.
We recommend using real-time data acquisition if you want to transfer data to BW at more regular intervals (every hour or every minute) than scheduled data transfers and you need up-to-date data to be regularly available for analysis and reporting (at least several times a day).
The following overview displays the differences between standard data acquisition using scheduled data requests and real-time data acquisition:
Web Service DataSources and DataSources, which provide data using the BW Service API (SAPI) or the operational provisioning framework (ODP), can support real-time data acquisition. In DataSource maintenance, the Extraction tab page shows whether the Real Time property is set.DataSources, which provide data using the SAPI or using ODP with ODP context SAPI, can be used on the following conditions for real-time data acquisition.
For other ODP contexts, real-time data acquisition is already supported on context level and not on DataSource level: The ODP contexts HANA, BYD and BW do not support real-time data acquisition. The context SLT~<Queue-Alias> supports support real-time data acquisition (property is displayed accordingly in DataSource maintenance).
Real-time data acquisition as a property of DataSources that provide data using the BW Service API is only possible if the following technical prerequisites are met: The BW Service API in the source system has Plug-In-Basis 2005.1 or higher. For 4.6 source systems, Plug-In 2004.1 SP10 or higher is required.
For real-time data acquisition, Web Service DataSources can only be used if they have been defined using DataSource maintenance in BW (transaction RSDS).
Data is loaded into BW at defined intervals and posted to the InfoProviders available for operational reporting.
DataStore objects (standard-optimized and write-optimized DataStore objects) as well as HybridProviders can be provided with data using real-time data acquisition. A HybridProvider consists of a DataStore object and an InfoCube. The actual real-time update takes place in the DataStore object of the HybridProvider. The InfoCube acts as an aggregate for the DataStore object. It is loaded using a standard DTP generated with the HybridProvider that updates successfully closed RDA requests. A query that is defined on a HybridProvider refers to the data from the InfoCube (history) and from the change log of the DataStore object (current data). This HybridProvider architecture enables high-performance access to data during analysis and reporting. More information: Creating HybridProviders.
DataStore objects that are part of a HybridProvider cannot be used independently of the HybridProvider. A DataStore object can therefore be the target in a RDA data model or a part of a HybridProvider in a RDA data model. One DataStore object cannot be both simultaneously.
To ensure that the transaction data and master data is synchronized, you can use real-time data acquisition to transfer data into InfoObjects as well as InfoProviders. The real-time transfer of attributes and texts is supported.
In the rest of this documentation, the term InfoProvider is used if the information refers to both the HybridProvider, independent DataStore objects and InfoObjects.
Special InfoPackages and data transfer processes are used for transferring data with real-time data acquisition. There are also special background processes (RDA jobs) that control and monitor data transfer with real-time data acquisition. These jobs are referred to as "daemons" for Web Service and BW Service, and "data package jobs" for ODP. Data is available for analysis and reporting as soon as it has been posted to the master data tables or DataStore object (of a HybridProvider) and activated. Refresh the query display to display the current data. The query shows the time that the query was last updated by a daemon run, even if no new data was posted.
The RDA job that controls and monitors data transfer for Web Service and BW Service API is referred to as "daemon". In this case, the data is transferred to the persistent staging area (PSA) in BW. This requires an InfoPackage. The data is transferred using a data transfer process from the PSA to the InfoProvider. The data transfer procedure from the source to the PSA can differ as follows:
The data in the source can be written to the PSA directly using the Web Service. The data transfer is therefore controlled externally without being requested in BW. An InfoPackage is only required here in order to define certain parameters for real-time data acquisition.
Data from an SAP source system can be loaded into the PSA using an InfoPackage created specifically for this purpose. This is triggered when the delta queue in the source system requests data. You have to perform the initialization of the delta process for the DataSource beforehand.
The following two scenarios are possible:
In this case, the daemon retrieves the data without calling the extractor.
For extractors that transfer data synchronously from BW to the service API on request (generic extractors, for example), the daemon calls the extractor, which then writes the data to the delta queue. The data is transferred directly to BW from the delta queue.
In addition to a pure RDA update and a pure standard update, the following update methods, in which both transfer mechanisms are used in parallel, are possible for data that was transferred to the PSA by real-time data acquisition:
The data is updated from the PSA to an InfoProvider with a DTP for real-time data acquisition, and to another InfoProvider with a standard DTP. The graphic below illustrates this process:
Real-time data acquisition keeps an activation request open parallel to each DTP request. In a DTP request for RDA, multiple data packages can be loaded during a given time period. Each data package is activated in the DataStore object immediately after it is transferred. A further data transfer process cannot load into the same standard DataStore object as long as an activation request is open. When you model your data flow, you should therefore not use real-time data acquisition to write to a standard DataStore object into which you need to load data simultaneously with another data transfer process.
Depending on the requirements, you can still merge the data that you load with a real-time data acquisition in an InfoProvider with additional data sources.
More information: Write-Optimized DataStore Objects
If you want to integrate the transfer of data with real-time data acquisition into an existing data flow, you have two options:
One DataSource executes the standard data transfer, and the other DataSource transfers the data with real-time data acquisition. The data is then combined into a CompositeProvider.
In this case, you have to replace the standard data transfer completely with a real-time data acquisition scenario.
More information: Real-Time Data Acquisition