Show TOC Start of Content Area

Process documentation Transferring Data Using Web Services  Locate the document in its SAP Library structure

Purpose

Data is generally transferred into BI by means of a data request, which is sent from BI to the source system (pull from the scheduler). You can also use Web services if you want the data transfer to be controlled from outside the BI system and sent into the inbound layer of BI, the Persistent Staging Area (PSA). This is a data push into the BI system.

If you are using Web services to transfer data into BI, you can use real-time data acquisition to update the data into BI. Alternatively, you can update data using a standard data transfer process:

      If you access the data frequently and want the data to be refreshed every once an hour to every once a minute, use Real-Time Data Acquisition. The data is first written to the PSA of the BI system. From there, the data is controlled by a background process, or daemon, which runs at frequent regular intervals; the data is updated to a DataStore object and is then immediately available for operational reporting.

      If you do not need to refresh the data in BI on an hourly basis to meet your analysis and reporting requirements, use the standard update. Again, the data is first written to the PSA of the BI system. Process chains control the update and further processing of data.

In SAP NetWeaver 7.0, you generate Web services for data loading when you activate a DataSource defined in the BI system. The Web services provide you with WSDL descriptions, which can be used to send data to BI regardless of the technology used.

Background documentation

The BI server SOAP interface can ensure guaranteed delivery, since an XML message is returned to the client upon success as well as failure. If the client receives an error or no message at all (due to connection termination when sending a success message, for example), the client can resend the data.

It is not currently possible to ensure guaranteed delivery only once, since there is no match at transaction-ID level. This is required to determine whether a data package was 'inadvertently' resent and should not be updated. If deltas are built using after-images (delta process AIM), the update to a DataStore object can, however, consistently deal with data sent excessively, as long as serialization is guaranteed. The serialization is the task of the client.

Prerequisites

You are familiar with the Web service standards and technology.

See also:

Web Services

Process Flow

Design Time

...

       1.      You define the Web service DataSource in BI. When you activate the DataSource, the system generates an RFC-enabled function module for the data transfer, along with a Web service definition, which you can use to generate a client proxy in an external system. For example, you can implement the Web service in ABAP in SAP systems.

       2.      Depending on how you want to update data to BI, proceed as follows:

       You specify the dataflow for real-time data acquisition.

                                                  i.       After you have defined a DataStore object, you create a transformation with the DataSource as the source and the DataStore object as the target; you also create a corresponding data transfer process for real-time data acquisition.

Note

You have to use a standard data transfer process to further update data to subsequent InfoProviders.

                                                ii.       In an InfoPackage for real-time data acquisition, you specify the threshold values for the size of the data packages and requests; this information is required to process the sent data.

                                               iii.       In the monitor for real-time data acquisition, you define a background process (daemon) and assign the DataSource (with InfoPackage) and data transfer process to it.

For more information, see Transferring Transaction Data Using Web Services (RDA).

       You specify the dataflow for the standard update:

...

                                                  i.       After you have defined an InfoProvider, you create a transformation with the DataSource as the source and the InfoProvider as the target; you also create a corresponding (standard) data transfer process. Specify any subsequent InfoProviders, transformations and data transfer processes, as required.

                                                ii.       In an InfoPackage, you specify the threshold values for the size of the data packages and requests; this information is required to process the sent data.

Caution

Since it is only possible to specify threshold values in an InfoPackage for Real-Time Data Acquisition, this type of InfoPackage is also used with the standard update. 

As with real-time data acquisition, the PSA request remains open across several load processes. The system automatically closes the PSA request when one of the threshold values defined in the InfoPackage is reached.  If you want to update data using a standard data transfer process, it must also be possible to close the PSA request without waiting for the threshold values to be reached. This is controlled in a process chain by the process type Close Real-Time InfoPackage Request.

                                               iii.       You create a process chain to control data processing in BI.

This process chain starts with process Close Real-Time InfoPackage Request; update processes and processes for further processing are included in the process chain.

For more information, see Transferring Transaction Data Using Web Services (Standard).

Runtime

You use the Web service to send data to the PSA of the BI system.

Background documentation

A WSDL description of the Web service, along with a test function to call the Web service, is available in administration for the SOAP runtime (transaction WSADMIN).

If you are using real-time data acquisition and the daemon is running, the daemon controls the regular update of data from the PSA to the DataStore object. The data is activated automatically and is available immediately for analysis and reporting.

If you are using standard update and the process chain is running, the process chain controls when the PSA request is closed and triggers the processes for update and further processing.

 

 

 

 

End of Content Area