Show TOC

DataStore ObjectLocate this document in the navigation structure

Use

With Write Data to DataStore Object as the data target, you can save calculation results from an analysis process in a DataStore object for direct update. With every run of the analysis process, first all of the data in the DataStore object is deleted, then the result is calculated and written to the DataStore object again. Each calculation of an analysis process thus corresponds to a full update with the data that already exists being deleted beforehand.

If you want to compare multiple calculation results with one another, you can use a DataStore object for direct update as a temporary buffer for the calculation result of the analysis process. Execute the analysis process. Check the data in the DataStore object as needed. If the calculation result is OK, use an update rule to write the data to an InfoCube or a normal DataStore object.

Features

On the Target Area tab page, you can partition the DataStore object into several subareas. The analysis process then only deletes and writes data into one partition at a time. You use the target area when you want to manually parallelize the processing of an analysis process.

Example

One ABC classification of customers should be implemented per business area. Each business area also gets in own analysis process. Then each analysis process only reads the data for a business area and writes the calculation result for exactly one business area. In this case, select the business area characteristic to restrict the partition and then specify a characteristic value to restrict the partition. The analysis process then only writes data to this one partition. Other analysis processes can write to other partitions. If you create an analysis process for each business area, processing can be started in parallel.

Updating in Detail:

  • When writing the data, the records are always only inserted (INSERT). If the inbound table contains several records for the same key, the system terminates the analysis process. As needed, use an aggregation transformation before the data target to aggregate the data to fit the key fields of the DataStore object.

  • To enable you to process mass data, the data from the analysis process is processed internally in technical packages. When the data is written to the target area, the system saves the data of each package in the database.

    Caution

    If the analysis process terminates during execution, it is not clear which data has already been written up to that point. The data in the target area may be entirely deleted or only partly there. In this case, eliminate the cause of the error and start execution again.

  • When writing to a DataStore object for direct update, a lock is currently not set. This means that several analysis processes can write in various partitions in the DataStore object. You should make sure that two analysis processes do not write to the same partition in the same DataStore object.

  • Currently, there is no validity check on the field values before they are written to the target area. This means that the data is expected to have the following format:

    • With characteristics with master data tables, only valid characteristic values may be transferred.

    • The data is in internal format: NUMC fields contain leading zeroes, for dates, all digits contain values, and fields with conversion routines have already been converted.

The following restrictions apply:

  • From a technical viewpoint, the same DataStore object can be used as a data source and as a data target in one analysis process simultaneously. However, because the data in the target area is always deleted first, and then the data is read, this approach cannot be used to change a data field in a DataStore object.

  • Only key fields of the DataStore objects can be used to define a target area.

  • Fields that were already defined in the settings for the target area to restrict the target area are no longer offered in the field assignment for this data target.

  • You enter a value on the Target Area tab page in a similar way to entering a constant in the field assignment on the incoming data flow arrow. The difference lies in the restriction of the target area, in which the data is deleted and written again. Only characteristics specified on the Target Area tab page are considered here. If the characteristic value is defined in the field assignment, all data is deleted.

Activities
  1. On the Data Target tab page, select the DataStore object for direct update that you want to fill.

  2. On the Target Area tab page, enter values to restrict the partition as required.

  3. Explicitly define the field assignment for this data target by double-clicking the inbound data flow arrow.

Example

You have modeled an analysis process that implements a customer evaluation. The result of the valuation is expressed as the attribute Customer Class, which takes the values gold, silver, and bronze. The analysis process delivers a table with two columns: Business Partner and Customer Classification:

Business Partner

Customer Classification

4711

Gold

4712

Silver

4713

Bronze

To track changes to customer classifications, you should run this evaluation once a month and save the result for each month.

To do this, create a DataStore object for direct update with the key fields Calendar Month and Business Partner. In the data part, include the field Customer Classification. You use this field to store the customer classification of the business partner every month.

In this example, the target area is all the data for one month. Enter a value for the month on the Target Area tab page, for example, June 2003. In the field assignment on an inbound data flow arrow, you can now only assign the fields Business Partner and Customer Classification.

Execute the analysis process. The result of the analysis then appears in the DataStore object under the name June 2003. If you want to perform the analysis in July, copy the analysis process and change the value on the Target Area tab page from June 2003 to July 2003.