Background documentationModeling Locate this document in the navigation structure

 

Modeling data in the BW principally involves data staging and modeling the layers of a Data Warehouse.

The concept of layered scalable archictecture (LSA) assists you in designing and implementing various layers in the BW system for data acquisition, Corporate Memory, data distribution and data analysis. Here we differentiate between two main layers: the Enterprise Data Warehouse layer and the Architected Data Mart layer.

The following graphic illustrates the structure of the different layers:

This graphic is explained in the accompanying text.

More information:

Enterprise Data Warehouse Layer

Architected Data Mart Layer

The different BW objects for integrating, transforming, consolidating, cleaning up, and storing data are now viewed independently of the layers.

The following figure outlines how BW objects are integrated in the dataflow:

This graphic is explained in the accompanying text.

The tool you use for modeling is the Data Warehousing Workbench.

The complexity of data flows varies. As a minimum, you need a DataSource, a transformation, and an InfoProvider. DataSources are used for extracting data from a source system and transferring it into the BW system. The transformation allows you to consolidate and cleanse data from multiple sources. You can semantically synchronize data from heterogeneous sources. You integrate the data into the BW system by assigning fields of the DataSource to InfoObjects. InfoProviders are comprised of multiple InfoObjects. They are either data stores or they show views of data. The InfoProvider provides the data that is evaluated in queries.

You can then update the data into other InfoProviders if required. To execute multiple transformations one after the other, you can use an intermediate InfoSource. You can also extract data from InfoProviders into downstream systems using an open hub destination.

Alternatively, you can generate a simple data flow on the basis of a DataSource. Note that this data flow can only illustrate simple models, since there is no transformation of the data and the data model is not optimized.