System Sizing, Tuning, and Limits
Here are some guidelines on sizing and tuning your SAP Analytics Cloud system, and limits for acquiring data.
- Data Integration Limits
- File Upload Limits
- Limits when Importing and Exporting Content as Files
- System Sizing and Tuning
Data integration consists of data acquisition and data preparation workflows. Each workflow has its own set of limits and restrictions. Depending on the data sources, content types, and actions, these workflows may be:
- Two independent steps (for example, when invoking the data acquisition workflow to import draft data, then at a later time invoking the data preparation workflow to create a model from draft data).
- One integrated step (for example, when scheduling a model refresh or refreshing a model manually).
- Just a data acquisition workflow (for example, acquiring data from a Business Planning and Consolidation data source, or importing data to a Dataset).
Data Acquisition: File size limits
- Microsoft Excel (XLSX only): Maximum file size: 200 MB.
- Comma-separated values files (CSV): Maximum file size: 2 GB.
- Excel files (XLSX only) in cloud storage through Cloud Elements: Maximum file size: 200 MB.
- CSV files in cloud storage through Cloud Elements: Maximum file size: 2 GB.
Data Acquisition: row, column, and cell limits
- Models and stories:
- For SAP BW, SAP Universe, SAP HANA, Google BigQuery, and SQL data sources only: 100,000,000 cells; 100 columns.
- For CSV and XLSX files, there is a limit on file size and a maximum of 2,000,000,000 rows; 100 columns.
- Google Sheets allows a maximum of 5 million cells (but CSV and XLSX files stored in Google Drive follow the above 2,000,000,000 row, 100 column limit)
- For all other data sources: 800,000 rows; 100 columns
- The row limitation doesn't apply when importing data from all versions of SAP Business Planning and Consolidation (BPC)
- For SAP BW, SAP HANA, Google BigQuery, and SQL data sources only: 1,000,000,000 cells; 1000 columns.
- For CSV and XLSX files, there is a limit on file size and a maximum of 2,000,000,000 rows; 1000 columns.
- Google Sheets allows a maximum of 5 million cells (but CSV and XLSX files stored in Google Drive follow the above 2,000,000,000 row, 1000 column limit)
- For all other data sources: 1,000,000 rows; 1000 columns
For more information, see Differences Between Acquired and Live Datasets.CautionWhile applying the predictive model to an application dataset, Smart Predict generates additional columns. The application process can get blocked if your application dataset already risks crossing the limit of 1,000 columns.
- The maximum number of characters in a cell is 4998.
- Each tenant can have a maximum of 30 concurrent data acquisition jobs. Additional submitted jobs will be queued. This is true no matter which type of SAP Analytics Cloud tenant you have, whether it is a private dedicated tenant or shared.
- Each data acquisition job has a maximum 24 hours run time. Jobs will terminate when they reach the time limit.
Data Preparation/Modeling: Row limits
(These are limits related to the fact table)
Row limits determining row counts:
The number of rows in a fact table is not the same as the number of data points in the dimension. The following examples illustrate how imported data can sum up to multiple aggregated arrangements, or rows of fact data:
2 rows of imported data become 4 rows in the fact table:
4 rows of imported data become 4 rows in the fact table:
- Data preparation row limit: The row limit is determined by the data acquisition limit. However, each data preparation job will time out if the browser times out. You can increase the session timeout setting if necessary. See System Sizing and Tuning below.
- Modeling row limit:
- Subsequent data imports to an existing model cannot exceed a total of 2^31-1 (2,147,483,647) rows.
- You cannot import data or schedule a data import into an existing model if the resulting fact data would include over 2^31-1 (2,147,483,647) rows.
Data Preparation: General limits
- Analytic models: if there are more than 1,000,000 unique members, the dimension will be made read-only
Data Preparation/Modeling: General limits
(These are limits related to the master data)
- Models, and stories with embedded models (that is, when adding data into a story): 100 columns
- Dimension members: 1,000,000
- Dimension members with geo enrichment: 200,000
- Dimension members with parent/child hierarchy: 150,000 (for other kinds of attributes, the 1,000,000 limit applies)
- The maximum length of imported data values is 256 characters.
The maximum file size when uploading files on the Files page is 100 MB.
The maximum file size when importing and exporting content as files in the Deployment area is 100 MB.
Session Timeout setting
Users need to maintain an active browser session during data import and processing (not including scheduled refreshes).
But if users often encounter session timeouts during large imports, a system administrator can increase the session timeout value:.
SAP Analytics Cloud Agent
For better handling of large data volumes, please ensure that Apache Tomcat has at least 4GB of memory.
Data source system tuning
If you plan to import large amounts of data, make sure your data source system is tuned accordingly. Refer to the documentation from your data source vendor for information about server tuning.