Import Data Connection to an SQL Database

You can create a connection that allows you to import data from an on-premise SQL database (JDBC + Freehand SQL).


You have completed ONE of the following setup options:

  • Performed setup using the SAP Analytics Cloud Agent Simple Deployment Kit. For more information, see SAP Analytics Cloud Agent Simple Deployment Kit.
    JDBC drivers must be installed using the instructions in the Post-Setup Guide included in the kit.
  • Performed a manual setup of the following required components:
    1. The cloud connector is installed. For more information, see Installing the SAPCP Cloud Connector.
    2. The SAP Analytics Cloud agent is installed. For more information, see Installing SAP Analytics Cloud Agent.
    3. JDBC drivers must be installed.
    4. The cloud connector is configured. For more information, see Configuring the SAPCP Cloud Connector.
    5. The SAP Analytics Cloud agent is configured in SAP Analytics Cloud. For more information, see Configuring SAP Analytics Cloud Agent.

    6. You must restart the SAP Analytics Cloud agent, using either the Java option -DSAP_CLOUD_AGENT_PROPERTIES_PATH or an environment variable SAP_CLOUD_AGENT_PROPERTIES_PATH to specify the complete path up to and including the properties file. Example path: C:\<full path to file>\
      • If you choose to use a Java option, restart the agent via the command line by navigating to the tomcat/bin directory and doing the following:
        1. Run the shutdown.bat or script.
        2. Open the catalina.bat folder in the tomcat/bin directory and find the line where Java options are set. It should look similar to this: set "JAVA_OPTS=%JAVA_OPTS% %LOGGING_CONFIG% -Xms1024m -Xmx10246m -XX:NewSize=256m -XX:MaxNewSize=356m -XX:PermSize=256m -XX:MaxPermSize=4096m"
        3. Modify this line so that the -DSAP_CLOUD_AGENT_PROPERTIES_PATH option is included and points to your .properties file.

          Example: set "JAVA_OPTS=%JAVA_OPTS% %LOGGING_CONFIG% -Xms1024m -Xmx10246m -XX:NewSize=256m -XX:MaxNewSize=356m -XX:PermSize=256m -XX:MaxPermSize=4096m -DSAP_CLOUD_AGENT_PROPERTIES_PATH=C:\<path to driver config file> \"

        4. Run the startup.bat or script.
      • If you choose to use an environment variable, restart the agent as follows:
        • If you're running Tomcat via command line: Shut down the existing Tomcat process, then add the environment variable, and then start Tomcat from a new command line window.
        • If you're running Tomcat as a Windows service: Restart the Tomcat service by using the Tomcat configuration manager.
    7. You have installed a JDBC driver. If you haven't already installed a JDBC driver, follow these steps:
      1. Download JDBC drivers for the databases that you want to connect to, and then place the downloaded .jar file(s) in an appropriate folder. The .jar files can be in any location on your machine as long as the .properties file includes the correct file locations (see the next step). For information about supported databases, see System Requirements and Technical Prerequisites.

        You must use a Simba JDBC driver for Amazon Redshift, Amazon EMR, Apache Hadoop Hive, Cloudera Impala, Apache Spark, and Hortonworks Data Platform.

      2. Create a properties file that specifies the paths to JDBC drivers you want to use, and place it in the same file system where the SAP Analytics Cloud agent is installed. The properties file can have any name.

        Example of a properties file:
        #This file is to specify where your JDBC drivers are on the file system of the machine where the SAP BusinessObjects Cloud Agent is installed
        #For each database you have a driver for, you can remove the # from before the name of the database and specify the path to the jar file
        #If a driver requires multiple jar files, you can separate the paths with semicolons
        #This list corresponds to  BOE 4.2 SP5
        #Amazon EMR 5.6 (Hive 2.1)="path_to_JDBCdriver"
        #Amazon EMR Hive 0.11="path_to_JDBCdriver"
        #Amazon EMR Hive 0.13="path_to_JDBCdriver"
        #Amazon Redshift="path_to_JDBCdriver"
        #Apache Hadoop HIVE="path_to_JDBCdriver"
        #Apache Hadoop Hive 0.10="path_to_JDBCdriver"
        #Apache Hadoop Hive 0.12="path_to_JDBCdriver"
        #Apache Hadoop Hive 0.13 HiveServer2="path_to_JDBCdriver"
        #Apache Hadoop Hive 0.14 HiveServer2="path_to_JDBCdriver"
        #Apache Hadoop Hive 0.7="path_to_JDBCdriver"
        #Apache Hadoop Hive 0.8="path_to_JDBCdriver"
        #Apache Hadoop Hive 0.9="path_to_JDBCdriver"
        #Apache Hadoop Hive 0.x HiveServer1="path_to_JDBCdriver"
        #Apache Hadoop Hive 0.x HiveServer2="path_to_JDBCdriver"
        #Apache Hadoop Hive 1.0 HiveServer2="path_to_JDBCdriver"
        #Apache Hadoop Hive 1.x HiveServer2="path_to_JDBCdriver"
        #Apache Hadoop Hive 2.x HiveServer2="path_to_JDBCdriver"
        #Apache Spark 1.0="path_to_JDBCdriver"
        #Apache Spark 2.0="path_to_JDBCdriver"
        #BusinessObjects Data Federator Server XI R3="path_to_JDBCdriver"
        #BusinessObjects Data Federator Server XI R4="path_to_JDBCdriver"
        #Cloudera Impala 1.0="path_to_JDBCdriver"
        #Cloudera Impala 2.0="path_to_JDBCdriver"
        #DB2 10 for LUW="path_to_JDBCdriver"
        #DB2 10 for z/OS="path_to_JDBCdriver"
        #DB2 10.5 for LUW="path_to_JDBCdriver"
        #DB2 11 for LUW="path_to_JDBCdriver"
        #DB2 UDB v5="path_to_JDBCdriver"
        #DB2 UDB v6="path_to_JDBCdriver"
        #DB2 UDB v7="path_to_JDBCdriver"
        #DB2 UDB v8="path_to_JDBCdriver"
        #DB2 for z/OS v11="path_to_JDBCdriver"
        #DB2 for z/OS v12="path_to_JDBCdriver"
        #DB2 v9="path_to_JDBCdriver"
        #Data Federator Server="path_to_JDBCdriver"
        #Data Federator Server XI R3="path_to_JDBCdriver"
        #Data Federator Server XI R4="path_to_JDBCdriver"
        #Generic JDBC datasource="path_to_JDBCdriver"
        #GreenPlum 3="path_to_JDBCdriver"
        #GreenPlum 4="path_to_JDBCdriver"
        #HP Vertica 6.1="path_to_JDBCdriver"
        #HP Vertica 7.1="path_to_JDBCdriver"
        #HP Vertica 8="path_to_JDBCdriver"
        #Hortonworks Data Platform 2.3="path_to_JDBCdriver"
        #Hortonworks Data Platform 2.5="path_to_JDBCdriver"
        #IBM Puredata (Netezza)="path_to_JDBCdriver"
        #IBM Puredata (Netezza) Server 7="path_to_JDBCdriver"
        #Informix Dynamic Server 10="path_to_JDBCdriver"
        #Informix Dynamic Server 11="path_to_JDBCdriver"
        #Informix Dynamic Server 12="path_to_JDBCdriver"
        #Ingres Database 10="path_to_JDBCdriver"
        #Ingres Database 9="path_to_JDBCdriver"
        #MS Parallel Data Warehouse="path_to_JDBCdriver"
        #MS SQL Server="path_to_JDBCdriver"
        #MS SQL Server 2000="path_to_JDBCdriver"
        #MS SQL Server 2005="path_to_JDBCdriver"
        #MS SQL Server 2008="path_to_JDBCdriver"
        #MS SQL Server 2012="path_to_JDBCdriver"
        #MS SQL Server 2014="path_to_JDBCdriver"
        #MS SQL Server 2016="path_to_JDBCdriver"
        #MS SQL Server 2017="path_to_JDBCdriver"
        #MS SQL Server 6.5="path_to_JDBCdriver"
        #MS SQL Server 7.x="path_to_JDBCdriver"
        #MaxDB 7.7="path_to_JDBCdriver"
        #MaxDB 7.9="path_to_JDBCdriver"
        #MySQL 5="path_to_JDBCdriver"
        #Netezza Server="path_to_JDBCdriver"
        #Netezza Server 4="path_to_JDBCdriver"
        #Netezza Server 5="path_to_JDBCdriver"
        #Netezza Server 6="path_to_JDBCdriver"
        #Netezza Server 7="path_to_JDBCdriver"
        #Oracle 10="path_to_JDBCdriver"
        #Oracle 11="path_to_JDBCdriver"
        #Oracle 12="path_to_JDBCdriver"
        #Oracle 12c Release 2="path_to_JDBCdriver"
        #Oracle 7.3="path_to_JDBCdriver"
        #Oracle 8="path_to_JDBCdriver"
        #Oracle 8.0="path_to_JDBCdriver"
        #Oracle 8.1="path_to_JDBCdriver"
        #Oracle 9="path_to_JDBCdriver"
        #Oracle Exadata="path_to_JDBCdriver"
        #Oracle Exadata 11="path_to_JDBCdriver"
        #Oracle Exadata 12="path_to_JDBCdriver"
        #PostgreSQL 8="path_to_JDBCdriver"
        #PostgreSQL 9="path_to_JDBCdriver"
        #Progress OpenEdge 10="path_to_JDBCdriver"
        #Progress OpenEdge 11="path_to_JDBCdriver"
        #Sybase ASIQ 12="path_to_JDBCdriver"
        #Sybase Adaptive Server 11="path_to_JDBCdriver"
        #Sybase Adaptive Server 12="path_to_JDBCdriver"
        #Sybase Adaptive Server 15="path_to_JDBCdriver"
        #Sybase Adaptive Server Enterprise 15="path_to_JDBCdriver"
        #Sybase Adaptive Server Enterprise 15.5="path_to_JDBCdriver"
        #Sybase Adaptive Server Enterprise 15.7="path_to_JDBCdriver"
        #Sybase Adaptive Server Enterprise 16.0="path_to_JDBCdriver"
        #Sybase IQ 15="path_to_JDBCdriver"
        #Sybase IQ 16="path_to_JDBCdriver"
        #Sybase SQL Anywhere 10="path_to_JDBCdriver"
        #Sybase SQL Anywhere 11="path_to_JDBCdriver"
        #Sybase SQL Anywhere 12="path_to_JDBCdriver"
        #Sybase SQL Anywhere 16="path_to_JDBCdriver"
        #Sybase SQL Anywhere 17="path_to_JDBCdriver"
        #Sybase SQLServer 11="path_to_JDBCdriver"
        #Teradata 12="path_to_JDBCdriver"
        #Teradata 13="path_to_JDBCdriver"
        #Teradata 14="path_to_JDBCdriver"
        #Teradata 15="path_to_JDBCdriver"
        #Teradata 16="path_to_JDBCdriver"
        #Teradata V2 R="path_to_JDBCdriver"
        #Teradata V2 R6="path_to_JDBCdriver"
        Any JDBC driver can be added as a Generic JDBC datasource as long as it uses queries that are supported by MySQL.
      3. Modify your properties file by un-commenting the lines of the databases you want to connect to, and enter the path to the driver as the property value. If the driver requires more than one .jar file, the paths can be separated by a semicolon.
        The names of databases in your properties file must EXACTLY match the names shown in this example. If you change the names, the SQL connections will fail.


  1. In the Modeler, while creating a new model or importing data into an existing SQL Database model, choose Get data from a data source.
  2. Under Acquire data, select SQL Databases.

    The Create Model from SQL Databases dialog will appear.

  3. Select the Location of your cloud connector from the list.
  4. Select the Connection Type.
  5. Under Connection, select Create New Connection.

    The new connection dialog will appear.

  6. A name and a description for the connection.
  7. In the Datasource Configuration section, select the Location of your cloud connector from the list.
  8. Add your Server information.
    The server name must be used. If you use a named installation of MS SQL Server, the instance name is also required. If the SQL Browser service is running, the port number is not required.

    The IP address cannot be used instead of the server name.

  9. Enter your User Name and Password.
    You must use an SQL Server authenticated user.
    If you want to share the credential details, select the option Share these credentials when sharing this connection. Otherwise, users will need to enter their own credentials in order to use the connection. If you don't share your credentials, users will be able to edit their credentials at any time without having to start a data acquisition process.
  10. Choose Create.
    The new connection is added to the list of connections on the Connections screen.
  11. If you want to create a model based on this connection, continue to Importing SQL Data to a New or Existing Model.