Modeling Guide

SAP Vora Loader

The SAP Vora Loader operator works as a client for the SAP Vora transaction coordinator and allows you to load data into SAP Vora.

For now, loading data is supported through HDFS, WEBHDFS, S3 and ADLS.

Configuration Parameters

Parameter

Type

Description

connection

object

Holds information abount connection information for the services.

configurationType

string

connection parameter: Which type of connection information will be used: Manual (user input) or retrieved by the Connection Management Service.

Default: ""

connectionID

string

connection parameter: The ID of the connection information to retrieve from the Connection Management Service.

Default: ""

connectionProperties

object

connection parameter: All the connection properties for the selected service for manual input.

storage

string

The file service to operate. Additional parameters may depend on the selected service.

Default: "local"

host

string

Hostname of SAP Vora instance.

Default: "localhost"

port

string

Port of SAP Vora instance.

Default: "9090"

tenant

string

Tenant of SAP Vora instance.

Default: ""

user

string

SAP Vora username.

Default: ""

password

string

SAP Vora password.

Default: ""

clientId

string

Mandatory. ADL parameter: The client ID from ADLS.

Default: ""

tenantId

string

Mandatory. ADL parameter: The tenant ID from ADLS.

Default: ""

clientKey

string

Mandatory. ADL parameter: The client key from ADLS.

Default: ""

accountName

string

Mandatory. ADL parameter:The account name from ADLS.

Default: ""

rootPath

string

ADL parameter: The optional root path name for browsing. Starts with a slash (e.g. /MyFolder/MySubfolder).

Default: "/MyFolder/MySubfolder"

host

string

Mandatory. HDFS parameter: The IP address to the Hadoop name node.

Default: "127.0.0.1"

port

string

Mandatory. HDFS parameter: The port to the Hadoop name node.

Default: "9000"

user

string

Mandatory. HDFS parameter: The Hadoop user name.

Default: "hdfs"

rootPath

string

HDFS parameter: The optional root path name for browsing. Starts with a slash (e.g. /MyFolder/MySubfolder).

Default: "/MyFolder/MySubfolder"

accessKey

string

Mandatory. S3 parameter: The AWS access key ID.

Default: "AWSAccessKeyId"

secretKey

string

Mandatory. S3 parameter: The AWS secret access key.

Default: "AWSSecretAccessKey"

endpoint

string

S3 parameter: allows a custom endpoint http://awsEndpointURL

Default: ""

awsBucket

string

S3 parameter: AWS bucket name.

Default: ""

region

string

Mandatory. S3 parameter: The AWS region to create the bucket in.

Default: "eu-central-1"

rootPath

string

Mandatory. S3 parameter: The optional root path name for browsing. Starts with a slash and the bucket name (e.g. /MyBucket/MyFolder).

Default: "/MyBucket/MyFolder"

rootPath

string

WebHDFS parameter: The optional root path name for browsing. Starts with a slash (e.g. /MyFolder/MySubfolder).

Default: "/MyFolder/MySubfolder"

protocol

string

Mandatory. WebHDFS parameter: The scheme used on WebHDFS connection (webhdfs/http or swebhdfs/https).

Default: "webhdfs"

host

string

Mandatory. WebHDFS parameter: The IP address to the WebHDFS node.

Default: "localhost"

port

string

Mandatory. WebHDFS parameter: The port to the WebHDFS node.

Default: "9000"

user

string

Mandatory. WebHDFS parameter: The WebHDFS user name.

Default: "hdfs"

protocol

string

Mandatory. S3 parameter: The protocol schema to be used (HTTP or HTTPS).

Default: "HTTP"

initStatements

string

SQL code to be executed when the operator is initialized.

Default: "CREATE SCHEMA TEST;CREATE TABLE TEST.T1(V VARCHAR(\*));"

tableName

string

Mandatory. The name of the table where the loaded file is inserted.

Default: "TEST.T1"

delimiter

string

Mandatory. The character used to separate column values.

Default: ","

tableName

string

The name of the table where the loaded file is inserted.

Default: "TEST.T1"

delimiter

string

The character used to separate column values.

Default: ","

fileFormat

string

The format of the file to be loaded.

Default: "csv"

numRetryAttempts

int

The number of retry attempts.

Default: 0

retryPeriodInMs

int

The waiting time in milliseconds between consecutive retry attempts.

Default: 0

hadoopNamenode

string

The Hadoop name node address to receive data from.

Default: "localhost:9000"

Input

Input

Type

Description

inFilename

string

The storage path of the file to be loaded to the transaction coordinator.

Output

Output

Type

Description

outResult

string

Either "Successfully executed sql!" or the appropriate error message.