As the integration lead for your organization, help your integration developers design enterprise-grade integration flows.

A design guideline acts as a rule that helps integration developers design an improved integration flow. For example, handling attachments in the right way, applying the appropriate security standards, or implement a specific pattern. Enable all or a subset of available design guidelines that you think are appropriate for your organization's business needs.

See: Integration Flow Design Guidelines.

Before you enable, understand every design guideline and its implications. Consume the in-app help available for each design guideline to learn more.

After you enable, all integration flows designed in your tenant must abide by the enabled design guidelines. You can enable or disable a design guideline anytime.

You can't process JMS and JDBC transactions together in a single transaction.

You can't process distributed transactions. If there's an error after one transaction is committed, the other transaction can't be committed. The message remains in the inbound queue or isn't committed into the outbound queue.

See: Avoid mixing JDBC and JMS transactions.

Keep your data store operations in a local integration process pool to keep the transaction duration shorter.

If transaction handling is activated within an integration flow, the related database transaction is only committed if the whole integration flow execution is completed successfully. If you keep your transaction in a main integration process pool, the transaction is active until the entire process is executed. Each database transaction requires a dedicated connection to the database. If your transaction is open for a longer duration, the number of available connections to the database can potentially be exhausted.

For better design, keep your transactions in a dedicated local integration pool so that the tranaction execution is faster.

See: Control the Number of Simultaneously Opened Database Connections.

Parallel processing of messages can't be transactional.

Splitters and Multicast don't support transaction handling. If you need a transaction, use sequential processing.

See: Parallel Processing of Messages.

Optimize the memory footprint during the execution of your integration scenario.

Avoid using string as output types unless your next flow expects a string input. Use ByteArray as output types if possible, especially when you process large messages.

See: Optimize Memory Footprint.

Empty or reset all the data that isn't required beyond a multicast branch.

When your flow has multiple branches through a multicast, make sure that before you end a branch via join and gather, you empty or reset all the data that isn't required beyond that step. Each branch is duplicating the entire content of the parent branch. Unless you explicitly reset it, the branch data is kept in memory until the integration process ends leading to unnecessary consumption of memory resources.

See: Reset Data For Every Branch.

When using XPATHs in your conditions, try to use absolute path as much as possible.

Relative XPATH expressions are very memory expensive, especially for large documents, as the parser searches the whole document for the specified element.

See: XPATH_CONDITIONS.

Manage batch messages of large sizes accordingly to avoid out of memory issues.

When calling an external source, to avoid messages created by calling the external source from getting too large, you must design the integration flow to read and process the data in chunks. Certain adapters allow you to set the property Process in Pages to process the data from the receiver system in pages of a certain size.

See: Manage Large Batch Sizes.

Use streaming to process large messages.

Processing large messages can lead to high processing times and high memory consumption, in turn, leading to out of memory errors and hence a downtime of the runtime components. In addition to decreasing memory consumption, streaming also improves runtime performance. Redesign your integration scenario using streaming-related feature that is available in the integration flow components.

See: Optimize Integration Flow Design for Streaming.

An integration flow must be able to appropriately control the incoming traffic.

Many inbound adapters in SAP Integration Suite allow you to block processing of messages that exceed a certain size. If the incoming message size exceeds the predefined size, the integration flow sends back an error message. The subsequent steps of the integration flow aren't processed.

See: Limit Size of Incoming Messages.

Set the timeout parameter for SuccessFactors OData API query based on the data load.

SuccessFactors supports a maximum session duration of 10 minutes. Make sure that the session is reused for next batch within this time range to trigger the new batch request.

See: Configure the SuccessFactors OData V2 Receiver Adapter.

Use secure authentication methods.

You must create a custom role for inbound communication.

See: Avoid Using a Generic User Role for Sender-Side Authorization.

Disable the processing of doctype declarations in your integration flow if you parse XML data in scripts.

Any application that parses XML data is prone to the risk of XML External Entity processing attacks. Take measures to protect your integration flows that contain script steps against XXE processing attacks.

See: Switch Off Resolving of External Entities.

Avoid using the default method provided for timezone settings.

Don't use the TimeZone.setDefault method. This method changes the default time zone of the virtual machine, which can lead to multiple technical issues.

See: Avoid Using Default Timezone.

Avoid writing the payload as message processing log attachment.

An Integration Suite tenant comes with a limited storage size. If you expect high message volumes and your integration flow contains many persistency steps, you can easily reach the upper limit of the disk space.

See: Avoid Creating MPL Attachments in Scripts.

Understand how to use JsonSlurper class in Groovy scripts.

When using JsonSlurper class in Groovy scripts, stream the message body to the JsonSlurper by using message.getBody(java.io.Reader). Parsing the message body using JSONSlurper without streaming leads to out-of-memory issues.

See: Use of JsonSlurper.

Understand how to use XMLSlurper class in Groovy scripts.

When using XMLSlurper class in Groovy scripts, stream the message body to the XMLSlurper by using message.getBody(java.io.Reader). Parsing the message body using XMLSlurper without streaming leads to out-of-memory issues.

See: Use of XMLSlurper.

Define your script’s import statement so that it consumes supported native APIs.

We don’t recommend to use unsupported external java archives (.jar) uploaded as integration flow resources. We recommend to use the supported native APIs like Groovy and Cloud Integration capability's SDK.

See: Use Only Supported External Libraries.

Understand how static analysis of your scripts work.

We analyze your Groovy scripts for defects, bad practices, inconsistencies, and style issues. Your script must clear all such static checks.

See: Static Analysis of Groovy Scripts.

Usage of Eval classes lead to out of memory situations.

As a best practice, avoid using Eval or groovy shell classes for evaluating expressions or scripts. The recommendation is to use native groovy functions to evaluate expressions. Usage of eval methods or groovy shell classes can lead to multiple errors including out of memory issues.

See: Avoid Using Eval Classes.

Avoid accessing secure parameters and assigning them to headers or properties.

As a best practice, don't access sensitive security information using script functions or store the same in a header. When you store sensitive information as a header or a property, you take the risk of exposing this data accidentally. For example, a header can be sent to a receiver system if you take no further actions to prevent this.

See: Avoid Accessing Secure Parameters in Scripts.

Use a secure authentication method when connecting to external systems.

Integration Suite offers a range of authentication methods when accessing integration flow endpoints (adapter sender channels), connecting to external systems via adapter receiver channels, as well as when consuming public OData APIs. Depending on the adapter and channel type, different authentication options are supported.

See: Use Secure Authentication Methods.

Upload WSDLs to an integration flow instead of using external references.

Always upload and use WSDLs as integration flow resources. Also, make sure that potentially external referenced XSD definitions are either resolved as part of the uploaded WSDL or are also uploaded as additional resource.

See: Upload WSDLs as Integration Flow Resources.

Use encrypted protocols to ensure secure communication.

To ensure secure communication with remote systems, check the protocols that are supported by the system. It is recommended that you use encrypted protocols, if supported by the system.

See: Use Secure Protocols.

Protect the system against cross-site request forgery attacks when using HTTPS sender adapter.

HTTP-based adapters are subject to CSRF attacks. Therefore, they offer protection against such attacks. When an integration flow can be called via modifying HTTP requests (POST, PUT, DELETE, PATCH), make sure that the CSRF protection feature is always activated.

See: Use CSRF Protection.

Avoid exhausting the resources of data store.

If you keep performing write operations on the data store, you continue to consume the tenant storage and eventually exhaust the resources. Remember that the tenant space is limited and shared with other tenant data like message processing logs.

It's important that you must read the information from the data store whenever required. And after use, it's essential that you must perform a delete operation. If you keep performing a write operation without reading or deleting, you soon exhaust the resources of the tenant.

See: Anticipate Message Throughput When Choosing a Storage Option.

Handle exceptions by extending an integration flow by adding an exception subprocess flow step.

Exceptions that occur during message processing are identified and handled by the logic implemented in the exception subprocess. The exception subprocess can distinguish between different error situations and, according to the error category, send back a custom error message to the sender application system.

See: Handle Exceptions.

Handle exceptions by extending every local integration process pool with an exception subprocess.

As an extension to handling exceptions in the main Integration Process pool, implement exception subprocess in the Local Integration Process pools as well.

See: Handle Exceptions.

Persist the data in transit using the JMS adapter to avoid message processing failure.

Integration Suite offers storage to persist data in transit during message processing, using JMS adapter's message queue storage. Use this storage to persist the message at the beginning of the processing sequence. That way, processing is executed faster for the sender, who immediately receives a response, and the subsequent processing steps are executed asynchronously.

See: Apply the Retry Pattern with JMS Queue.

Write standard or custom header properties to store business- or payload-related information in the message processing log.

The message monitoring view supports search based on custom header properties. In addition, search is also supported via the Cloud Integration capability's OData API. While designing you integration flow, define standard and custom header properties. Later, in the message monitoring, you can search the processed messages based on the defined headers to look out for specific information.

See: Use Custom Header Properties to Search for Message Processing Logs.

Remove all flow steps that are unused and not connected via message flows.

See: Consider Basic Layout Principles.

Externalizing volatile properties of a receiver adapter increases its level of maintainability.

Certain parameters of a receiver adapter must be changed depending on the environment in which the integration flow is executed. We refer to such parameters as volatile because they are to be adapted each time an integration flow is deployed.

By externalizing such properties, you don't need to understand the complete logic of the integration flow to perform a change. In addition, using externalized parameters allows you to reuse certain configurations across the integration flow.

See: Externalize Volatile Configurations .

Externalizing volatile properties of a sender adapter increases its level of maintainability.

Certain parameters of a sender adapter must be changed depending on the environment in which the integration flow is executed. We refer to such parameters as volatile because they are to be adapted each time an integration flow is deployed.

By externalizing such properties, you don't need to understand the complete logic of the integration flow to perform a change. In addition, using externalized parameters allows you to reuse certain configurations across the integration flow.

See: Externalize Volatile Configurations .

Create additional integration flow to handle more number of process pools and flow steps.

You can only add a certain number of process pools in an integration flow. Also, there is a maximum limit to the number of flow steps that you can add inside a process pool.

See: Consider Basic Layout Principles.