Integration Content Catalogs in SAP for Standard Interfaces

Looking for standard SAP interfaces?

There is a lot of interfaces and integration content available which you can use for your implementation project using SAP Process Orchestration (SAP PRO) / SAP Process Integration (SAP PI) or SAP HANA Cloud Platform, integration service (SAP HCI).

Here is an overview:



SAP API Hub

Catalog1


SAP Content Hub

Catalog2


SAP App Center

Catalog3


Enterprise Services Workplace

Catalog4


SAP Best Practices Explorer

Catalog5Catalog6


IDocs

  • You have to see in your WE60 or in SAP Help which IDoc might fit for your needs. Please consider that IDocs are not delivered by SAP anymore since 2003…

WHINT InterfaceCatalog for SAP Process Orchestration / PI

Functionality

This solution provides you with an overview about your Integration Landscape: All information from your configuration (Integration Directory) of your SAP Process Orchestration System. The tool downloads (e.g. periodically)

  • the Interface Catalog (based on all Integrated Configurations),
  • the Channel Catalog (based on all channels or on all channels used in the Integrated Configurations) and
  • the Interface Catalog Classic (optionally for PI only, based on Dual-Stack classic routings).

as an Excel Sheet.

All Routing information can be read as well as all communication channel attributes (e.g. host name) and all adapter modules including the corresponding parameters.

It answers questions like

  • what are the routing conditions for Partner X or IDoc Y ?
  • where do we use a specific FTP host?
  • where do we use Content Conversion (also via the MessageTransform Bean) ?

The solution is not invasive (no specific installation needed) and runs completely as interfaces using Standard PI/Process Orchestration technology. Only read access is required using the Integration Directory API.

IFC3

IFC4

IFC5


Prerequisites

  1. Import the TPZ file provided by Whitepaper InterfaceDesign into the ESR
  2. Configure the Process Integration Scenario using the Swing Client (Integration Builder: Directory) or Eclipse (SAP NetWeaver Developer Studio – NWDS) – see Configuration Guide below

IFC15


Configuration Guide

  1. Create a user to access via the Integration Directory API (SOAP Receiver Lookup Channels) with the roles: SAP_XI_API_DISPLAY_J2EE and SAP_XI_APPL_SERV_USER
  2. Create two components according to your naming conventions, e.g. WHINT_IFC_Reader and <SID>_FILESYSTEM (in PI 7.11 it must be a Business System, for newer releases it can be either Business Component or Business System)
  3. Create all necessary SOAP Lookup Channels using the Channel Templates provided for the Objects
    • Communication Channel
    • Integrated Configuration
    • Configuration Scenario
    • Classic only: Receiver Determination
    • Classic only: Interface Determination
    • Classic only: Sender Agreement
    • Classic only: Receiver Agreement
  4. Start the wizard with the Process Integration Scenario WHINT_InterfaceCatalog, select the Component view matching your release and assign the systems to generate your iFlows/ICOs
  5. Assign the lookup channel parameters and select the correct NFS path where to write/read files
  6. Repeat this activity for Process Integration Scenario WHINT_InterfaceCatalog_Classic if you want to create the Dualstack InterfaceCatalog as well

IFC7

IFC6


Messages at Runtime

IFC1


How to check if a file exists with SAP PRO/SAP PI

Sometimes we have to check if a file has been written on a file system and perform actions on that. With PI we usually poll a directory using the File Sender Adapter (e.g. in TEST mode if we can not delete the file). Here´s a smarter way:

  1. Schedule a Trigger that runs the File Check periodically (based on your requirement)
  2. Read the directory of the NFS/FTP/SFTP using the FileReader Adapter

Example Configuration

  • Sender: SOAP Message coming from WHINT MessageTrigger Job
  • Receiver: FileReader Query using WHINT FileReader Adapter
  • Both sides use BusinessComponent “FileChecker” here

Create Business Component:

FileChecker 1

Define iFlow/IntegratedConfiguration (SOAP to FileReader):

FileChecker 2

Configure the FileReader Adapter Channel with the file name you look for:

FileChecker 7

As the FileReader Adapter reads synchronously, we have to switch from Async to Sync using the RequestResponseBean and back to Async using the RequestOnewayBean:

FileChecker 8

If the file is found, we need to have a second (Dummy) iFlow/ICO that is receiving the Response Message from the adapter but without processing it:

FileChecker 9


Set up a MessageTriggerJob in NWA / Operations / Jobs according to your needs (the period is configurable very flexible: daily / every 10 minutes / …):

FileChecker 4FileChecker 6

If the file can not be found, the message goes into error:

FileChecker 5

VOILÁ – here is the message in error (System Error).
If you have defined alerting, an automatic E-Mail is triggered!

In case the file is found, no additional message is being generated. This behaviour can be changed if you change the routing of the Dummy iFlow.


This blog is using the following stand-alone solutions:

Usage Dependencies in Software Components

There is a useful functionality to reuse objects from one software component in another one.

You can define this usage dependency in SLD or in ESR (software dependencies & underlying software components).

It is needed if want to enhance Standard Enterprise Services or when you create a shared pool of objects that is strictly reused and not copied into other Software Components.

  • Data Type Enhancements (to enhance SAP Data Types)
  • Central Mappings, Function Libraries

Please do not use it to share interface objects across several components/systems.

Different release/deploy/maintenance cycles in the connected systems require a clearly decoupled approach which is also reflected by the software components. In case your data types/message types/service interfaces are equal in both software components, simply copy them over. Starting from then on, you have to add a new field in both components (or you copy again by overwriting the target object).

Key takeaway: Decoupling is more important than reuse (in this case)!

No Mapping required!

As your software components will use their own namespaces you should be careful designing the messages. When you copy over the message type from SWC_1 to SWC_2, the XML Namespace in SWC_2 will automatically have the Namespace of SWC_1. To use this feature smarter, simply use a cross-component, company-wide, global namespace.

XML Namespace in the Message Type: http://<company-domain>/xi/GLOBAL
Btw: This is approach is also used by SAP to design their namespaces.

Example

  • SWC_1: OTC_RETAILER 2016 of whitepaper-id.com
    Namespace: http://whint.de/xi/OTC/RETAILER
  • SWC_2: OTC_SAP_ERP 6.0 of whitepaper-id.com
    Namespace: http://whint.de/xi/OTC/SAP/ERP
  • XML Namespace across all SWC: http://whint.de/xi/GLOBAL

SAP Plain J2SE Adapter Engine in Proxy Mode

In case you want to track your HTTP requests (e.g. to SOAP / HTTP / REST receivers), the SAP Plain Adapter Engine might offer a nice surprise to you.

You simply have to reactivate the adapter type “httpwatcher” and then you can set up your own Proxy Server easily:

J2SE Proxy 1 J2SE Proxy 2

In the Log of this HTTP Watcher Proxy you can see the complete HTTP (not HTTPS) traffic that is going through!

 

Decoupling and service-orientation at Amazon

Great article about Amazon´s transformation about a decade ago from an online book-seller into a billion-dollar, IaaS/cloud computing leader at API Evangelist

It is a clear and direct mandate, issued by Jeff Bezos (CEO and founder) to make sure all teams interact through well-defined service interfaces and do not interact on a point-to-point level.

My favorite sentence: “Anyone who doesn’t do this will be fired.  Thank you; have a nice day!

Hybrid Integration / OnPremise & Cloud Connectivity

Now, in 2015, I believe in 2 major future trends becoming reality in the next 10 years:

  • Cars will be mainly electric
  • Software will be mainly running in the cloud (outside of companies own data centers)

To get there, it will be sometimes by selling your beloved Diesel-Car and buying a Tesla. This is a really big change as the infrastructure for chargers is not as good as with classic gas-stations. So doing this in Germany/Europe means you are an early adopter and sacrifice some luxury to be innovative.

Same applies for software managed in the cloud. No company is interested in driving projects to replace their existing IT systems running in a server room / data center and let them be hosted and managed by another company… Also, the question of hosting your current IT landscape at some provider does not provide the benefits you might be looking for. You still have to buy the license, and decide which hardware/database you might want to use. If you are lucky, updates will be managed by the hosting provider, but thats usually it.

But the future is clear: You pay for your software subscription-based, it is very up-to date (no headache about release changes and versioning), it is fully scalable, you do not worry about hardware, security and so on….

To get there over the time, a pragmatic approach could be to replace your “old” applications by looking for a cloud alternative. Instead installing a new release and/or migrating to a newer version, you make the paradigm change into cloud step by step.

Of course there will be questions about security, availability, performance, SLAs, change management and support, but that´s why you have an IT-department. They should be working those things out (with some help of good partners).


So instead replacing everything in one shot, a hybrid model seems to be first choice: some applications are running on premise, and some are running (already) in the cloud. I guess this concept also applies to cars for the time being. Plug-In hybrid cars offer a good mix of gas-station infrastructure and pointing to the right direction with electric-only driving modes.


The next question will then be: How can I INTEGRATE my on-premise with my cloud applications in a smart way?

  • Option 1:  You have a (on premise) middleware already (SAP PI/PO, Microsoft BizTalk, …)
    This works well if you connect OP (on premise) – OD (on demand/cloud)
  • Option 2: You decide for a cloud middleware
    This does not mean you can get rid of the OP middleware, but if you connect OD – OD it would be bad to go through a OP component (same applies for OP-OP connectivity).
    So you keep your OD middleware and enhance it with a CLOUD (OD) middleware.

For SAP customers using SAP NetWeaver PI or PO (PRO), it seems logical to use SAP HANA Cloud Integration (HCI) as the concepts are quite similar (although more flexible) and OP mappings can be reused… Moreover, there will be a hybrid option in the future where runtime elements from HCI will be running on your PRO installation in runtime containers.

If you do not want to narrow your possibilities to one single platform, you can be quite relaxed. There are many great solutions out there, including Dell Boomi or elastic.io, so the reality might be to integrate through multiple cloud middleware solutions, depending on what they are able to provide (existing content for plug-and-play connectivity) or based on the business app you want to integrate.

Avoid using flat files !

This is about why we should not use flat files anymore:


Integration across systems in the 1990´s looked like this:
– Sender System runs export program in batch to produce flat file
– Transfer script sends flat file via FTP to Receiver System
– Receiver System runs import program in batch to update database


What is so bad about it:

– STORAGE: to store files on a (local) file system is not good from a Security and Auditing point of view. Any intruder who is able to access the file system, can read the data (unless they are right away encrypted, e.g. via PGP). It is better to persist your (especially sensitive) data in a database.
– TRANSFER: the transmission protocol FTP is not very secure and sometimes can not be so reliable (depending on the FTP server implementation). To secure the transmission use at least SSH/SSL encryption via SFTP/FTPs
– FORMAT: there are issues with flat files in general. The change management is horrible if you want to extend something (all senders and receivers have to switch their export/import programs in one shot) – and believe me: there will always be changes…

The only advantage is: File Size.
Nothing is leaner than a flat file, especially if you deal with high volumes/large messages.
But especially in SAP projects we learned already: Sometimes it is better to go a structured and organized way instead of choosing the light-weight and easy approach.


Options to improve existing integration flows (keeping the flat file format for the time being or for message size reasons):
– Clean up the SAP side first where you can instead of redesigning the flow completely: use ABAP Proxy with flat file as attachment to avoid the STORAGE problem and solve the TRANSFER issue by using http(s):
– TRANSFER: No more FTP to/from SAP servers, only RFC or HTTP
– STORAGE: Change your programs to write files into memory only and then pass the data to an ABAP proxy. This is of course only possible for you own programs, but SAP is also modernizing the way to transfer standard-based files.

For new integrations simply try to avoid flat files. They are not state-of-the-art anymore.

SAP PO Integration Patterns & BPM content

SCN article about Integration Patterns for SAP NetWeaver Process Orchestration (PO)

  • stateless patterns via AEX connectivity
  • stateful via BPM connectivity

Download the enterprise pattern reference models for BPM modeling from SCN