Thursday, November 5, 2015

BizTalk 2010 integrating with SAP using a custom web service instead of the WCF-SAP LOB adapter part 1

There are a lot of experts blogging about interfacing BizTalk with SAP using the WCF-SAP LOB Adapter.  Off the top of my head, two come to mind, Sandro Pereira and Kent Weare.  They have numerous posts on installing the adapter, configuring the adapter, and receiving/sending XML IDocs.

In this post, I'm going to detail a scenario where we didn't use the BizTalk WCF-SAP LOB Adapter.  Instead BizTalk was used to integrate with a custom web service created in SAP to pull information.  Part 1 will cover consuming the WSDL and creating the send port.  Part 2 will be calling the web service and wrap up.

To start off, I have very little knowledge or experience working in SAP.  I primarily rely on an SAP team to understand the inter workings of that system.  I make that point to set the stage in that this post is primarily BizTalk related.  Creating and configuring SAP to expose a web service is outside the scope of this post.

So why no WCF-SAP LOB Adapter?  I actually discussed the options with the SAP team, and together we decided the better option was to try and develop a custom web service on the SAP side.  I'll probably get some dissension from the BizTalk community for designing the solution this way, but I'm not going to go into all the details of the reasoning here.  Suffice to say, it was the better option for this particular process.

The specific interface I'll discuss is a synchronous web service call from BizTalk to SAP for currency exchange data.  Below are the steps I took to get BizTalk configured to interface with a custom SAP web service.  I'm including some of the problems that I encountered along the way.

Step 1: Consuming the WSDL


Just like most of the other web services I've worked on, I started off with trying to consume the WSDL provided by the SAP team.   From a development perspective, the tool used to consume that WSDL is the "BizTalk WCF Service Consuming Wizard" found in Visual Studio when selecting "Add Generated Items".  There are two benefits of using this.  The first is to get a working schema of the data being messaged.  The second benefit is the binding file that is created by the wizard (I'll discuss this file later on).

One difference from other web service WSDL's I've worked with, a file was provided instead of a URL.  No problem, the wizard can handle that:


I then added the WSDL file provided by the SAP team:


After hitting the "next" button, and clicking "Finish", I received the following error:


I have encountered the dreaded "Object reference not set to an instance of an object" error other times when using the BizTalk WCF consuming wizard.  If you notice, the wizard expects an .xsd file as well as the .wsdl file.  However, if an .xsd file isn't available, how do you get around this issue?  In this case, the schema was provided within the wsdl, however, sometimes the targetNamespace in the in the schema record of the wsdl is missing.  Sure enough when I opened it up, it was that exact problem:


After inserting a temporary targetNamespace in the .wsdl file and re-running the wizard, it was able to successfully create the .xsd from the WSDL:


Here is an expanded view of the xsd created.  Remember, this is a synchronous web service, so there is both a request and response message:


Step 2: Creating the send port

In addition to the xsd, the wizard also creates another resource that is of some use.  In particular, the files ending in BindingInfo.xml.  These files can be used to generate a configured WCF send port in BizTalk Administrator.  In general, I tend to use the file ending in Custom.BindingInfo.xml because it gives you more flexibility in configuring the binding when it is created.

To import the binding file, first open up BizTalk administrator.  You can then "right click" on the applicable application you want to import the binding into.  This will give you an option to "Import" the "Bindings":

After selecting the correct binding file:


 The application should import the appropriate binding information and create the send port.  However, in this case, no port was created.  So I opened up the zfi_exchange_rate_pull2_Custom.BindingInfo.xml file to see what it contained:


So why is the SendPortCollection record empty?  In this case, the BizTalk WCF Service Consuming Wizard must have had some issues with the WSDL file.  That means I had to go and create the send port manually.  As previously stated, I try to use the WCF-Custom send port whenever possible.  Here are some of the steps I took when creating the send port:

To create the send port, I needed some assistance from the WSDL file.  Specifically, I looked for the location attribute in the soap:address section of the file:



The location attribute is what you need to use for the address URI property in the send port:

The second property of importance on the send port is the Action property under the SOAP Action header area.  That value I pulled from the soap:operation element under the wsdl:operation record of the wsdl:


Here is what the send port looks like with the SOAP Action Header filled:


The last piece to configuring the send port was under the "Binding" tab.  When selecting a WCF-Custom Send Port, you need to manually configure the binding as well.  The first step is to select the Binding Type.  In this instance, I used the "customBinding".  In addition, I had to go into the "messageVersion" property under the textMessageEncoding extension and change the value to "Soap11":



All other send port properties I left defaulted.  In the next post, I'll explain how I went about testing the web service.  I'll also touch on how this interface may impact future design decisions.

Sunday, July 12, 2015

Business Activity Monitoring (BAM), BAMPrimaryImport, OLAP Cubes, and archiving

There are already a lot of great posts about Business Activity Monitoring (BAM) and how to go about setting up archiving.  In particular, the blog post by Richard Seroter and the post by BizTalk Bill are two I closely followed to set-up my environment. In the time when I originally configured the archiving, I've been super slammed and have just come back around to performing some environment maintenance.

Needless to say, I was a bit surprised to see that my BAMPrimaryImport database had grown to almost 40 GB:


Wait, what???  I went through and checked all my configurations to make sure everything was running correctly.  Firstly, I checked to make sure the SQL Job that I had created to run all the BAM SSIS packages was working.  BizTalk Bill had set-up an SSIS package to do this, but I went down the route of creating two SQL Jobs, the first used to dynamically build and execute the list of packages to run (any package starting with DM_) and the second to control and monitor the execution of the first.  Both jobs looked like they were running successfully (These are run on a nightly basis).

My second check was to look at the BAMPrimaryImport database tables.  I first did a visual inspection of the tables, in which I noticed an unusual amount of partition tables for the first message being tracked using BAM activities and views:



Secondly, I ran a quick SQL Command to return the number of tables in the BAMPrimaryImport database:


I had over 7000 tables in the BAMPrimaryImport database!!!  Granted, we do a lot of BAM tracking on all of our different messages, but that number sounded excessive.  So I wanted to confirm what Richard Seroter had writtien in his blog and looked at the Metadata_Activities table to see how long data should be kept before archiving.  As I suspected, it was configured to only keep a months worth of tracking data:




 So from the above i could determine:

  1. SQL Jobs were running - bueno
  2. SSIS packages creating table partitions - bueno
  3. Partition tables being archived to the BAMArchive database - no bueno
So why were the partition tables not being moved to the BAMArchive database?  In looking at the properties on some of the tables, they had been created way back in October and November of 2014.  In between looking at the aforementioned blog posts, I noticed something different in my Integration Services environment.  I had what appeared to be a lot of SSIS packages not only starting with "DM_", but "AN_" as well.  

I did a quick google search with the AN SSIS packages and found a great article by the Microsoft India team.  In the last paragraph of the article I found my problem.  It appears, that if in your BAM tracking you take advantage of creating OLAP cubes, you need to set-up the SSIS Packages that begin with "AN_" to run daily.  If you fail to do this step, the partition tables will fail to be moved to the BAMArchive database.

As a test, I went ahead and ran the "AN_" SSIS package for the first tracked message.  I then went and executed the "DN_" SSIS package for that same tracked message.  Sure enough, the appropriate partition tables were moved to the BAMArchive table:


In order to play it safe and get the appropriate tables archived, I manually ran each SSIS package that had the "AN_" prefix (all 143 of them).  What's worse about this whole ordeal is that the current environment in which I work in doesn't even use these OLAP cubes (you can read why here).  I have to admit this was a sloppy mistake on my part, and shows how little I really understood the archiving process when using BAM.  On a positive note, maybe I can convince the BizTalk360 to automate this process in an upcoming release?

Friday, February 13, 2015

Real World Business Activity Monitoring (BAM) using BizTalk 2010 part 2

In this second blog post on Real World Business Activity Monitoring (BAM), I'm going to discuss BAM tracking.  There are already a number of great resources on the process of creating BAM tracking, including the book Pro BAM in BizTalk 2009 and a recent blog post I just saw on code project.  So instead of talking about the creation of BAM tracking, I'm going to go over some important tips I follow when setting up BAM tracking in BizTalk.

Proper Naming Conventions in Definition Files

When creating activities for a definition file in MS Excel, I think an important factor that is often overlooked is the use of proper naming conventions.  For one, your activity names in Excel are limited to 48 characters.  In the environment I work in, I've tried to standardize on using four distinct factors in the name.  These factors can be seen below with an example proceeding:

Activity Naming Convention: [MessagePattern][MessageName][SourceSystem][TargetSystem]

Activity Naming Example: PublishInvoicePOSERP

I know what you might be thinking, there is only one pattern represented here.  This is just meant as an example and you should come up with a convention that fits the messaging used in your environment. What do I mean by that?  You might not have common messaging patterns like Publish/Subscribe or Request/Response like I do.  The important thing to keep in mind is to have a consistent approach.

Additionally, in the above, the abbreviation POS stands for Point of Sale and ERP stands for Enterprise Resource Planning.  If inclined, you could replace ERP with SAP, PeopleSoft, Dynamics, etc., depending on the ERP implemented.  Again, personal preference on the granularity of your naming.

When creating views, you have even less flexibility in the naming convention, as you are limited to 18 characters.  No, that's not a typo, you've got 18 characters.  So the term "short and sweet" is the name of the game.  For that reason, the naming I've used has been shortened to the below:   

View Naming Convention: [MessagePattern][MessageName][TargetSystem]

View Naming Example: PublishInvoiceERP

Again, with the 18 character limitation, there will be times you need to modify your convention. The point is to remain consistent with your approach.  I can't stress this enough.  This is especially true if you don't leverage the out of the box BAM monitoring portal for your end users (like in my environment).

Modifying deployed Definition files

I make it a point to always remove the BAM definition file via the command prompt before making any changes to activities or views.  This is especially true if your directly deploying your xlsx definition file and not the generated XML definition file.  If you do change an activity or view while the xlsx BAM definition file is still deployed, you run the risk of having to manually remove all activities and views in the definition file.

To go along with not modifying definition files before removing them, it's also important to deploy your BAM definition file from the XML not the xlsx.  The XML is created using the menu command "Export XML" from the BAM Add-In.  This insulates your deployed definition file from your Excel definition file.  In this manner, if you do change an activity or view without removing the definition file, you have an extra layer of protection from change.

So the order of events that I execute when dealing with changes to a definition file:
  1. Remove Definition File using command prompt and the remove-all command
  2. Make changes to view /activity in Excel spreadsheet
  3. Create XML of the BAM definition file using "Export XML" from the Excel BAM menu command
  4. Deploy the new Defintion File using command prompt and the update-all command

Maintaining BAM History

I'm not going to rehash what has been blogged about many times on BAM maintenance.  There are a lot of great posts including but not limited to ones written by Saravana Kumar and Richard Seroter.  Out of the box, BAM history is saved for 6 months.  Talk to the business and learn the requirements for retention.  Although tracking data doesn't seem like it should consume much space, depending on factors like number of messages and volume of messages, the database can build up quickly and eventually cause more serious problems.

My next post will focus on why we choose to develop a custom UI for displaying BAM tracking data for our users instead of the out of box BAM portal.