Thursday, November 5, 2015

BizTalk 2010 integrating with SAP using a custom web service instead of the WCF-SAP LOB adapter part 1

There are a lot of experts blogging about interfacing BizTalk with SAP using the WCF-SAP LOB Adapter.  Off the top of my head, two come to mind, Sandro Pereira and Kent Weare.  They have numerous posts on installing the adapter, configuring the adapter, and receiving/sending XML IDocs.

In this post, I'm going to detail a scenario where we didn't use the BizTalk WCF-SAP LOB Adapter.  Instead BizTalk was used to integrate with a custom web service created in SAP to pull information.  Part 1 will cover consuming the WSDL and creating the send port.  Part 2 will be calling the web service and wrap up.

To start off, I have very little knowledge or experience working in SAP.  I primarily rely on an SAP team to understand the inter workings of that system.  I make that point to set the stage in that this post is primarily BizTalk related.  Creating and configuring SAP to expose a web service is outside the scope of this post.

So why no WCF-SAP LOB Adapter?  I actually discussed the options with the SAP team, and together we decided the better option was to try and develop a custom web service on the SAP side.  I'll probably get some dissension from the BizTalk community for designing the solution this way, but I'm not going to go into all the details of the reasoning here.  Suffice to say, it was the better option for this particular process.

The specific interface I'll discuss is a synchronous web service call from BizTalk to SAP for currency exchange data.  Below are the steps I took to get BizTalk configured to interface with a custom SAP web service.  I'm including some of the problems that I encountered along the way.

Step 1: Consuming the WSDL


Just like most of the other web services I've worked on, I started off with trying to consume the WSDL provided by the SAP team.   From a development perspective, the tool used to consume that WSDL is the "BizTalk WCF Service Consuming Wizard" found in Visual Studio when selecting "Add Generated Items".  There are two benefits of using this.  The first is to get a working schema of the data being messaged.  The second benefit is the binding file that is created by the wizard (I'll discuss this file later on).

One difference from other web service WSDL's I've worked with, a file was provided instead of a URL.  No problem, the wizard can handle that:


I then added the WSDL file provided by the SAP team:


After hitting the "next" button, and clicking "Finish", I received the following error:


I have encountered the dreaded "Object reference not set to an instance of an object" error other times when using the BizTalk WCF consuming wizard.  If you notice, the wizard expects an .xsd file as well as the .wsdl file.  However, if an .xsd file isn't available, how do you get around this issue?  In this case, the schema was provided within the wsdl, however, sometimes the targetNamespace in the in the schema record of the wsdl is missing.  Sure enough when I opened it up, it was that exact problem:


After inserting a temporary targetNamespace in the .wsdl file and re-running the wizard, it was able to successfully create the .xsd from the WSDL:


Here is an expanded view of the xsd created.  Remember, this is a synchronous web service, so there is both a request and response message:


Step 2: Creating the send port

In addition to the xsd, the wizard also creates another resource that is of some use.  In particular, the files ending in BindingInfo.xml.  These files can be used to generate a configured WCF send port in BizTalk Administrator.  In general, I tend to use the file ending in Custom.BindingInfo.xml because it gives you more flexibility in configuring the binding when it is created.

To import the binding file, first open up BizTalk administrator.  You can then "right click" on the applicable application you want to import the binding into.  This will give you an option to "Import" the "Bindings":

After selecting the correct binding file:


 The application should import the appropriate binding information and create the send port.  However, in this case, no port was created.  So I opened up the zfi_exchange_rate_pull2_Custom.BindingInfo.xml file to see what it contained:


So why is the SendPortCollection record empty?  In this case, the BizTalk WCF Service Consuming Wizard must have had some issues with the WSDL file.  That means I had to go and create the send port manually.  As previously stated, I try to use the WCF-Custom send port whenever possible.  Here are some of the steps I took when creating the send port:

To create the send port, I needed some assistance from the WSDL file.  Specifically, I looked for the location attribute in the soap:address section of the file:



The location attribute is what you need to use for the address URI property in the send port:

The second property of importance on the send port is the Action property under the SOAP Action header area.  That value I pulled from the soap:operation element under the wsdl:operation record of the wsdl:


Here is what the send port looks like with the SOAP Action Header filled:


The last piece to configuring the send port was under the "Binding" tab.  When selecting a WCF-Custom Send Port, you need to manually configure the binding as well.  The first step is to select the Binding Type.  In this instance, I used the "customBinding".  In addition, I had to go into the "messageVersion" property under the textMessageEncoding extension and change the value to "Soap11":



All other send port properties I left defaulted.  In the next post, I'll explain how I went about testing the web service.  I'll also touch on how this interface may impact future design decisions.

Sunday, July 12, 2015

Business Activity Monitoring (BAM), BAMPrimaryImport, OLAP Cubes, and archiving

There are already a lot of great posts about Business Activity Monitoring (BAM) and how to go about setting up archiving.  In particular, the blog post by Richard Seroter and the post by BizTalk Bill are two I closely followed to set-up my environment. In the time when I originally configured the archiving, I've been super slammed and have just come back around to performing some environment maintenance.

Needless to say, I was a bit surprised to see that my BAMPrimaryImport database had grown to almost 40 GB:


Wait, what???  I went through and checked all my configurations to make sure everything was running correctly.  Firstly, I checked to make sure the SQL Job that I had created to run all the BAM SSIS packages was working.  BizTalk Bill had set-up an SSIS package to do this, but I went down the route of creating two SQL Jobs, the first used to dynamically build and execute the list of packages to run (any package starting with DM_) and the second to control and monitor the execution of the first.  Both jobs looked like they were running successfully (These are run on a nightly basis).

My second check was to look at the BAMPrimaryImport database tables.  I first did a visual inspection of the tables, in which I noticed an unusual amount of partition tables for the first message being tracked using BAM activities and views:



Secondly, I ran a quick SQL Command to return the number of tables in the BAMPrimaryImport database:


I had over 7000 tables in the BAMPrimaryImport database!!!  Granted, we do a lot of BAM tracking on all of our different messages, but that number sounded excessive.  So I wanted to confirm what Richard Seroter had writtien in his blog and looked at the Metadata_Activities table to see how long data should be kept before archiving.  As I suspected, it was configured to only keep a months worth of tracking data:




 So from the above i could determine:

  1. SQL Jobs were running - bueno
  2. SSIS packages creating table partitions - bueno
  3. Partition tables being archived to the BAMArchive database - no bueno
So why were the partition tables not being moved to the BAMArchive database?  In looking at the properties on some of the tables, they had been created way back in October and November of 2014.  In between looking at the aforementioned blog posts, I noticed something different in my Integration Services environment.  I had what appeared to be a lot of SSIS packages not only starting with "DM_", but "AN_" as well.  

I did a quick google search with the AN SSIS packages and found a great article by the Microsoft India team.  In the last paragraph of the article I found my problem.  It appears, that if in your BAM tracking you take advantage of creating OLAP cubes, you need to set-up the SSIS Packages that begin with "AN_" to run daily.  If you fail to do this step, the partition tables will fail to be moved to the BAMArchive database.

As a test, I went ahead and ran the "AN_" SSIS package for the first tracked message.  I then went and executed the "DN_" SSIS package for that same tracked message.  Sure enough, the appropriate partition tables were moved to the BAMArchive table:


In order to play it safe and get the appropriate tables archived, I manually ran each SSIS package that had the "AN_" prefix (all 143 of them).  What's worse about this whole ordeal is that the current environment in which I work in doesn't even use these OLAP cubes (you can read why here).  I have to admit this was a sloppy mistake on my part, and shows how little I really understood the archiving process when using BAM.  On a positive note, maybe I can convince the BizTalk360 to automate this process in an upcoming release?

Friday, February 13, 2015

Real World Business Activity Monitoring (BAM) using BizTalk 2010 part 2

In this second blog post on Real World Business Activity Monitoring (BAM), I'm going to discuss BAM tracking.  There are already a number of great resources on the process of creating BAM tracking, including the book Pro BAM in BizTalk 2009 and a recent blog post I just saw on code project.  So instead of talking about the creation of BAM tracking, I'm going to go over some important tips I follow when setting up BAM tracking in BizTalk.

Proper Naming Conventions in Definition Files

When creating activities for a definition file in MS Excel, I think an important factor that is often overlooked is the use of proper naming conventions.  For one, your activity names in Excel are limited to 48 characters.  In the environment I work in, I've tried to standardize on using four distinct factors in the name.  These factors can be seen below with an example proceeding:

Activity Naming Convention: [MessagePattern][MessageName][SourceSystem][TargetSystem]

Activity Naming Example: PublishInvoicePOSERP

I know what you might be thinking, there is only one pattern represented here.  This is just meant as an example and you should come up with a convention that fits the messaging used in your environment. What do I mean by that?  You might not have common messaging patterns like Publish/Subscribe or Request/Response like I do.  The important thing to keep in mind is to have a consistent approach.

Additionally, in the above, the abbreviation POS stands for Point of Sale and ERP stands for Enterprise Resource Planning.  If inclined, you could replace ERP with SAP, PeopleSoft, Dynamics, etc., depending on the ERP implemented.  Again, personal preference on the granularity of your naming.

When creating views, you have even less flexibility in the naming convention, as you are limited to 18 characters.  No, that's not a typo, you've got 18 characters.  So the term "short and sweet" is the name of the game.  For that reason, the naming I've used has been shortened to the below:   

View Naming Convention: [MessagePattern][MessageName][TargetSystem]

View Naming Example: PublishInvoiceERP

Again, with the 18 character limitation, there will be times you need to modify your convention. The point is to remain consistent with your approach.  I can't stress this enough.  This is especially true if you don't leverage the out of the box BAM monitoring portal for your end users (like in my environment).

Modifying deployed Definition files

I make it a point to always remove the BAM definition file via the command prompt before making any changes to activities or views.  This is especially true if your directly deploying your xlsx definition file and not the generated XML definition file.  If you do change an activity or view while the xlsx BAM definition file is still deployed, you run the risk of having to manually remove all activities and views in the definition file.

To go along with not modifying definition files before removing them, it's also important to deploy your BAM definition file from the XML not the xlsx.  The XML is created using the menu command "Export XML" from the BAM Add-In.  This insulates your deployed definition file from your Excel definition file.  In this manner, if you do change an activity or view without removing the definition file, you have an extra layer of protection from change.

So the order of events that I execute when dealing with changes to a definition file:
  1. Remove Definition File using command prompt and the remove-all command
  2. Make changes to view /activity in Excel spreadsheet
  3. Create XML of the BAM definition file using "Export XML" from the Excel BAM menu command
  4. Deploy the new Defintion File using command prompt and the update-all command

Maintaining BAM History

I'm not going to rehash what has been blogged about many times on BAM maintenance.  There are a lot of great posts including but not limited to ones written by Saravana Kumar and Richard Seroter.  Out of the box, BAM history is saved for 6 months.  Talk to the business and learn the requirements for retention.  Although tracking data doesn't seem like it should consume much space, depending on factors like number of messages and volume of messages, the database can build up quickly and eventually cause more serious problems.

My next post will focus on why we choose to develop a custom UI for displaying BAM tracking data for our users instead of the out of box BAM portal.  

Wednesday, November 12, 2014

Real World Business Activity Monitoring (BAM) using BizTalk 2010 part 1

In all my years working with BizTalk, my current employer is the first that I've had a real requirement for using Business Activity Monitoring (BAM).  From a high level, we needed to provide the business with a real time view of all the different messages coming into and out of the Enterprise Service Bus (ESB) environment.  Additionally, the business wanted to see some high level statistics of these messages like volume throughput, average size, and average time.

To meet these requirements, we used two different solutions.  BizTalk BAM tracking was used to track the receiving and sending of each different type of message in the ESB.  The second solution was a custom ASP.NET web site to display the information in a consolidated manner for  the business users.  I'll explain why we didn't use the out of the box BAM monitoring portal in a subsequent post.

As I've mentioned in some previous posts, the current BizTalk environment I work in performs a lot of content based routing for messaging.  When looking at a way to monitor all of the messages, it made sense to leverage BAM tracking.  This not only allowed the tracking of all the different individual messages, but it also provided a means of tracking specific information related to each message.   Below is a screen shot of the web site that is displayed to the user:



From looking at the above screen shot, the intent of the web page is to display a high level view of the different messages moving through the ESB.  The information presented to the user on the site is based mostly off of the views created with BAM tracking.  Each row shown above identifies a different subscriber to a particular message publication.  While meeting the requirements of the business, it also gives some indicators to the health of each individual message in the ESB.  *Note: Credit must be given to my then manager Darrick Johnson for creating the original web site.

My next post will walk through the "how to" of creating the tracking for an individual message. In my last post, I will look into more specifics of how the ESB monitor is organized and what it provides for the users.

Thursday, July 31, 2014

BizTalk Toolkit 2.1 ESB Add Namespace Pipeline Component and Encoding

Here is something that tripped me up the other week.  I was working on a Proof of Concept (POC) with BizTalk and writing XML messages to a RedHat Linux Network File Share (NFS).  To connect to the NFS, I followed a great blog post How to connect to an NFS with Windows Server 2008 R2 by Randy Aldrich Paulo.  Anyway, what tripped me up was the encoding of the files that were written to the server.

On send ports, I usually use the "Out of the box" XMLTransmit pipeline component.  However, this particular POC had a requirement to strip all namespace prefix tags from the XML elements contained in the message while retaining the target namespace on the root element.  I thought, no problem, the ESB Toolkit has two pipeline components I can use to solve this.  Specifically, I created a pipeline component that used the ESB Remove Namespace Pipeline Component and the ESB Add Namespace Pipeline Component.

So I get the send port created and assign the new pipeline component that satisfies the requirements:
  

I run a test through and everything looks fine from a BizTalk perspective.  I inform the Linux Administrator that a new XML message is available for him to review.  He confirms the receipt of the file and I think life is good.  

I get a call back from the Linux Administrator after about 10 minutes with a problem.  Apparently, the file being sent has some special characters ‘��’ at the start of the XML.  I think to myself, that is strange, maybe it is Byte Order Mark (BOM) related.  I change the property "RemoveBOM" on the ESB Remove Namespace Component and set it to "True", thinking this was the problem:



So I run another test through and inform the Linux Administrator that a new file is available.  He confirms the receipt but on inspection of the file the special character problem persists.  Furthermore, he follows up with the fact that the encoding on the XML file is UTF-16 when they are expecting UTF-8.  Again, I think to myself, shouldn't BizTalk be sending this message in UTF-8?  

I go to the Tracked Message in BizTalk Administrator for the send port and look at the MessagePart:



So the Character set on the message looks to be "utf-8".  I go back to the Linux Administrator and argue a bit with him until he finally sends me the output file from Putty showing the file properties.  One of those properties being the encoding which was set to: Little-endian UTF-16 Unicode text.

So me being my hard headed self set out to prove the Linux Administrator wrong.  I find a cool utility on CodePlex called File Encoding Checker and download the tool.  I create a dummy send port (using the new pipeline component) that outputs the file to a local folder on my development BizTalk server.  I check the encoding and this is what I find:





What???  How could this be?  I check the documentation for the ESB Add Namespace Pipeline Component at msdn.microsoft.com and I can find no information about what encoding is used.  Apparently, the ESB Add Namespace Pipeline Component defaults the encoding to UTF-16.  Additionally, the pipeline component has no property for setting the encoding (although the Remove Namespace Pipeline Component does).

In order to resolve this problem, I had to create a custom pipeline component which adds a namespace to the root element and uses UTF-8 to encode the output.  An interesting lesson on the Add Namespace Pipeline Component and something to keep in mind when leveraging it.  It would be great if the Add Namespace Pipeline Component could be updated with a property to set the encoding, just like the Remove Namespace Pipeline Component.

Tuesday, April 8, 2014

BizTalk 2010 and integrating with the Java Messaging Service (JMS) SonicMQ Part 3 Creating the Send Port and Wrap Up

Has it really been three months since my last post?  Sorry about the delay in getting this finished, but I just got through a substantial sized project.  On to the goods...

Creating the JMS Adapter Send Port
I'm assuming that creating a send port is already known for someone reading this post.  Setting the send port properties of the JMS Adapter are pretty straightforward and all dependent on the queue you are connecting to:

The first section of properties entitled "Adapter Behavior" are pre configured and I left the default settings alone.  I haven't played around with these settings at all, but they revolve around how the send port handles exceptions and timeouts.

The next section of properties to configure is under the title "Connection Properties".  The Host Name and Port Number coincide with the JMS server you are connecting to and the port in which the JMS server is listening on.

The third section is called "JMS Operation Properties" and have to do with the specific instance of JMS being connected to.  The JMS Object Name is the actual name of the Queue or Topic that messages will be sent to.  The JMS Object Type is a pull down list with either Queue or Topic.  For Message Type, this is another pull down list of the different message types.  In my case, I selected Text, but the options are: Bytes, Map, Text, Text ISO-8859-15, and Text UTF.  Lastly, the Transactions Enabled property tells the port whether or not transactions are enabled.

The last section is called "Overrride Transmit Handler Properties", and although I didn't configure these, I'm assuming you can use this to override what was configured in the JMS send adapter transport handler.

After creating the send port, you will need to spend some time getting communication between BizTalk and your JMS server working.  Some of the less obvious issues you will run into could be security related but those are too environment specific to go into detail here.  Additionally, you may run into issues with how the JMS server is configured.  For instance, I had created multiple ports to communicate with the JMS server, but the server was configured to only allow one connection.  This caused the JMS adapter to throw infrequent errors when trying to communicate.  It took several tests to isolate the issue and change the configuration on the JMS server to allow multiple connections.

Lastly, there are other aspects to consider before implementing the use of the JMS adapter.  In my opinion, two of the more important aspects are that of security and cost.    Depending on how mission critical this adapter is to your business, the former may weigh more heavily on the decision rather than the later.  This was the case in my scenario.

Security
From a high level, this should always be in the back of your mind as to how to handle security with any adapter being used.  In my opinion, defense in depth is always the way to go.  If your unfamiliar with this term, it essentially means the more security you have, the better.  Still, there are considerations to take, like whether messaging is internal only or external (meaning outside of your network).  What kind of infrastructure support you have to help set up security.  What type of data is being messaged could come into play as well.

Cost
The only negative about this adapter has been the cost.  Since my environment consists of migrating code through four different environments, there was a cost associated with the adapter for each environment. Since my current company leverages an integrated test environment that mimics our production environment, the cost for the adapter was even higher.  This is something to consider before going forward with the adapter.

All in all, I've been really happy with the adapter and how we were able to spin up a proof of concept fairly quickly.  Not having any prior experience with integrating BizTalk and JMS, this adapter helped bridge that gap.  It does a lot of the heavy lifting so that I didn't have to become an expert in JMS.

Tuesday, January 14, 2014

BizTalk 2010 and integrating with the Java Messaging Service (JMS) SonicMQ Part 2 Configuration

The second post on the JNBridge JMS Adapter for SonicMQ will cover configuration of the JMS send adapter transport handler.  If you haven't already read Part 1 , I suggest you do so before continuing.  That post covers the installation of the adapter onto a BizTalk 2010 Server.  Although installation is pretty straightforward, the post also details how to get the third party adapter from JNBridge, as this adapter doesn't come out of the box with BizTalk.

After installing the JMS Adapter, configuring the adapter falls into two parts.  The first part covers configuring the send adapter transport handler and will be explained in this post.  The second part will cover creating the send port in a BizTalk Application and will be detailed in a subsequent post. Setting up the JMS adapter for the receive transport handler and receive location should be very similar to the send handler and port and will not be covered.

Configuring the JMS Adapter Send Transport Handler

When configuring the send transport handler for JMS, there are several properties that are important and dependent upon the SonicMQ which is being integrated with.  If your unfamiliar with SonicMQ (like me), it's important to have a detailed discussion with whoever is responsible for this application.  Doing so will ensure the below properties are properly configured which in turn will save a lot of time and effort.

In order to edit the properties of the send handler, you will need to open BizTalk Administrator, and navigate to "Platform Settings", "Adapters", "JNBridge SonicMQ", and finally right click the "SendHost" and select the properties option:


The first section of properties that need to be configured are under the section "JMS Properties".  I started with the property called "Acknowledge Mode"  A pull down list is provided for this property.  There are three choices, "AUTO ACKNOWLEDGE", "CLIENT ACKNOWLEDGE", and "DUPS OK ACKNOWLEDGE".  In my case, I selected the "AUTO ACKNOWLEDGE" which also happens to be the default:


The second property to be configured is called the "Initial Context Factory".  This is an editable field and needs the name of a JNDI class.  I'm not going to explain what JNDI is, as it's outside the scope of this post.  In my scenario, a named SonicMQ domain is required, which the JNBridge Adapter supports.  In order for the JMS adapter to support this, I am required to use com.sonicsw.jndi.mfcontext.FixedMFContextFactory as the initial context factory:

The JMS Scheme is the protocol used to connect to the JMS service. In the case of SonicMQ, that is tcp:


The next property(s) that need to be configured are the Queue Factory or Topic Factory.  This is the class name of the SonicMQ factory.  In my case, I only filled out the Queue Factory property since that is what I'm connecting to.  I left the Topic Factory property blank:


The Security mode property is the last under the "JMS Properties" section of the send handler.  This property is another drop down list and allows the passing of credentials to the JMS server.  The options are none, simple, and strong.  In my case, I need to pass credentials so I selected the simple option:


The second section of properties that need to be configured are under the "JNBridge Properties" section. These properties are JN Bridge specific and I'm not going to go into details on the set up of these properties. The documentation provided by JN Bridge is pretty thorough and will guide you on the proper values required for each property.  One point I do want to comment on is the "JVM Arguments" property.  This property allows me to pass a named SonicMQ Domain and ties directly to the "Initial Context Property" configured above:
.

The third section called "Security Properties" has to do with credentials needed to connect to the JMS server.  Since I selected the "simple" in the Security Mode Property under the JMS Properties section, I will need to pass a username and password.  This should be supplied by the owner of the SonicMQ application:


The last section "Debug Properties", isn't required, but is worth mentioning.  I relied upon this heavily when first trying to connect to the JMS SonicMQ Queue.  It provides verbose logging when the JMS adapter is trying to connect to SonicMQ.  After your successful in connecting, you can just switch the "Log Errors" flag to false

That is the last section of the JMS adapter send handler transport properties to configure.  In the next post, I'll cover creating the send port in BizTalk Administrator and what you should consider before using the JNBridge JMS adapter.