Showing posts with label BizTalk 2010. Show all posts
Showing posts with label BizTalk 2010. Show all posts

Sunday, July 12, 2015

Business Activity Monitoring (BAM), BAMPrimaryImport, OLAP Cubes, and archiving

There are already a lot of great posts about Business Activity Monitoring (BAM) and how to go about setting up archiving.  In particular, the blog post by Richard Seroter and the post by BizTalk Bill are two I closely followed to set-up my environment. In the time when I originally configured the archiving, I've been super slammed and have just come back around to performing some environment maintenance.

Needless to say, I was a bit surprised to see that my BAMPrimaryImport database had grown to almost 40 GB:


Wait, what???  I went through and checked all my configurations to make sure everything was running correctly.  Firstly, I checked to make sure the SQL Job that I had created to run all the BAM SSIS packages was working.  BizTalk Bill had set-up an SSIS package to do this, but I went down the route of creating two SQL Jobs, the first used to dynamically build and execute the list of packages to run (any package starting with DM_) and the second to control and monitor the execution of the first.  Both jobs looked like they were running successfully (These are run on a nightly basis).

My second check was to look at the BAMPrimaryImport database tables.  I first did a visual inspection of the tables, in which I noticed an unusual amount of partition tables for the first message being tracked using BAM activities and views:



Secondly, I ran a quick SQL Command to return the number of tables in the BAMPrimaryImport database:


I had over 7000 tables in the BAMPrimaryImport database!!!  Granted, we do a lot of BAM tracking on all of our different messages, but that number sounded excessive.  So I wanted to confirm what Richard Seroter had writtien in his blog and looked at the Metadata_Activities table to see how long data should be kept before archiving.  As I suspected, it was configured to only keep a months worth of tracking data:




 So from the above i could determine:

  1. SQL Jobs were running - bueno
  2. SSIS packages creating table partitions - bueno
  3. Partition tables being archived to the BAMArchive database - no bueno
So why were the partition tables not being moved to the BAMArchive database?  In looking at the properties on some of the tables, they had been created way back in October and November of 2014.  In between looking at the aforementioned blog posts, I noticed something different in my Integration Services environment.  I had what appeared to be a lot of SSIS packages not only starting with "DM_", but "AN_" as well.  

I did a quick google search with the AN SSIS packages and found a great article by the Microsoft India team.  In the last paragraph of the article I found my problem.  It appears, that if in your BAM tracking you take advantage of creating OLAP cubes, you need to set-up the SSIS Packages that begin with "AN_" to run daily.  If you fail to do this step, the partition tables will fail to be moved to the BAMArchive database.

As a test, I went ahead and ran the "AN_" SSIS package for the first tracked message.  I then went and executed the "DN_" SSIS package for that same tracked message.  Sure enough, the appropriate partition tables were moved to the BAMArchive table:


In order to play it safe and get the appropriate tables archived, I manually ran each SSIS package that had the "AN_" prefix (all 143 of them).  What's worse about this whole ordeal is that the current environment in which I work in doesn't even use these OLAP cubes (you can read why here).  I have to admit this was a sloppy mistake on my part, and shows how little I really understood the archiving process when using BAM.  On a positive note, maybe I can convince the BizTalk360 to automate this process in an upcoming release?

Friday, February 13, 2015

Real World Business Activity Monitoring (BAM) using BizTalk 2010 part 2

In this second blog post on Real World Business Activity Monitoring (BAM), I'm going to discuss BAM tracking.  There are already a number of great resources on the process of creating BAM tracking, including the book Pro BAM in BizTalk 2009 and a recent blog post I just saw on code project.  So instead of talking about the creation of BAM tracking, I'm going to go over some important tips I follow when setting up BAM tracking in BizTalk.

Proper Naming Conventions in Definition Files

When creating activities for a definition file in MS Excel, I think an important factor that is often overlooked is the use of proper naming conventions.  For one, your activity names in Excel are limited to 48 characters.  In the environment I work in, I've tried to standardize on using four distinct factors in the name.  These factors can be seen below with an example proceeding:

Activity Naming Convention: [MessagePattern][MessageName][SourceSystem][TargetSystem]

Activity Naming Example: PublishInvoicePOSERP

I know what you might be thinking, there is only one pattern represented here.  This is just meant as an example and you should come up with a convention that fits the messaging used in your environment. What do I mean by that?  You might not have common messaging patterns like Publish/Subscribe or Request/Response like I do.  The important thing to keep in mind is to have a consistent approach.

Additionally, in the above, the abbreviation POS stands for Point of Sale and ERP stands for Enterprise Resource Planning.  If inclined, you could replace ERP with SAP, PeopleSoft, Dynamics, etc., depending on the ERP implemented.  Again, personal preference on the granularity of your naming.

When creating views, you have even less flexibility in the naming convention, as you are limited to 18 characters.  No, that's not a typo, you've got 18 characters.  So the term "short and sweet" is the name of the game.  For that reason, the naming I've used has been shortened to the below:   

View Naming Convention: [MessagePattern][MessageName][TargetSystem]

View Naming Example: PublishInvoiceERP

Again, with the 18 character limitation, there will be times you need to modify your convention. The point is to remain consistent with your approach.  I can't stress this enough.  This is especially true if you don't leverage the out of the box BAM monitoring portal for your end users (like in my environment).

Modifying deployed Definition files

I make it a point to always remove the BAM definition file via the command prompt before making any changes to activities or views.  This is especially true if your directly deploying your xlsx definition file and not the generated XML definition file.  If you do change an activity or view while the xlsx BAM definition file is still deployed, you run the risk of having to manually remove all activities and views in the definition file.

To go along with not modifying definition files before removing them, it's also important to deploy your BAM definition file from the XML not the xlsx.  The XML is created using the menu command "Export XML" from the BAM Add-In.  This insulates your deployed definition file from your Excel definition file.  In this manner, if you do change an activity or view without removing the definition file, you have an extra layer of protection from change.

So the order of events that I execute when dealing with changes to a definition file:
  1. Remove Definition File using command prompt and the remove-all command
  2. Make changes to view /activity in Excel spreadsheet
  3. Create XML of the BAM definition file using "Export XML" from the Excel BAM menu command
  4. Deploy the new Defintion File using command prompt and the update-all command

Maintaining BAM History

I'm not going to rehash what has been blogged about many times on BAM maintenance.  There are a lot of great posts including but not limited to ones written by Saravana Kumar and Richard Seroter.  Out of the box, BAM history is saved for 6 months.  Talk to the business and learn the requirements for retention.  Although tracking data doesn't seem like it should consume much space, depending on factors like number of messages and volume of messages, the database can build up quickly and eventually cause more serious problems.

My next post will focus on why we choose to develop a custom UI for displaying BAM tracking data for our users instead of the out of box BAM portal.  

Wednesday, November 12, 2014

Real World Business Activity Monitoring (BAM) using BizTalk 2010 part 1

In all my years working with BizTalk, my current employer is the first that I've had a real requirement for using Business Activity Monitoring (BAM).  From a high level, we needed to provide the business with a real time view of all the different messages coming into and out of the Enterprise Service Bus (ESB) environment.  Additionally, the business wanted to see some high level statistics of these messages like volume throughput, average size, and average time.

To meet these requirements, we used two different solutions.  BizTalk BAM tracking was used to track the receiving and sending of each different type of message in the ESB.  The second solution was a custom ASP.NET web site to display the information in a consolidated manner for  the business users.  I'll explain why we didn't use the out of the box BAM monitoring portal in a subsequent post.

As I've mentioned in some previous posts, the current BizTalk environment I work in performs a lot of content based routing for messaging.  When looking at a way to monitor all of the messages, it made sense to leverage BAM tracking.  This not only allowed the tracking of all the different individual messages, but it also provided a means of tracking specific information related to each message.   Below is a screen shot of the web site that is displayed to the user:



From looking at the above screen shot, the intent of the web page is to display a high level view of the different messages moving through the ESB.  The information presented to the user on the site is based mostly off of the views created with BAM tracking.  Each row shown above identifies a different subscriber to a particular message publication.  While meeting the requirements of the business, it also gives some indicators to the health of each individual message in the ESB.  *Note: Credit must be given to my then manager Darrick Johnson for creating the original web site.

My next post will walk through the "how to" of creating the tracking for an individual message. In my last post, I will look into more specifics of how the ESB monitor is organized and what it provides for the users.

Tuesday, January 14, 2014

BizTalk 2010 and integrating with the Java Messaging Service (JMS) SonicMQ Part 2 Configuration

The second post on the JNBridge JMS Adapter for SonicMQ will cover configuration of the JMS send adapter transport handler.  If you haven't already read Part 1 , I suggest you do so before continuing.  That post covers the installation of the adapter onto a BizTalk 2010 Server.  Although installation is pretty straightforward, the post also details how to get the third party adapter from JNBridge, as this adapter doesn't come out of the box with BizTalk.

After installing the JMS Adapter, configuring the adapter falls into two parts.  The first part covers configuring the send adapter transport handler and will be explained in this post.  The second part will cover creating the send port in a BizTalk Application and will be detailed in a subsequent post. Setting up the JMS adapter for the receive transport handler and receive location should be very similar to the send handler and port and will not be covered.

Configuring the JMS Adapter Send Transport Handler

When configuring the send transport handler for JMS, there are several properties that are important and dependent upon the SonicMQ which is being integrated with.  If your unfamiliar with SonicMQ (like me), it's important to have a detailed discussion with whoever is responsible for this application.  Doing so will ensure the below properties are properly configured which in turn will save a lot of time and effort.

In order to edit the properties of the send handler, you will need to open BizTalk Administrator, and navigate to "Platform Settings", "Adapters", "JNBridge SonicMQ", and finally right click the "SendHost" and select the properties option:


The first section of properties that need to be configured are under the section "JMS Properties".  I started with the property called "Acknowledge Mode"  A pull down list is provided for this property.  There are three choices, "AUTO ACKNOWLEDGE", "CLIENT ACKNOWLEDGE", and "DUPS OK ACKNOWLEDGE".  In my case, I selected the "AUTO ACKNOWLEDGE" which also happens to be the default:


The second property to be configured is called the "Initial Context Factory".  This is an editable field and needs the name of a JNDI class.  I'm not going to explain what JNDI is, as it's outside the scope of this post.  In my scenario, a named SonicMQ domain is required, which the JNBridge Adapter supports.  In order for the JMS adapter to support this, I am required to use com.sonicsw.jndi.mfcontext.FixedMFContextFactory as the initial context factory:

The JMS Scheme is the protocol used to connect to the JMS service. In the case of SonicMQ, that is tcp:


The next property(s) that need to be configured are the Queue Factory or Topic Factory.  This is the class name of the SonicMQ factory.  In my case, I only filled out the Queue Factory property since that is what I'm connecting to.  I left the Topic Factory property blank:


The Security mode property is the last under the "JMS Properties" section of the send handler.  This property is another drop down list and allows the passing of credentials to the JMS server.  The options are none, simple, and strong.  In my case, I need to pass credentials so I selected the simple option:


The second section of properties that need to be configured are under the "JNBridge Properties" section. These properties are JN Bridge specific and I'm not going to go into details on the set up of these properties. The documentation provided by JN Bridge is pretty thorough and will guide you on the proper values required for each property.  One point I do want to comment on is the "JVM Arguments" property.  This property allows me to pass a named SonicMQ Domain and ties directly to the "Initial Context Property" configured above:
.

The third section called "Security Properties" has to do with credentials needed to connect to the JMS server.  Since I selected the "simple" in the Security Mode Property under the JMS Properties section, I will need to pass a username and password.  This should be supplied by the owner of the SonicMQ application:


The last section "Debug Properties", isn't required, but is worth mentioning.  I relied upon this heavily when first trying to connect to the JMS SonicMQ Queue.  It provides verbose logging when the JMS adapter is trying to connect to SonicMQ.  After your successful in connecting, you can just switch the "Log Errors" flag to false

That is the last section of the JMS adapter send handler transport properties to configure.  In the next post, I'll cover creating the send port in BizTalk Administrator and what you should consider before using the JNBridge JMS adapter.  

Sunday, December 15, 2013

BizTalk 2010 and Clustered Enterprise Single Sign-On Support for Multiple BizTalk Groups

My current company follows the traditional code promotion environments: Development, Test, and Production. These environments are fully integrated with other systems so we can test messaging of our Enterprise Service Bus (ESB).  Where things really start to get atypical is in the fact that we support three different development and test environments.  Wait, what???

Yeah, and since we have three different test environments, we have three different BizTalk groups which all mimic our production set up.  Which in turn means three clustered databases.  Here is one thing that is super nice, we leverage the same Master Secret Server database and Enterprise Single Sign-On Service (SSO) for all three BizTalk groups.  Wait, what???

That's right, when installing and configuring different BizTalk groups, you can configure these BizTalk groups to leverage the same Enterprise Single Sign-On Service (SSO), and in my case, that resides on the cluster.  So how did I set this up?  In my particular scenario, all of our test BizTalk databases reside on the same cluster server. When I start the Failover Cluster Manager application, I'm presented with all the database clusters (In my case, 1, BIZTALK2, and SQL Server):


The clustered  Enterprise Single Sign-On (SSO) Service resides on the cluster with the name of  "1", which houses the original BizTalk database I created.  If I click this cluster in the left hand pane, it shows me a summary of my resources, which includes the Enterprise Single Sign-On Service:


So knowing that the Enterprise SSO resides on the first cluster application, I can leverage this set up when configuring my second BizTalk group.  To do this, during the configuration of the second BizTalk group, I selected the Enterprise Single Sign-On feature.  Instead of creating a new SSO System, I selected the "Join an existing SSO system".  I then entered the server name and database of where the Master Secret server was originally created (on the first cluster named "1").  I also used the same domain account that the service runs under:


Configuration of the remaining BizTalk features can be set up to leverage the second cluster application labeled BIZTALK2.  This cluster has a SQL Server database that can be used when configuring the BizTalk group, BizTalk Runtime, Business Rules Engine, and Business Activity Monitoring (BAM).

I used the same process when configuring the third BizTalk group.  The only changes are during the configuration of BizTalk.  Instead of using the BIZTALK2 clustered application, I used the clustered application labeled "SQL Server".

So what do I gain in having three different BizTalk groups leveraging one Enterprise SSO?
  • For starters, this minimizes the number of clustered servers.  Instead of having three different clustered servers, I have one, which means fewer servers to maintain and support.
  • Reduce Total Cost of Ownership (TCO) which directly relates to the first point.
  • Maintain only one Enterprise Single Sign-On Service.  If I had multiple clustered servers, each one would require it's own Master Secret Server and Enterprise SSO (which need to be clustered).
  • Avoid multiple Master Secrets to backup and store for a non production environment.
As a side note, some may argue that I've altered my test environment in a way so that it no longer mirrors production.  In theory, I can agree with this, however in practice I'm willing to take the risk.  I don't think multiple BizTalk groups sharing the same Enterprise SSO creates a test environment that deviates in a meaningful way from the production environment.

Wednesday, November 13, 2013

BizTalk 2013 and configuring the Oracle Adapter in a 64 bit environment

At my current company, I was on a recent project that needed to query master data housed in our PeopleSoft  Enterprise Resource Planning (ERP) System.  Generally, we have leveraged the Request/Reponse message pattern via web services when querying ERP master data.  However, on this project, we needed multiple succinct request messages to get the required data.

These smaller high volume messages were a definite limitation for PeopleSoft.  PeopleSoft uses Integration Broker (IB) for messaging, and we discovered early on that it's ability to respond in a timely manner to a high volume of requests was sub par.  Luckily we have BizTalk, which is interoperable, and provides an Oracle adapter that could be used to access the PeopleSoft Oracle database.  Having never used the Oracle Adapter before, I started Googling around for help.

I found two blog posts that got me rolling.   The first is installing the BizTalk LOB Adapter by Sandro Pereira which can be found here:
This is a step-by-step guide with screen shots for installing the BizTalk Line of Business Adapter Pack. Although I was more interested in the Oracle adapter, I went ahead and installed the other adapters as well (SAP, Siebel, Oracle E-Business Suite, etc.).  In addition, this blog post also guides the user on how to add the adapter to the BizTalk Administration Console.

The second article, written by Jason Agostini, was around configuring the BizTalk Oracle adapter which can be found here:
The only problem I had with the second article was that it was geared to a 32 bit development environment and an earlier version of the Oracle Database.  So using the above article as a guideline, here are the steps I took to get the Oracle adapter configured in a 64-bit environment:
  • First I contacted our PeopleSoft Admin to get the current version of the PeopleSoft Oracle Database (11.2.0.2).  I then went to the Oracle website and downloaded the appropriate 32-bit and 64-bit ODAC zip files (ODAC112021XCopy).  Why not just the 64-bit?  For design time development, you will need the 32-bit version to create a connection to the database.
  • From the readme.txt included in the ODAC zip files, I installed the assemblies by running the "install.bat" file from the command prompt.  Again, I had to do this twice, once for the 32-bit version and once for the 64-bit version.
  • I then navigated to the directory where the BizTalk Adapter Pack was installed (In my case, C:\Program Files (x86)\Microsoft BizTalk Adapter Pack\bin) and opened the file "Microsoft.Adapters.OracleDB.config".  I then added an entry to reference the new version of the Oracle DataAccess assembly:

  • Having completed the installation, I opened Visual Studio 2012 and brought up my BizTalk solution. In my schemas project, I right clicked the project and selected "Add Generated Items" selection:
  • From the "Add Generated Schema" window, I selected the "ConsumeAdapterService" option:

  • The next step is to select your binding, in my case "OracleDBBinding", and click the "Configure" button.  This is where I entered database information like ServerName and ServerAddress in the "URI Properties" tab.  Enter any credentials needed for access in the "Security" tab:
  • Once the database information is entered, click the "OK" button.  From the "Configure a URI" text box, copy the text and paste into an application like notepad for later (you will need this information when setting up your send port in BizTalk Administrator).  
  • Click the "Connect" button.  Once connected to the database, I selected the "Client (Outbound operations)" from the "Select Contract type" pull down list (which will allow me to pull data from the database).  Select the appropriate database object and operation from the menu panes.  Click the "Add" button so that your operation is added to the "Added categories and operations" menu pane.  Click the "OK" button :
  • From this, a schema of the database object is created in the BizTalk project that can be used to retrieve data.  In my case, this was a table. Included in the schema's "Select" record is a filter element which acts as a "WHERE" clause to narrow your return set.  Below is a screenshot of a sample table schema with the "Select" and the "SelectResponse" records:

  • After completing development (in my case an orchestration), building, and deploying my project, it was time to configure the send port in BizTalk Administrator.  The first step is to create a new Send Port for the application.  Right click the "Send Ports" and add new "Static Solicit-Response Send Port"
  • When selecting the type of adapter to use, I have two choices.  I  can either select the "WCF-Custom" or the "Oracle" adapter:
  • Since I selected the "WCF-Custom", under the "Binding" tab of the adapter I needed to select "oracleDBBinding":


  • When configuring the adapter, under the "General" tab, there are two important steps.  First, I entered the Address (URI) by copying and pasting the text saved in notepad from my previous step of connecting to the Oracle database.  Second, the Soap Action should come from the "Target Namespace" of the database object schema created with the Adapter service.

  • To ensure I was using the 64-bit version of the adapter, I made sure the BizTalk Host associated with the adapter was configured with the check box for "32-bit only" not selected:
  • Running a test message, I was able to confirm that the send port was sending and receiving messages with the Oracle database successfully.  One thing to note, directly accessing an ERP database table is not a recommended approach, and I only used a table to get a working example up and running.