Вы находитесь на странице: 1из 19

An Elementary Comparison Of TIBCO BW With Ab Initio

Project Details

Project Code: NOREDEAI


Project Type: EAI
Software Tools: TIBCO BW, TIBCO Admin, TIBCO EMS
Project Code: PEDEFFIX
Project Type: EAI specific
Software Tools: Ab Initio
TIBCO is primarily a tool for process integration wherein there is a
real time processing involved in the enterprise together with process
management.
Ab Initio is a data integration tool that consists of data
synchronization between applications and single step interactive
processing. This tool is a choice when a there is a large amount of
data involved together with some complex transformations.
The Ab Intio GDE (Graphical Development Environment) is the front
end of the tool wherein we design the processes known as Graphs.
The GDE is somewhat like our familiar TIBCO Designer. As Ab Initio
is primarily a Data Warehousing tool the performance in moving and
transforming large chunks of data mainly from the database is quite
high .However the catch here is when the point is regarding
discontinuous workflows or large numbers of small transactional
messages or process integration Ab Initio is not as efficient as TIBCO.
Thus Ab Initio handles Data Integration quite fine whereas TIBCO is
specialized for Process Integration.
Some comparison of Components in Ab Initio and the means to achieve the
corresponding functionality in TIBCO BW

We will now look into some of the components in Ab Initio and the
corresponding method to achieve the same functionality in TIBCO BW.
We shall go for some simple process designs in both the tools and
visualize the basic working of these tools from a common perspective.
This is also a small step to familiarize some of the important
components in Ab Initio like the Normalize and Denormalize Sorted.
At the outset let us be clear that the processes discussed here are
quite simple and elementary. This is just to give a feel of the style of
code and approach that we need to follow together with a focus on the
analogy of the processes involved in both the tools.

Let us take the following simple XML structure for the processes that
we are going to design.

<?xml version="1.0" encoding="UTF-8"?>


<address
xmlns="http://xmlns.example.com/unique/default/namespace/1127735342
853" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.example.com/unique/default/namespac
e/1127735342853D:/Tib_prog/Schema.xsd">
<name>Harry</name>
<city>Dallas</city>
<city>Bangalore</city>
</address>

Our sole motto is to map the name tag to the city tags. To be more
precise we need to map Harry to the two cities Dallas and Bangalore.
For parsing the XML structure Ab Initio has a component akin to XML
Parser in TIBCO BW known as Read XML.
We assume that the reader is already familiar with the Hashtable
structure in Java wherein there is a concept of a key value pair
relationship. Let us take the following Hashtable kind of structure.
Key
Value
b
c
d
e

A
A
A
B

We note that the same value A is related to more than one key viz. b,
c, d. In Ab Initio such structures can be manifested into a type of data
that is termed as Vectors.
So the Vector structure in Ab Initio for the above Hashtable
representation would be
Value
Key

A
[

b,
c,
2

d ]
Please note that only the bold form is the real Vector structure.
Going by the above rule we can deduce that for our sample XML the
above structure can be put as

name
city

"Harry"
[vector
"Dallas",
"Bangalore"]

Following is the sample process for the above transformation in Ab


Initio:

Now let us delve a bit deeper into the main component of the above
process viz. Read XML.
The Read XML uses an expat parser as default though we can also use
a xerces parser too. The important part here is to define the out port
DML which has to be imported in accordance with the XML structure
which we have defined.

We select the Record Format Source as Embed and click on the edit
button that will land us on the following screen.

After this we click on the Import XML as shown which we will give us
the following:

The filename is the actual file wherein we have used the XML
structure.

We click on the Import button and doing that would give us the
following which is in fact the actual DML which is imported from the
XML structure.

Incidentally we have another component in Ab Initio called the


Denormalize Sorted which can give us the same functionality as
described above. However as the Read XML has the inherent
capability of converting structures into Vectors we generally use
Denormalize Sorted when the incoming structure is not an XML
structure. Let us extend the above process so as to visualize the
functionality of Denormalize Sorted. Thus in the same process we will
normalize the vector that we finally had i.e. we will reverse the
transformation and convert the same Vector structure to the incoming
structure after the XML is parsed. So our very first step would be to
use a Normalize component just after the Read XML and then add
another Denormalize Sorted component in the process to finally have
the output of the Vector form. Following is the output structure from
the Normalize component.

Record 1:
[record
name "Harry"
city "Dallas"]
Record 2:
[record
name "Harry"
city "Bangalore"]

Following are the transformations in the Normalize functions.

The output of the length(in) function is mapped to the length_of(city)


which denotes the length of the incoming city which is a Vector. It is
simply a way of telling to loop the mapping of the fields (name and
city (defined as a string in the out port dml)) as many times as the
length of incoming vector.

The normalize(in, index) function is the main mapping wherein we can


find that the elements in the Vector city is mapped to the String city
with the help of the index which is incremented from 0 to
length_of(city)-1.
Now we add the final Denormalize component. Below is the sample of
the transform that is being used in this component. This can be used
as a generic code (though requiring the preliminary customizations)
for the component.

type element_type =
record
string("\001") city;
end /* Element Type*/;

type denormalization_type =
element_type[3] /* Denorm vector*/;
type temporary_type =
record
decimal(4) count;
end;

/*This function may be optionally defined.


Initialize temporary*/
out::initialize(in) =
begin
out.count :: 0;
end;
/*Initialize vector element*/
out::initial_denormalization() =
begin
out.city :: " ";
end;
/*Rollup on normalize*/
out::rollup(temp, in) =
begin
out.count :: if(!is_null(in.city) && !is_blank(in.city))(temp.count + 1);
end;
/*Do computation*/
denorm_out::denormalize(temp, denorm, in, count) =
begin
denorm_out.index :: count;
denorm_out.elt.city :: if(!is_null(in.city) && !is_blank(in.city)) in.city;
end;

/*Create output record*/


ret::fun(length, denorm, in) =
begin
let
record
string("\001") city=NULL;
end[length] city_temp=allocate();
let integer(4) cnt =0;
for(cnt,cnt<length)
begin
if(!is_null(denorm[cnt].city) && !is_blank(denorm[cnt].city))
city_temp[cnt].city=denorm[cnt].city;
end
ret::city_temp;
end;
out::finalize(temp, denorm, in) =
begin
let integer(4) cnt =0;
let integer(4) length =0;
for(cnt,cnt<length_of(denorm))
begin
if(!is_null(denorm[cnt].city) && !is_blank(denorm[cnt].city))
length=length+1;
end
out.name :: in.name;
out.city :: fun(length,denorm,in);
end;

Thus the above transformations would give us the following output:

Record 1:
[record
name "Harry"
city [vector
[record
city "Dallas"],
[record
city "Bangalore"]]]

Hence we have seen the main functionality of a Denormalize Sorted


component.
The following graph depicts the final process what we have discussed
above.

Now let us try to achieve a similar functionality in TIBCO BW.


We will aim to accomplish this by using a Hashtable conversion of the
above incoming data in XML format by using a simple Java code in the
Java Code palette in TIBCO BW.
Subsequently we will write the same Hashtable to a file in the form of
a String for our verification. However we should need to understand
that we can also convert the same to an Object Reference and not a
String object which can be used downstream in a complex process.

10

The input and the output parameters can be defined as following.


(The output parameter len which is mapped to the length of the
Hashtable is not used downstream but we do make a note here that
we can sometimes use this length to ease some transformation)

The following is the piece of Java code

11

package Denormalize;
import java.util.*;
import java.io.*;
public class DenormalizeJavaCodetoconverttotheHashtable{
protected String name = "";
protected String[] city = null;
protected String HTable = "";
protected int len = 0;
public String getname() {
return name;
}
public void setname(String val) {
name = val;
}
public String[] getcity() {
return city;
}
public void setcity(String[] val) {
city = val;
}
public String getHTable() {
return HTable;
}
public void setHTable(String val) {
HTable = val;
}
public int getlen() {
return len;
}
public void setlen(int val) {
len = val;
}
public DenormalizeJavaCodetoconverttotheHashtable() {
}
public void invoke() throws Exception {
Hashtable city_hash = new Hashtable();
for(int i=0;i<city.length;i++)
{
city_hash.put(city[i],name);

}
setHTable(city_hash.toString());
setlen(city.length);
}
}

12

We can also follow an alternate approach to achieve our goal. The


following process defines the same.

The Map palette has the following transformation

Following is the output from the above process.


Harry[city[1] Dallas]
[city[2] Bangalore]

13

We now extend the above process to understand the functionality of a


Normalize (in Ab Initio) component in TIBCO BW.
We will make a Hashtable object and then send it to a JMS Queue
using a JMS Queue Sender. Later we will receive it using a JMS Queue
Receiver and we will write the key pair attributes to a file.
The following process will send the Hashtable to a JMS queue.

This time however we wont take a String version of the Hashtable.


Instead we will be dealing with the Hashtable object reference
directly.

The message type of the JMS Queue Sender will be Object Ref.

14

The following process will receive the Hashtable object.

The Group activity would be definitely having the number of iterations


as the length of the city array (to which it is finally converted). Hence
we will have the following condition.

15

The input and the output parameters will be

Let us now look into the Java code.

16

package Normalize;
import java.util.*;
import java.io.*;
public class NormalizeJavaCodeUnwraptheHashtable{
protected Object Hashtable = null;
protected String[] city = null;
protected String name = "";
protected int length = 0;
public Object getHashtable() {
return Hashtable;
}
public void setHashtable(Object val) {
Hashtable = val;
}
public String[] getcity() {
return city;
}
public void setcity(String[] val) {
city = val;
}
public String getname() {
return name;
}
public void setname(String val) {
name = val;
}
public int getlength() {
return length;
}
public void setlength(int val) {
length = val;
}
public NormalizeJavaCodeUnwraptheHashtable() {
}
public void invoke() throws Exception {
Hashtable city_hash=new Hashtable();
city_hash=(Hashtable)getHashtable();
Set city_set=city_hash.keySet();
Iterator city_cnt=city_set.iterator();
int count=0;
String name=new String();

17

while(city_cnt.hasNext())
{
city_cnt.next();
count++;
}
String[] city_arr=new String[count];
count=0;
while(city_itr.hasNext())
{
city_arr[count]=(String)city_itr.next();
name=(String)city_hash.get(city_arr[count]);
count++;
}
setlength((count+1));
setcity(city_arr);
setname(name);
}

}
The text content file in the Write file will be the following.
concat($Java-Code--Unwrap-the-Hashtable/javaCodeActivityOutput/name,"
", $Java-Code--Unwrap-the-Hashtable/javaCodeActivityOutput/city[$i])

Thus we have the following output.


Harry Bangalore
Harry Dallas

Now we have the idea as how we can manipulate the Hashtable


Object to meet our needs and achieve the functionality of normalizing
or denormalizing the Vector in Ab Initio.
From the above discussion we have the concept of two of the most
important components in Ab Initio viz. Normalize and Denormalize
Sorted and how we can develop the same functionality in TIBCO BW.
Of course there would always be some other means too to develop the
same logic.
We would like to mention that there is another versatile component in
Ab Initio which is extensively used viz. Reformat. We can look at it as
the Map Data component in TIBCO BW. However depending upon the
use we can achieve the same functionality using some different
palettes.

18

Another important point to note regarding Ab Initio is we dont have


the concept of Topics here. There are only Queues in Ab Initio to
which we can send data and then receive from them. These are
basically some simple data files that are created and stored in the
UNIX system when we create the queues using some commands.
However the component names like Subscriber and Publisher in (Ab
Initio) may be misleading for those who have stepped into this tool
from the field of TIBCO BW.
Once a process is developed in Ab Initio it is converted into a (ksh)
script which is subsequently run .Depending on whether the process
is a batch or a continuous one it can be run at a particular time in the
day or can be left running continuously.
However the process deployment in TIBCO BW is a bit different
because here we need to make the EAR file of the process in the
beginning which is then deployed in the TIBCO BW Administrator. A
lot of parameters can be monitored and also controlled using TIBCO
BW Admin which can help us to get the optimal performance.
We hope that the above discussion would be helpful in giving a high
level insight to both Ab Initio and TIBCO BW from the viewpoint of
both common and disparate functionalities.

19

Вам также может понравиться