Вы находитесь на странице: 1из 34

Steps Involved in building an ETL process…

1. Create Source Definition


2. Create Target Definition
3. Design Mapping with or without Transformation Rule
4. Create Session for each Mapping
5. Create Workflow
6. Execute Workflow

Prerequisites:
1. Creation of User Accounts

Username: scott Username: batch7


Passwor: tiger Password: target

EMP Emp

DEPT

BONUS

Source DB Target DB

Process
1. Create User in Oracle (It can be any database)
a) StartProgramOracleApplication DevelopmentSQL*PLUS
b) Login with
Username: system
Password: manager
Host String: ORCL
SQL> CREATE USER BATCH7 IDENTIFIED BY target;
SQL> GRANT DBA TO BATCH7;
SQL> CONNECT BATCH7/target@ORCL;
SQL> CREATE TABLE DIM_EMP
( EMPNO NUMBER(5) PRIMARY KEY,
ENAME VARCHAR2 (10),
SAL NUMBER(7,2),
DNO NUMBER(3) );
2. Create the ODBC connection
An ODBC is a middleware or an interface which provides an access to the databases..
StartSettingControl PanelPerformance & MaintenanceAdministrative
tools Data Source (ODBC)
For Stand alone PC set “USER DSN”
For PC on Network:-
Select SYSTEM DSN tabClickAdd
-- Now the Create New Data Source Window will appear…
Select the driver Oracle in oradb10g_home (For 9i Oracle in OraHome90)
ClickFinish
-- Now the Oracle ODBC driver Configuration window will appear…
Data Source Name: Batch7_Source_Oracle
TNS Service Name: ORCL
USER ID: scott
ClickTest connection  give password (tiger)
A message will appear “CONNECTION SUCCESSFUL” It means now your user
Scott is connected through ODBC. Otherwise check the configuration setting again
and do it properly.
** One more ODBC connection required for target similarly create an ODBC
connection name with BATCH7_TARGET_ORACLE by repeating the same process
explained above but here the username will be BATCH7 which you have created with
password target.

3. Starting Services
For Starting Services you can use MSCONFIG command on RUN prompt a window
will appear in that choose services. Or you can find Services in
control panelAdministrative tool.
Start these two services:
1. Informatica orchestration server
2. Informatica services 8.6.0
4. Creation of Folder
Process
1. StartProgramInformatica PowerCenter 8.6.0PowerCenter Repository
Manager.
2. From Repository Navigator pane Select Repository (nipuna_rep)Right
Clickconnect (you can also go through the Repository menu)
3. Enter the Login Details such as username, password (Administrator,
Administrator)connect
4. Folder menuCreate
5. Enter the Folder details (Name)
** Now every thing is completed for creation of Mappings. Now follow the steps
described earlier for creating mapping.

Step 1: Creation of Source Definition


A source definition is created using Source Analyzer Tool in the designer client
component.
Process
1. StartProgramInformatica PowerClientClientPowreCenter Designer
** Now you are at Designer component window.
2. ConnectRepositorySelect desired folder
3. Tools menuSource Analyzer
4. Sources menuImport from database
** An Import Tables window will appear
Connect to the database with the following details…
ODBC data source- Give the connection name you had given earlier while creating your
ODBC connection.
Username: SCOTT
Ownername: SCOTT
Password: tiger
Click Connect
Select desired tables you want to be as a source definitionOK
Repository menuSave
**Now your Source Definition has been created and saved in repository.

Step 2: Create Target Definition


Target Definition can be created using Target Designer Tool in the Designer client
Component.
Procedure
1. Tool menuTarget Designer
2. Source menuImport from database
Connect to the database with following details:
ODBC data source
Username
Password
ClickConnect
Select TablesOK
Repository menuSave

Step3: Design a mapping without Transformation Rule


** A mapping without Transformation Rule called Simple Pass Mapping.
A Mapping is created using mapping designer tool. Every mapping is uniquely identified
by name.
Procedure
1. Tools menuMapping Designer
2. Mapping MenuCreate
3. Enter the Mapping NameOK
4. From repository navigator pane drag the source (EMP) and Target (Dim_Emp)
table definition; drop on mapping designer work space.
5. From Source_Qualifier (SQ_EMP) connect column to the corresponding columns
in the target table definition by just dragging. (You can also use auto connect)
6. Repository menuSave

Note: Every source table definition by default associates with source qualifier
transformation.
The source qualifier transformation prepares an SQL statement which is used for
extraction by integration service.
Step 4: Creation of Session
Process
1. Open the client workflow manager
2. ConnectRepository
3. SelectFolder(your folder from Repository pane)Tool menuTask Developer
4. Task MenuCreate
5. SelectTask type sessionEnter the nameClickCreate
6. SelectMappingOK
7. ClickDone
** Creation of Source Connection:- (This connection is required for extraction and
loading of actual data. Earlier connection ODBC which you made is only for extracting
the structure of the table.)
Connection menuRelational
FromlistOracleNewEnter the Details
Name-BATCH7PM_SRC
Username- scott
Password-tiger
Connect String-ORCL
** Similarly create target connection following the above stated process.
8. Double ClickSession(S_simplepass)Mapping tab
9. From left paneSQ_EMPSet Connection with the value which you have
created for the “Extraction” connection name
10. Repeat the processDim_employeeSet Connection
11. From propertiesset Target load type “Normal”ApplyOK
12. RepositorySave

Step 5: Creation of Workflow


Process
1. Tool menuWorkflow Designer
2. Workflow menuCreate
3. Enter the Workflow name (WKF_simplepass)
4. From repository Navigator window expandSession sub folderDrag the
session, drop beside the workflow
5. From Task menuLink taskDrag the link from workflow, drop on session
6. Repository menuSave

Step 6: Executing Workflow


Process
1. Workflow menuStart Workflow

** Now Start the Workflow Monitor to view your workflow status.

** There is an option in the Informatica PowerCenter in which you can create the target
definition manually to the Target Database.

Target DefinitionManual Approach


Process
1. Tools menuTarget Designer
2. Target menuCreate
3. Enter the Target Table nameselect Database type (Oracle)CreateDone
4. Double ClickTarget DefinitionColumn Tab from toolbarAdd new column
The column Structure will look like this…
Column name Data Type Precision Scale Not Null Key Type
DEPTNO Number(p,s) 2 0 √ Primary key
DNAME Varchar2 10 0 Not A key
SUMSAL Number(p,s) 7 2 Not A key
5. ClickOkDone
6. Target menuGenerate/Execute SQL
7. ClickConnectGive the Information
ODBC Data source
Username
Password
8. ClickConnect
9. Select Create TableGenerate/ExecuteClose
Transformation
1. (Filter, Rank, Expression)

DFD
Emp SQ_EMP Dept = 30 Top 3 Tax (Sal*.10) T_Emp
14 Rows 14 Rows 14 (I) 6(O) 6(I) 3(O) 3(I) 3(O)

Business Logic: Calculate the Tax for top 3 employees of department number 30
Ans: First we use the Filter transformation to filter the data of department number 30
only, then Rank transformation to take the top 3 employees from department number 30
and at last Expression transformation to calculate the Tax for each employee.
Procedure
1. Create the Source Definition  Emp
2. Source AnalyzerCreate Target DefinitionEmp_tax_cal
3. Create MappingName M_tax_cal Drop source & target definition onto the
mapping designer workspace.
4. Transformation Menu CreateSelect Transformation type FilterEnter the
nameCreatedone
-- Repeat the same process to add Rank & Expression Transformation.
5. From SQ_EmpCopy the required ports to the filter transformation.
6. Double clickFilter TransformationProperties tab Set
Filter Condition  DEPT = 30
7. Click  ApplyOk
8. From Filter Transformation copy the port to a Rank Transformation.
9. Double clickRank Transformation Port tabSet
For a port name SALRank Port
10. SelectProperties TabSet
Top/Bottom Top
Number of Ranks3
11. From Rank Transformation copy the port to an Expression Transformation.
12. Double clickExpression TransformationPort tab
13. From tool barAdd new PortSet
Port Name=Tax
Data Type=Decimal
Precision=7
Scale=2
UncheckInput Port (I)
Expression SAL * 0.15
14. Click ApplyOk
15. From Expression Transformation connect the ports to the Target Definition.
16. Repeat the process of creation of workflow and execute the workflow.

2. Sorter, Aggregator, Lookup

DFD Lookup T/R


Dept Table
-Deptno
-Deptname
-Location

Emp SQ_EMP Sorter T/R Aggregator T/R Emp_Sum

-Emp -Sal -Deptno -Deptno


-Ename -Deptno (Key) -Sal (I) -Dname
-Sal -Sumsal (O) -Sumsal
-Deptno -sum(Sal)

Business Logic: Calculate the total Salary Paid for each department
Ans: Sorter transformation is used for better performance of Aggregator transformation,
and grouping the data department wise, aggregator transformation aggregate the salary
(Sum) department wise. There is Dname port is in the target table but Dname is not in the
Emp table so, we used the Lookup Transformation to get the Dname from Department
table. (Emp.deptno=Dept.deptno)

Procedure
1. Create the Source and Target Definition
2. Create the mapping with name M_LKP
3. Drop the Source & Target Definitions
4. Create the Transformation type Sorter and Aggregator.
5. From SQ_EMP copy the port (Dept, Sal) to Sorter Transformation
6. Double clickSelect Port tabfor port name Deptno check, key
checkboxapplyok
7. From Sorter Transformation copy ports to aggregator transformation
8. Double click on aggregator transformationSelect port tabfor a port name
deptnocheck group by checkbox
9. From a port name SAL uncheck output port (O)
10. From toolbar add new port
Port name DataType P S I O V Expression
Sumsal decimal 7 2 √ sum(sal)
11. Select the properties tab select a sorted inputclick applyOK
12. From aggregator connects the port to the target definition.
13. Transformation menuCreatelookup transformationName
LKP_SRCCreate
14. Select the tab Sourceselect the source table Dept definition  ok
15. From aggregator transformation copy the port Deptno to the lookup
transformation double click on lookup transformation
16. Select the condition tab from a toolbar click on add a new condition.
Lookup table column Operator Transformation Port
Deptno = Deptno1
17. ClickapplyOK
18. From lookup transformation connect the Dname port to the target.
19. ClickRepository menusave
20. Repeat the process of creation of session and workflow and run the workflow.
Note: - Lookup Transformation supports both Equi-Join and Non-Equijoin.
3. Joiner Transformation

DFD

Emp
-Empno Joiner
-Ename Transformation Emp_Dept
-Job
-Sal -Empno
-Deptno -Ename
-Job
Dept -Sal
-Deptno
-Deptno -Dname
-Dname -Location
-Location
Business Logic Merge the table Emp and Dept
Ans: Use Joiner transformation copy the ports from the both Emp and Dept table to
joiner transformation. Set the condition on the port which is available in both tables
just like equi-join (deptno).
Procedure
1. Create Source and Target definition
2. Create MappingNameM_DATA_JOIN
3. Drop source and Target Definition
4. Create the transformation type joiner
5. From SQ_EMP copy the following ports to the joiner transformation (Empno,
ename job, sal, deptno)
6. From SQ_DEPT copy the following ports to joiner transformation (Deptno,
Dname, Location)
7. Double click on joiner transformationCondition tabFrom toolbar Add new
condition
Master Operator Detail
Deptno1 = Deptno
8. Clickapplyok
9. From joiner transformation connect to the port to target definition.
10. Repeat the process for creating session & workflow and run the workflow

Router Transformation

Router Transformation State HR


Input
Sales Sales_SQ State=HR
State=DL State DL
State=KA
Default State KA

Default
Business Logic: Divide the Emp table records Department wise.
Ans: For this we will use Router Transformation because this takes one input and
provides multiple outputs. Connect the target table through the every distinct department
groups of output port in the Router transformation.
Procedure
1. Create Source and Target Definition
2. Create mapping with name M_Router
3. Drag and drop the source and target definition on to the mapping designer work
space.
4. Create Router Transformation from transformation menu.
5. Copy all the port from Source Qualifier to Router transformation
6. Double click Router TransformationGroup tabAdd new Group from toolbar
Group Name Group Filter Condition
Dept 10 Deptno=10
Dept 20 Deptno=20
Dept 30 Deptno=30
7. ClickApplyOk
8. Copy ports from Dept 10 Group to Emp_dept 10 Target. (Repeat the process for
Dept 20 and Dept 30 group)
9. Click Repository menuSave
10. Repeat the process for creating session and workflow and run the workflow.

Union Transformation
Note: All the sources should have the same structure.

Emp SQ_Emp Output


Emp_Union
Group 1

Employees Employees_SQ Group 2

Business Logic: Merge two table Emp and Employees table into one table Emp_Union
Ans: For this we will use Union Transformation It takes multiple inputs and provides one
output. But the various sources should have the same structure. Connect the target table
through single output port of the Union transformation.

Procedure
1. Import metadata of Emp and Employees from source database using Source
Analyzer.
2. Create Target table Emp_Union
3. Create a mapping with the name M_Union
4. Drag and Drop the Source and Target Definition on the mapping designer
workspace.
5. Create Union Transformation with name Emp_Union
6. Double Click on Union TransformationGroup tabAdd new groupName the
group Emp and Employees
7. Group port tabAdd new port
Name Datatype Precision Scale
Empno decimal 2 0
Ename String 10 0
Job String 10 0
8. Copy ports SQ_Emp to Emp Group
9. Copy ports Employees_SQ to Employees group
10. Copy the ports from output group to target (Emp_Union)
11. Repeat the Process of creating Session and workflow and Run the workflow.

Stored Procedure Transformation


Create the following stored procedure in the database.
Create procedure Annual_tax
(Sal In Number,
Tax Out Number)
IS
Begin
Tax := Sal*0.15;
End;
Business Logic: Calculate the Annual Salary of the employee for the new column tax in
the target table.
Ans: For calculating the annual tax this time we will use stored procedure transformation.
Because by this the overheads of the Integration service will be reduced and the
performance will be increased.
Procedure
1. Create Source and Target Definition
2. Create the mapping with the name M_StoredP
3. Drop the source and target definition on to the Mapping designer Workspace.
4. Create the transformation type stored procedure
5. Give the ODBC Connection to run the stored procedure according to it’s place
means whether it is in Source database or Target database.
If the Stored procedure is in the target database, then nothing to configure
additional settings otherwise do the additional settings as follows:
6. Double Click Stored procedure Properties TabSet the “Connection
InformationAs per your Relational connection name for source.
7. While Configuring the mapping in the session for connection set the connection
for transformation alsoAs per your Relational Connection name for source.
8. From SQ_Emp Connect the SAL port to the stored procedure transformation.
9. From stored procedure connect the TAX port to the target definition.
10. From SQ_Emp connect the remaining ports to target definition.
11. Click Repository menuSave
12. Repeat the process for session & Workflow and Run the workflow.

Source Qualifier Transformation


In this transformation we generally do the changing in the SQL code of the Source
Qualifier.
Business Logic: Load the data of employees in the target those belongs to only
department number 20 and 30 and records should be sorted by their salary in ascending
order.
Ans: Earlier we did this by the filter transformation but now we will do this by source
qualifier transformation. It will certainly increase the efficiency of the Integration
Service.
Procedure
1. Create Source and Target Definition
2. Create Mapping with a name M_Source_filter
3. Drop the source and target definition on the mapping designer window.
4. From SQ_Emp connect the required port to the Target table just like simple pass.
5. Double click SQ_EmpProperties tabSet the value for “Number of sorted
ports=1” Set the value for “sql query” as “SELECT EMP.EMPNO,
EMP.ENAME, EMP.SAL, EMP.DEPTNO
FROM EMP
WHERE DEPTNO IN (20, 30)
ORDER BY EMP.SALclick Generate SQLClick ApplyOk
** Here you can also set properties for you query such as Distinct by check the
checkbox of Distinct.
** By default the order by clause will be imposed on empno because you had set value
for number of sorted port =1 the integration service take it sequentially from empno if
you would choose value 2 then it will take to port empno and ename so you must do
changes in the SQL query according to your requirements.
Note: If you will set the value for sql query without connecting to target port you will get
error message, so connect to the target port from SQ_Emp first.

6. Repeat the process for creating session and workflow and Run the workflow.

User Define Join in Source Qualifier Transformation


User defined joins possible in the source qualifier only when the two sources are belongs
to same database user account or schema.
Business Logic: Join two tables Emp and Dept to get the Department name and Location
from Dept table for each employee in the Emp table.
Ans: Instead of using joiner transformation we will use User Defined Join option in
source Qualifier because both table are in the same schema of scott. It will certainly
increase the performance measure also.
Procedure
1. Create the source and target definition
2. Create a mapping with the name M_Source_join
3. Drag and Drop the source and Target definition on to the mapping designer
workspace.
4. From source qualifier connects the port to the target definition.
5. Double click on source qualifier Properties Tab Set the value for “User
defined join” Emp.Deptno=Dept.Deptno
6. Click ApplyOk
7. Repository menuSave
8. Repeat the process for creating session and workflow and run the workflow.

Mapplet
 A mapplet is reusable metadata object created with business logic using set of
transformation.
Procedure
1. Tool menuMapplet Designer
2. Mapplet menuCreateGive the name of the mapplet
3. Transformation menuCreateMapplet InputEnter the proper
nameCreateDone
4. Transformation menuCreateMapplet InputEnter the proper
nameCreateDone
5. Create the Filter Transformation and Expression Transformation.
6. Double ClickMapplet Input TransformationPort tabCreate desired
portsOk
For Exercise (Empno, Ename, Job, Sal, Deptno)
7. From Mapplet Input Copy the port to Filter TransformationChange the data
type, precision, Scale for the required portDefine filter condition Deptno=20
or Deptno=30
8. From Filter Transformation copy the port to expression transformation.
9. Create an output port with the name TAX, Uncheck the input port checkbox from
properties & develop the expression with the following syntax
IFF(SAL>2000, SAL*15, SAL*20)
10. From Expression Transformation copy the ports to mapplet output transformation.
11. Repository menuSave
Design a Mapping with Mapplet
Business logic: Create a mapping for extraction and loading the data from table emp for
those employee whose belongs to department number 20 or 30 and also calculate their
annual tax.
Ans: We will use mapplet that we have just created in the above exercise because we
have already implemented this business logic into creating mapplet.
Procedure
1. Create Source and Target Definition
2. Create a mapping with name M_Mapplet
3. Drop the source and target definition onto the mapping designer workspace.
4. From mapplet subfolder drag the mapplet drop beside the source qualifier.
5. From SQ_Emp connect the ports to mapplet input and from mapplet output
connect the ports to target definition.
6. Repository menuSave
7. Repeat the process for creating the workflow and Run the workflow.

Constraint Based Load Ordering


A CBL is specified when you want to load the data into snowflake dimension, which are
having primary and foreign key relationship.
Exercise: Using CBL load the data into dimension named DEPT and EMP in which
deptno is primary key in the DEPT table and Foreign key in the EMP table.
Procedure
1. Create Source and Target Definition
2. Create Mapping with the name M_CBL
3. Drag and Drop the source and target definition on to the mapping working space
4. From SQ_Emp_dept connect the port to the target definition.
5. Create a session with name S_CBL
6. Double clickSessionConfig object tabCheck Constraint Based Load ordering
7. SelectMapping tabSet the source and each target connection relation
typeapplyOk
8. Repeat the process for creating workflow and Run the workflow.

Scheduling Workflow
A schedule specifies the data and time to run the workflow.
Procedure
1. From the workflow managerTool menuWorkflow DesignerCreate
Workflow
2. Select scheduler tabSelect Reusable Radio buttonSet the values for scheduler
For Run Option
Run on Integration service Initialization
Schedule option Select Run Everyday
Select End OptionForever
Set the start date and time
3. ClickApplyOk
4. Repeat the Rest of the process of creating workflow.

Working with Flat files


Procedure
Step 1Creation of Source Definition
i) Tool menuSource AnalyzerSources menuImport from file
ii) Browse the location of flat fileSelect the fileOk
iii) In the pop-up window do these settings
a) Select Flat File TypeDelimited
b) Select (Check) Import field name from the first lineClickNext
c) Select delimiter typeNext
d) If required alter the data type for source definitionFinish
iv) Repository menuSave
Step 2  Create the target definition in the target database and repeat the process for
target definition
Step 3 Create MappingGive proper name From Source Qualifier connect the
ports to the target definition.
Step 4  Create a sessionGive the proper name
Step 5 Double click session Mapping tabSelect SQ_Customer from left
paneSet the attribute as follows
Attributes Values
Source File Directory D:\Flatfiles
Source File Name Customer.txt
Source File Type Direct
Step 6 Set the target setting for loading as usual with relational connection.
Step 7Repeat the process for creating workflow and run the workflow.

Direct and Indirect Communication of Integration Service with Source File type

Integration Service Integration Service

Direct Indirect

C:\Flatfile\Customer.txt List of files


Path:D:\Files\Cust.txt

File Name: Cust.txt


C:\Flatfiles\Customer.txt
D:\Customer\Cust1.txt
D:\Sales\Customer.txt
Working with File List
A File list is a list of flat files with the same data definition, which needs to be
merged with source file type as indirect.
Note: A file list works only when the entire flat files having the same data definition.
Procedure
1. The all processes are same except you have to set the session attribute value for
“Source File Type” as Indirect.

XML Source Qualifier Transformation


We use XML Source Qualifier Transformation to read the data from XML files.
Every XML Source definition by default associates with XML Source Qualifier
Transformation.
Procedure
1. Create source definition Emp.xml
2. Tool menuSource AnalyzerSource menuImport xml definition
3. Browse the location for xml fileselect fileOpenOk
4. A window will pop-up for settingsClickNextSelect Option
i. Hierarchy Relationship
ii. Denormalized xml viewsFinsih
5. Repository menuSave
6. Create target definitionEmp_xml
7. Create mapping with giving proper name.
8. Drag and Drop the source and target definition to the mapping workspace.
9. From XML_SQ connect the ports to the target definition.
10. Create SessionS_XML
11. Double click sessionMapping tabfrom left pane select XML source
qualifierIn properties section set the following attributes.
Source File Directory  D:\XML Files
Source File NameEmp.xml
Source File Type Direct
12. Repeat the process for target relational connection, and creating workflow
and run the workflow.
Normalizer Transformation
Normalizer transformation functions like a source qualifier while reading the data from
COBOL sources.
Use normalizer transformation to convert a single input record from source into multiple
output data records. (This process is known as data pivoting)
Process
1. Create Source and Target definition
2. Create a mapping with nameM_Pivot
3. Drop the source and target definition to the mapping working space.
4. Create transformation type Normalizer
5. double clickNormalizer TransformationNormalizer tabAdd new column
Column Name Level Occurs Datatype P S
Year 0 0 Number 4 0
Account 0 0 String 10 0
Amount 0 3 Number 7 2
6. ClickApplyOk
7. From Source Qualifier connect the port to normalizer transformation
8. From Normalizer transformation connect to target definition (GCIDMonth)
9. Repository menusave
10. Repeat the process for session & workflow and run the workflow.

GCID Generated Column ID / Global Column ID

Transaction Control Transformation


Type of active transformation which allow to control the transaction by set of commit and
rollback condition.
Business Logic: Commit only those employees information whose belongs to department
number 20.
Ans:- Use Transaction control transformation and give the condition.
IFF(DEPT=20 TC_COMMIT_AFTER, TC_ROLLBACK_AFTER)
Process
1. Create Source and Target definition
2. Create the mapping with name  M_TCT
3. Drag & Drop the source and target definition at mapping designer workspace.
4. Create transformation typeTransaction control
5. Copy ports from SQ_Emp to Transaction Control Transformation
6. Copy the ports from Transaction Control Transformation to Target table
7. Double ClickTransaction Control TransformationSelect Transaction Control
Transformation ConditionGive the condition
IFF(DEPT=20 TC_COMMIT_AFTER, TC_ROLLBACK_AFTER)
To select predefined variableClick on variable built in folder.

Slowly Changing Dimension-type2 (Type2 Mapping)


SCD Type 2 Complete history + Current Data

Process
1. Source Definition Emp Table
2. Target Definition (Emp_type2) (Empkey, Empno, Ename, Sal, Version)
3. Drag and Drop the source and target definition on mapping designer workspace.
Note: Drop the Target Definition twice on the mapping designer workspace
4. Create a lookup transformation to perform a lookup on target table.
5. From SQ_Emp copy the port Empno to the lookup transformation and make it
port type input only.
6. Double click on the lookup transformationSelect the condition tabadd new
conditionSet condition as follows
Lookup table condition Operator Transformation Port
Empno = Empno1
7. Select the properties tabSet the following properties
Transformation Attribute Value
Lookup policy on multiple match Use last value
8. Create Expression Transformation
9. From SQ_Emp copy (empno, ename, sal) ports to expression transformation
10. From lookup transformation copy (empkey, sal1, version) ports to expression
transformation
11. Rename the ports which are copies from lookup transformation for better
understanding as (TRG_EMPKEY, TRG_SAL, TRG_VERSION)
12. In expression transformationCreate two output ports with name
NEW_RECORD and UPDATE_RECORD and develop the following expression
NEW_RECORD ISNULL(TRG_EMPKEY)
UPDATE_RECORD NOT ISNULL(TRG_EMPKEY) AND (SAL!=TRG_SAL)
13. ClickApplyOk
Defining New Record data flow
14. Create transformation type Filter, Sequence Generator, Expression and Update
Strategy.
15. From expression transformation copy (Empno, Ename, Sal, New_Record) ports to
Filter transformation.
16. Double clickFilter TransformationProperties Tab

Transformation Attribute Value


Filter Condition New_Record
17. From filter transformation copy the (Empno, Ename, Sal) ports to expression
transformation
18. From Sequence Generator transformation copy the Next_Val port to expression
transformation
19. Double clickExpression TransformationPort TabFor port Next_Val select
only input portAdd new port
Port Name Datatype Precision Scale O Expression
S_key Decimal 6 0 √ Next_Val * 100
Version Decimal 5 0 √ 0
20. ClickApplyOk
21. From Expression Transformation copy the (S_key, Empno, Ename, Sal, Version)
ports to an update strategy transformation
22. Double ClickUpdate Strategy transformationProperties tab
Transformation Attribute Value
Update Strategy Expression DD_INSERT
23. From Update Strategy connect the ports to the first target definition
Defining Update Record Data Flow
24. Create the transformation type Filter, Expression, Update Strategy
25. From Expression Transformation copy (Empno, Ename, Sal, TRG_Empkey,
TRG_Version, Update_Record) ports to Filter Transformation
26. Double Click Properties tab
Transformation Attribute Value
Filter Condition Update_Record
27. From Filter Transformation copy (Empno, Ename, Sal, TRG_Empkey,
TRG_Version) ports to the Expression Transformation
28. Double ClickExpression TransformationPort TabUncheck the output ports
for a port name i. TRG_Emp ii. TRG_Version
29. ToolbarAdd new port
Port Name Datatype Precision Scale O Expression
Dimkey Decimal 6 0 √ TRG_Empkey + 1
Version Decimal 5 0 √ TRG_Version + 1
30. Click ApplyOk
31. From Expression TransformationCopy (Dimkey, Empno, Ename, Sal, Version)
ports to Update Strategy Transformation
32. Double ClickUpdate Strategy transformationProperties tab
Transformation Attribute Value
Update Strategy Expression DD_INSERT
33. From Update Strategy Transformation connect the ports to Second target
definition.
34. Repository MenuSave
35. Repeat the process for Creating Session and Workflow and Execute the
Workflow.

Link Condition
In Sequential batch processing the session can be executed sequentially and conditionally
using link conditions.
Define the link condition using a predefined variable called PreTaskStatus
Process
1. Create three Mappings.
2. Create three Sessions S10, S20, S30
3. Create workflow and assign the session in a sequential fashion.
4. Double Click on the link between S10 and S20 sessionPredefined TabDouble
Click on PrevTaskStatus Type SUCCEEDED
$S10.PrevTaskStatus=SUCCEEDED
5. Repeat the process for S20 and S30
6. Repository menuSave
7. Repeat the process for creating workflow and execute the workflow.

Worklet
A Worklet is defined as group of task. There are two types of Worklet.
i. Reusable Worklets
ii. Non-reusable Worklets
Process
1. Reusable Worklet
1. Tool menuWorklet Designer
2. Worklet menuCreateGive the proper name
3. Drag and Drop the sessions in parallel fashion and link the sessions.
4. Create WorkflowDrop the WorkletLink the workflow and Worklet
5. Run the Workflow
2. Non-Reusable Worklet
1. Tool menuWorkflow Designer Workflow menuCreateEnter the
nameOk
2. Task menuCreateTask type “Worklet”Enter the name
3. Link Workflow and Non-Reusable Worklet
4. Select Non-Reusable WorkletRight ClickOpen Worklet Drag and
Drop the sessions in Parallel Fashion
5. Select Workflow designer toolRepository menuSave
6. Run the Workflow.

Conversion from Non-Reusable to Reusable Worklet


1. Select Non-Reusable worklet
2. Right ClickEdit
3. Check the Check Box of Make Reusable
4. ClickApplyOk

Command Task
You can specify one or more Shell command for UNIX or Dos Command for Windows
to run during workflow with the command task.
You specify the shell commands in the command task to delete, reject and copy a file etc.
Used the command task in the following ways:
1. Standalone Command Task:- Use a command task anywhere in the workflow or
worklet to run the shell commands.
2. Pre-post session shell command:- You can call the command task as the pre-post
session shell command for a session task.
You can use any valid Unix commands for Unix server and any valid dos command for
Windows server.
Control Flow

Event-Wait
Copy C:\test.txt C:\Success Success

CMD
TASK S 20

WKF S 10

CMD
TASK S 20

Event-Wait
Copy C:\test.txt C:\Fail
Fail

Prerequisite
1. Create three session
2. Create two folder in drive C:\ named Success and Failed
3. Create a blank txt file named test.txt
Procedure
1. Tool menuTask DeveloperTask menuCreate
2. Select Task type “Command”Enter the NameCreateDone
3. Double clickCmd TaskCommand tabAdd a new cmd
Name cmd
On_Success copy C:\test.txt C:\Success
4. ClickApplyok
5. Select Task type “Command”Enter the NameCreateDone
6. Double clickCmd TaskCommand tabAdd a new cmd
Name cmd
On_Failed copy C:\test.txt C:\Failed
7. Double click on the session S10 Select the Components tab
Task Type Value
Post Session Success Command Reusable On_Success
Post Session Failure Command Reusable On_Failed
8. ClickApplyOk
9. Task menuCreateEvent WaitEnter the nameCreateDone
10. Double click Success “Event Wait” TaskSelect Events tabSelect the pre-
defined event
Enter the name of the file to watch
C:\Success\test.txt
11. Select properties tabSelect delete file watch file
12. ClickApplyok
13. Make a link between tasksDefine the link condition on the link between session
S10 and Command task for success and failure as follows
$S10.PrevTaskStatus=SUCCEEDED
$S10.PrevTaskStatus=FAILED
14. Repository menuSave Execute Workflow

User Defined Events

Event Task

Event Raise Event Wait

User Defined Events Pre-Defined Events


[File Watch Events]

S 10 S 30 Event Raise S10 S30


Complete
Event to be raised
WKF

S 20 Event Wait S 40

Wait for S10 S30


to be completed
Procedure
1. Create four Sessions.
2. Tool MenuWorkflow DesignerWorkflow menuCreate
3. Enter the Workflow nameEvents tabCreate new EventsEnter the Event
nameClick Ok
4. Create Workflow as shown above figureTask menuCreateSelect Task
“Event Raise” and “Event Wait” CreateDone
5. Makes the links between tasks.
6. Double click Event Raise TaskSelect Properties tab
Attribute Value
User Defined Value S10 S30 Complete
7. Double ClickEvent Wait TaskEvent tab
8. Select Option “User Defined” Click on Browse Events to Choose an
EventSelect the EventClick Ok
9. Repository menuSave
10. Execute Workflow

Workflow with Decision Task


You can enter a condition that determines the execution of the workflow with decision
task, similar to the link condition.
Use decision task instead of multiple link condition in the workflow.

S10

WKF S20 Decision Command Event Wait S40


0 Task Task Task 0
S30
0 Decision Condition

Procedure
1. Create four Sessions.
2. Tool MenuWorkflow DesignerWorkflow menuCreate
3. Enter the Workflow nameEvents tabCreate new EventsEnter the Event
nameClick Ok
4. Create Workflow as shown above figureTask menuCreateSelect Task
“Decision”, “Command” and “Event Wait” CreateDone
5. Make a link between tasksDouble Click on “Decision Task” Select the
Properties tab
Attribute Value
Decision Name $ S10.Status=Succeeded AND
$S20.Status=Succeeded AND
$ S30.Status=Succeeded
6. ClickOk
7. Double ClickLink Input to “Command Task”Properties tabCreate the
Expression
Expression: $Decision .condition=True
8. Double ClickCommand TaskCommand tab
Name Command
Success copy D:\CMDTASK\RESULT.txt D:\Success
9. Double Click “Event Wait task”Predefined EventEnter the file name
D:\Success\RESULT.txt
Note: If you want to delete the watch file after complete the task. You can select the
option “Delete Watch File” From properties tab of Event wait task.
10. Repository menuSave

Timer Task
You can specify the period of time to wait before integration service runs the next task in
the workflow with the timer task.

Procedure
1. Create timer task from task menu of workflow designer.
2. Double clickTimer taskTimer tab
3. Select absolute timespecify date and timeApplyOk

Design a workflow with multiple link condition (Alternative to decision task)

$S10.Status=Succeeded
S10
WKF
Command
Task
S20 $S20.Status=Succeeded
Procedure
1. Design the workflow as shown in above figure.
2. Double clickCommand TaskGeneral tab
Treat Input Link as AND OR
3. Command TabGive the command
4. Repository menuSave

Assignment Task
You can assign a value to user defined workflow variable with the assignment task.
To use an assignment task in the workflow first create and add an assignment task to the
workflow then configure the assignment task to assign value or expression to user
defined variable.

** Weekly and Daily Loading


Procedure
1. Create three sessions.
2. From Tools menuSelect workflow designerFrom workflow menuSelect
Create
3. Enter the workflow name Select Variable tabFrom toolbarClick Add new
Variable
Name Datatype Persistent
$$WKF_RUNS Integer √
Enter the default value 0
4. From Repository navigator windowDrag and drop the session S10 drop beside
the start task.
5. Create the task type “Decision” and “Assignment”
6. Drag and Drop the session S20, S30
7. Make the link between tasksDouble click on link between S10 and Assignment
taskDevelop the following expression
$S10.Status=Succeeded
8. Double click “Assignment task”Expression tabFrom toolbar click on add a
new Expression
User Defined Variable Operator Expression
$$WKF_RUNS = $$WKF_RUNS + 1
9. Double clickLink between Assignment task and Decision taskDevelop the
following expression
$Assign_value.Status=Succeeded
10. Double clickDecision TaskProperties tab
Attribute Value
Decision Name MOD($$WKF_RUNS, 7) = 0
11. Double clickLink between decision task and session S20Develop the link
condition
$Decision .Condition=True
12. Double clickLink between decision task and session S30Develop the link
condition
$Decision . Condition=False
13. Repository MenuSave

E-mail Task
Used to send an e-mail within a workflow.
Procedure
1. Develop a workflow with e-mail task.
2. Double clickEmail taskProperties tabSet the following Attribute
Email User Name pandey.dipu@in.com
Email Subject Daily Load Completed
Email Text Success
3. ClickOk
4. Repository menuSave

User Defined Function


It lets you to create customized function or user specific function to meet the specific
business task that is not possible with built in functions.
The user defined functions can be private or public.
Procedure
1. From repository navigator window in Designer Client Select “User Defined
Function” sub folder
2. Tool menu User defined functionsNew
3. Enter the name and type
4. Add new Argument
Name Datatype Precision Scale
Arg 1 String 12 0
5. To define the expression click on “Launch Editor”
LTRIM (RTRIM (ARG1))

Вам также может понравиться