Академический Документы
Профессиональный Документы
Культура Документы
Username: scott
Passwor: tiger
EMP
Username: batch7
Password: target
Emp
DEPT
BONUS
Source DB
Process
1. Create User in Oracle (It can be any database)
a) StartProgramOracleApplication DevelopmentSQL*PLUS
b) Login with
Username: system
Password: manager
Host String: ORCL
SQL> CREATE USER BATCH7 IDENTIFIED BY target;
SQL> GRANT DBA TO BATCH7;
Target DB
** One more ODBC connection required for target similarly create an ODBC
connection name with BATCH7_TARGET_ORACLE by repeating the same process
explained above but here the username will be BATCH7 which you have created with
password target.
3. Starting Services
For Starting Services you can use MSCONFIG command on RUN prompt a window
will appear in that choose services. Or you can find Services in
control panelAdministrative tool.
Start these two services:
2.
3.
4.
Folder menuCreate
5.
**
Now every thing is completed for creation of Mappings. Now follow the steps
described earlier for creating mapping.
Step 1: Creation of Source Definition
A source definition is created using Source Analyzer Tool in the designer client
component.
Process
1. StartProgramInformatica PowerClientClientPowreCenter Designer
** Now you are at Designer component window.
2. ConnectRepositorySelect desired folder
3. Tools menuSource Analyzer
4. Sources menuImport from database
** An Import Tables window will appear
Connect to the database with the following details
ODBC data source- Give the connection name you had given earlier while creating your
ODBC connection.
Username: SCOTT
Ownername: SCOTT
Password: tiger
Click Connect
Select desired tables you want to be as a source definitionOK
Repository menuSave
**Now your Source Definition has been created and saved in repository.
Step 2: Create Target Definition
Target Definition can be created using Target Designer Tool in the Designer client
Component.
Procedure
1. Tool menuTarget Designer
2. Source menuImport from database
Connect to the database with following details:
ODBC data source
Username
Password
ClickConnect
Select TablesOK
Repository menuSave
Step3: Design a mapping without Transformation Rule
** A mapping without Transformation Rule called Simple Pass Mapping.
A Mapping is created using mapping designer tool. Every mapping is uniquely identified
by name.
Procedure
1. Tools menuMapping Designer
2. Mapping MenuCreate
3. Enter the Mapping NameOK
4. From repository navigator pane drag the source (EMP) and Target (Dim_Emp)
table definition; drop on mapping designer work space.
5. From Source_Qualifier (SQ_EMP) connect column to the corresponding columns
in the target table definition by just dragging. (You can also use auto connect)
6. Repository menuSave
Note: Every source table definition by default associates with source qualifier
transformation.
The source qualifier transformation prepares an SQL statement which is used for
extraction by integration service.
**
Data Type
Precision
Scale
DEPTNO
Number(p,s)
DNAME
Varchar2
10
Not A key
SUMSAL
Number(p,s)
Not A key
5. ClickOkDone
6. Target menuGenerate/Execute SQL
7. ClickConnectGive the Information
ODBC Data source
Username
Password
8. ClickConnect
Not Null
Key Type
Primary key
Transformation
1. (Filter, Rank, Expression)
DFD
Emp
14 Rows
SQ_EMP
14 Rows
Dept = 30
14 (I)
Top 3
6(O)
6(I)
Tax (Sal*.10)
3(O)
3(I)
T_Emp
3(O)
Business Logic: Calculate the Tax for top 3 employees of department number 30
Ans: First we use the Filter transformation to filter the data of department number 30
only, then Rank transformation to take the top 3 employees from department number 30
and at last Expression transformation to calculate the Tax for each employee.
Procedure
1. Create the Source Definition Emp
2. Source AnalyzerCreate Target DefinitionEmp_tax_cal
3. Create MappingName M_tax_cal Drop source & target definition onto the
mapping designer workspace.
4. Transformation Menu CreateSelect Transformation type FilterEnter the
nameCreatedone
-- Repeat the same process to add Rank & Expression Transformation.
5. From SQ_EmpCopy the required ports to the filter transformation.
6. Double clickFilter TransformationProperties tab Set
Filter Condition DEPT = 30
7. Click ApplyOk
8. From Filter Transformation copy the port to a Rank Transformation.
9. Double clickRank Transformation Port tabSet
For a port name SALRank Port
10. SelectProperties TabSet
Top/Bottom Top
Number of Ranks3
11. From Rank Transformation copy the port to an Expression Transformation.
12. Double clickExpression TransformationPort tab
13. From tool barAdd new PortSet
Port Name=Tax
Data Type=Decimal
Precision=7
Scale=2
UncheckInput Port (I)
Expression SAL * 0.15
14. Click ApplyOk
15. From Expression Transformation connect the ports to the Target Definition.
16. Repeat the process of creation of workflow and execute the workflow.
2. Sorter, Aggregator, Lookup
DFD
Lookup T/R
Dept Table
-Deptno
-Deptname
-Location
Emp
SQ_EMP
-Emp
-Ename
-Sal
-Deptno
Sorter T/R
-Sal
-Deptno (Key)
Aggregator T/R
-Deptno
-Sal (I)
-Sumsal (O)
-sum(Sal)
Emp_Sum
-Deptno
-Dname
-Sumsal
Business Logic: Calculate the total Salary Paid for each department
Ans: Sorter transformation is used for better performance of Aggregator transformation,
and grouping the data department wise, aggregator transformation aggregate the salary
(Sum) department wise. There is Dname port is in the target table but Dname is not in the
Emp table so, we used the Lookup Transformation to get the Dname from Department
table. (Emp.deptno=Dept.deptno)
Procedure
1.
2.
3.
4.
5.
sum(sal)
11. Select the properties tab select a sorted inputclick applyOK
12. From aggregator connects the port to the target definition.
13. Transformation menuCreatelookup transformationName
LKP_SRCCreate
14. Select the tab Sourceselect the source table Dept definition ok
15. From aggregator transformation copy the port Deptno to the lookup
transformation double click on lookup transformation
16. Select the condition tab from a toolbar click on add a new condition.
Lookup table column
Operator Transformation Port
Deptno
=
Deptno1
17. ClickapplyOK
18. From lookup transformation connect the Dname port to the target.
19. ClickRepository menusave
20. Repeat the process of creation of session and workflow and run the workflow.
Note: - Lookup Transformation supports both Equi-Join and Non-Equijoin.
3. Joiner Transformation
DFD
Emp
-Empno
-Ename
-Job
-Sal
-Deptno
Dept
-Deptno
-Dname
-Location
Joiner
Transformation
Emp_Dept
-Empno
-Ename
-Job
-Sal
-Deptno
-Dname
-Location
Create MappingNameM_DATA_JOIN
Drop source and Target Definition
Create the transformation type joiner
From SQ_EMP copy the following ports to the joiner transformation (Empno,
ename job, sal, deptno)
6. From SQ_DEPT copy the following ports to joiner transformation (Deptno,
Dname, Location)
7. Double click on joiner transformationCondition tabFrom toolbar Add new
condition
Master
Operator
Detail
Deptno1 =
Deptno
8. Clickapplyok
9. From joiner transformation connect to the port to target definition.
10. Repeat the process for creating session & workflow and run the workflow
Router Transformation
Sales
Sales_SQ
Router Transformation
Input
State=HR
State=DL
State=KA
Default
State HR
State DL
State KA
Default
Dept 10
Deptno=10
Dept 20
Deptno=20
Dept 30
Deptno=30
7. ClickApplyOk
8. Copy ports from Dept 10 Group to Emp_dept 10 Target. (Repeat the process for
Dept 20 and Dept 30 group)
9. Click Repository menuSave
10. Repeat the process for creating session and workflow and run the workflow.
Union Transformation
Note: All the sources should have the same structure.
Emp
SQ_Emp
Output
Emp_Union
Group 1
Employees
Employees_SQ
Group 2
Business Logic: Merge two table Emp and Employees table into one table Emp_Union
Ans: For this we will use Union Transformation It takes multiple inputs and provides one
output. But the various sources should have the same structure. Connect the target table
through single output port of the Union transformation.
Procedure
1. Import metadata of Emp and Employees from source database using Source
Analyzer.
Datatype
Precision
Scale
Empno
decimal
Ename
String
10
Job
String
10
3. Drop the source and target definition on to the Mapping designer Workspace.
4. Create the transformation type stored procedure
5. Give the ODBC Connection to run the stored procedure according to its place
means whether it is in Source database or Target database.
If the Stored procedure is in the target database, then nothing to configure
additional settings otherwise do the additional settings as follows:
6. Double Click Stored procedure Properties TabSet the Connection
InformationAs per your Relational connection name for source.
7. While Configuring the mapping in the session for connection set the connection
for transformation alsoAs per your Relational Connection name for source.
8. From SQ_Emp Connect the SAL port to the stored procedure transformation.
9. From stored procedure connect the TAX port to the target definition.
10. From SQ_Emp connect the remaining ports to target definition.
11. Click Repository menuSave
12. Repeat the process for session & Workflow and Run the workflow.
Source Qualifier Transformation
In this transformation we generally do the changing in the SQL code of the Source
Qualifier.
Business Logic: Load the data of employees in the target those belongs to only
department number 20 and 30 and records should be sorted by their salary in ascending
order.
Ans: Earlier we did this by the filter transformation but now we will do this by source
qualifier transformation. It will certainly increase the efficiency of the Integration
Service.
Procedure
1. Create Source and Target Definition
2. Create Mapping with a name M_Source_filter
3. Drop the source and target definition on the mapping designer window.
4. From SQ_Emp connect the required port to the Target table just like simple pass.
5. Double click SQ_EmpProperties tabSet the value for Number of sorted
ports=1 Set the value for sql query as SELECT EMP.EMPNO,
EMP.ENAME, EMP.SAL, EMP.DEPTNO
FROM EMP
WHERE DEPTNO IN (20, 30)
ORDER BY EMP.SALclick Generate SQLClick ApplyOk
** Here you can also set properties for you query such as Distinct by check the
checkbox of Distinct.
** By default the order by clause will be imposed on empno because you had set value
for number of sorted port =1 the integration service take it sequentially from empno if
you would choose value 2 then it will take to port empno and ename so you must do
changes in the SQL query according to your requirements.
Note: If you will set the value for sql query without connecting to target port you will get
error message, so connect to the target port from SQ_Emp first.
6. Repeat the process for creating session and workflow and Run the workflow.
User Define Join in Source Qualifier Transformation
User defined joins possible in the source qualifier only when the two sources are belongs
to same database user account or schema.
Business Logic: Join two tables Emp and Dept to get the Department name and Location
from Dept table for each employee in the Emp table.
Ans: Instead of using joiner transformation we will use User Defined Join option in
source Qualifier because both table are in the same schema of scott. It will certainly
increase the performance measure also.
Procedure
1. Create the source and target definition
2. Create a mapping with the name M_Source_join
3. Drag and Drop the source and Target definition on to the mapping designer
workspace.
4. From source qualifier connects the port to the target definition.
5. Double click on source qualifier Properties Tab Set the value for User
defined join Emp.Deptno=Dept.Deptno
6. Click ApplyOk
7. Repository menuSave
8. Repeat the process for creating session and workflow and run the workflow.
Mapplet
A mapplet is reusable metadata object created with business logic using set of
transformation.
Procedure
1. Tool menuMapplet Designer
2. Mapplet menuCreateGive the name of the mapplet
A CBL is specified when you want to load the data into snowflake dimension, which are
having primary and foreign key relationship.
Exercise: Using CBL load the data into dimension named DEPT and EMP in which
deptno is primary key in the DEPT table and Foreign key in the EMP table.
Procedure
1. Create Source and Target Definition
2. Create Mapping with the name M_CBL
3. Drag and Drop the source and target definition on to the mapping working space
4. From SQ_Emp_dept connect the port to the target definition.
5. Create a session with name S_CBL
6. Double clickSessionConfig object tabCheck Constraint Based Load ordering
7. SelectMapping tabSet the source and each target connection relation
typeapplyOk
8. Repeat the process for creating workflow and Run the workflow.
Scheduling Workflow
A schedule specifies the data and time to run the workflow.
Procedure
1. From the workflow managerTool menuWorkflow DesignerCreate
Workflow
2. Select scheduler tabSelect Reusable Radio buttonSet the values for scheduler
For Run Option
Run on Integration service Initialization
Schedule option Select Run Everyday
Select End OptionForever
Set the start date and time
3. ClickApplyOk
4. Repeat the Rest of the process of creating workflow.
Working with Flat files
Procedure
Step 1Creation of Source Definition
i)
ii)
iii)
Repository menuSave
Step 2 Create the target definition in the target database and repeat the process for
target definition
Step 3 Create MappingGive proper name From Source Qualifier connect the
ports to the target definition.
Step 4 Create a sessionGive the proper name
Step 5 Double click session Mapping tabSelect SQ_Customer from left
paneSet the attribute as follows
Attributes
Values
D:\Flatfiles
Customer.txt
Direct
Step 6 Set the target setting for loading as usual with relational connection.
Step 7Repeat the process for creating workflow and run the workflow.
Direct and Indirect Communication of Integration Service with Source File type
Integration Service
Direct
C:\Flatfile\Customer.txt
Integration Service
Indirect
List of files
Path:D:\Files\Cust.txt
File Name: Cust.txt
C:\Flatfiles\Customer.txt
D:\Customer\Cust1.txt
D:\Sales\Customer.txt
Use normalizer transformation to convert a single input record from source into multiple
output data records. (This process is known as data pivoting)
Process
1. Create Source and Target definition
2. Create a mapping with nameM_Pivot
3. Drop the source and target definition to the mapping working space.
4. Create transformation type Normalizer
5. double clickNormalizer TransformationNormalizer tabAdd new column
Column Name
Level
Occurs
Datatype
P
S
Year
0
0
Number
4
0
Account
0
0
String
10
0
Amount
0
3
Number
7
2
6. ClickApplyOk
7. From Source Qualifier connect the port to normalizer transformation
8. From Normalizer transformation connect to target definition (GCIDMonth)
9. Repository menusave
10. Repeat the process for session & workflow and run the workflow.
GCID Generated Column ID / Global Column ID
Transaction Control Transformation
Type of active transformation which allow to control the transaction by set of commit and
rollback condition.
Business Logic: Commit only those employees information whose belongs to department
number 20.
Ans:- Use Transaction control transformation and give the condition.
IFF(DEPT=20 TC_COMMIT_AFTER, TC_ROLLBACK_AFTER)
Process
1. Create Source and Target definition
2. Create the mapping with name M_TCT
3. Drag & Drop the source and target definition at mapping designer workspace.
4. Create transformation typeTransaction control
5. Copy ports from SQ_Emp to Transaction Control Transformation
6. Copy the ports from Transaction Control Transformation to Target table
7. Double ClickTransaction Control TransformationSelect Transaction Control
Transformation ConditionGive the condition
15. From expression transformation copy (Empno, Ename, Sal, New_Record) ports to
Filter transformation.
16. Double clickFilter TransformationProperties Tab
Transformation Attribute
Value
Filter Condition
New_Record
17. From filter transformation copy the (Empno, Ename, Sal) ports to expression
transformation
18. From Sequence Generator transformation copy the Next_Val port to expression
transformation
19. Double clickExpression TransformationPort TabFor port Next_Val select
only input portAdd new port
Port Name
Datatype
Precision
Scale O
Expression
S_key
Decimal
Next_Val * 100
Version
Decimal
20. ClickApplyOk
21. From Expression Transformation copy the (S_key, Empno, Ename, Sal, Version)
ports to an update strategy transformation
22. Double ClickUpdate Strategy transformationProperties tab
Transformation Attribute
Value
DD_INSERT
23. From Update Strategy connect the ports to the first target definition
Defining Update Record Data Flow
24. Create the transformation type Filter, Expression, Update Strategy
25. From Expression Transformation copy (Empno, Ename, Sal, TRG_Empkey,
TRG_Version, Update_Record) ports to Filter Transformation
26. Double Click Properties tab
Transformation Attribute
Value
Filter Condition
Update_Record
27. From Filter Transformation copy (Empno, Ename, Sal, TRG_Empkey,
TRG_Version) ports to the Expression Transformation
28. Double ClickExpression TransformationPort TabUncheck the output ports
for a port name i. TRG_Emp ii. TRG_Version
Datatype
Precision
Scale O
Expression
Dimkey
Decimal
TRG_Empkey + 1
Version
Decimal
TRG_Version + 1
Value
DD_INSERT
33. From Update Strategy Transformation connect the ports to Second target
definition.
34. Repository MenuSave
35. Repeat the process for Creating Session and Workflow and Execute the
Workflow.
Link Condition
In Sequential batch processing the session can be executed sequentially and conditionally
using link conditions.
Define the link condition using a predefined variable called PreTaskStatus
Process
1. Create three Mappings.
2. Create three Sessions S10, S20, S30
3. Create workflow and assign the session in a sequential fashion.
4. Double Click on the link between S10 and S20 sessionPredefined TabDouble
Click on PrevTaskStatus Type SUCCEEDED
$S10.PrevTaskStatus=SUCCEEDED
5. Repeat the process for S20 and S30
6. Repository menuSave
7. Repeat the process for creating workflow and execute the workflow.
Worklet
Process
1. Reusable Worklet
1. Tool menuWorklet Designer
2. Worklet menuCreateGive the proper name
3. Drag and Drop the sessions in parallel fashion and link the sessions.
4. Create WorkflowDrop the WorkletLink the workflow and Worklet
5. Run the Workflow
2. Non-Reusable Worklet
1.
Tool menuWorkflow Designer Workflow menuCreateEnter the
nameOk
2.
3.
4.
Select Non-Reusable WorkletRight ClickOpen Worklet Drag and
Drop the sessions in Parallel Fashion
5.
6.
Control Flow
Event-Wait
Success
CMD
TASK
WKF
S 20
S 10
CMD
TASK
S 20
Event-Wait
Fail
Prerequisite
1. Create three session
2. Create two folder in drive C:\ named Success and Failed
3. Create a blank txt file named test.txt
Procedure
1. Tool menuTask DeveloperTask menuCreate
2. Select Task type CommandEnter the NameCreateDone
3. Double clickCmd TaskCommand tabAdd a new cmd
Name
cmd
On_Success
copy C:\test.txt C:\Success
4. ClickApplyok
5. Select Task type CommandEnter the NameCreateDone
6. Double clickCmd TaskCommand tabAdd a new cmd
Name
cmd
On_Failed
copy C:\test.txt C:\Failed
7. Double click on the session S10 Select the Components tab
Task
Type
Value
Post Session Success Command
Reusable
On_Success
Post Session Failure Command
Reusable
On_Failed
8. ClickApplyOk
9. Task menuCreateEvent WaitEnter the nameCreateDone
10. Double click Success Event Wait TaskSelect Events tabSelect the predefined event
Enter the name of the file to watch
C:\Success\test.txt
11. Select properties tabSelect delete file watch file
12. ClickApplyok
13. Make a link between tasksDefine the link condition on the link between session
S10 and Command task for success and failure as follows
$S10.PrevTaskStatus=SUCCEEDED
$S10.PrevTaskStatus=FAILED
14. Repository menuSave Execute Workflow
User Defined Events
Event Task
Event Raise
Event Wait
Pre-Defined Events
[File Watch Events]
S 10
S 30
S 20
Event Wait
Event Raise
WKF
S 40
S10 S30
Complete
Event to be raised
Procedure
1. Create four Sessions.
2. Tool MenuWorkflow DesignerWorkflow menuCreate
3. Enter the Workflow nameEvents tabCreate new EventsEnter the Event
nameClick Ok
4. Create Workflow as shown above figureTask menuCreateSelect Task
Event Raise and Event Wait CreateDone
5. Makes the links between tasks.
6. Double click Event Raise TaskSelect Properties tab
Attribute
Value
User Defined Value
S10 S30 Complete
7. Double ClickEvent Wait TaskEvent tab
8. Select Option User Defined Click on Browse Events to Choose an
EventSelect the EventClick Ok
9. Repository menuSave
10. Execute Workflow
Workflow with Decision Task
You can enter a condition that determines the execution of the workflow with decision
task, similar to the link condition.
Use decision task instead of multiple link condition in the workflow.
S10
WKF
S20
0
Decision
Task
S30
0
Command
Task
Event Wait
Task
S40
0
Decision Condition
Procedure
1. Create four Sessions.
2. Tool MenuWorkflow DesignerWorkflow menuCreate
3. Enter the Workflow nameEvents tabCreate new EventsEnter the Event
nameClick Ok
S10
$S10.Status=Succeeded
WKF
Command
Task
S20
$S20.Status=Succeeded
Procedure
1. Design the workflow as shown in above figure.
2. Double clickCommand TaskGeneral tab
Treat Input Link as
AND
OR
3. Command TabGive the command
4. Repository menuSave
Assignment Task
You can assign a value to user defined workflow variable with the assignment task.
To use an assignment task in the workflow first create and add an assignment task to the
workflow then configure the assignment task to assign value or expression to user
defined variable.
** Weekly and Daily Loading
Procedure
1. Create three sessions.
2. From Tools menuSelect workflow designerFrom workflow menuSelect
Create
3. Enter the workflow name Select Variable tabFrom toolbarClick Add new
Variable
Name
Datatype
Persistent
$$WKF_RUNS
Integer
Name
Datatype
Precision
Scale
Arg 1
String
12
0
5. To define the expression click on Launch Editor
LTRIM (RTRIM (ARG1))