Академический Документы
Профессиональный Документы
Культура Документы
target
Scenario1:
We have a source table containing 3 columns : Col1, Col2 and Col3. There is only 1 row in the table as follows:
There is target table containg only 1 column Col. Design a mapping so that the target table contains 3 rows as follows:
Col
a
b
c
Without using normaliser transformation.
Solution:
Create 3 expression transformations exp_1,exp_2 and exp_3 with 1 port each. Connect col1 from Source Qualifier to port in
exp_1.Connect col2 from Source Qualifier to port in exp_2.Connect col3 from source qualifier to port in exp_3. Make 3 instances of
the target. Connect port from exp_1 to target_1. Connect port from exp_2 to target_2 and connect port from exp_3 to target_3.
I
D
Name Phone No
10 AAA
123
20 BBB
234
30 CCC
434
40 DDD
343
50 EEE
442
Target Table 1
ID Name
10 AAA
20 BBB
30 CCC
40 DDD
50 EEE
Target Table 2
ID Phone No
10 123
20 234
30 434
40 343
50 442
Solution:
Step 1: Source qualifier: get the source table to the mapping area. See image below.
Step 2: Drag all the port from (from the previous step) to the Aggregator transformation and group by the key column. Since we
have to split the columns to two different tables with the key column in each, so we are going use two expression transformation,
each will take the key column and one non-key column. Connect aggregator transformation with each of the expression
transformation as follows.
Step 3: We need another set of aggregator to be associated with each of the expression tranformation from the previous step.
Step 4: In the final step connect the aggregators with the two target tables as follows.
CO
L1
COL2 COL3
Solution:
Step 1: Drag the source to mapping and connect it to an aggregator transformation.
Step 2: In aggregator transformation, group by the key column and add a new port call it count_rec to count the key column.
Step 3: connect a router to the aggregator from the previous step.In router make two groups one named "original" and another as
"duplicate"
In original write count_rec=1 and in duplicate write count_rec>1.
The picture below depicting group name and the filter conditions
qualifier
to
two
rank
In Rank2
some
Header
here
col1
col2
col3
data1
data2
data3
data4
data5
data6
data7
data8
data1
data2
data3
data4
data1
data2
data3
data4
footer
Just we have to remove footer from the file.
Solution:
col4
After
mapping
go
to
workflow
and
scheduled
it.
Step4:Chose the advance option. Set number of initial rows skip: 1 ( it can be more as per requirement )
ADV PROPERTIES
1.
Drag
2.
and
drop
the
source
to
3.
select * from emp where rownum <= ( select count(*)/2 from emp)
Then connect to target.Now you are ready to run the mapping to see it in action.
mapping.
select * from emp minus select * from emp where rownum <= ( select count(*)/2
from emp))
Step:3 Then connect to target, and run mapping to see the results.
Scenario
9:
How
to
send
alternate
record
to
target?
Or
Sending Odd numbered records to one target and even numbered records to another target.
Solution:
Step
Step2:
1:
Add
Drag
the
the
next
value
source
of
and
a
sequence
connect
generator
to
an
to
expression
expression
transformation.
transformation.
Step 3: In expression transformation make two port, one is "odd" and another "even".
And Write the expression like below
EXPRESSION PROPERTY
RTR PROPERTY
---------------------------------------------------------------------
col2
col3
Col1
Col2
Col3
Target Table
Solution:
Step 1: Bring the source to mapping.
RANK PROPERTY
ROUTER TRANSFORMATION
ROUTER TO TARGET
2.
Arrange the salary in descending order in sorter as follows and send the record to expression.
SORTER PROPERTIES
3.
Add the next value of sequence generator to expression.(start the value from 1 in sequence generator).
4.
Connect the expression transformation to a filter or router. In the property set the condition as follows-
5.
1.
Put
2.
the
source
to
mapping
each
port
write
and
connect
the
ports
to
aggregator
transformation.
like
in
the
picture
below.
3.
Then
4.
In
And
send
expression
provide
make
it
four
the
output
ports
expression
to
(dept10,
expression
dept20,
like
dept30,
dept40)
in
the
transformation.
to
validate
picture
dept
no
below.
5.
Then
connect
6.
Finally
to
connect
router
to
transformation.
target
And
table
create
having
group
one
and
column
fill
that
condition
is
like
dept
below.
no.
Connect
Add
the
an
next
expression
value
port
transformation
of
sequence
generator
after
to
source
expression
qualifier.
transformation.
2.
In
expression
create
new
port
(validate)
and
write
the
expression
like
in
the
picture
below.
3.
Connect a filter transformation to expression and write the condition in property like in the picture below.
4.
Finally
connect
to
target.
Drag the source and connect to an expression.Connect the next value port of sequence generator to expression.
2.
Send the all ports to a router and make three groups as bellow
Group1
mod(NEXTVAL,30) >= 1 and mod(NEXTVAL,30) <= 10
Group2
mod(NEXTVAL,30) >= 11 and mod(NEXTVAL,30) <= 20
Group3
mod(NEXTVAL,30) >= 21and mod(NEXTVAL,30) <= 29 or mod(NEXTVAL,30) = 0
3.
Finally
connect
Group1
to
T1,
Group2
to
T2
and
Group3
to
T3.
Scenario: You have two columns in source table T1, in which the col2 may contain duplicate values.All the duplicate values in col2
of will be transformed as comma separated in the column col2 of target table T2.
Source Table: T1
Col1
Col2
Target Table: T2
col1
col2
x,m
y,n
Solution:
1.
We
have
to
use
the
following
transformation
as
below.
First connect a sorter transformation to source and make col1 as key and its order is ascending. After that connect it to an
expression
transformation.
2.
In
Expression
make
3.
In
concat_val
4.
In
5.
write
aggregator
four
expression
group
new
like
it
port
as
by
and
give
describe
col1
them
bellow
and
name
and
send
send
as
in
it
to
it
picture
an
below.
aggregator
to
target
SOURCE TABLE
id
Sal
200
300
500
560
TARGET TABLE
Id
Sal
200
500
1000
1560
1.
Pull
2.
In
the
expression
source
add
to
one
column
mapping
and
and
make
then
it
output(sal1)
connect
and
it
sal
to
port
as
expression.
input
only.
We will make use of a function named cume() to solve our problem, rather using any complex mapping. Write the expression
in
sal1
as
cume(sal)
and
send
the
output
rows
Solution:
to
target.
1.
Drag your target file to target designer and add a column as show on the picture. Its not a normal column .click on the
add
2.
file
Then
name
drag
your
to
source
the
to
table
mapping
property.
area
(I
and
have
connect
given
it
to
an
red
expression
mark
there)
transformation.
3.
4.
In
expression
transformation
add
new
port
as
string
data
type
and
make
it
output
port.
In that output port write the condition like describe as bellow and then map it in to filename port of target. Also send other
ports to target. Finally run the session. You will find two file one with sys date and other one is .out file which one you can
delete.
5.
1.
In repository go to menu tool then queries. Query Browser dialog box will appear.Then click on new button.
2.
In
Query
Editor,
choose
folder
name
and
3.
object
type
as
have
shown
in
the
picture.
4.
Query results window will appear. You select single mapping (by selecting single one) or whole mapping (by pressing Ctrl
+
A)
and
go
to
"tools"
then
"validate"
option
to
validate
it.
Go to mapping then parameter and variable tab in the Informatica designer.Give name as $$v1, type choose parameter
(You
can
also
choose
variable),
data
type
as
integer
and
give
initial
value
as
20.
2.
Create a mapping as shown in the figure( I have considered a simple scenario where a particular department id will be
filtered
3.
In
4.
to
filter
set
deptno=$$v1
(that
the
means
only
dept
no
20
target).
record
will
go
to
the
target.)
Mapping parameter value cant change throughout the session but variable can be changed. We can change variable
value by using text file. Ill show it in next scenario.
JOB
7369 SMITH
CLERK
7499 ALLEN
SALESMAN
MGR HIREDATE
SAL
7902 17-DEC-80
$800
20
$1600
30
MGR HIREDATE
SAL
DEPTNO
7902 17-DEC-80
800
20
1600
30
7698 20-FEB-81
DEPTNO
Target
EMPNO ENAME
JOB
7369 SMITH
CLERK
7499 ALLEN
SALESMAN
7698 20-FEB-81
1.
Drag the source to mapping area and connect each port to an expression transformation.
2.
In expression transformation add a new col sal1 and make it as out put and sal as in put only as shown in picture.
3.
4.
Currency convertor
Q22 Suppose that a source contains a column which holds the salary information prefixed with the currency code , for
example
EMPNO ENAME
JOB
7369 SMITH
CLERK
MGR HIREDATE
SAL
7902 17-DEC-80
$300
DEPTNO
20
7499 ALLEN
SALESMAN
7698 20-FEB-81
1600
30
7521 WARD
SALESMAN
7698 22-FEB-81
8500
30
In the target different currency will evaluate to a single currency value, for example covert all to Rupees.
1.
First thing we should consider that there are different types of currency like pound, dollar, yen etc.So its a good idea to
use mapping parameter or variable.Go to mapping=> mapping parameter and variables then create three parameters (for this
2.
Then drag the source to mapping area and connect to an expression transformation.
3.
In expression create a output port as sal1 and make sal as input only as bellow.
4.
$$DOLAR,$$POUND,$$YEN these are mapping parameter . you can multiply price in rupee directly for example dollar price
in rupees i.e 48 .
5.
Connect required output port from expression to target directly. And run the session.
2.
Drag an sequence generator transformation and set properties like this And connect the next value port to expression.
3.
Drag all output port of expression to router. In router make three groups and gve the conditions Like
this
4.
JOB
MGR HIREDATE
SAL
7369 SMITH
CLERK
7902 17-DEC-80
$800
20
$1600
30
7499 ALLEN
SALESMAN
7698 20-FEB-81
DEPTNO
Target
EMPNO ENAME
JOB
MGR HIREDATE
SAL
7369 SMITH
CLERK
7902 17-DEC-80
Rs.800
7499 ALLEN
SALESMAN
7698 20-FEB-81
RS.1600
DEPTNO
20
30
1.
2.
In expression make a output port sal1 and make sal as input port only.
3.
4.
2.
3.
total_vowels_count
2
Scott
Ward
1.
2.
In Expression add 6 columns like in the picture as bellow. But You can make it two columns( One for all the vowels and
one for the vowel counts). For better understanding I have added 6 columns,5 for each of the vowels and one for the vowel
count.
The way I achieved is for each of the vowels in ename , I replaced it with null and in port total vowel count , I substract the
vowel port from the ename length which gives me the individual count of vowels, after adding up for all vowels I found all the
vowels
present.
For
For
are
all
the
write
write
For
o/p
REPLACECHR(0,ENAME,'i',NULL)
write
U
for
ports.
REPLACECHR(0,ENAME,'e',NULL)
For
variable
REPLACECHR(0,ENAME,'a',NULL)
write
For
And
Here
REPLACECHR(0,ENAME,'o',NULL)
write
column
total_vowels_count
REPLACECHR(0,ENAME,'u',NULL)
write
expression
like
this
(length(ENAME)-length(A))
+
(length(ENAME)-length(E))
+
(length(ENAME)-length(I))
+
(length(ENAME)-length(O))
+
(length(ENAME)-length(U))
3.
This could have been done a different or a better way, If you have suggestion don't forget to comment.
1.
2.
In expression make empno as input and create another port empno1 as output port with date datatype. And in empno1
write condition like this. and finally send it to target
HIRE_DATE(numeric)
-------
-----------
20101111
20090909
target
EMPNO
------
HIRE_DATE (date)
-----------
11/11/2010
09/09/2009
1.
2.
In expression make hire_date as input only and make another port hire_date1 as o/p port with date data type.
3.
4.
HIRE_DATE
------ ----
-----------
12-11-87
02-04-88;
02-2323
empno is number and hire_date is in string format. We have to check the hire_date column, if it is in date format like 'dd-mm-yy',
then convert it to date , in the format mm/dd/yy and send it to target else send null.
output
EMPNO
HIRE_DATE
--------
---------
11-DEC-87
null
null
2.
In expression create another oupput port hire_date1 and make it to date data-type, shown in picture.
3.
4.
ORDER_DATE
DELIVERY_DATE
---------
---------
11-JAN-83
13-JAN-83
--------
04-FEB-83
07-FEB-83
08-DEC-81
09-DEC-81
We have to calculate difference between order_date and delivery date in hours and send it to target.
o/p will be
ORDER_NO ORDER_DATE DELIVERY_ DATE
---------
---------
---------
11-JAN-83
13-JAN-83
04-FEB-83
07-FEB-83
08-DEC-81
09-DEC-81
DIFF_IN_HH
--------48
72
24
2.
In expression create one out/put port diff and make it integer type.
3.
In that port write the condition like this and sent to target.
ORDER_DATE
--------11-JAN-83
DELIVERY_DATE
--------13-JAN-83
04-FEB-83
07-FEB-83
08-DEC-81
09-DEC-81
Target
ORDER_NO
---------
-------- ------
--- ----------
11-JAN-83
13-JAN-83
04-FEB-83
07-FEB-83
2.
3.
YEAR
DAYNO
------ --------- -
---------
01-JAN-07
301
01-JAN-08
200
Year column is a date and dayno is numeric that represents a day ( as in 365 for 31-Dec-Year). Convert the Dayno to
corresponding year's month and date and then send to targer.
Target
E_NO
YEAR_MONTH_DAY
------
--------- ----------
29-OCT-07
19-JUL-08
2.
In expression create one o/p port c_year_mm_dd, make it to date type and in that port write the condition like this.
3.
E_NO
-------
JOIN_DATE
---------
07-JUL-11
05-JUL-11
05-MAY-11
If the current month is july ,2011 then target will be like this.
Target
E_NO
JOIN_DATE
-------
---------
07-JUL-11
05-JUL-11
1.
2.
3.
empno ename
1
Amit Rao
Chitra Dash
In target we have to separate ename column to firstnane and lastname like this
empno
firstname
Lastname
Amit
Rao
Chitra
Dash
1.
Drag the source to mapping area and connect with an expression trans formation as shown bellow.
2.
In expression transformation create two output port one is f_name and other is l_name.
3.
4.
ename
ename
Sekher
Prasad
2.
In Expression create two ports one is name1(as variable port) and Middle_Name (o/p port)
3.
4.
5.
SCD TYPE1
TYPE1 MAPPING
The SCD Type 1 methodology overwrites old data with new data, and therefore does no need to track historical data .
1.
2.
3.
4.
Connect lookup to source. In Lookup fetch the data from target table and send only CUSTOMER_ID port from source to
lookup
5.
6.
Then rest of the columns from source send to one router transformation
7.
8.
For new records we have to generate new customer_id. For that take a sequence generator and connect the next column
to expression .New_rec group from router connect to target1(Bring two instances of target to mapping, one for new rec and
other for old rec) .Then connect next_val from expression to customer_id column of target
9.
Change_rec group of router bring to one update strategy. and give the condition like this
10.
SCD Type2
In Type 2 Slowly Changing Dimension, if one new record is added to the existing table with a new
information then both the original and the new record will be presented having new records with its own
primary key.
1.
2.
3.
4.
All the procedure same as described in SCD TYPE1 mapping. The Only difference is , From router new_rec will come to
one update_strategy and condition will be given dd_insert and one new_pm and version_no will be added before sending to
target.
5.
Old_rec also will come to update_strategy condition will given dd_insert then will send to target.
SCD Type3
In SCD Type3 ,there should be added two column to identifying a single attribute. It stores one time
historical data with current data
1.
2.
3.
Up to rouer transformation ,all the procedure is same as described in Scenario_36 SCD type1.
4.
The only difference is after router bring the new_rec to router and give condition dd_insert send to target. Create one new
primary key send to target.
5.
For old_rec send to update_strategy and set condition dd_insert and send to target.
6.
Unit Testing
Unit Testing
In unit testing what we need do is something like below
1.
2.
3.
4.
-
- Database performance
After unit testing we generally prepare one document as described below
5.
UNIT TEST CASE FOR LOAN_MASRER
FUNCTIONALITY_ID
FIELD_NAME