Академический Документы
Профессиональный Документы
Культура Документы
Scenario 2: Split the non-key columns to separate tables with key column in both / How to split the data of source table column-wise with respect to primary key. See the source and target tables below. source table: ID is the key column, Name and Phone No are non-key columns ID Name Phone No 10 AAA 123 20 BBB 234 30 CCC 434 40 DDD 343 50 EEE 442 Target Table 1 ID Name 10 AAA 20 BBB 30 CCC 40 DDD 50 EEE Target Table 2 ID Phone No 10 123 20 234 30 434 40 343 50 442 Solution: Step 1: Source qualifier: get the source table to the mapping area. See image below.
Step 2: Drag all the port from (from the previous step) to the Aggregator transformation and group by the key column. Since we have to split the columns to two different tables with the key column in each, so we are going use two expression transformation, each will take the key column and one non-key column. Connect aggregator transformation with each of the expression transformation as follows.
Step 3: We need another set of aggregator to be associated with each of the expression tranformation from the previous step. Step 4: In the final step connect the aggregators with the two target tables as follows.
Target Table 2: Table containing all the duplicate rows COL1 COL2 COL3 a b c a b c
v f Solution:
Step 2: In aggregator transformation, group by the key column and add a new port call it count_rec to count the key column. Step 3: connect a router to the aggregator from the previous step.In router make two groups one named "original" and another as "duplicate" In original write count_rec=1 and in duplicate write count_rec>1.
The picture below depicting group name and the filter conditions
Step2: After that connect a filter or router transformation. Step3: In filter write the condition like in the picture Scenario6: Get the first top five salaries with out using rank transformation
flat file properties Step4:Chose the advance option. Set number of initial rows skip: 1 ( it can be more as per requirement )
adv properties
2. Make 4 output ports in aggregator as in the picture above : count_d10, count_d20, count_d30, count_d40. For each port write expression like in the picture below.
3. Then
send
it
to
expression
transformation.
4. In expression make four output ports (dept10, dept20, dept30, dept40) to validate dept no And provide the expression like in the picture below.
5. Then connect to router transformation. And create a group and fill condition like below.
6. Finally
connect
to
target
table
having
one
column
that
is
dept
no.
src to target mapping Step:2 In source-Qualifier , go to property and write the SQL query like view source print? 1.select * from emp where rownum <= ( select count(*)/2 from emp)
src qualifier sql query Step:3 Then connect to target. Now you are ready to run the mapping to see it in action.
2. In expression create a new port (validate) and write the expression like in the picture below.
3. Connect a filter transformation to expression and write the condition in property like in the picture below.
4. Finally
connect
to
target.
Step 2: In source-Qualifier , go to propery and write the SQL query like view source print? 1.select * from emp minus select * from emp where rownum <= ( select count(*)/2 from emp)) .
src qualifier sql query Step:3 Then connect to target, and run mapping to see the results.
2. Send the all ports to a router and make three groups as bellow
Group1 mod(NEXTVAL,30) >= 1 and mod(NEXTVAL,30) <= 10 Group2 mod(NEXTVAL,30) >= 11 and mod(NEXTVAL,30) <= 20 Group3 mod(NEXTVAL,30) >= 21and mod(NEXTVAL,30) <= 29 or mod(NEXTVAL,30) = 0
3. Finally
connect
Group1
to
T1,
Group2
to
T2
and
Group3
to
T3.
scr to seq mapping Step 3: In expression transformation make two port, one is "odd" and another "even". And Write the expression like below
expression property Step 4: Connect a router transformation to expression. Make two group in router. And give condition Like below
rtr property Step 5: Then send the two group to different targets. The entire mapping is as below
1. We have to use the following transformation as below. First connect a sorter transformation to source and make col1 as key and its order is ascending. After that connect it to an expression transformation.
2. In Expression make four new port and give them name as in picture below.
4. In
aggregator
group
it
by
col1
and
send
it
to
target
Target table rows , with each row as sum of all previous rows from source table.
Scenario: How to produce rows in target table with every row as sum of all previous rows in source table ? See the source and target table to understand the scenario. SOURCE TABLE id Sal 1 200 2 300 3 500 4 560 TARGET TABLE Id Sal 1 200 2 500 3 1000 4 1560 1. Pull the
source
to
mapping
and
then
connect
it
to
expression.
2. In expression add one column and make it output(sal1) and sal port as input only. We will make use of a function named cume() to solve our problem, rather using any complex mapping. Write the expression in sal1 as cume(sal) and send the output rows to target.
2. Then drag your source to mapping area and connect it to an expression transformation.
3. In expression transformation add a new port as string data type and make it output port.
4. In that output port write the condition like describe as bellow and then map it in to filename port of target. Also send other ports to target. Finally run the session. You will find two file one with sys date and other one is .out file which one you can delete. 5.