Вы находитесь на странице: 1из 4

DAC Parameters for Informatica Workflows The main location for the parameters $DAC_HOME/Informatica/parameters/input

for

DAC

are

in

the

files

located

in

- parameterfileOLTP.txt is used for source systems/SDE mappings - parameterfileDW.txt is used for DW mappings/SIL and PLP mappings

Another place to define the parameters is in the tasks Parameters tab in DAC console itself. The parameter values defined in here will overwrite those set in the text files. When DAC starts to execute the task, it creates the parameter file needed for each workflow in Informatica on the fly based on these definitions. DAC changes the format of the file suitable for Informatica including changing the command name into actual Informatica session name [FolderName.SessionName]. It also adds common DAC parameters into each workflow parameter file.

Why do I need to use DAC from Oracle BI EE? Data Warehouse Application Console (DAC) works together with Informatica to acomplish the ETL for pre-packaged BI applications. Here are what happen behind the scene:

DAC publish the changes from OLTP Informatica extracts the changes from the change log published by DAC as well as from the base table Informatica load the data to the stage tables and the target tables in the data warehouse DAC manage the performance by dropping indexes, truncating stage tables, rebuilding the indexes, and analyzing the tables during the process

If you do not use DAC, you have to write your own custom change capture process and need to redesign from scratch an ETL method that allow the restart of the ETL process from point of failure at record level. The biggest saving is that DAC can survive during the upgrade , while your custom processes cannot.

OBIEE DAC Tutorial - With a simple Hands-On Example This Tutorial will help someone new to OBI APPS and especially DAC to get a hang of the DAC as a tool and its purpose. Usually most bloggers talk about configuring OBI APPS and then straingth away to customizations. People new to the tool may find this article useful. Prerequisite : ======== 1)Working Knowledge of Informatica 2) A Configured OBI APPS Environment

Our Objective . ========== To keep things simple we have a data in SCOTT.EMP . We need to transfer this to TGT.EMPT and then we create an Aggregation of Departmental Salary to AGG.DEPTSAL. The following Oracle SQL will set up the environment to mimic OLTP to DWH and then to Aggregate tables sqlplus / as sysdba grant resource,connect,dba to TGT identified by TGT; grant resource,connect,dba to AGG identified by AGG; grant select on SCOTT.EMP to TGT,AGG ; create table TGT.EMPT as select * from SCOTT.EMP where 1=2; create table AGG.DEPTSAL as select deptno,sum(sal) SUMSAL from SCOTT.EMP where 1=2 group by deptno; Informatica Transformation =========================== Create a folder named INFA_FOLDER1 for our test transformation. Lets create an Simple Pass Through Informatica Transformation for transfering data from SCOTT.EMP to TGT.EMPT.Also create an aggregation Transformation for data from TGT.EMPT to AGG.DEPTSAL . Lets create three relational connection named SCOTT_CONN , TGT_CONN , AGG_CONN Lets create workflow for the above mapping and name them - wf_SPT with session s_SPT . source:SCOTT_CONN Target: TGT_CONN - wf_AGG with session s_AGG . Source TGT_CONN , Target AGG_CONN Run the Informatica workflow once and ensure everything works fine. Working in DAC =========== 1) Start DACServer 2) Launch DAC CLient Notice the DAC Server status on the right top corner . If its Red , it means the DAC Cient could not connect to DAC SErver .If Yellow , connection is working and if Green it means DAC is running and some activity is going on . SETTING UP THE DAC ENVIRONMENT ============================== We set up the DAC environment using the DAC Client.We need to first create a new contianer . File ->new source system Container.Lets name it DAC_CTNR1 Connection to DAC Server --------------------------Tools ->DAC Server Management -> DAC Server setup The details should point to the DAC Repository that you have created. If the connection was successful , The DAC SERVER status Icon will change to pale Yellow. Setting up the Sources and Infa Connection Information-----------------------------------------------------To do the setup , in the DAC Client click ->SETUP DAC System PropertiesHere we may not have to do anything . Only thing that we need to supply is the DAC Repository Name.For complicated environments we may have to set the DAC Server port and URL.

Informatica Servers We need to create two new entry .Lets create as follows 1) MY_INFA - Ensure the type is selected to INformatica . Enter Integration service , Server Port (usually 4006) , user/pass and Repository . Tip : If you are not able to edit the Service Port at the bottom part .Edit it within the record entry itself. 2) MY_INFA_REP - Type is repository Give hostname , Server Port (Usually 6001) , user/pass and INfa Repository Name Physical Data SourcesThe Name here should correspond to the NAME of the connections that we have created in Informatica . In our case lets create three Data Sources named SCOTT_CONN - type Source TGT_CONN - type Warehouse AGG_CONN - type Warehouse give the other details of the source connection. DESIGN THE CONTAINER ==================== 1) Create the tables Design->Tables , Right Click on the blank area ->import from Database ->Database Tables . Import all three tables created in the beginning. 2) Source System Folders. Here we need to map the Logical Folder to the Physical Folder Name . You can create a logical Folder name and Physical Folder Name from Tools->Seed Data->Task Logical/Physical Folder. Task Logical Folder = DAC_FOLDER1 Task Physical Folder = INFA_FOLDER_NAME Now in the Source System Folders Tab map DAC_FOLDER1 = INFA_FOLDER1 Task Physical folder name should be same as the Informatica Folder. 3) Create a new task with the name of the Informatica workflow . Design->Tasks . Lets give the Simple Pass Though workflow name that we already created. wf_SPT. Command for incremental/full load .Both wf_SPT You Can create Folder Name , Primary Source , Primary Target, Task Phase at Tools->Seed Data (Its Just Giving a name). Task Phase - Extract Folder Name -Select the Task Logical Folder created earlier. DAC_FOLDER1 Primary Source - Data Sources.Say SCOTT_LOGICAL_DS Primary Target - TGT_LOGICAL_DS Select the Execution Type as Informatica Right Click -> Synchronize tasks ,if we get a success , things are fine. Similarly create a new task for wf_AGG. wf_AGG , wf_AGG ,wf_AGG DAC_FOLDER1 , TGT_LOGICAL_DS , AGG_LOGICAL_DS Aggregate , Informatica ,5 4) Create a dummy Parameters. Else DAC does not compile Source System Parameters-> Give a dummy name , say dummy_param.

5)Subject Area. Give a name .say SPTAGG On the bottom ->Tasks ->add tasks select both the tasks that we created earlier. 6) Now Assemble the Design.When it pops up a prompt accept it. Execute =======1) Edit -> Give a Name (say SPTAGG_EXEC_PLAN ) 2) Subject Areas->Add the new Subject Area created (SPTAGG) 3) Parameters ->Generate . Now you could see 4 parameters Generated. Then give appropriate values to the Parameters. Type DataSource SCOTT_LOGICAL_DS -> value= SCOTT_CONN TGT_LOGICAL_DS -> value = TGT_CONN AGG_LOGICAL_DS -> Value = AGG_CONN Type Folder INFA_FOLDER1= INFA_FOLDER1 ( The Informatica Folder) Ideally you could give the same names , I have differentiated for clarity. 4) Now Select the subject Area and give a build . DAc Automaticall builds the execution plan. 5) To do a Unit test Go to Ordered Tasks , select the workflow to do a unit test , and click unit test . Do the unit testing for both the tasks . You can monitor the workflow in Informatica Monitor. 6) Running the Execution Plan Execute ->Run Now to execute the execution plan . 7) You can monitor the current run in DAC . Also you can see the workflow progress in Informatica.

Вам также может понравиться