Вы находитесь на странице: 1из 50

1) OBIA Architecture Components

1. Source systems
2. Oracle Business Intelligence Enterprise Edition
3. Oracle Business Analytics Warehouse
4. OBIA components repository
5. OBIA Oracle Data Integrator repository
6. OBIA Configuration Manager
7. Functional Setup Manager
8. Oracle Data Integrator
9. Oracle GoldenGate
2) Oracle Business Analytics Warehouse
The Oracle Business Analytics Warehouse (OBAW) is a prebuilt data warehouse data model that was
designed using dimensional modeling techniques to support the analysis requirements of OBIA. It
includes:
1. A universal data warehouse design that enables you to integrate data from different source systems
2. Multiple instance support that enables you to use one data warehouse deployment for multiple source
system instances
3. Conformed dimensions that enables you to view the data from different subject areas
3) OBIA Repositories
1. OBIA Oracle Data Integrator repository
a. Contains the OBIA-specific prebuilt ETL logic
2. OBIA components repository
a. Repository for Configuration Manager and Functional Setup Manager
b. Contains load plan definitions, OBIA product hierarchy, setup objects, such as parameters and
domain mappings, and a list of functional tasks
4) OBIA Configuration Manager
Is a web application that you use to set up and maintain an OBIA environment.
You use Configuration Manger to:
1. Configure offerings, which are the products you have purchased (for example, Oracle Financial Analytics)
2. Configure functional areas, which are the component parts of the offering (for example, Accounts
Receivable is a functional area of Oracle Financial Analytics)
3. Monitor and manage OBIA setup data
4. Monitor and manage load plans, which you use to perform extract, transform, and load (ETL) processes
5. Migrate configuration data across environments using the import and export options
5) Functional Setup Manager
Functional Setup Manager is a web application that works in conjunction with Configuration Manager to enable
you to:
1. Manage and perform functional configuration tasks for offerings
2. Deploy an offering and its functional areas
3. Manage a list of configuration tasks, which you can assign to different functional developers
4. Monitor task implementation status
6) Oracle Data Integrator
1. OBIA uses Oracle Data Integrator (ODI) as its data integration platform.
2. ODI is a comprehensive, unified ETL tool for building, deploying, and managing complex data
warehouses.
3. ODI performs high-volume, high-performance batch and real-time loads, as well as data validation.
4. In an OBIA environment, ODI works in conjunction with OBIA Configuration Manager, which
provides:
a. A user interface for managing load plans
b. Access to ODI Console, a web application that enables you to view objects in the ODI repository,
and control and monitor ETL processes
7) OBIA Product Life Cycle
A typical OBIA product lifecycle consists of the following processes, some of which may happen concurrently:
1. Installing prerequisites and OBIA
2. Configuring OBIA
3. Performing a full load
4. Testing and moving to production
5. Running periodic ETL
6. Patching OBIA
7. Administering OBIA
8) Configuring OBIA
After installation, use Configuration Manager and Functional Setup Manager to perform functional
configuration for each of the OBIA that you want to deploy.
Configuration tasks include:
a. Enabling offerings and functional areas
b. Creating an implementation project
c. Setting up functional configuration data
d. Assigning tasks to developers
9) Performing a Full Load
After you have configured your load plan and your functional configuration data, you perform a full load of
your transactional data into the Oracle Business Analytics Warehouse.
10) Testing and Moving to Production
Test your OBIA deployment in a preproduction environment .
Continue to customize the deployment based on user feedback and any encountered issues.
After your developers, users, and administrators have verified that your test deployment meets your
organizations needs, you can move the deployment to production.
11) Running Periodic ETL
OBIA processes data by using two ETL modes:
Full ETL load: Initially, a full load is performed to extract all the required data and load all tables in the Oracle
Business Analytics Warehouse.
Incremental ETL load: Subsequently, the data warehouse is updated incrementally and loads only data that
has changed since the most recent ETL was processed.
Once your OBIA deployment is in production, run periodic incremental ETL loads to update data in your
Oracle Business Analytics Warehouse.
12) OBIA
Periodically Oracle provides patches for OBIA and related products.
1. Patching involves copying a small collection of files over an existing installation.
2. A patch is normally associated with a particular version of an Oracle product and involves:
a. Updating from one minor version of the same product to a newer minor version of the same product
b. Applying an interim patch to resolve a specific issue
3. An OBIA patch can include bug fixes, metadata, and binary file updates.
13) Administering OBIA
Ongoing administration of OBIA includes:
1. Functional configuration
2. Managing and monitoring ETL processing
3. Customization
4. Security
14) Which of the following is not an accurate description of the Oracle Business Analytics Warehouse?
a. A universal data warehouse design that enables you to integrate data from different source systems
b. A business intelligence platform that is used to access and present data in easy-to-understand formats, such as
tables and graphs
c. Multiple instance support that enables you to use one data warehouse deployment for multiple source system
instances
d. Conformed dimensions that enable you to view the data from different subject areas

15) The OBIA Oracle Data Integrator (ODI) Repository contains load plan definitions, BI Applications product
hierarchy, setup objects, such as parameters and
domain mappings, and a list of functional tasks
a. True
b. False
16) Which of the following are tasks that you perform using OBIA Configuration Manager?
a. Configure offerings, which are the products you have purchased.
b. Configure functional areas, which are the component parts of the offering.
c. Build, deploy, and manage complex data warehouses.
d. Monitor and manage load plans that you use to perform ETL processes.
e. Used to access and present data in easy-to-understand formats, such as tables and graphs.
f. Migrate configuration data across environments, using the import and export options.
17) Which of the following statements does not describe Oracle Data Integrator?
a. Comprehensive, unified ETL tool for building, deploying, and managing complex data warehouses
b. Performs high-volume, high-performance batch and real- time loads, as well as data validation
c. A data replication tool used to create a replicated OLTP schema or schemas, facilitate change data capture,
and aid in ETL and transactional system performance
d. Works in conjunction with OBIA Configuration Manager
18) Minidimension Tables
1. Include most queried attributes of parent dimensions
2. Are used to increase query performance
3. Are identified with the suffix _MD
4. Prebuilt minidimension tables include:
a. Responses
b. Agreements
c. Assets
d. Opportunities
e. Orders
f. Quotes
g. Service Requests
19) Helper tables
1. Helper tables are used by the Oracle Business Analytics Warehouse to solve complex problems that
cannot be resolved by simple dimensionals chemas.
2. In a typical dimensional schema, fact records join to dimension records with a many-to-one relationship.
3. To support a many-to-many relationship between fact and dimension records, a helper table is inserted
between the fact and dimension tables.
4. The helper table can have multiple records for each fact and dimension key combination.
5. This allows queries to retrieve facts for any given dimension value.
It should be noted that any aggregation of fact records over a set of dimension values might contain overlaps (due to a
many-to-many relationship)and can result in double counting
20) Subset Dimension Tables
1. Are large dimension tables that include well-defined subsets
2. Are extracted from dimension tables
3. Enhance query performance by segregating subsets of frequently queried data
21) Hierarchy Tables
1. Hierarchies stored in transactional systems are flattened in hierarchy tables in the data warehouse.
2. For example, W_ORG_DHstores the hierarchy relationships for the organization dimension,
W_PRODUCT_DHstores hierarchy relationships for the product dimension, and so on.
3. Hierarchy tables are rebuilt with each ETL run. Examples of hierarchy tables in the data warehouse
include:
a. Industry (W_INDUSTRY_DH) Organization (W_ORG_DH)
b. Internal Organization (W_INT_ORG_DH) Employee Positions (W_POSITION_DH)
c. Product (W_PRODUCT_DH)
The screenshot shows a partial view of W_INT_ORG_DH, which stores the flattened hierarchy of internal
organizations. When one organization rollsup into multiple hierarchies, the multiple hierarchies are stored in
this table. Each hierarchy is differentiatedby a hierarchy number and name.
22) Staging Tables
1. Are intermediate storage tables within the OBAW
2. Hold data for transformation before loading into the dimension and fact tables
Data is loaded from Oracle and non-Oracle transactional databases.
3. Are normally populated only with incremental data
4. Are not persistent
Truncated after each load
5. Are loaded by source-dependent processes
Contain the prefix SDE (Source Dependent Extract)
6. Are sources for Source Independent Load (SIL) processes that load and transform staging table data,
internal and external, into the dimension and fact tables
23) Internal Tables
Assist in the load process and contain warehouse metadata
Examples:
A new row is entered into the W_ETL_RUN_S table with a new process ID for every load.
W_EXCH_RATE_G stores exchange rates.
24) Internal Tables
Internal tables in the OBAW are those tables that cannot be classified as staging, fact, dimension, hierarchy,
extension, or dimensional map tables. These tables store important information that is used during the ETL
process, are rebuilt during each ETL process, and are not used by end-user query tools. Internal tables include
the following:
1. W_COSTLST_G: The cost list information used by the ETL process. It is mirrored in the Siebel
transactional database by S_ETL_COSTLST.
2. W_DUAL_G: The table used by the ETL process to generate calculated values. It is similar to S_DUAL in
the Siebel transactional database.
3. W_EXCH_RATE_G: The exchange rate information used by the ETL process. It is mirrored in the
transactional database by S_ETL_EXCH_RATE.
4. W_DIM_TABLES_G: A list of data warehouse tables that are map enabled
5. W_LOV_EXCPT_G: An intermediate table for finding exceptions in List of Values
6. W_LST_OF_VAL_G: A list of Values used in the ETL process
25) Using a Separate Database for the OBAW
1. ETL is configured to maximize hardware resources.
2. Analytical queries interfere with normal use of the transactional database.
3. The data in a transactional database is normalized for update efficiency.
4. Historical data cannot be purged from a transactional database.
5. Transactional databases are tuned for one specific application.
6. The OBAW database can be specifically tuned for the analytical queries and extract, transform, and load
(ETL) processing.
26) Apply the ATGLite Patch.
1. ATGLite is a J2EE component used by OBIA Configuration Manager and Functional Setup Manager.
2. The ATGLite patch updates tables and seed data in the BI Applications Components Repository schema
(BIACOMP).
27) Performance Tuning Recommendations
1. Tuning underlying systems
2. Guidelines for Oracle Business Analytics Warehouse (OBAW) databases
3. Using a separate database for the OBAW
4. General guidelines for Oracle databases
5. Using Oracle template files
6. Configuring base and installed data warehouse languages
7. Minidimension tables
8. Aggregate tables
9. Creating indices
10. Prune Days
11. Oracle Golden Gate
28) OBIA Configuration Manager
Configuration Manager is a web application that you use to set up and maintain an OBIA environment. You use
Configuration Manger to:
1. Configure offerings, which are the products you have purchased (for example, Oracle Financial Analytics)
2. Configure functional areas, which are the component parts of the offering (for example, Accounts
Receivable is a functional area of Oracle Financial Analytics)
3. Monitor and manage OBIA setup data
4. Monitor and manage load plans, which you use to perform extract, transform, and load (ETL) processes
5. Migrate configuration data across environments using the import and export options
29) Functional Setup Manager
Functional Setup Manager is a web application that works in conjunction with Configuration Manager to enable
you to:
1. Manage and perform functional configuration tasks for offerings
2. Deploy an offering and its functional areas
3. Manage a list of configuration tasks, which you can assign to different functional developers
4. Monitor task implementation status
30) Register the Source System in Configuration Manager.
Use the Register Source in Configuration Manager page in OBIA Configuration Manager to register the
source system
Use OBIA Configuration Manager to register the source system. Launch OBIA Configuration Manager and
sign in as the BI Applications Administrator.
In the navigation pane, select System Setups > Define Business Intelligence Applications
Instance. The Source Systems tab is displayed (not shown here). Click the Add icon to
display the Register Source dialog box.
To register the source in Configuration Manager, use the Register Source in Configuration
Manager page to specify the following properties:
1. Product Line
2. Product Line Version
3. Source Instance Name
4. Description
5. Data Source Number
31) Enable Offerings for Deployment.
Use Configuration Manager to enable the OBIA offerings that you have purchased and are deploying.
From the Tasks bar, under System Setups, click Manage Business Intelligence Applications to display the
Manage Business Intelligence Applications dialog box. Select the Business Intelligence Application Offerings
tab.
Select the Enabled check box next to the desired offering(s). In this example, Oracle Financial Analytics and its
associated functional areas are selected. Enabling an offering makes the setup data associated with that offering
available in Configuration Manager.
32) Edit Preferred Currency Display Names.
Use the Manage Preferred Currencies dialog box in Configuration Manager to edit the default currency display
names.
Oracle Business Intelligence is installed with a set of preferred currencies with preconfigured preferred currency
names and preferred currency codes. Preferred currency names are used in Oracle Business Intelligence
dashboards in the Currency drop-down in the My Account dialog box and on the Preferences tab for a user
logged into Oracle Business Intelligence.
You can use the Manage Preferred Currencies dialog box in Configuration Manager to edit the default currency
display names. You edit preferred currency name values to change the currency labels that are displayed in all
modules associated with BI dashboards. For example, you might want to change the Local Currency label
from Ledger Currency to Local Currency.
33) Set Languages for Data Load.
1. OBIA supports data loads in multiple languages.
2. Specify the languages for data loads in OBIA Configuration Manager.
3. American English is the default installed language; all other languages are disabled.
34) Run the Domain-Only Load Plan.
Define, generate, and run a domain-only load plan to load source-specific data into OBIA Configuration
Manager tables.
This enables Configuration Manager to display the appropriate source-specific values as choices in drop-
down lists for setup objects.
Before you perform this step, you must have registered the source system and propagated connection details to
ODI as discussed earlier in this lesson.
35) Grant User Access to Configuration Manager, FSM, and ODI.
Work with your security administrator to grant users access to OBIA Configuration Manager, Functional
Setup Manager, and ODI.
Grant the appropriate application (duty) roles to a user based on the users job responsibilities.
36) ETL Phases
An Oracle BI Applications ETL process includes the following phases:
Source-dependent extract (SDE) tasks: Extract data from the source system and stage it in staging tables
SDE tasks are source specific.
Source-independent load (SIL) tasks: Transform and port data from staging tables to base fact or dimension
tables
SIL tasks are source independent.
Post load process (PLP) tasks: Are executed only after the dimension and fact tables are populated
A typical use of a PLP task is to transform data from a base fact table and load it into an aggregate table.
PLP tasks are source independent.
37) Load Plan
A load plan is an executable object that comprises and organizes the child objects (referred to as steps) that
carry out the ETL process.
1. You define a load plan in Oracle BI Applications Configuration Manager by selecting a data source and one
or more fact groups.
2. After you define the load plan, you generate it to build it in the Oracle Data Integrator (ODI) repository.
3. You then execute the load plan in Oracle BI Applications Configuration Manager to perform the ETL
process.
38) ODI Run-Time Agent
The ODI Run-Time Agent is a Java EE agent, which handles schedules and orchestrates ETL sessions.
1. At design time, developers generate scenarios from the business rules that they have designed.
2. The code of these scenarios is then retrieved from the repository by the Run-Time Agent.
3. This agent then connects to the data servers and orchestrates code execution on these servers.
4. It retrieves the return codes and messages for the execution, as well as logging information.
39) ODI Console
ODI Console is a web application deployed in Oracle WebLogic application server.
Provides read access to the ODI repository and the ability to perform topology configuration and production
operations
Can be accessed via Oracle BI Applications Configuration Manager
40) Package
1. A package is the largest unit of execution in Oracle Data Integrator.
2. A package is a workflow, made up of a sequence of steps organized into an execution diagram.
3. Packages assemble and reference other components from a project such as interfaces, procedure, or variable.
In the example in the slide, notice there are three steps for the SDE_ORA_GLRevenueFact
package in the SDE_ORA_GLRevenueFact task folder.
Refresh IS_INCREMENTAL:
This is a variable step, which declares, sets, refreshes, or evaluates the value of a variable. In this case, this is a
refresh variable step, which refreshes the variable by running the query specified in the variable definition.
More specifically, this variable is used to determine whether this package should be run in full
or incremental mode.
Refresh LAST_EXTRACT_DATE:
This is also a refresh variable step that determines the date on which data was last extracted for this package.
This variable is used to determine if this package should execute a full or incremental load.
Run SDE_ORA_GLRevenueFact.W_GL_REVN_FS:
This is a flow step, which executes an interface. In this example, it executes the W_GL_REVN_FS interface,
which loads the W_GL_REVN_FS fact staging table in the data warehouse.
You can also view and administer package steps in an execution diagram in the ODI Designer editor.
41) Change Capture
During a full load, Oracle BI Applications extracts:
All records from tables that are sources for dimension tables
Records that were created after an Initial Extract Date from tables that are sources for fact tables
During an incremental load, Oracle BI Applications extracts:
Records that have changed or were created after a "Last Extract Date." This is done by comparing the Last
Extract Date value to a "Last Updated Date" (or LUD) type column in the source table. If the source table does
not have such a column, Oracle BI Applications extracts all records from that table.
The Initial Extract Date defines a cut-off so that not all records are loaded into the data warehouse. You set the
Initial Extract Date value for each data source in Oracle BI Applications Configuration Manager.
An ETL process can extract a record from a single table or from multiple tables. When a record is the result of
joining multiple tables, one of these tables is identified as the base table, which defines the granularity of the
record. When extracting fact records, Oracle BI Applications compares the Created Date of the base table
only with the Initial Extract Date.
42) Prune Days
The Prune Days parameter is used to:
1. Extend the window of the ETL extract beyond the last time the ETL actually ran
2. Ensure that records that may have somehow been missed in an earlier ETL process are picked up in the
next ETL
Last Extract Date is a value that is calculated based on the last time data was extracted from
that table less a Prune Days value. Records can be missed in an ETL process when a record
is being updated while the ETL process is running and was not committed until after the ETL
completed.
You set the Prune Days parameter value in Oracle BI Applications Configuration Manager.
Setting a small value means the ETL will extract fewer records, thus improving performance;
however, this increases the chances that records are not detected. Setting a large number is
useful if ETL runs are infrequent, but this increases the number of records that are extracted
and updated in the data warehouse. Therefore, you should not set the Prune Days value to a
very large number. A large Prune Days number can also be used to trigger re-extracting
records that were previously processed but have not changed. The value for Prune Days
should never be set to 0.
43) Multi-Source Environments
Oracle BI Applications supports the loading of data into the Oracle Business Analytics Warehouse from
multiple source systems as long as the source systems are different.
Oracle BI Applications does not support multiple instances of the same source system.
For example, your environment could have a PeopleSoft 9.0 source and an Oracle EBS 11.5.0 source, but not
two instances of the Oracle EBS 11.5.10 source.
Multi-source ETL processes are different from single-source ETL processes in that you could potentially
have both sources being used to load the same fact groups and dimensions.
44) ETL Application Roles
Oracle BI Applications has two application (duty) roles for ETL operations:
Load Plan Operator
Load Plan Administrator
Oracle BI Applications has the following additional roles:
BI Applications Administrator
BI Applications Functional Developer
BI Applications Implementation Manager
45) Functional Configuration Terminology
Source instance
The transactional system that serves as the source of data for the Oracle Business Analytics Warehouse
(OBAW)
Offering
A BI Application product that you have purchased
For example, Oracle Financial Analytics, or Oracle Sales Analytics
Functional area
A component part of an OBIA offering; the smallest unit of an OBIA offering that can be implemented
For example, Accounts Receivable, Accounts Payable, and General Ledger are functional areas in Oracle
Financial Analytics.
46) Functional Configuration
Functional setup of OBIA offerings must be performed to ensure the accurate and successful movement of data
from a source database to the target OBAW.
These functional setups, based on either business requirements or on transactional source system settings or
configurations, direct the manner in which relevant data is moved and transformed from source database to
target database.
Additionally, some functional setups of OBIA control the manner in which data is displayed.
Functional setups are also called functional configurations
47) Functional Configuration Tools
Functional configuration for OBIA is done using the following tools:
Configuration Manager
Functional Setup Manager (FSM)
48) Configuration Manager
OBIA Configuration Manager is a web application for setting up and maintaining an OBIA environment.
You use Configuration Manager to:
Launch FSM to configure offerings and functional areas
Monitor and manage setup data, and extend the OBAW where necessary
Monitor and manage load plans that you use to perform extract, transform, and load (ETL) processes
Migrate configuration data across environments, using the import and export options
49) Performing Functional Configuration
1. Enabling offerings and functional areas Creating an implementation project
2. Performing functional tasks
3. Assigning tasks to developers
4. Managing offerings and functional areas
50) Enabling Offerings and Functional Areas
At the start of a deployment project, you need to enable your offerings and functional areas for implementation.
51) Creating an Implementation Project
Use FSM to create implementation projects to configure offerings and the modules that you want to deploy.

1. To create an implementation project, in the Tasks bar, select Implementations > Manage Implementation
Projects to display the Manage Implementation Projects page.
2. Then choose Actions > Create to enter a name for the project and select the offering to implement.
3. To make offerings easier to manage, Oracle recommends that you deploy one offering per project.
4. In other words, if you are deploying three offerings, then create three implementation projects.
5. In this example, you have installed Oracle Financial Analytics and you create an implementation project
to configure the ETL for Oracle Financial Analytics.
6. To configure ETL for Oracle Financial Analytics, you must create at least one implementation project.
7. When you create an implementation project, you select the offering to deploy as part of that project.
8. Once you create an implementation project, FSM generates the tasks required to configure the specified
offering.
9. By default, the tasks are assigned to the BI Administrator user.
10. If required, you can optionally assign tasks to functional developers, who will then perform the tasks.
11. Use the Go to Task column to complete functional configuration tasks.
12. The example in the slide shows an implementation project named IP_Financial_Analytics, configured
for the Oracle Financial Analytics offering, with its associated tasks.
52) Performing Functional Tasks
When you click Go to Task, you display a configuration page that enables you complete the task.
53) Changing Task Status
Use the Status icon for a task to edit its status.
When you have completed the steps listed in the informationaltask, you must manually set the status of the task.
You can edit the statusby clicking the Status icon for the task, or selectingthe task and clicking the Edit Status
button on the toolbar (not shown here).
54) Assigning Tasks
1. By default, tasks are assigned to the BI Applications Administratoruser (weblogic, in this example).
2. You can assign tasks to functionaldevelopers so that functionaldevelopers can configureOBIA offerings.
3. When functionaldevelopers log in and display the Assigned Implementation Tasks tab, they would only
see the tasks that have been assigned to them.
4. When BI Administratorslog in and display the Implementation Projects tab, they see all tasks.
5. In a small deployment project, a single person with BI ApplicationsAdministrator privileges might
perform all of the setup and functionalconfiguration tasks for OBIA.
6. Notice that you could also add notes and assign a due date for a task.
55) Managing Offerings and Functional Areas

In the Tasks bar select Implementation Objects > Manage Offerings and FunctionalAreas. From here, you can
manage the various offerings and their associated functional areas. For example, you can drill on an offering to
view detail informationabout the offering and the associated functionalareas. You can also search by name,
offerings, functionalarea, or product.
56) Domains and Domain Mappings
Domains are pre-seeded dimensional values that help define business metrics.

Domains are typicallylocated in the source system. For example, in FinancialAnalytics, domains store
information about the GeneralLedger accounts. If domains are not available in a source system, then they can be
sourced from a flat file.
The screenshot shows the Domain Mappings tab. You access this tab by selecting Tasks > Domains
Administration > Manage Domain Mappings and Hierarchies in Configuration Manager.
Thistab shows how data fields in the source system map to data fields in the OracleBusiness Analytics
Warehouse (OBAW).The domain mappings specify how data in a source system is extracted and loaded into
the OBAW.
For example, the data in the source domain Account Employee Size (ACCNT_EMP_SIZE) extracts and loads
into the target domain Customer Employee Size Category (W_ACCNT_EMPLOYEE_SIZE).
Notice that you can search for domain mappings by source instance, offering, fact group, dimension group,
domain name, and so on.
57) Domain Member Mappings
Domain member mappings specify how domain member data in a source system is extracted and loaded into
domain member data in the OBAW.
There are two types of domain member mappings:
Regular domains have members consisting of a single value. These single values map to single member
values in the target system.
Band domains have members consisting of two values (Range Start and Range End) that specify a range.
In the screenshot, Account Employee Size is an example of a band domain. Each range maps to a single target
domain member. In this example, the range 1 to 5000 maps to Employee Total Between 0 and 5000, the
range 5,001 to 10,000 maps to Employee Total Between 5001 and 10000, and so on.
58) Source Domains
Data fields in a source system application are referred to as source domains.

The screenshot shows the Manage Source Domains page. To navigate to this page, select Tasks > Domains
Administration > Manage Source Domains in ConfigurationManager. Source domains displayed on the
Source Domains tab are read-only.
59) Source Domains: Domain Members
Select a source domain to display domain members in the lower pane.
On the Source Domains page, you can select a source domain to view its members in the lower pane. In the
example in the slide, you select the Source AP TransactionSource domain to view its domain members. Domain
members are the permitted values for a source domain.
Notice that you do not have the abilityto edit and add domain members in thispane. To maintain data integrity in
Oracle Business Intelligence Applications, some domains have been designed as non-extensible, and are,
therefore, read-only.
60) Warehouse Domains

Data fields in the OBAW are referred to as warehouse domains.


The screenshot shows the Manage WarehouseDomains page. To navigate to this page, select Tasks > Domains
Administration > Manage Warehouse Domains in Configuration Manager.
Unauthorized reproduction or distribution prohibited Copyright 2014, Oracle and/or its affiliates
The screenshot shows the Manage WarehouseDomains page. To navigate to this page, select Tasks > Domains
Administration > Manage Warehouse Domains in Configuration Manager.
61) Warehouse Domains: Warehouse Members

Select a warehouse domain to display warehouse members in the lower pane.


In the example in the slide, you select the Country warehouse domain to view its warehouse members.
Warehouse members are the permitted values for a warehouse domain.
Notice that, unlike when working with source domain members, you have the ability to edit and add warehouse
members in this pane.
62) Warehouse Domain Hierarchies

Warehouse domain hierarchies are domains that have been organized into hierarchies to enable the data to be
more effectively analyzed.
Domain hierarchies are displayed in inverted format. In the example in the slide, W_COUNTRYis the parent of
child W_REGION.
63) Warehouse Domain Hierarchies: Domain Member Mappings
Select a domain mapping in the hierarchy to display domain member mappings in the lower pane.
Select a domain mapping in the hierarchy to display domain member mappings in the lower pane.
Use the field next to Source Domain Members to display mapped, unmapped, or all source domain members.
64) Editing Domain Member Mappings

OracleBusiness Intelligence Applications ships with default domain value mappings that map the seeded BI
Application domain values to the seeded configuration data. If you want to use these default categories, you do
not need to make any changes to these mappings before you start your ETL processes. The example in the slide
shows the member mappings for Account Employee Size.
If you want to edit a domain member mapping, click the Edit icon to display the Edit Domain Member
Mappings dialog box. Use this dialog box if you want to make changes to default domain-mapping values. For
example, you could change the range values or click Add Range Domain Member Mapping to create a new
range.
65) Using Batch Edit
You can set up a target domain by using the Batch Edit option to update multiple target domain members with
the same value.

Using batch edit is useful for large domains with many member mappings that require the same value. In the
Tasks bar, click Manage Domains and Mappings, display the Domain mappings tab, select the Domain that you
want to edit, and then click the Edit Domain Member Mappings icon in the Domain Member Mappings pane to
display the Edit Domain Member Mappings dialog box.
To use batch edit, select one or more rows in the table, select a value from the Batch Edit drop-down list, and
then click Change to apply the value selected in the Batch Edit drop-down list to all specified members.
66) Using Sync To Source
You can set up a non-ranged target domain using the Sync To Source option to automatically synchronize a
target domain with values from the source domain.

In some scenarios, you might not know what target domain member values should be when you deploy
OBIA.For example, in Order Management or Supply Chain Analytics, UOM (Unit of Measurement) is
typicallynot known until deployment time.
You can set up a non-ranged target domain using the Sync To Source option to automatically synchronizea
target domain with values from the source domain. This process inserts new target members from the source
domain and automaticallygenerates 1:1 mappings. This is usefulfor large domains with many member mappings
that might otherwise take a long time to set up. Sync To Source is only available for extensiblenon-ranged
domains.
67) Data Load Parameters
Data load parameters are configuration values that specify how source system data is loaded into the OBAW.

To view and manage data load parameters, select Tasks > Data Load Parameters Administration> Manage Data
Load Parameters to open the Manage Data Load Parameters page in ConfigurationManager. Use the Search
section to specify source instance, offering, fact group, and so on. In the example in the slide, data load
parameters are shown for the Oracle Financial Analytics offering.
Data load parameters can be either globalor application-specific.
Globalparameters apply to all applications and are indicated by the (ab) and globe icon. Globaldata load
parameters can also be associated with specificfact groups or dimension groups.
Application-specificparameters apply to specificapplicationsand are indicated by the (ab) icon. Application-
specific data load parameters are always associated with one or more fact groups or dimension groups.
Some parameters have a warning icon that indicates that this parameter value must be set before running a full
load.
Some parameters have a read-only icon that indicates that this parameter value cannot be edited.
68) Editing Data Load Parameters
In the example in the slide, the Slowly Changing Dimension Flag parameter is selected. This parameter
indicates whether the slowly changing dimension type 2 flag is set for dimensions or dimension groups.
Notice that thisis a global parameter and the GlobalParameter Value is set to No. If the GlobalParameter that
you edit is associated with fact groups or dimension groups, then a warning message is displayed to verify that
you want to update the value for all associated fact groups and dimension groups. If you click Yes at the
warning message, then the values of all occurrences of the parameter at the group level will be updated to the
new value.
69) Editing Group-Specific Parameter Values

In the example in the slide, the Slowly Changing Dimension Flag data load parameter is selected, and the Group
SpecificParameter Values pane is visible. Thispane shows the value of the Slowly Changing Dimension Flag
parameter for specific dimension groups within the Oracle FinancialAnalytics offering.
In this example, the Business Location Dimension group-specificparameter value is selected.
To edit the group-specificparameter value, click the Edit icon on the toolbar to open the Edit Dialog dialog box.
Notice that you can change the parameter value by selecting from a list of values: Yes or No in thisexample.
The fields that are displayed in this dialog box are different, dependingon the type of parameter being edited.
For example, the parameter data type might be Boolean, date, multi-value select list of values, number, single-
value select list of values, string, and so on. It is also possible to edit more than one group-specificparameter
value by using the Edit All icon (two pencils).
70) Reporting Parameters
Reporting parameters are configuration values that specify how data is presented in Oracle Business
Intelligence dashboards.
In the Tasks bar, select Manage Reporting Parameters to view or edit reporting parameters. In the example in
the slide, the Globaltab is selected. Global parameters apply to all applications. Application-specific parameters
apply to specific applications. To edit a reporting parameter, select the parameter in the parameter list, and then
either click the Edit icon or click the value in the Parameter Value column.
71) Exporting and Importing Setup Data
You can export and import setup data for Configuration Manager.

You can export and import setup data for OracleBI ApplicationsConfiguration Manager to:
Make a backup of your configuration settings for security purposes. For example, you might keep a record of
the configuration changes that you have made.
Migrate the Setup Data for OBIA Configuration Manager from one environment to another environment. For
example, you might move the configuration changes that you have made from a Test environment to a
Production environment.
In the Tasks bar, select Export Setup Data and then click the Export icon to display the Export dialog box.
Name the export file and use the Export dialog box to specify the setup objects that you want to export.
Whenyou export setup data, you can export only the changes that you have made to the values of the following
objects: data load parameters, domains and mappings, reportingparameters, and system setups. Unchanged
configuration values are not exported. For example, if you only change the value of DEFAULT_CURRENCY
from USD to Euro and then you export your data, then the export zip file that is produced will contain only
columns for DEFAULT_ CURRENCY=Euro. The Export Detailspane (in the Export Setup Data pane)
displays the detailsof the selected export file.
To import setup data, select Tasks > Import Setup Data and follow a process similar to importing an exported
file. When you import setup data from a zip file, you import whatever configurationchanges were exported to
that zip file.
72) Monitoring Setup Data
Use the Overview page to monitor setup data to ensure that your offerings are correctly configured.

You can use the Reports panes on the Overview page to monitor setup data. For example:
Use the System Setups list to monitor which Offerings have been enabled for deployment.
Use the Parameters By Offerings report to monitor visuallythe number of parameters that have been
configured.
Use the Load Plan Executions report to monitor load plans.
Use the Domain Mappings by Offeringsreport to monitor domain mappings.
You can drill into each report for more detailed information. For example, drilling on the Parameters bar graph
in the Parameters By Offeringsreport will take you to a list of parametersby offering.Drillingon the
Parameterswith no values bar in the graph will open a page where you can view and edit parameters with
unassigned values.
Quiz
Which of the following statements about domains is incorrect?
a. Domains are pre-seeded dimensional values that help define business metrics.
b. Domains are configuration values that specify how source system data is loaded into the OBAW.
c. Domains are typically located in the source system.
d. If domains are not available in a source system, then they can be sourced from a flat file.
Answer: b
Quiz
Which of the following statements describe domain member mappings?
a. Data fields in a source system application are referred to as domain member mappings.
b. Domain member mappings specify how domain member data in a source system is extracted and loaded into
domain member data in the OBAW.
c. Regular domains have members consisting of a single value.
d. Band domains have members consisting of two values that specify a range.
Answer: b, c, d

Quiz
Batch edit is useful for large domains with many member mappings that require different values.
a. True
b. False

Answer: b
Batch edit is useful for large domains with many member mappings that require the same value.
Quiz
Which of the following statements describe data load parameters?
a. Data load parameters are configuration values that specify how source system data is loaded into the OBAW.
b. Data load parameters can be either global or application-specific.
c. Some parameters have a warning icon that indicates that this parameter value must be set before running a
full load.
d. Some parameters have a read-only icon that indicates this parameter value cannot be edited.
Answer: a, b, c, d
Quiz
Which of the following is not a reason for exporting setup data?
a. To make a backup of your configuration settings for security purposes
b. To automatically synchronize a target domain with values from the source domain
c. To migrate the setup data for OBIA Configuration Manager from one environment to another environment
d. To monitor setup data to ensure that your offerings are correctly configured.
Answer: b, d
73) Load Plans
A load plan is:
An executable object that comprises and organizes the child objects (referred to as steps) that carry out the
ETL process
Defined in OBIA Configuration Manager Generated to build it as an object in the ODI repository
Executed to perform the ETL process

A load plan is an executable object that comprises and organizes the child objects (referredto as steps) that carry
out the ETL process. A load plan is made up of a sequence of several types of steps. Each step can contain
several child steps. Depending on the step type, the steps can be executed conditionally,in parallelor
sequentially.
You define a load plan in OBIA Configuration Manager by selecting a data source and one or more fact groups.
This selection determines which steps need to be performed during the ETL process. Each fact group belongs to
a specificfunctionalarea or areas that are associated with one or more offerings, which, in turn, are related to a
data server. A transactional data source is associated with one or more data servers.
After you define the load plan, you then generate it to build it in the ODI repository.You then execute the load
plan to perform the ETL process.
74) Overview of a Load Plan Life Cycle
A load plan life cycle comprises the following phases:
A load plan life cycle comprises the following phases:
Phase 1: Define load plan. In this phase, you define load plan properties in OBIA Configuration Manager. You
select a data source and one or more fact groups, and this selection determines the steps to be performed during
the ETL process.
Phase 2: Generate load plan. In this phase, you launch a generation process from OBIA Configuration
Manager that propagates the load plan properties to the ODI repository,where the load plan is built.
Phase 3: Execute load plan. In this phase, you start a load plan run from OBIA Configuration Manager, which
executes the steps of the load plan. Executing a load plan creates a load plan instance and a first load plan run. If
a run is restarted,a new load plan run is created under this load plan instance. Each execution attempt of the load
plan instance is preserved as a differentload plan run in the log.
Phase 4: Monitor load plan. In thisphase, you monitor the load plan run on the Load Plan Details page of
OBIA Configuration Manager. The Load Plan Detailspage provides a view of the ODI repository through
Oracle Data Integrator Console.
75) Defining a Load Plan
Perform the following steps to define a load plan in ConfigurationManager:
1. In the Tasks pane of OBIA Configuration Manager, select Manage Load Plans, which appears under the Load
Plans Administration heading. The Manage Load Plans page is displayed. On the Load Plans toolbar, click the
Add icon. The Create Load Plan page is displayed.
2. On the first page of the Create Load Plan series, specify the load plan name, description, type, and source.
Load plan types include:
- Source Extract (SDE): Includes only those tasks that extract from the source and loads data into staging
tables.
- Source Extract and Load (SDE, SIL, and PLP): Includes all tasks to extract from the source and load the
data warehouse tables.
- Warehouse Load (SIL and PLP): Includes only those tasks that extract from the staging tables and load the
data warehouse tables.
- Domain-Only: Includes all tasks required to extract domain-related records from the source and load the data
into the domain-related tables in the Oracle Business Analytics Warehouse.
3. On the second page of the Create Load Plan series, select the fact groups that you want to include in the load
plan definition.Note that fact groups may belong to a hierarchy of fact groups. You can select only the top-
levelparent fact group and not a child fact group. A load plan must contain at least one fact group, and multiple
fact groups may be selected from one or more data sources.
4. Save the load plan definition to display it in the Load Plans master list.
76) Generating a Load Plan
When you generate a load plan, the load plan is built in the ODI repository. A load plan must be generated
successfullybefore it can be executed. Note: Load plans must be generated seriallyor the processwill fail. Do not
launch a second load plan generationif one is already underway. You must wait until the first generation process
completes before you launch the next generation process.
To generate a load plan:
1. In the Load Plans master list, select the load plan that you want to generate. 2. In the Load Plans toolbar,
click the Generate icon.
3. Use the Generation Status field to monitor progress. Click the Refresh icon to refresh the display.
4. When the generationprocess completes, the Succeeded icon is displayed in the Generation Status field.
You can execute a load plan or schedule it for execution after it has been successfully generated.
77) Executing a Load Plan
You can only execute a load plan if it was successfullygenerated. You can have separate load plans for each
source, but load plans should not run in parallel.
To execute a load plan:
1. In the Load Plans list, select the load plan that you want to execute.
2. On the Load Plans toolbar, click the Execute icon to display the load plan dialog box. 3. Specify the
following information in the load plan dialog box:
- Context: The ODI context to be used when the load plan is run (Note that Global is the only supported
context.)
- Local Agent: The ODI local agent to be used when the load plan is run. - ODI WorkRepository: The name of
the ODI Work Repository
4. Use the Execution Status field to monitor execution progress.
78) Monitoring a Load Plan
You can monitor a load plan run by viewing the execution status information on the Load Plan Execution
Detailspage of OBIA Configuration Manager.
To view load plan execution details:
1. In the Load Plans master list, select the load plan whose run you want to view.
2. On the Load Plans toolbar, click the Show Execution Status Detailsicon. The Oracle Data Integrator Console
login screen is displayed (not shown in the slide).
3. Log in to OracleData IntegratorConsole by entering an appropriateuser ID and password.
4. ODI Console is displayed.
WithinODI Console, the navigationpane is displayed in the left pane and the Load Plan Execution page for the
selected load plan is displayed in the right pane. The Load Plan Execution page displays the load plan execution
name and load plan details. You can use the Load Plan Execution page to view detailed information about the
definition and execution status of the load plan.
79) Copying a Load Plan
Copying a load plan enables you to define a new load plan with the same fact groups as the selected load plan
definition, but with a different name and identifier.
1. In ConfigurationManager, in the Load Plans list, select the load plan that you want to copy.
2. On the Load Plans toolbar, click the Copy icon to display the Copy Load Plan page. 3. On the first page of
the Copy Load Plan series, modify the load plan information.
4. On the second page of the Copy Load Plan series, verify that the same fact groups are selected.
5. Save the copied load plan.
80) Stopping a Load Plan
You can stop a load plan run in ODI Console or ODI Studio.
You can stop a load plan run from the Load Plan Execution page in ConfigurationManager (click Show
Execution Status Detailson the toolbar) or from ODI Studio. To stop a load plan run from ODI Studio:
1. In Operator Navigator, select the running or waiting load plan run to stop from the Load Plan Executions
accordion.
2. Right-click the load plan and select Stop Normal or Stop Immediate.
- Stop Normal: In normal stop mode, the agent in charge of stopping the load plan sends a Stop Normal signal
to each agent running a session for this load plan. Each agent will wait for the completion of the currenttask of
the session and then end the session in error. Exception steps will not be executed by the load plan and, once all
exceptions are finished, the load plan is moved to an error state.
- Stop Immediate: In immediate stop mode, the agent in charge of stopping the load plan sends a Stop
Immediate signal to each agent running a session for this load plan. Each agent will immediately end the session
in error and not wait for the completionof the current task of the session. Exception steps will not be executed
by the load plan and, once all exceptions are finished, the load plan is moved to an error state.
3. In the Stop Load Plan dialog box (not shown here), select an agent to stop the load plan and click OK.
81) Overview of Restarting a Load Plan
When you execute a load plan, you may need to restart the load plan after a failure.
Examples of reasons for load plan failure include:
Problem with access either to the source or target database Failure of the ODI agent
Problem with space or storage Problem with data
When you run ETL to load data from a source system into the Oracle Business Analytics Warehouse
(OBAW),you may need to restart the ETL load after a failure. Examples of circumstances and reasons for load
plan failure include:
Problem with access either to the source or target database due to network failure or expired or otherwise
incorrect usernames and passwords
Failureof the ODI agent
Problem with space or storage (For example, you are able to connect to the source or target database, but the
query failsto run due to lack of temp space, disk space, and so on. For files, it could be due to inadequate space
where the file needs to be placed.)
Problem with data (for example, incorrect data with lengths larger than the target column can hold, or null
values in Not Null columns)
After such a failure during ETL, to avoid restarting the entire load plan after a failure, which would
requireinefficient re-runs of all ETL tasks, you must restart the load from the same point in its execution once
the cause of failure has been diagnosed and resolved.
82) Understanding Restartability Grain
When you restart a load plan after a failure, you may not be able to restart from the exact point of failure.
To maintain data integrity in the case of restart, the grain will vary depending on:
The location in the step hierarchy of the failed step The Restart setting for the step
When you restart a load plan after a failure, you may not be able to restart from the exact point of failure,
depending on where it occurred and on dependencies between load plan steps. The goal of restartability is that
the result of the load plan execution is the same regardless of any load plan failure.
To maintain data integrity in the case of restart, the grain varies depending on the location in the step hierarchy
of the failed step and on the restart setting for the step.
Withinthe Steps Hierarchy, you can view the restart setting of a step in the Restart column. The default settings
for different steps in the hierarchy support data integrity in restarts:
Root steps are set to Restart from Failure if serialand Restart from failed Children if parallel.
Substeps are set to Restart from Failure if serial and Restart from failed Children if parallel.
Scenariosteps are set to Restart from Failed Step.
The following examples highlight the implications for each type of load plan step. Serial Load Plan Step
Serialsteps are represented by a verticalicon in the load plan steps hierarchy and, by default, have a restart
setting of Restart from Failure. In a case where the load plan failswhen running such a step to load a
dimension group with multiple serial substeps loading individual dimensions, the load plan, on restart, starts
from the individualsubstep that failed. Any successfullycompleted serial substeps are not run again.
Parallel Load Plan Step
Parallelsteps are represented by a horizontalicon in the load plan steps hierarchy and, by default, have a restart
setting of Restart from Failed Children.In a typicalrun, a parallelstep with five parallelsubsteps under it has
all five substeps executed in parallel,subject to free sessions being available. If two of those five steps complete
and then the load plan fails, all the steps that did not complete or failed would be started again when the load
plan is restarted.
Scenario Step
At the lowest order in any load plan are the scenariosteps. While the parent steps, whether serialor parallel,are
used to set the dependencies, the scenario steps are those that load the tables. A scenariostep in turn may have
one or more substeps, correspondingto the number of steps inside the package. In the case of a scenario-step
failure during execution, the scenariostep may have multiple steps, all under the same session in the operator
log, but identifiedwith different step numbers: 0, 1, 2, and so on. If the plan is restarted, the scenario executes
from the failed parent scenario step, re-runningall substeps.
Restarting a Load Plan
You can use ODI Studio or ODI Console to restart a load plan. The slide shows how to restart a load plan using
ODI Console.
1. In ODI Console, navigate to Runtime > Sessions/Load Plan Executions and select the load plan execution
that has failed.
2. Click the Restart button. The Restart button is displayed only when the selected load plan is the most recent
run of the load plan. The restart option is enabled only on the last run for a load plan. A load plan can be
restarted any number of times and each time it progressesfrom the last failure.
3. A new instance of the load plan is generated.
4. Monitor the load plan.
83) Restarting a Session
It is also possible to restart a session. To avoid restarting the entire load plan after a failure, which would
requireinefficient re-runs of all ETL tasks, you can restart the load from the same point in its execution once the
cause of failure has been diagnosed and resolved.
You can use ODI Studio or ODI Console to restart a session. The slide shows how to restart a session using
ODI Console.
1. In ODI Console, navigate to Runtime > Sessions/Load Plan Executions and select the session that has failed.
2. Click the Restart button. The Restart button is displayed only when the selected session has failed.
84) Troubleshooting Load Plans
A load plan must be restarted when it has stopped with an error or is non-responsive.
Use the following checklist to assist in troubleshooting a non-responsive load plan:
1. Check the maximum number of sessions set to run against the agent.
2. Clean out stale sessions.
3. Check whether the agent is alive.
4. Verify that the ODI repository and the server hosting it are running and have not experienced a failure.
5. If your load plan is in error and you have verified all of the above, restart the load plan.
A load plan must be restarted when it has stopped with an error. An alternate case where restart may be required
is when a load plan is not doing anything at all (for example, when a load plan is executed and nothing has
changed after 30 minutes). Use the following checklist to assist in troubleshootinga nonresponsiveload plan:
1. Check the maximum number of sessions set to run against the agent. In ODI Operator, verify that the number
of sessions running is equal to the maximum. If so, then the other sessions are waiting for the running sessions
to complete. Proceed to the next step.
2. Clean out stale sessions. Stale sessions are sessions that are incorrectlyleft in a running state after an agent or
repository crash. If an agent crashes or loses its connectionto the repository after it has started a session, it is not
able to update the status of the session in the repository,and such a session becomes stale. Until the stale session
is cleaned, it shows up as running in the repository but actuallyis not.
3. Check whether the agent is alive. To test the agent to see whether it is running and still has a connection to
the repository,open it in the Topology Navigator in ODI Studio and select the Test tab. If the agent test fails,
restart the agent after fixing the issue.
4. Verify that the ODI repository and the server hosting it are running and have not experienced a failure.
5. If your load plan is in error and you have verified all of the above, restart the Load plan.
85) Using Mark as Complete
When you mark a load plan step as complete, it ensures that when the load plan is restarted, the marked step is
not executed.
In most cases, the load plan restart method described earlier in this lesson is the recommended approach. This
approach ensures data integrity and leaves no scope for manual error. However, at times you may want to run a
load plan step manually. For example, if a step is inserting duplicate records that are causing failure, rerunning
the step would still insert duplicates.In such a case, you may need to manually correct the data outside of the
load plan and then skip that step when you restart the load plan.
For this kind of situation,you can use the Mark as Complete option. When you mark a load plan step as
complete, it ensures that when the load plan is restarted, the marked step is not executed. It is then the
responsibility of the person making this setting to ensure that the load for that step is carried out outside the load
plan.
To mark a step as complete, right-click the step and select Mark as Complete. This can be done at the
scenario step or at any step higher than that. Marking a step complete at a higher levelin the step hierarchy
means that none of the child steps under that parent step is executed upon load plan restart, even if it is
otherwise eligible. For this reason, marking a step as complete should be treated as an advanced task and must
be done only with a full understanding of its impact. There is no single recommendation that pertains in all
cases, so the setting must be done carefullyand only on a case-by-case basis.
86) Running a Stand-Alone Scenario
You can fix a failed scenario step and run it individually outside the load plan to complete the load.
When you are monitoring a load plan, you may not know how to completelyfix a scenario-step failure, but may
wish to use the Mark as Complete option for the failed scenario step instead of waiting for complete
resolution.This prevents a step failure from precluding an entire load plan from completing, while allowing you
to inform the ETL team about the failed scenariostep so that they can work on a solution. The ETL team might
then fix the scenario and want to run it stand-alone outside the load plan to complete the load. You can use
ODI Studio or ODI Console to run a stand-alone scenario. The slide shows how to run a stand-alone scenario by
using ODI Console.
In ODI Console, navigate to Runtime > Scenarios/Load Plan Executions and select the scenario.
Click the Execute button. You can also right-click the scenarioand select Execute.
As in marking a step as complete, running a stand-alone scenario should be treated as an advanced task and the
person running the scenario must be aware of the following:
A scenariorun outside of a load plan by itselfinvokes the Table Maintenance process. This could, depending
on the setting, truncate the table before the load.
A scenariostep could have many variable values set, either dynamicallyin the case of a refresh variable or
explicitly by overriding its value at that scenario step in the load plan. When running a scenariooutside the load
plan, all the scenariovariables would have only their default values. For this reason, care should be taken to set
the variables appropriately before calling a scenario from outside the load plan.
87) Managing Load Plans: Fact Groups Tab
Use this tab to view the fact groups associated with a load plan selected in the Load Plans list.
Use this tab to view the fact groups associated with a load plan selected in the Load Plans list. The fact groups
displayed may belong to a hierarchy of fact groups. You can expand the fact group node to view the hierarchy.
If a fact group is a child of another fact group in a hierarchy, it appears twice in the tree table, because it is
associated with both the functional area and the parent fact group.
88) Managing Load Plans: Data Load Parameters Tab
Use this tab to view and edit the data load parameters associated with a load plan selected in the Load Plans list.
Use this tab to view and edit the data load parameters associated with a load plan selected in the Load Plans list.
The Data Load Parameters list includes both application-specificand global parameters. Application-specific
parameters are associated with one or more fact groups included in the load plan definition. Global parameters
apply to all applications and can also be associated with specificfact groups. Key points to note about the Data
Load Parameters tab:
If a listed parameter requires a value but a value has not been assigned, the respective row in the table is
tagged with an error icon. Parameters that do not require a value (value can be null) are not tagged even if no
value has been assigned.
You can filter the list of parameters to display only the data load parameters that have no value by using the
Show drop-down list in the toolbar.
You can export and save content displayed in the table to a Microsoft Excel formatted fileby clicking the
Export icon on the toolbar.
You can change a parameter value by selecting the parameter in the list, and then clicking the Edit icon on the
toolbar. The Edit Parameter Value dialog box is displayed. To change a parameter value, the user must have
been assigned a role that has the appropriate privilege.
89) Managing Load Plans: Domains and Mappings Tab
Use the Domains and Mappings tab to view and edit domains and mappings related to a load plan selected in
the Load Plan list.
Use the Domains and Mappings tab to view and edit domains and mappings related to a load plan selected in
the Load Plan list. The domains and mappings are associated with the fact group included in the load plan
definition.Key points to note about the Domains and Mappings tab:
If a source domain in the list contains members that have not been mapped to an appropriatewarehouse
domain member, the row in the table is tagged with an error icon. Some source domain members are not
applicable, and, therefore, are not tagged even if they are unmapped.
You can filter the list of mappings to display only the domains that have unmapped source members using the
Show drop-down list in the toolbar.
You can export and save content displayed in the table to a Microsoft Excel formatted fileby clicking the
Export icon on the toolbar.
You can change a domain mapping by selecting the mapping in the list and then clicking the Edit icon on the
toolbar. The Edit Domain Member Mappings dialog box is displayed. To change a domain member mapping,
the user must have been assigned a role that has the appropriateprivilege.
90) Managing Load Plans: Schedules Tab
Use the Schedules tab to view, create, edit, and delete schedules for the execution of a load plan.
Use the Schedules tab to view, create, edit, and delete schedules for the execution of a load plan. A load plan
schedule includes the following required properties:
Context: The ODI context to be used when the load plan is run (Note that Global is the only supported
context.)
LogicalAgent: The ODI Agent to be used when the load plan is run Recurrence: The frequency of occurrence
Status: The status of the schedule
Scheduled Time: The date and time the load plan is to be executed
91) Resetting the Data Warehouse
Resetting the data warehouse truncates the W_ETL_LOAD_DATEStable and ensures that the subsequent load
will truncate all target tables and do a fresh full load.
Select Actions > Execute Reset Data WarehouseScenario. This command resets the data warehouse by
truncating the W_ETL_LOAD_DATEStable. This ensures that the subsequent load will truncate all target
tables and do a fresh full load.
In the Execute Reset Data Warehouse Scenario Dialog, set the Context, Agent, and ODI Work Repository and
click OK.
Quiz
Which of the following does not describe a load plan?
a. Generated to build it as an object in the ODI repository
b. An executable object that comprises and organizes the child objects that carry out the ETL process
c. Queried to perform the ETL process
d. Defined in OBIA Configuration Manager
Answer: c
Quiz
A load plan must be generated successfully before it can be executed.
a. True b. False
Answer: a
Quiz
When you restart a load plan after a failure, you must restart from the exact point of failure.
a. True b. False
Answer: b
When you restart a load plan after a failure, you may not be able to restart from the exact point of failure,
depending on where it occurred and on dependencies between load plan steps. The goal of restartability is that
the result of the load plan execution is the same regardless of any load plan failure.
Quiz
Which of the following should you use to troubleshoot a non-responsive load plan?
a. Verify that the ODI repository and the server hosting it are running and have not experienced a failure.
b. Check whether the agent is alive.
c. Verify that Scenario steps are set to Restart from Failed Child.
d. Check the maximum number of sessions set to run against the agent.
e. Clean out stale sessions.
Answer: a, b, d, e
Quiz
Which of the following are alternate options for restarting load plans?
a. Using Mark as Incomplete.
b. Running a stand-alone session. c. Using Mark as Complete.
d. Running a stand-alone scenario.
Answer: c, d
92) Customization
In Oracle Business Intelligence Applications, customization is defined as changing the preconfigured behavior
to enable you to analyze new information in your business intelligence dashboards.
To accommodate new data, preconfigured OBIA objects can be modified:
Oracle Business Analytics Warehouse (OBAW) star schema objects
Oracle Data Integrator (ODI) ETL metadata objects OBIA metadata repository
OBIA presentation objects (dashboards)
The focus of this lesson is the customization of OBAW star schema objects and ODI ETL metadata objects.
In Oracle Business IntelligenceApplications, customization is defined as changing the preconfigured behavior
to enable you to analyze new information in your business intelligence dashboards.
Oracle Business Intelligence Applications ship with a collection of preconfigured objects, includingstar schema
tables, a metadata repository, presentation objects (dashboards),and OracleData Integrator (ODI) ETL metadata.
These preconfiguredobjects may not completely accommodate the analysis needs of a business. To
accommodate new data, the preconfigured objects can be modified.
Customization is the process by which the Oracle Business Analytics Warehouse (OBAW) objects, Oracle
Business Intelligence Applications metadata and presentation objects, and ODI ETL metadata objects, are
modified to accommodate new data for analysis.
This lesson describes concepts and techniques for customizing the ETL functionality in Oracle Business
Intelligence Applications. This includes the customization of OBAW star schema objects and ODI ETL
metadata objects. These objects are customized using ODI Studio.
93) Customization Categories
Categories are grouped by category based on data source type and modification type.
The type of data source that you have determinesthe type of customization that you can do. Data sources can be
one of the following types:
Packaged applications (for example, OracleE-Business Suite), which use prepackaged adapters
Nonpackaged data sources, which use the Universal adapter Customizations are grouped into the following
categories:
Category 1: In a Category 1 customization, you add additionalcolumns from source systems that have
prepackaged adapters and load the data into existing Oracle Business Analytics Warehouse tables.
Category 2: In a Category 2 customization, you use prepackaged adapters to add new fact or dimension tables
to the Oracle Business Analytics Warehouse. Category 2 customizationsnormally require that you build new
source-dependentextract (SDE) and source-independent load (SIL) mappings.
Category 3: In a Category 3 customization, you use the Universaladapter to load data from sources that do not
have prepackaged adapters.
Thislesson and the next two lessons focus on Category 1 and Category 2 customizations. For more information
about Category 3 customizations, refer to the Oracle Fusion Middleware Administrator's Guide for Oracle
Business Intelligence Applications.
94) Category 1 Customization: Overview
Category 1 customizations add additional columns from source systems that have prepackaged adapters and
load the data into existing Oracle Business Analytics Warehouse tables.
Category 1 customizations involve extracting additionalcolumns from source systems for which prepackaged
adapters are included (for example, Oracle E-Business Suite) and loading the data into existing Oracle Business
Analytics Warehousetables.
For Category 1 customizations, data can also come from nonpackaged sources, but this assumes that the sources
have already been mapped with a Universaladapter and only need to be extended to capture additionalcolumns.
In the example in the slide, you would use the DESCRIPTIONcolumn in the HZ_LOCATIONS table in the
source system to capture data related to locations. You would then run a load plan to load the data into a custom
column, X_DESCRIPTION, in the W_GEO_DSdimensionstaging table in the data warehouse, and
ultimatelyinto a custom column, X_DESCRIPTION, in the W_GEO_Ddimension table in the data warehouse.
95) Category 1 Customization: Extending Mappings and Tables
Existing mappings and tables are extensible.
Sample placeholders demonstrate how to pass and store additional data.
OBIA provides a methodology to extend preconfigured mappings to include additional columns and load the
data into existing tables.
Do not modify existing logic or columns.
Copy objects to custom folders before modifying.
In order to see additionalcolumns in the OracleBusiness Analytics Warehouse,the columns must first be passed
through the ETL process. The existing mappings and tables are extensible. Oracle Business Intelligence
Applications provides a methodology to extend preconfigured mappings to include these additionalcolumns and
load the data into existing tables.
Oracle Business Intelligence Applications recognizes two types of customization: extension and modification.
The supported extension logic allows you to add to existing objects. For example, you can extract
additionalcolumns from a source, pass them through existing mappings, and populate new columns added to an
existing table.
Generally, Oracle Business Intelligence Applications does not allow you to modify existing logic or columns.
You should copy existing logic to custom folders, and then modify it. You should not change existing
calculations to use differentcolumns, and you should not remap existing columns to be loaded from different
sources.
96) Category 1 Customization: Safe Path
Most datastores have a single placeholder column named X_CUSTOM.
Each ETL task has mapping expressions to populate this column, which marks a safe path through the ETL
task.

Most datastores have a single placeholdercolumn named X_CUSTOM. Each ETL task has mapping expressions
to populate this column, which marks a safe path through the ETL task. These serve as templates for
customizing ODI datastores and interfaces. When creating new custom columns, follow the naming convention
of including the X_ prefix to help distinguish custom columns.
In the figure in the slide, the preconfiguredlogic is shaded in gray. You should not modify anything contained
within these objects. You should add customizations to existing objects rather than creating new packages and
interfaces,which allows them to run parallelto the existing logic.
97) Category 1 Customization: Steps
1. Create custom SDE and SIL folders in ODI Studio.
2. Create a version of the task folder to be customized.
3. Copy the preconfigured task folder to the custom folder.
4. Create versions of the copied task folder.
5. Create versions of the model. 6. Edit the target datastores.
7. Map the new column in the interface. 8. Generate DDL scripts.
9. Execute the DDL procedure.
10. Modify the scenario naming convention. 11. Generate scenarios.
12. Generate a load plan.
13. Open the generated load plan. 14. Update the load plan step.
15. Execute the load plan.
16. Verify that the data is loaded.

The most common reason for extending the OracleBusiness Analytics Warehouseis to extract existing columns
from a source system and map them to an existing Oracle Business Analytics Warehouse table (either fact or
dimension). This type of change typicallyrequires you to extend the interfaces within an SIL package. If the
data is coming from a packaged source, then you will also need to extend the interfaces within an
appropriateSDE adapter package. If the data is coming from a nonpackaged source, then you must use a
Universal adapter package. If an appropriate package does not already exist, you will need to create a
Universaladapter package with interfaces.
This slide and the next slide list the typical steps needed for a Category 1 customization. The steps are provided
here to give you a high-leveloverview of the Category 1 customization process. Do not be concerned with the
step detailsat this point. These steps are covered in more detail in Lesson 11: Building a Category 1
Customization.
98) Category 2 Customization: Overview
Category 2 customizations use prepackaged adapters to add new fact or dimension tables to the Oracle
Business Analytics Warehouse.
Category 2 customizations normally require that you build new source-dependent extract (SDE) and source-
independent load (SIL) objects.
In a Category 2 customization, you use prepackaged adapters to add new fact or dimension tables to the Oracle
Business Analytics Warehouse. Category 2 customizations normally require that you build new SDE and SIL
mappings.
A typical Category 2 customization involves building entirely new tables that will be loaded with data from a
source table that is not already extracted from. For example, you might want to create a new PARTNER
dimension table. In this case, you create new dimension and staging tables as well as new extract and load ETL
mappings.
In the example in the slide, you would use the PARTNER table in the source system to capture data related to
partners. You would then load the data into a new dimension staging table and subsequentlya new dimension
table in the data warehouse.
99) Category 2 Customization: Required Columns
For staging tables:
INTEGRATION_ID, DATASOURCE_NUM_ID For dimension and fact tables:
INTEGRATION_ID, DATASOURCE_NUM_ID, ROW_WID, ETL_PROC_WID

When you create a new dimension or fact table, use the required system columns that are part of each of the
Oracle Business Analytics Warehouse tables to maintain consistency and enable you to reference existing table
structures.
For staging tables, the following columns are required:
INTEGRATION_ID:Stores the primary key or the unique identifierof a record as in the source table
DATASOURCE_NUM_ID: Stores the data source from which the data is extracted
For dimension and fact tables, the required columns are the INTEGRATION_IDand
DATASOURCE_NUM_ID columns as well as the following:
ROW_WID:A sequence number generated during the ETL process, which is used as a unique identifierfor the
Oracle Business Analytics Warehouse
ETL_PROC_WID:Stores the ID of the ETL process information
100) Category 2 Customization: Steps
1. Create new tables in the OBAW.
2. Import the custom tables (datastores) into ODI.
3. Move the datastores to the appropriate submodels.
4. Set properties for the datastores.
5. Create an ODI sequence for the custom dimension.
6. Create custom SDE and SIL tasks to load the dimension.
7. Extend the fact staging datastore.
8. Extend the fact datastore.
9. Add a foreign key constraint to the fact table.
10. Add a non-unique bitmap index to the fact table.
11. Create custom SDE and SIL tasks to load the fact table.
This slide lists the typical steps needed for a Category 2 customization where a new dimension table is added to
the OBAW. The steps are provided here to give you a high-level overview of the Category 2 customization
process. Do not be concerned with the step details at this point. These steps are covered in more detail in Lesson
12: Building a Category 2 Customization.
101) Additional Customization Considerations
Understanding the DATASOURCE_NUM_IDcolumn
Understanding the impact of patches on customizations Using custom folders
Applying an update strategy Creating indices
Applying naming conventions Using Configuration Manager
102) Understanding the DATASOURCE_NUM_IDColumn
DATASOURCE_NUM_IDis part of the unique user key for all tables in the warehouse schema.
DATASOURCE_NUM_IDpermits rows to be loaded in the same warehouse tables from different sources,
provided that the column is given a different value for each source.
The tables in the Oracle Business Analytics Warehouse schema have DATASOURCE_NUM_ID as part of their
unique user key. While the transactionalapplication normallyensures that a primary key is unique, it is possible
that a primary key is duplicated between transactional systems.
To avoid problems when loading this data into the data warehouse, uniqueness is ensured by includingthe
DATASOURCE_NUM_ID as part of the user key. This means that the rows can be loaded in the same data
warehouse tables from different sources if this column is given a differentvalue for each data source.
DATASOURCE_NUM_IDis maintained in ODI. Make sure that each source system has a unique value
assigned to it. It is possible to have multiple instances of the same source system (for example, a U.S.-based and
a European-basedOracle transactionaldatabase both loading into the same data warehouse). The two different
transactionaldatabase systems should be assigned different DATASOURCE_NUM_IDvalues in ODI.
103) Understanding the Impact of Patches on Customizations
In some cases, you must reapply a customization to an object that has been patched.
ODIs version-compare utility identifies the changes introduced by the patch.
You only need to reapply customizations to mappings that have been changed by the patch.

In some cases, you must reapply a customization to an object that has been patched. For example, if you
installan Oracle Business Intelligence Applications patch that modifies the Supply Chain and Order
Management application,you might need to manuallyreapply customizationsthat you have made to the Supply
Chain and Order Management application.
As part of customizing an ETL task (including interfaces and package under a specifictask folder),you copy the
task folder to be customized, version the original,and version the copy. Any patches are applied to the current
version of the originaltask. Leverage ODIs version-compare utilityto identify the changes introduced by the
patch.
The copy is also versioned so that any changes introducedcan be isolated. After a patch, compare any changes
with those introducedby the patch and verify that there is no conflict, and then manuallyapply the same changes
introduced by the patch to the customized ETL tasks.
A patch only installschanged repository objects, not the entire ODI WorkRepository. Therefore, you only need
to reapply customizations to mappings that have been changed by the patch.
For example, if a patch only modifies the Supply Chain and Order Management application, you only need to
manually reapply customizations that you have made to the Supply Chain and Order Management
application.Customizations in other applications would not be affected by the patch.
104) Using Custom Folders
On the Designer tab in ODI Studio, select BI Apps Project > Mappings > New Sub-Folder to create custom
SDE and SIL folders.
You use ODI Studio to customize ETL objects. If you want to make changes to preconfigured ODI objects,
create a custom folder and make the changes in it. Do not change objects in any of the preconfigured folders
unless explicitlydirected by Oracle. This is because preconfigured folders and the objects within them may be
overwritten in future upgrades. Using custom folders is not required, but it is strongly recommended and
considered a best practice to make identifying customized content easier.
The preconfigured ODI repository does not include any custom folders. You must create your own. You should
create a custom folder for each prepackaged SDE Adaptor folder that you have deployed that will have
customizations.In the example in the slide, a custom folderis created for the SDE_ORA11510_Adaptor folder.
You should also create a separate custom folder for customizations that you want to make to objects in the
SILOS folder. Do not store customized SDE and SIL objects in the same folder. In the example in the slide, a
CUSTOM_SILOS folder is created to hold custom SIL objects.
105) Applying an Update Strategy
Design a custom process to detect new and modified records.
The process should be designed to load only changed data.
If data is loaded without an incremental process, the previously loaded data will be updated again.
Example logic in preconfigured SIL mappings:
1. A mapping looks up destination tables based on INTEGRATION_IDand DATASOURCE_NUM_ID.
2. If the combination exists, ROW_WIDis returned and the record is updated.
3. If the combination does not exist, lookup returns NULL and a record is inserted.
For loading new fact and dimension tables, design a custom process on the source side to detect the new and
modified records. The SDE process should be designed to load only the changed data (new and modified). If the
data is loaded without the incrementalprocess, the data that was previouslyloaded will be erroneously updated
again.
For example, the logic in the preconfiguredSIL mappings looks up the destination tables based on the
INTEGRATION_IDand DATASOURCE_NUM_IDand returns the ROW_WIDif the combination exists, in
which case it updates the record. If the lookup returns NULL, it inserts the record instead. In some cases, last
update date(s) stored in target tables are also compared in addition to the columns specified above to determine
insert or update. Look at the similar mappings in the preconfiguredfolder for more details.
106) Creating Indices
Define indices to improve query performance. Staging tables typically do not require indices.
Create indices on all columns that the ETL uses for dimensions and facts. For example:
ROW_WIDs of dimensions and facts INTEGRATION_ID
DATASOURCE_NUM_ID
Flags
Carefully consider on which columns to put filter conditions.
Inspect preconfigured objects for guidance.
Register indices in the appropriate ODI model..
107) Using Naming Conventions
Name all the newly created tables with the prefix WC_.
This helps to visually isolate the new tables from the shipped tables.
Keep good documentation of the customizations done. This helps when upgrading your data warehouse.

108) Using Configuration Manager


After making changes to ETL objects in ODI Studio, you can use OBIA Configuration Manager to modify,
generate, execute, and monitor load plans that contain the customized objects.
Quiz
Identify the correct statement about a Category 1 customization:
a. You use prepackaged adapters to add new fact or dimension tables to the Oracle Business Analytics
Warehouse.
b. You use the Universal adapter to load data from sources that do not have prepackaged adapters.
c. You add additional columns from source systems that have prepackaged adapters and load the data into
existing Oracle Business Analytics Warehouse tables.
Answer: c
Quiz
Identify the incorrect statement about the safe path in a Category 1 customization:
a. Each ETL task has mapping expressions to populate the X_CUSTOMcolumn, which marks a safe path
through the ETL task.
b. Add customizations to existing objects rather than creating new packages and interfaces
c. All new custom columns must be named X_CUSTOM.
d. When creating new custom columns, include the X_prefix to help distinguish custom columns.
Answer: c

Quiz
Which of the following is not a required column for dimension and fact tables:
a.INTEGRATION_ID.
b.DATASOURCE_NUM_ID c.W_ETL_RUN_S d.ROW_WID e.ETL_PROC_WID

Answer: c

Quiz
Which of the following statements about OBAW customization are not true?
a. Leverage Configuration Manager's version-compare utility to identify changes introduced by a patch.
b. ODI objects are customized using ODI Studio.
c. ODI objects are customized using ODI Configuration Manager.
d. The most common reason for extending the OBAW is to extract existing columns from a source system and
map them to an existing OBAW table.
e. Leverage ODIs version-compare utility to identify changes introduced by a patch.
Answer: a, c
109) Category 1 Customization: Adding Columns to Existing Fact or Dimension Tables
In a Category 1 customization, you add additional columns from source systems that have prepackaged adapters
and load the data into existing Oracle Business Analytics Warehouse tables.

A Category 1 customization is one of the three customization types that were initially presented in the previous
lesson titled Customizing the Oracle Business Analytics Warehouse. Customizations are categorized based
on the data source (packaged or nonpackaged) and the desired Oracle Business Analytics Warehouse (OBAW)
modification (additionalcolumns, tables, or rows). Category 1 customizations involve extracting additional
columns from source systems that are already mapped and loading the data into existing data warehouse tables.
110) Category 1 Customization: Scenario
Scenario: Extract data from a column in a table in the source database and load it into a custom column in a
dimension table in the OBAW.
Example:
Use the DESCRIPTIONcolumn in the HZ_LOCATIONStable in the source system to capture data related to
locations.
Load the data into a custom column, X_DESCRIPTION, in the W_GEO_Ddimension table in the data
warehouse.
This slide presents the scenario for a Category 1 customization used in this lesson. Data is extracted from a table
in a source transactionaldatabase and loaded into a custom column in a dimension table in the OracleBusiness
Analytics Warehouse.Thisscenario is used throughout the lesson and associated practices.
111) Category 1 Customization Steps
1. Create custom SDE and SIL folders in ODI Studio. 2. Create a version of the task folder to be customized.
3. Copy the preconfigured task folder to the custom folder. 4. Create versions of the copied task folder.
5. Create versions of the model.
6. Edit the target datastores.
7. Map the new column in the interface.
8. Generate DDL scripts.
9. Execute the DDL procedure.
10. Modify the scenario naming convention.
11. Generate scenarios.
12. Generate a load plan.
13. Open the generated load plan.
14. Update the load plan step.
15. Execute the load plan.
16. Verify that the data is loaded.

1. Create Custom SDE and SIL Folders in ODI Studio.


On the Designer tab in ODI Studio, select BI Apps Project > Mappings > New Sub-Folder to create custom
SDE and SIL folders.

If you want to make changes to preconfiguredODI objects, you must create a custom folder and make the
changes in it. Do not change objects in any of the preconfigured folders unless explicitly directed by Oracle.
This is because preconfigured folders and the objects within them may be overwritten in future upgrades.
The preconfigured ODI repository does not include any custom folders. You must create your own. You should
create a custom folder for each prepackaged SDE Adaptor folder you have deployed that will have
customizations.In the example in the slide, a custom folder is created for the SDE_ORA11510_Adaptor folder.
You should also create a separate custom folder for customizations that you want to make to objects in the
SILOS folder. Do not store customized SDE and SIL objects in the same folder.
The customization steps in the slides that followuse SDE objects as examples. The steps apply to SIL objects as
well.
2. Create a Version of the Task Folder to be Customized.
Before you begin customization, create a version of the preconfigured task folder to be customized.

Before you begin customization, enable versioning for the preconfiguredtask folder to be customized. The
version comment should indicate that this is the base (original)version of the task. Subsequent patches applied
to this task in the future would require increasing the version in the comment so that it can be compared to the
originaltask to identify any changes. To create a version, right-click the task folder and select Version > Create
Version to open the Version dialog box. In the example in the slide, a version number and description have been
enabled for the SDE_ORA_GeographyDimension_HZLocations task folder.
3. Copy the Preconfigured Task Folder to the Custom Folder.
Before you begin customization, duplicate the task folder to be customized by copying it to the custom folder.

Duplicate the task folder to be customized by copying it. Paste the copied task folder to the custom folder, and
rename it by removing the 'Copy of' prefix.
In this example in the slide, you copy the preconfigured task folder,
SDE_ORA_GeographyDimension_HZLocations, in the preconfiguredadaptor folder,
SDE_ORA11510_Adaptor,and then paste the copied task folder to the
CUSTOM_SDE_ORA11510_Adaptorfolder. Thiscreates a new task folder named Copy of
SDE_ORA_GeographyDimension_HZLocations,which you rename to
SDE_ORA_GeographyDimension_HZLocations.
4. Create Versions of the Copied Task Folder.
Before you begin customization, create versions of the copied task folder to be customized.
Before you begin customization, enable versioning of the copied task folder to be customized. The version
comment should indicate that this is the original version. This versioning enables comparison of the customized
task to a copy of the originalversion to determine all changes that have been introduced.
Create another version of the copied task comment so that it can be compared to the original task to identify any
changes. The version comment should indicate that this is the customized version. To enable versioning,right-
click the task folder and select Version > Create Version to open the Version dialog box.
In the example in the slide, a version number and description have been enabled for both the originalversion and
the customized version of the SDE_ORA_GeographyDimension_HZLocations task folder in the
CUSTOM_SDE_ORA11519_Adaptor folder.
5. Create Versions of the Model.
Before you begin customization, create versions of the model in which the datastores to be customized exist.
Version the model in which the datastores to be customized exist. Submodels and datastores cannot be
versioned. The version comment should indicate that thisis the base or original version.
Create another version of the model, with a version comment indicatingthat this is where customizationsare
introduced. The models can now be compared to show differences.If the model ever needs to be patched, the
model should be versioned again so that the patched version can be compared to the custom and originalversion.
The example in the slide shows navigation to ODI Designer > Models > OBIA (folder) > OBIA(model). A
version number and descriptionhave been enabled for both the originalversion and the customized version of the
OBIA model. Thismodel has the datastores that you will customize in a later step in this lesson.
6. Edit the Target Datastores.
Before you apply customizations to a task, edit the target datastores to include the required column.

Before you apply customizations to a task, edit the target datastores to include the required column. In the
example in the slide, you navigate to ODI Designer > Models > OBIA (folder) > OBIA(model) > Dimension
Stage and then edit the W_GEO_DSdimension staging table to include the custom column, X_DESCRIPTION.
7. Map the New Column in the Interface.
Some task folders have multiple interfaces,each of which must be customized with new mappings. The example
in the slide shows how to map the custom column, X_DESCRIPTION, in the W_GEO_DSinterface for the
SDE_ORA_GeographyDimension_HZLocationstask folder. This assumes that a similar mapping has already
been completed for the W_GEO_DS_SQ_HZ_LOCATIONSinterface, which contains the source,
SQ_HZ_LOCATIONS,for the W_GEO_DSinterface.
To map the custom column in the W_GEO_DSinterface, perform the following steps:
1. Expand Projects > BI Apps Project > Mappings > CUSTOM_SDE_ORA11510_Adaptor >
SDE_ORA_GeographyDimension_HZLocations > Interfaces.
2. Double-clickthe W_GEO_DSinterface to open it in the editor. 3. Click the Mapping tab at the bottom of the
editor.
4. Drag the DESCRIPTIONcolumn from the HZ_LOCATIONSsource datastore to the custom column,
X_DESCRIPTION, in the W_GEO_DStarget datastore.
Notice that the mapping indicates both the table and the column from which it comes:
SQ_HZ_LOCATIONS.DESCRIPTION.
8. Generate DDL Scripts.
Generate DDL scripts to synchronize changes between the ODI model and the data warehouse.
When a diagramor data model is designed or modified in ODI, it is necessary to synchronize changes between
the ODI model and the data warehouse. This operation is performed with DDL scripts.
For example, you need to synchronize the X_DESCRIPTIONcustom column that you created in the BI
Applications ODI model with the physical tables in the data warehouse. The example in the slide shows the
steps to synchronize the X_DESCRIPTIONcolumn with the W_GEO_DS table:
1. To generate DDL scripts, right-click the OBIA model folder and select Generate DDL.
2. When processing completes, the Generate DDL Editor appears with the differences detected.
3. Click the check box in the Synchronizationcolumn for the objects that you want to synchronize. The example
in the slide shows a check box in the Synchronizationcolumn for X_DESCRIPTION.
4. When you click OK to generate the DDL script, ODI generates the DDL scripts in a procedure, opens the
procedure in the editor in the right pane, and stores the procedure in a designated location.
9. Execute the DDL Procedure.
Execute the DDL procedure to run the DDL scripts.

Navigate to the location in Designer where the DDL procedure is stored and execute the procedure. Use the
Operator tab to monitor the procedure and verify that it completes successfully. Use a SQL query tool to
confirm that the physicaltables are modified as expected. In the example in the slide, the
X_DESCRIPTIONcustom column has been added to both the W_GEO_DSdimensions staging table and
W_GEO_Ddimension table in the data warehouse.

10. Modify the Scenario Naming Convention.


Prior to generating scenarios, ensure that the Scenario Naming Convention user parameter has a value of
%FOLDER_NAME(2)%_%OBJECT_NAME%.

Prior to generating scenarios, ensure that the Scenario Naming Convention user parameter has a value of
%FOLDER_NAME(2)%_%OBJECT_NAME%.This ensures that generated scenarios are easily identified as
custom scenarios because the custom folder is included in the naming convention.
1. In ODI Studio, select ODI > User Parameters.
2. Scrollto locate the Scenario Naming Convention parameter.
3. Change the value to %FOLDER_NAME(2)%_%OBJECT_NAME%from the default value
%OBJECT_NAME%.
Newly generated scenarios now will be named accordingto the modified Scenario Naming Convention user
parameter.
11. Generate Scenarios.
Generate scenarios for any new custom objects.

When a component is finished and tested, you can generate the scenariocorrespondingto its actual state.
Generating a scenario for an object compiles the code for this object for deployment and execution in a
production environment. When a set of packages, interfaces, procedures, and variables grouped under a project
or folder is finished and tested, you can generate a group of scenarios.
Generate scenarios for any new custom adaptors,using the option to generate the scenario as if all underlying
objects were materialized.The scenariowill be generated reflecting the custom adaptor name. In the future, if
you make changes to any of the interfacesor the package, you can either regenerate the existing scenario or
generate a new scenario.
The example in the slide shows how to create a group of scenarios for the objects in the
SDE_ORA_GeographyDimension_HZLocationscustom task folder.
In the Scenario Generationdialog box, select the Creation generation mode. This creates for each object a new
scenariowith the same name as the last scenario version and with an automatically incremented version number.
If no scenarioexists for an object, a scenario named after the object with version number 001 is created. Select
Generate scenario as if all underlying objects are materialized.In the Objects to Generate section, select
Packages, Interfaces, and Procedures. Scenarios are generated for all underlyingobjects with the naming
convention set in User Parameters.
12. Generate a Load Plan.
Use known techniques to create and generate a load plan.
Use the techniques you learned in Lesson 9, Managing Load Plans, to create and generate a load plan. In the
example in the slide, the GL Revenue SDE Custom load plan has been generated successfully.
13. Open the Generated Load Plan.
Open the generated load plan in ODI Studio Designer and locate the step that you want to update.
Open the generated load plan in the Designer editor and use the search field to locate the step that you want to
update. In thisexample, navigate to Designer > Load Plans and Scenarios > Generated Load Plans and open the
GL Revenue SDE Custom load plan. Use the Search field to locate the step
SDE_ORA11510_ADAPTOR_SDE_ORA_GEOGRAPHYDIMENSION_HZLOCATIONS.
14. Update the Load Plan Step.
Update the load plan step in the generated load plan to reference the custom scenario.
15. Execute the Load Plan.
Use known techniques to execute and monitor the load plan.
Use the techniques you learned in Lesson 9, Managing Load Plans, to execute and monitor the load plan. In
the example in the slide, the GL Revenue SDE Custom load plan has been executed successfully.
16. Verify That the Data Is Loaded.
Use a SQL query tool to verify that the data loaded as expected.

Quiz
Which of the following statements about customization is not true?
a. If you want to make changes to preconfigured ODI objects, you must create a custom folder and make the
changes in it.
b. Preconfigured folders and the objects within them are typically not overwritten in future upgrades.
c. The preconfigured ODI repository does not include any custom folders.
d. You should create a custom folder for each prepackaged SDE Adaptor folder that you have deployed that will
have customizations.
Answer: b
Quiz
Which of the following should be versioned before beginning customization?
a. The task folder to be customized b. The copied task folder
c. The model
d. The target datastores
Answer: a, b, c
Quiz
You should not create a separate custom folder for SILOS customizations. Store customized SDE and SIL
objects in the same folder.
a. True b. False
Answer: b
You should create a separate custom folder for customizations that you want to make to objects in the
SILOS folder. Do not store customized SDE and SIL objects in the same folder.

Quiz
Generate DDL scripts to:
a. Edit the target datastores
b. Create mappings in interfaces
c. Synchronize changes between the ODI model and the data warehouse
d. Modify the scenario naming convention

Answer: c
112) Category 2 Customization: Adding New Fact or Dimension Tables
In a Category 2 customization, you use prepackaged adapters to add new fact or dimension tables to the
Oracle Business Analytics Warehouse.
Category 2 customizations typically require that you create new source-dependent extract (SDE) and source-
independent load (SIL) objects.
A Category 2 customization is one of the three customization types that were initially presented in Lesson 10,
Customizing the OracleBusiness Analytics Warehouse. Customizations are categorized based on the data
source (packaged or nonpackaged) and the desired Oracle Business Analytics Warehouse (OBAW)
modification (additionalcolumns, tables, or rows). Category 2 customizations involve using prepackaged
adapters to add new fact or dimension tables to the Oracle Business Analytics Warehouse.
113) Category 2 Customization Scenario
Scenario: Extract data from a table in the source database and load it into a new dimension table in the data
warehouse.
Example:
Use the PARTNERtable in the source system to capture data related to partners.
Load the data into a new dimension staging table and subsequently a new dimension table in the data
warehouse.
Thisslide presents a scenarioand example for the Category 2 customization used in this lesson. Data is extracted
from a table in the source transactional database, loaded into a new dimension staging table, and ultimately
loaded into a new dimension table in the Oracle Business Analytics Warehouse.This example is used
throughout the lesson and associated practices.
114) Category 2 Customization Steps
1. Create new tables in the OBAW.
2. Import the custom tables into ODI.
3. Move the imported tables to the appropriate submodels.
4. Set properties for the datastores.
5. Create an ODI sequence for the dimension.
6. Create custom SDE and SIL tasks.
7. Extend the fact staging datastore.
8. Extend the fact datastore.
9. Add a foreign key constraint to the fact table.
10. Add a non-unique bitmap index to the fact table.
11. Modify an SDE task to load the fact staging table.
12. Create a custom SIL task.
1. Create New Tables in the OBAW.
Create a new dimension staging table and a new dimension table in the data warehouse.
Use the prefix WC_to help distinguish custom tables from tables provided by Oracle.
Include the required system columns.
In this example, you run a DDL script to manuallycreate a new dimension table, WC_PARTNER_D, and a new
dimension staging table, WC_PARTNER_DS,in the data warehouse based on the standard data warehouse
structure with the required system columns.
When creating a new custom table, use the prefixWC_to help distinguish custom tables from tables provided by
Oracle as well as to avoid naming conflicts in case Oracle later releases a table with a similar name.
Notice in the example in the slide that the dimension staging table contains the required columns:
DATASOURCE_NUM_ID and INTEGRATION_ID. The dimension table contains these two required columns
as well as the required columns ETL_PROC_IDand ROW_WID.
INTEGRATION_IDstores the primary key or the unique identifier of a record in the source table.
DATASOURCE_NUM_IDstores the data source from which the data is extracted. ETL_PROC_WIDstores
the ID of the ETL process information.
ROW_WIDis a sequence number generated during the ETL process, which is used as a unique identifierfor
the Oracle Business Analytics Warehouse.

2. Import the Custom Tables into ODI.


Use reverse engineering to import the custom tables into ODI.
In the set of practices for the previous lesson, Building a Category 1 Customization. you manuallycreated
columns in ODI and then generated DDL scripts to define the columns in the data warehouse. In this step, you
use a different technique. You manuallydefined the tables in the database in the previous step, and now you
import the table definitions into ODI using reverse engineering.
3. Move the Imported Tables to the Appropriate Submodels.
Imported tables are automatically placed in the Other submodel and must be moved into the appropriate
submodels.
When you use the reverse engineering technique,the imported tables are automatically placed in the Other
submodel and must be moved into the appropriatesubmodels. In the example in the slide, the
WC_PARTNER_DSdimension staging table is moved from the Other submodel to the Dimension Stage
submodel in the OBIA model.
Note: The specificsubmodel that a table belongs to drives the table maintenance behavior. For example, tables
in the Dimension Stage submodel will always be truncatedat each ETL run, while tables in the Dimension
submodel are truncated only during a Full ETL run. Do not create a Custom submodel to place your
datastores, because table maintenance will not be implemented properlyfor tables in such a submodel.
4. Set Properties for the Datastores.
Open the new datastores in ODI Studio Designer and set the OLAP type.
Open the new datastores in the ODI Studio Designer editor and set the OLAP type. In the example in the slide,
the OLAP type is set to Dimension for the WC_PARTNER_DSdatastore. Other OLAP types include Slowly
Changing Dimension and Fact Table.
5. Create an ODI Sequence for the Dimension.
Create an ODI sequence for the custom dimension to populate the ROW_WIDcolumn of the dimension.
In this example, name the sequence name to WC_PARTNER_D_SEQ. Generally, the Native sequence name
should match the ODI name unless this causes the name length to exceed 30 characters, in which case you can
shorten the name to meet this limit. This is the name of the database trigger created to populate the
ROW_WIDcolumn.
You also need to create the sequence in the data warehouse. You can do that manually by running SQL to create
the sequence in the database, or use Generate DDL in ODI to synchronizethe ODI model with the data
warehouse.
6. Create Custom SDE and SIL Tasks.
Create custom SDE and SIL tasks to populate the dimension staging table and dimension table in the data
warehouse.
Create custom SDE and SIL tasks in the custom SDE and SIL adaptor folders to populate the new dimension
staging table and dimension table in the data warehouse. Creating custom SDE and SIL tasks includes creating
new task folders, interfaces,mappings, and packages. You can use the SDE_ <Product Line
Code>_SampleDimension and SIL_SampleDimension tasks as a template. These sample tasks include the logic
required to populate the system columns.
In the example in the slide, the SDE_ORA_PartnerDimension task folder includes an
SDE_ORA_PartnerDimension package and an SDE_ORA_PartnerDimension.WC_PARTNER_DS interface.
These objects are used to extract data from the PARTNERsource table and populate the
WC_PARTNER_DSdimension staging table in the data warehouse.
The SIL_PartnerDimensiontask folder includes an SIL_PartnerDimension package and an
SIL_PartnerDimension.WC_PARTNER_D interface.These objects are used to extract data from the
WC_PARTNER_DSdimension staging table in the data warehouse and load it into the
WC_PARTNER_Ddimension table in the data warehouse.
7. Extend the Fact Staging Datastore.
Extend the fact staging datastore by adding an IDcolumn.
The fact related datastores and tasks must be extended to reflect the new dimension. In the example for this
lesson, both the W_GL_REVN_FSfact staging datastore and the W_GL_REVN_Ffact datastore must be
extended. The OBIAModel should already be versioned.
The example in the slide shows how to extend the W_GL_REVN_FS fact staging datastore by adding an
IDcolumn that follows the naming convention X_<name>_ID with data type VARCHAR2(80). In this example,
the new column is X_PARTNER_ID. The next slide shows how to extend the W_GL_REVN_F fact datastore.
8. Extend the Fact Datastore.
Extend the fact datastore by adding a _WIDcolumn.
Extend the GL Revenue (W_GL_REVN_F) fact datastore by adding a _WIDcolumn that follows the naming
convention X_<name>_WID with data type NUMBER(10). In this example, the new column is
X_PARTNER_WID.
9. Add a Foreign Key Constraint to the Fact Table.
Add a foreign key constraint to the fact table that refers to the custom dimension table created previously.
Add a foreign key constraint to the W_GL_REVN_Ffact table that refers to the WC_PARTNER_D custom
dimension table created previously. The naming convention is FK_<Fact Table>_<Dimension Table>.
In this example, the new constraint is named FK_W_GL_REVN_F_WC_PARTNER_D. The foreign key
constraint ensures that the custom SIL task is included in the generated load plan. The custom SDE task is
included in the generated load plan because it populates the staging table that is used as a source for the custom
SIL task.
10. Add a Non-Unique Bitmap Index to the Fact Table.
Add a non-unique bitmap index on the X_PARTNER_WID column.
Add a non-unique bitmap index on the X_PARTNER_WIDcolumn. The naming convention is <Fact
Table>_F<n>.In this example, the index is named W_GL_REVN_F_F99. Use the Description tab to enter the
name.
On the Columns subtab, add the X_PARTNER_WIDcolumn by using the shuttle button. On the Control subtab,
check the Defined in the Database and Active check boxes.
On the Flexfields subtab, deselect Default for OBI Index Type, and change the value to QUERY from ETL.
Confirm that the OBI Bitmap Index value is set to Y.
11. Modify an SDE Task to Load the Fact Staging Table.
Modify SDE_ORA_GLRevenueFact to pass PARTNER.ROW_WIDto W_GL_REVN_FS.X_PARTNER_ID.

In the example for thislesson, you modify a copy of the preconfigured SDE_ORA_GLRevenueFacttask folder
to pass the ROW_WIDvalue from the PARTNER dimension source table to the custom
X_PARTNER_IDcolumn in the W_GL_REVN_FS fact staging table. It is assumed that the preconfigured
SDE_ORA_GLRevenueFact task folder has been versioned and copied to the custom folder in ODI Designer.
The first step is to add a lookup in the W_GL_REVN_FS_SQ_GL_REVENUE_EXTRACT interface to retrieve
the ROW_IDfrom the PARTNERsource dimension table.
The next step is to create the mapping for the X_PARTNER_IDcolumn in the
W_GL_REVN_FS_SQ_GL_REVENUE_EXTRACT interface.The mapping is to PARTNER.ROW_IDvia the
lookup. Modify the mapping to convert the data type to VARCHAR2: TO_CHAR(LKP_PARTNER.ROW_ID).
The last step is to map the X_PARTNER_IDcolumn from the SQ_GL_REVENUE_EXTRACT source to
X_PARTNER_IDin the W_GL_REVN_FStarget datastore in the W_GL_REVN_FS interface.
12. Create a Custom SIL Task.
Modify the fact SIL task by adding logic to retrieve the ROW_WIDvalue from the custom dimension.
Modify the preconfigured SIL_GLRevenueFact fact SIL task by adding logic to retrieve the ROW_WIDvalue
from the WC_PARTNER_Dcustom dimension.
The first step is to add a new column to the SQ_W_GL_REVN_FSdatastore in the
W_GL_REVN_F_SQ_W_GL_REVN_FSinterface. The step for adding the column is not shown in the slide.
The next step is to add the WC_PARTNER_Ddimension as a source to the interface and define the mapping on
the X_PARTNER_WIDcolumn: WC_PARTNER_D.ROW_WID.
Next, create a join on the fact tables ID column and the dimension table's INTEGRATION_ID column and the
fact and dimension DATASOURCE_NUM_IDcolumns.
Finally, create the mapping in the main interface.Modify the expression to include a function that defaults
NULL values to 0: COALESCE(SQ_W_GL_REVN_FS.X_PARTNER_WID,0).
Complete the Remaining Steps
Generate DDL scripts to deploy the new ODI objects to the data warehouse.
Define and generate new load plans.
Generate scenarios for the custom tasks.
Update steps in the generated load plans to reference the custom scenarios.
Execute and monitor load plans to extract data from the source and load it into the data warehouse.
Query the data warehouse to confirm that tables are loaded with the expected data.

Quiz
Which of the following are not required system columns for a dimension staging table?
a.INTEGRATION_ID b.DATASOURCE_NUM_ID c.ETL_PROC_WID d.ROW_WID
Answer: c, d

Quiz
Which of the following are not steps used to reverse engineer a table in ODI?
a. Manually create columns in ODI.
b. Generate DDL scripts to define the table in the data warehouse.
c. Manually define the table in the database.
d. Import the table definition into ODI by using reverse engineering.
Answer: a, b
Quiz
You should create a Custom submodel to place your custom datastores so that table maintenance will be
implemented properly.
a. True b. False
Answer: b
Do not create a Custom submodel to place your datastores, because table maintenance will not be
implemented properlyfor tables in such a submodel.

Quiz
Which of the following statements does not apply to adding a foreign key constraint to the fact table?
a. Add a foreign key constraint to the fact table that refers to the custom dimension table.
b. The naming convention is FK_<Fact Table>_<Dimension Table>.
c. The foreign key constraint ensures that the custom SDE task is included in the generated load plan.
d. The foreign key constraint ensures that the custom SIL task is included in the generated load plan.
Answer: c
115) Optimizing Performance
Performance tuning is a balancing act involving hardware, the transactional schema, and the OBAW schema.
The graphic in the slide shows a performance triangle,which illustrates that performance tuning is a broad topic
that requires a balancing act involving hardware, the transactional schema, and the OBAW schema. All three
points of the performance triangle can be bottlenecks, and performance enhancement in one area can have an
impact on the performance of another area. Thislesson provides high-level recommendations for improving
performance related to ETL and queries in OBIA.Real-world usage patterns and ETL scheduling may demand
further testing and tuning.
116) Common Performance Bottlenecks
Common performance bottlenecks involve ETL and queries.
Thisis a high-level overview of the three areas where bottlenecks occur. Bottlenecks can include transactional or
OBAW schema issues (for example, index usage) or hardware issues (forexample, number of processors,
degree of parallelism, or I/O subsystems).
117) Performance Tuning Recommendations
Tuning underlying systems
Guidelines for Oracle Business Analytics Warehouse (OBAW) databases
Using a separate database for the OBAW
General guidelines for Oracle databases
Using Oracle template files
Configuring base and installed data warehouse languages
Minidimension tables
Aggregate tables
Creating indices
Prune Days
Oracle GoldenGate
Tuning Underlying Systems
Make sure that underlying systems are running optimally before you initiate performance tuning:
Multiprocessor systems Memory
Disks
Network links
OS memory management Native ODBC drivers
Database server configuration
Guidelines for Oracle Business Analytics Warehouse Databases
The following guidelines will help you set up the data warehouse physical database for performance and
growth:
Allocate around 5070 percent of the total available server memory to the database, assuming no other
application is running on the same server.
At a minimum, separate the data and index tablespaces.
Create more tablespaces to separate heavily used tables and their indexes.
Oracle recommends using 8k block size for Oracle warehouses.
If you are using multiple disk storage systems, stripe the tablespace containers and files across as many disks
as possible.
Guidelines for Oracle Business Analytics Warehouse Databases
Raw devices for tablespaces provide better performance as compared to cooked file systems.
RAID-5 is known to give a good balance of performance and availability.
Using a Separate Database for the Oracle Business Analytics Warehouse
Reasons for not putting the Oracle Business Analytics Warehouse in the same database as the transactional
database:
ETL is configured to maximize hardware resources. Therefore, the warehouse should not share any resources
with any other projects.
The analytical queries interfere with normal use of the transactional database, which is entering and managing
individual transactions.
The data in a transactional database is normalized for update efficiency; transactional queries join several
normalized tables and will be slow (as opposed to
pre-joined, de-normalized analytical tables).
Although it is technically possible to put the Oracle Business Analytics Warehouse in the same database as the
transactionaldatabase,it is not recommended for performance reasons. The transactionaldatabase is structured as
an online transactionprocessing (OLTP) database, whereas the Oracle Business AnalyticWarehouse is
structured as an online analyticalprocessing (OLAP) database, each optimized for its own purpose.
Using a Separate Database for the Oracle Business Analytics Warehouse
Historical data cannot be purged from a transactional database, even if not required for current transaction
processing, because you need it for analysis.
This causes the transactional database to further slow down.
In contrast, the analytical database is the warehouse for historical as well as current data.
Transactional databases are tuned for one specific application, and it is not productive to use these separate
transactional databases for analytical queries that usually span more than one functional application.
The analytical database can be specifically tuned for analytical queries and ETL processing. These are
different from transactional database requirements.
General Guidelines for Oracle Databases
Recommendations for optimizing performance for Oracle databases:
OBIA on Oracle databases support only binary sorting. If you are running an Oracle client, do one of the
following to ensure adequate performance:
Set the NLS_SORTparameter to BINARY.
Choose an NLS_LANGsetting that includes binary.
It is recommended that you gather workload system statistics.
To increase data throughput between Oracle BI Server and the Oracle database, change SDU and TDU
settings in listener.ora.
The default is 2 KB and can be increased to 8 KB.
General Guidelines for Oracle Databases
On the server side, in the listener.orafile under the particular SID_LIST entry, modify SID_DESCas follows:
General Guidelines for Oracle Databases
Set the number of log file groups to 4.
On the client side, in the tnsnames.orafile, modify the TNS alias by adding SDU=and TDU=as follows:
Using Oracle Template Files
To configure the OBAW on Oracle databases more easily, refer to the parameter template file
init11gR2_template.oraor init11gR2_Exadata_template.ora, which are stored in
\<BI_Oracle_Home>\biapps\etl.
The parameter template file provides parameter guidelines based on the cost-based optimizer for Oracle
Database 11gR2. Use these guidelinesas a startingpoint. You will need to make changes based on your
specificdatabase sizes, data shape, server size (CPU and memory), and type of storage.
Copy the appropriatetemplate file into your <ORACLE_HOME>/dbsdirectory. Then, review the
recommendations in the template file, and make the changes based on your specific database configuration.The
database administrator should make changes to the settings based on performance monitoring and tuning
considerations.
The NLS_LENGTH_SEMANTICSparameter enables you to define byte- or character-length semantics. OBIA
supports BYTEand CHARvalues for this parameter. If you are using MLS characters, you can add this
parameter to the parameter template file (init<DB version>.ora) for your database version.
Configuring Base and Installed Data Warehouse Languages
You should install only the languages that you expect to use, because each installed language can significantly
increase the number of records stored in the data warehouse and can affect overall database performance.
After installingOracleBI Applications,you use the OBIAConfiguration Manager to configure which languages
you want to support in the OracleBusiness Analytics Warehouse. You must configure one base" language,
and you can also configure any number of installed"languages. Typically, the base language specified for the
data warehouse should match the base language of the source system.
The installed languages that you specify for the data warehouse do not have to match the languages that are
installed in the source system. The data warehouse can have more, fewer, or completelydifferent installed
languages compared to the source system.
Note that for languages that match between the transactional system and the data warehouse, the corresponding
record is extracted from the transactional system; languages that do not match will have a pseudo-translated
record generated.
Minidimension Tables
Are used to increase query performance
Include the most queried attributes of parent dimensions
Minidimension tables, which contain subsets of heavily queried attributes of parent dimensions, accrue benefits
by segregating these attributesin their own tables. They improve query performance because the database does
not need to join the fact tables with the big parent dimensions but can join these small tables with the fact tables
instead.
Aggregate Tables
Can dramatically improve query performance Summarize detail-level facts at a higher level Are identified
with the suffix _A
Examples:
Daily to yearly sales
Sum the fact data by date or sales region
One of the main uses of a data warehouse is to sum up fact data with respect to a given dimension, for example,
by date or by sales region. Performing this summation on demand is resource-intensive,and slows down
response time. The Oracle Business Analytics Warehouse precalculates some of these sums and stores the
information in aggregate tables to speed up response time. In the Oracle Business Analytics Warehouse, the
aggregate tables have been suffixed with _A.
Creating Indices
When customizing OBIA and the OBAW, define indices to improve query performance.
Create indices on all columns that the ETL uses for dimensions and facts as in the following examples:
ROW_WIDs of dimensions and facts INTEGRATION_ID
DATASOURCE_NUM_ID
Flags
Staging tables typically do not require indices.
Carefully consider on which columns to put filter conditions.
Inspect preconfigured objects for guidance.
Register indices in the appropriate ODI model.
Prune Days
Evaluate the performance effect of setting the Prune Days parameter:
Setting a small value for the Prune Days parameter means that the ETL will extract fewer records, thus
improving performance; however, this increases the chances that records are not detected.
The Prune Days parameter is used to extend the window of the ETL extract beyond the last time the ETL
actuallyran and ensure that the records that may have somehow been missed in an earlier ETL process are
picked up in the next ETL.
Last Extract Date is a value that is calculated based on the last time data was extracted from that table less a
Prune Days value. Records can be missed in an ETL process when a record is being updated while the ETL
process is running and was not committed until after the ETL completed.
You set the Prune Days parameter value in OBIAConfigurationManager. Setting a small value means the ETL
will extract fewer records,thus improving performance; however, this increases the chances that records are not
detected. Setting a large number is usefulif ETL runs are infrequent, but this increases the number of records
that are extracted and updated in the data warehouse. Therefore,you should not set the Prune Days value to a
very large number. A large Prune Days number can also be used to trigger re-extracting records that were
previously processed but have not changed. The value for Prune Days should never be set to 0.
Oracle GoldenGate
Oracle GoldenGate is a data replication tool that you can use to create a replicated OLTP schema or schemas,
facilitate change data capture, and aid in ETL and transactional system performance.
Provides an OLTP-mirrored schema on the OBAW database instance using replication
This replicated source-dependent schema (SDS) is then used as a source during ETL.
Provides less network I/O during ETL because the source data for extract is in the same physical
instance/machine as the OBAW target tables
Shortens the length of time that ETL takes by minimizing impact to the OLTP database
Quiz
Which of the following is not a recommended guideline for improving performance in the data warehouse
physical database?
a. Allocate around 5070 percent of the total available server memory to the database, assuming no other
application is running on the same server.
b. At a minimum, combine the data and index tablespaces.
c. Create more tablespaces to separate heavily used tables and their indexes.
d. Use 8k block size for Oracle warehouses.
Answer: b
Quiz
Which of the following are reasons for not putting the Oracle Business Analytics Warehouse in the same
database as the transactional database?
a. The analytical queries interfere with normal use of the transactional database.
b. The data in a transactional database is normalized for update efficiency.
c. Transactional databases are tuned for multiple applications.
d. The analytical database can be specifically tuned for the analytical queries and ETL processing.
Answer: a, b, d
Quiz
How does Oracle GoldenGate help to improve ETL and transactional system performance?
a. Provides an OLTP-mirrored schema on the OBAW database instance using replication
b. Provides less network I/O during ETL because the source data for extract is in the same physical
instance/machine as the OBAW target tables
c. Precalculates sums and stores the information in aggregate tables to speed up response time
d. Shortens the length of time that ETL takes by minimizing impact to the OLTP database
Answer: a, b, d
Quiz
Identify the incorrect statement about minidimension tables.
a. Contain subsets of heavily queried attributes of parent dimensions
b. Improve query performance because the database does not need to join the fact tables with the parent
dimensions
c. Include the most queried attributes of parent dimensions d. Summarize detail-level facts at a higher level
Answer: d
118) Standardized Codes
Intuitive and nonintuitive codes must be standardized across source systems.
1. W_CODE_D stores all the standardized code and name combinations for all languages in the OBAW.
2. Code lookups in load mappings perform a separate extract for codes and insert codes and descriptions into
W_CODE_D
Some source systems use intelligent codes that are intuitively descriptive, such as HD for hard
disks, while other systems use nonintuitive codes (such as numbers, or other vague descriptors),
such as 16 for hard disks. While codes are an important tool with which to analyze information,
the variety of codes and code descriptions used poses a problem when performing an analysis
across source systems.
The lack of uniformity in source system codes must be resolved to integrate data for the OBAW.
There is a code lookup in load mappings that integrates both intelligent and nonintuitive codes
by performing a separate extract for codes, and inserting the codes and their description into the
W_CODE_D table. The codes table provides the load mapping with a resource from which it can
perform a lookup for code descriptions. The load mapping integrates multiple source system
codes by designating one source system instance as a master in a mapping. All other source
system codes are then mapped to the master. The screenshot shows a partial view of
W_CODE_D:
DATASOURCE_NUM_ID: Unique identifier of the source system from which data was
extracted
SOURCE_CODE: Concatenated value of the various source codes in the hierarchy
SOURCE_NAME_2: Description of the source code
MASTER_ DATASOURCE_NUM_ID: Unique identifier of the master system
MASTER_VALUE: Identifies the description for the master code in the master source system
MASTER_CODE: Identifies the corresponding master code for the source code
Standardized Codes:
Accounts Receivable Process Example
Key baseline analytic requirements:
Invoices
Schedules
Adjustments (Credit Memo/Debit Memo/Others)
Payments
Payment Applications
Further analysis supported:
Aging Buckets
Balances
Efficiency
Volume
Risks (Outstanding Balance against Credit Limit)

Вам также может понравиться