Академический Документы
Профессиональный Документы
Культура Документы
When one talks about Oracle E-Business suite implementation following questions about
data arise:
While discussing about these questions few terms get used interchangeable viz. Data
Migration, Data conversion, Interfaces etc. All can be brought under one
umbrella/category called Data Movement.
This white paper talks about various strategies/technologies which can be used for data
communication between legacy system and Oracle E-Business Suite. This gives the
overview about various technical components involved in taking care of most important
word called "DATA".
This white paper tries to give direction to answer above questions. To make legacy data
available to new system, one has to perform data entry manually or use tools like data
loader, win-runner which simulate keyboard/mouse action and make data entry faster. If
data is huge the programmer/developers help is needed to build the program to automate
data loading. If two systems are going to coexists for some time and communicate with
each other then programs which run at regular intervals to move the data are needed.
Defining Various Terminologies
Data Migration is a process of moving required (and most often very large) volumes of
data from our clients' existing systems to new systems. Existing systems can be anything
from custom-built IT infrastructures to spreadsheets and standalone databases.
Data conversion can be defined as a process of converting data from one structural form
to another to suit the requirements of the system to which it is migrated.
Interface is set of programs for connection between two systems in order to synchronize
the Data. They can be Manual, Batch or Real-Time. They are used repeatedly and should
therefore be designed and constructed in the most efficient manner possible.
Though Data Migration, conversion, interfaces are treated differently but technical folks
should not differentiate between them. The design of the all program should same like
interfaces, because in most of the implementation data need to be moved multiple times
and for some time both legacy system and Oracle Apps 11i happened to be active.
Bottom line if data needs to be moved programmatically i.e. without using tools like data
loader or manual data entry then it is better to design the program which can be execute
multiple time as a concurrent program.
Legacy System
There can be various types of legacy systems which can be based on databases like
Oracle, DB2 etc or based on file systems. Generally it is better to use files extracted from
legacy system while development of data conversion program as data volume is very
high. DB Links can be used for connecting to other databases while developing
interfaces.
Oracle Apps Custom staging tables :- As a thumb rule the structure of custom staging
table should be same as the source data file or source legacy database table. There should
be presence of columns like Interface_id, Batch_ID, Process_Flag, Error_Message, WHO
columns and Extended WHO columns in the table structure.
Shell Script with SQL*Loader:- Shell script (host) type of concurrent program should
be used to load data files in custom staging tables instead of SQL*Loader executable
type. Shell script internally uses SQL*Loader command line utility to load data and
archives the data files. With usage of shell script the data file path is not hard coded,
archiving monitoring bad file is easily achievable which is not the case with SQL*Loader
executable type. As a rule there should not be hard coding in shell scripts e.g. directory
paths, username, password etc.Inbound data files can be placed in directories like
$XXINV_TOP/datafiles/in.
Shell script should read these files and load in custom tables and archive data files in
$XXINV_TOP/datafiles/in/archive with date timestamp appended to file name.
SQL*Loader log files and bad files can reside in $XXINV_TOP/datafiles/in/log and
$XXINV_TOP/datafiles/in/bad folders.
This shell script should also read the bad files generated by SQL*Loader and complete
the concurrent program in Success/Warning/Error Status. Sample shell script is given in
annexure.
This shell script has following sections
Shell Script with Java based XML Parser:- Shell script (host) type of concurrent
program should be used to call java based/PLSQL XML parser. The directory structure,
archiving, logging related rules remain the same mentioned above. In this case we have to
develop a custom Java or PLSQL based XML parser which can parse the input xml files
and load in custom staging table. PLSQL based parsers user UTL_FILE package to
perform file read operations. Building a java based XML parser is a topic in itself, which
can be considered as out of scope for this white paper.
PLSQL Package for reading Legacy outbound tables:- This can be a plsql package
which can read legacy database tables using database links and writes the inbound
custom staging tables in Oracle Apps 11i instance.
Oracle Apps Custom Lookup Table:- This table should be used for storing the lookup
values i.e. mapping legacy system values with oracle values. This generally store
information like Unit of Measure etc. which can be different on Oracle and Legacy
system. The table structure should have columns like Lookup_Type, Language, Legacy
System, Legacy System Value, Oracle_Value, effectivity_date, end_date and WHO
columns.
PLSQL package for data validation and mapping:- This PLSQL package should
perform following activities
1 - PENDING
3 - ERROR
7 - PROCESSED/SUCCESS
5 --VALIDATION SUCCESS
2 - PICKED FOR VALIDATION
PLSQL package to write data in Oracle Open Interface tables:- This PLSQL package
should read all valid records (Process_Flag=5) and insert data in Oracle Seeded interface
table and mark Process_Flag=7
Oracle Open Interface tables:- These are the inbound interface tables provided by
oracle for various base tables in oracle applications.
Oracle Open Interface Concurrent Programs:- These are oracle supplied import
programs which read oracle open interface tables, validate the data and load oracle apps
base tables.
Oracle Supplied Public APIs- Generally for online processing (when batch processing
is not acceptable) public APIs are used to load oracle base tables. In this form of
interfaces, open interface tables are not used. Public PLSQL APIs accept plsql table type
of input parameters and APIs perform validation and updates of oracle base tables.
Lookups and Profiles :- Custom Profile Options can be used for following purposes
Custom Lookup Table -In case of some mapping between Oracle and Legacy systems a
custom lookup table can be designed along with a custom form which will help data entry
for mappings
Custom Form to update custom staging tables :-This form is used for performing
UPDATE/DELETE operations on custom staging tables. This is used to correct any data
errors in custom staging tables and re-submit data for processing. This helps avoiding
manual correction of data files. This form can be a simple data entry screen which selects
custom table from list of value and displays the data in tabular format on a stacked canvas
and allows data updates.
Alert based monitoring:- This can be developed using a custom table with columns which
are generally required for an email e.g. SUBJECT, TO_EMAIL, EMAIL_MESSAGE,
PROCESS_STATUS, INSTANCE and WHO Columns. This custom table can have an
event alert (INSERT) which read new records and send out an email and update
PROCESS_STATUS column value for current record to 'SENT'
Error Reporting using UNIX Emails :- To build this functionality additional efforts are
needed to populate an error table giving more details of the problem, build simple reports
on it and submit these report concurrent programs at the end of data conversion
routine/interfaces and email the output file to respective users.
Summary
To summarize the data conversion and interface development strategies, one should
always have about components like custom staging tables, shell scripts, SQL*Loader
control files, validation programs and monitoring in the design. There can be short cuts in
data conversion like instead of shell scripts using directly SQL*Loader executable types
in concurrent programs, which should definitely avoided because these short cuts involve
lot of hard coding also maintenance and reusability of components cannot be achieved.
Monitoring and error report is generally ignored considering it as overhead but it helps in
long run for debugging and maintenance.
Monitoring and error report is generally ignored considering it as overhead but it helps in
long run for debugging and maintenance
# XXINV2402_ITEM_OnHand
#author :
# Date Written. 09/20/2007
# Purpose. Loading data in Staging table for Master Item
#######################################################################
######## get user id #######
#######################################################################
for i in $*
do
case $i in FCP_LOGIN=*)
#echo $i
UID_PWD=`echo $i | sed 's/FCP_LOGIN=//g'`
UID_PWD=`echo $UID_PWD | sed 's/"//g'`;;
esac
done
#######################################################################
###
####### Initialize the variables #######
#######################################################################
###
TODAY_DAT=`date +%Y%m%d%H%M`
CTL_FILE=$XXINV_TOP/bin/XXINV2402_01.ctl
DATA_FILE=$XXINV_TOP/datafiles/in/Item_OnHand.csv
BAK_FILE=$XXINV_TOP/datafiles/in/archive/Item_OnHand$TODAY_DAT.bak
LOG_FILE=$XXINV_TOP/log/Item_OnHand$TODAY_DAT.log
BAD_FILE=$XXINV_TOP/datafiles/out/archive/Item_OnHand$TODAY_DAT.bad
#######################################################################
###
####### Check File Existence #######
#######################################################################
###
fileChk()
{
# if file is not exists then exit the program
if [ -f $1 ]
then
echo $1' File Exists'
else
echo $1' File Not Exists'
exit 1
fi
}
fileChk $DATA_FILE
fileChk $CTL_FILE
retcode=`echo $?`
echo $retcode
if [ $retcode = 0 ]
then
# take backup of datafile to out folder if successful loaded
mv $DATA_FILE $BAK_FILE
fi
if [ -f $BAD_FILE ]
then
echo 'Please Check Bad file '$BAD_FILE
fi
Set as favorite
Bookmark
Email This
Hits: 8631
Comments (3)
Subscribe to this comment's feed
Excellent Article
written by Dinesh Chauhan , May 20, 2008
Very good article indeed. While creating flat file or asking third party to provide one,
what do you think is the best approach,Comma/Pipe delimited file or without any
delimiter ?
report abuse
vote down
vote up
Votes: +0
Excellent Article
written by R Mohanty , August 31, 2009
Excellent Article
report abuse
vote down
vote up
Votes: +0
Excellent article and Good Source of Data Migration
written by Boyapatisireesha , November 18, 2009
I am New to Shell script but the example of shell script in above article given me more
confidence to learn scripting.
Could you please add commands used to execute the script in unix .
report abuse
vote down
vote up
Votes: +0
Write comment
Name
Email
Title
Comment
smaller | bigger
Subscribe via email (Registered users only)
Add Comment
Please enable JavaScript to post a new comment
Related Items
Search apps2fusion