Вы находитесь на странице: 1из 11

These tools are used to transfer data from one oracle database to another oracle

database. You Export tool to export data from source database, and Import tool to load
data into the target database. When you export tables from source database export tool
will extracts the tables and puts it into the dump file. This dump file is transferred to
the target database. At the target database the Import tool will copy the data from
dump file to the target database.
From Ver. 10g Oracle is recommending to use Data Pump Export and Import tools,
which are enhanced versions of original Export and Import tools.
The export dump file contains objects in the following order:
1.
2.
3.
4.
5.
6.

Type definitions
Table definitions
Table data
Table indexes
Integrity constraints, views, procedures, and triggers
Bitmap, function-based, and domain indexes

When you import the tables the import tool will perform the actions in the following
order, new tables are created, data is imported and indexes are built, triggers are
imported, integrity constraints are enabled on the new tables, and any bitmap,
function-based, and/or domain indexes are built. This sequence prevents data from
being rejected due to the order in which tables are imported. This sequence also
prevents redundant triggers from firing twice on the same data
Invoking Export and Import

You can run Export and Import tool in two modes


Command Line Mode
Interactive Mode
When you just type exp or imp at o/s prompt it will run in interactive mode i.e. these
tools will prompt you for all the necessary input. If you supply command line
arguments when calling exp or imp then it will run in command line mode
Command Line Parameters of Export tool
You can control how Export runs by entering the EXP command followed
by various arguments. To specify parameters, you use keywords:

Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)


Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
or TABLES=(T1:P1,T1:P2), if T1 is partitioned table

Keyword

Description (Default)

-------------------------------------------------------------USERID
BUFFER
FILE

username/password
size of data buffer
output files (EXPDAT.DMP)

COMPRESS

import into one extent (Y)

GRANTS

export grants (Y)

INDEXES

export indexes (Y)

DIRECT

direct path (N)

LOG

log file of screen output

ROWS

export data rows (Y)

CONSISTENT

cross-table consistency(N)

FULL

export entire file (N)

OWNER

list of owner usernames

TABLES

list of table names

RECORDLENGTH

length of IO record

INCTYPE

incremental export type

RECORD

track incr. export (Y)

TRIGGERS

export triggers (Y)

STATISTICS

analyze objects (ESTIMATE)

PARFILE

parameter filename

CONSTRAINTS
OBJECT_CONSISTENT

export constraints (Y)


transaction set to read only during object export (N)

FEEDBACK

display progress every x rows (0)

FILESIZE

maximum size of each dump file

FLASHBACK_SCN

SCN used to set session snapshot back to

FLASHBACK_TIME

time used to get the SCN closest to the specified time

QUERY

select clause used to export a subset of a table

RESUMABLE

suspend when a space related error is encountered(N)

RESUMABLE_NAME

text string used to identify resumable statement

RESUMABLE_TIMEOUT

wait time for RESUMABLE

TTS_FULL_CHECK

perform full or partial dependency check for TTS

TABLESPACES

list of tablespaces to export

TRANSPORT_TABLESPACE
TEMPLATE

export transportable tablespace metadata (N)

template name which invokes iAS mode export

The Export and Import tools support four modes of operation


FULL
:Exports all the objects in all schemas
OWNER
:Exports objects only belonging to the given OWNER
TABLES
:Exports Individual Tables
TABLESPACE :Export all objects located in a given TABLESPACE.

Example of Exporting Full Database


The following example shows how to export full database
$exp USERID=scott/tiger FULL=y FILE=myfull.dmp
In the above command, FILE option specifies the name of the dump
file, FULL option specifies that you want to export the full database, USERID option
specifies the user account to connect to the database. Note, to perform full export the
user should have DBA or EXP_FULL_DATABASE privilege.
Example of Exporting Schemas
To export Objects stored in a particular schemas you can run export utility with the
following arguments
$exp USERID=scott/tiger OWNER=(SCOTT,ALI)
FILE=exp_own.dmp
The above command will export all the objects stored in SCOTT and ALIs schema.
Exporting Individual Tables
To export individual tables give the following command
$exp USERID=scott/tiger TABLES=(scott.emp,scott.sales)
FILE=exp_tab.dmp
This will export scotts emp and sales tables.
Exporting Consistent Image of the tables
If you include CONSISTENT=Y option in export command argument then, Export
utility will export a consistent image of the table i.e. the changes which are done to the
table during export operation will not be exported.
Using Import Utility

Objects exported by export utility can only be imported by Import utility. Import
utility can run in Interactive mode or command line mode.

You can let Import prompt you for parameters by entering the IMP command
followed by your username/password:
Example: IMP SCOTT/TIGER
Or, you can control how Import runs by entering the IMP command followed
by various arguments. To specify parameters, you use keywords:
Format: IMP KEYWORD=value or
KEYWORD=(value1,value2,...,valueN)
Example: IMP SCOTT/TIGER IGNORE=Y TABLES=(EMP,DEPT)
FULL=N
or TABLES=(T1:P1,T1:P2), if T1 is
partitioned table
USERID must be the first parameter on the command line.
Keyword
USERID
BUFFER
FILE
SHOW
IGNORE
GRANTS
INDEXES
ROWS
LOG
FULL
FROMUSER
TOUSER
TABLES
RECORDLENGTH
INCTYPE
COMMIT
PARFILE
CONSTRAINTS
DESTROY
INDEXFILE
SKIP_UNUSABLE_INDEXES
FEEDBACK

Description (Default)
username/password
size of data buffer
input files (EXPDAT.DMP)
just list file contents (N)
ignore create errors (N)
import grants (Y)
import indexes (Y)
import data rows (Y)
log file of screen output
import entire file (N)
list of owner usernames
list of usernames
list of table names
length of IO record
incremental import type
commit array insert (N)
parameter filename
import constraints (Y)
overwrite tablespace data file (N)
write table/index info to specified file
skip maintenance of unusable indexes (N)
display progress every x rows(0)

TOID_NOVALIDATE
FILESIZE
STATISTICS
RESUMABLE
RESUMABLE_NAME
RESUMABLE_TIMEOUT
COMPILE
STREAMS_CONFIGURATION
STREAMS_INSTANITATION

skip validation of specified type ids


maximum size of each dump file
import precomputed statistics (always)
suspend when a space related error is encountered(N)
text string used to identify resumable statement
wait time for RESUMABLE
compile procedures, packages, and functions (Y)
import streams general metadata (Y)
import streams instantiation metadata (N)

Example Importing Individual Tables


To import individual tables from a full database export dump file give the following
command
$imp scott/tiger FILE=myfullexp.dmp
FROMUSER=scott TABLES=(emp,dept)
This command will import only emp, dept tables into Scott user and you will get a
output similar to as shown below
Export file created by EXPORT:V10.00.00 via conventional path
import done in WE8DEC character set and AL16UTF16 NCHAR character set
. importing SCOTT's objects into SCOTT
. . importing table

"DEPT"

4 rows imported

. . importing table

"EMP"

14 rows imported

Import terminated successfully without warnings.

Example, Importing Tables of One User account into another User account
For example, suppose Ali has exported tables into a dump file mytables.dmp. Now
Scott wants to import these tables. To achieve this Scott will give the following
import command
$imp scott/tiger FILE=mytables.dmp
FROMUSER=ali TOUSER=scott
Then import utility will give a warning that tables in the dump file was exported by
user Ali and not you and then proceed.

Example Importing Tables Using Pattern Matching


Suppose you want to import all tables from a dump file whose name matches a
particular pattern. To do so, use % wild character in TABLES option. For example,
the following command will import all tables whose names starts with alphabet e
and those tables whose name contains alphabet d
$imp scott/tiger FILE=myfullexp.dmp
FROMUSER=scott TABLES=(a%,%d%)
Migrating a Database across platforms.

The Export and Import utilities are the only method that Oracle supports for moving
an existing Oracle database from one hardware platform to another. This includes
moving between UNIX and NT systems and also moving between two NT systems
running on different platforms.
The following steps present a general overview of how to move a database between
platforms.
1. As a DBA user, issue the following SQL query to get the exact name of all tablespaces. You will
need this information later in the process.

SQL> SELECT tablespace_name FROM dba_tablespaces;


2. As a DBA user, perform a full export from the source database, for example:

> exp system/manager FULL=y FILE=myfullexp.dmp


3. Move the dump file to the target database server. If you use FTP, be sure to copy it in binary
format (by entering binary at the FTP prompt) to avoid file corruption.
4. Create a database on the target server.
5. Before importing the dump file, you must first create your tablespaces, using the information
obtained in Step 1. Otherwise, the import will create the corresponding datafiles in the same file
structure as at the source database, which may not be compatible with the file structure on the
target system.
6. As a DBA user, perform a full import with the IGNORE parameter enabled:

> imp system/manager FULL=y IGNORE=y


FILE=myfullexp.dmp

Using IGNORE=y instructs Oracle to ignore any creation errors during the import and permit the import
to complete.

7. Perform a full backup of your new database.

7.8.
Using Data Pump Export Utility
8.9.
To Use Data Pump, DBA has to create a directory in Server Machine
and create a Directory Object in the database mapping to the directory created
in the file system.
9.10.
The following example creates a directory in the filesystem and creates a
directory object in the database and grants privileges on the Directory Object to
the SCOTT user.
10.11. $mkdir my_dump_dir
$sqlplus
Enter User:/ as sysdba
SQL>create directory data_pump_dir as
/u01/oracle/my_dump_dir;
11.12. Now grant access on this directory object to SCOTT user
12.13. SQL> grant read,write on
directory data_pump_dir to scott;
13.14. Example of Exporting a Full Database
14.15. To Export Full Database, give the following command
15.16. $expdp scott/tiger FULL=y
DIRECTORY=data_pump_dir DUMPFILE=full.dmp
LOGFILE=myfullexp.log JOB_NAME=myfullJob
16.17. The above command will export the full database and it will create the
dump file full.dmp in the directory on the server /u01/oracle/my_dump_dir
17.18. In some cases where the Database is in Terabytes the above command
will not feasible since the dump file size will be larger than the operating
system limit, and hence export will fail. In this situation you can create multiple
dump files by typing the following command
18.19. $expdp scott/tiger FULL=y
DIRECTORY=data_pump_dir DUMPFILE=full%U.dmp
FILESIZE=5G LOGFILE=myfullexp.log
JOB_NAME=myfullJob
19.20. This will create multiple dump files named full01.dmp, full02.dmp,
full03.dmp and so on. The FILESIZE parameter specifies how much larger the
dump file should be.
21. Example of Exporting a Schema
22.To export all the objects of SCOTTS schema you can run the following export
data pump command.

23.$expdp scott/tiger
DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT
24.You can omit SCHEMAS since the default mode of Data Pump export is
SCHEMAS only.
25.If you want to export objects of multiple schemas you can specify the following
command
26.$expdp scott/tiger
DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT,HR,ALI
27. Exporting Individual Tables using Data Pump Export
28.You can use Data Pump Export utility to export individual tables. The
following example shows the syntax to export tables
29. $expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=tables.dmp
30.
TABLES=employees,jobs,departments
31.
32. Exporting Tables located in a Tablespace
33.If you want to export tables located in a particular tablespace you can type the
following command
34.
35. $expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=tbs.dmp
36. TABLESPACES=tbs_4, tbs_5, tbs_6
37.
38.The above will export all the objects located in tbs_4,tbs_5,tbs_6
39.
40. Excluding and Including Objects during Export
41.You can exclude objects while performing a export by using EXCLUDE option
of Data Pump utility. For example you are exporting a schema and dont want
to export tables whose name starts with A then you can type the following
command
42.$expdp scott/tiger
DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT EXCLUDE=TABLE:like A%
43.Then all tables in Scotts Schema whose name starts with A will not be
exported.
44.Similarly you can also INCLUDE option to only export certain objects like this
45.$expdp scott/tiger
DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT INCLUDE=TABLE:like A%
46.This is opposite of EXCLUDE option i.e. it will export only those tables of
Scotts schema whose name starts with A

47.Similarly you can also exclude INDEXES, CONSTRAINTS, GRANTS,


USER, SCHEMA
48. Using Query to Filter Rows during Export
49.You can use QUERY option to export only required rows. For Example, the
following will export only those rows ofemployees tables whose salary is
above 10000 and whose dept id is 10.
50. expdp hr/hr QUERY=emp:'"WHERE dept_id > 10 AND sal > 10000"'
51.
NOLOGFILE=y DIRECTORY=dpump_dir1 DUMPFILE=exp1.dmp
52.
53. Suspending and Resuming Export Jobs (Attaching and Re-Attaching to
the Jobs)
54.You can suspend running export jobs and later on resume these jobs or kill
these jobs using Data Pump Export. You can start a job in one client machine
and then, if because of some work, you can suspend it. Afterwards when your
work has been finished you can continue the job from the same client, where
you stopped the job, or you can restart the job from another client machine.
55.For Example, suppose a DBA starts a full database export by typing the
following command at one client machine CLNT1 by typing the following
command
56.$expdp scott/tiger@mydb FULL=y
DIRECTORY=data_pump_dir
DUMPFILE=full.dmp LOGFILE=myfullexp.log
JOB_NAME=myfullJob
57.After some time, the DBA wants to stop this job temporarily. Then he presses
CTRL+C to enter into interactive mode. Then he will get the Export> prompt
where he can type interactive commands
58.Now he wants to stop this export job so he will type the following command
59. Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y

60. The job is placed in a stopped state and exits the client.
61. After finishing his other work, the DBA wants to resume the export job and the client

machine from where he actually started the job is locked because, the user has locked
his/her cabin. So now the DBA will go to another client machine and he reattach to the
job by typing the following command

62. $expdp hr/hr@mydb ATTACH=myfulljob


63. After the job status is displayed, he can issue the CONTINUE_CLIENT command to resume
logging mode and restart themyfulljob job.

64. Export> CONTINUE_CLIENT


65. A message is displayed that the job has been reopened, and processing status is output to

the client.
66. Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he
doesnt want to continue with the export job.

What are the use of Export and Import command?


Use the IMPORT command to import data into a nickname from a file.............
Read answer

What are the parameters to be provided while executing Export and Import commands?
The parameters that need to be provided while executing IMPORT or EXPORT commands are: ..................
Read answer

What is the difference between the SQL*Loader and IMPORT utilities?


SQL*Loader is a bulk loader utility used for moving data from external files into the Oracle database................
Read answer

What is the difference between the SQL*Loader and IMPORT utilities?


SQL*Loader can be used to load data from Delimiter separated files and fixed............
Read answer

Difference between locks, latches, enqueues and semaphores


Latches are used to protect Oracle data structures to be modified or run by more than one process. They are more restrictive than locks. Locking has a
similar concept...............
Read answer

Oracle processes
User process User process is used in invocation of application software, Data writing process - A database writer process is used to write buffer
content into a datafile. They are specifically used to write dirty block to data files from the buffer...............
Read answer

Oracle system privilege


A system privilege is the right to perform an action on any schema objects of a particular type.............
Read answer

Oracle union, intersect and minus

7.

UNION: The UNION operator is used to combine the result-set of two or more SELECT statements Tables of both the select statement
must have the same number of columns with similar data types. It eliminates duplicates..........
Read answer

Вам также может понравиться