Friday 8 March 2013

DATA PUMP ARCHITECTURE AND STEPS TO PERFORME EXPORT/IMPORT USING DATAPUMP


INTRODUCTION
Oracle Data pump utility enables very fast bulk data and metadata movement between Oracle databases. It provides parallel Export and Import utilities (expdp and impdp) and Web-based Oracle Enterprise Manager Interface.Data Pump Export and Import utilities are typically much faster than the original Export and Import Utilities. A single thread of Data Pump Export is about twice as fast as original Export, while Data Pump Import is 15-45 times fast than original Import.Data Pump jobs can be restarted without loss of data, whether or not the stoppage was voluntary or involuntary.Data Pump jobs support fine-grained object selection. Virtually any type of object can be included or excluded in a Data Pump job. Data Pump supports the ability to load one instance directly from another (network import) and unload a remote instance (network export).

DATA PUMP ARCHITECTURE
Oracle Data Pump is made up of three distinct parts:
              The command-line clients, expdp and impdp
              The DBMS_DATAPUMP PL/SQL package (also known as the Data Pump API)
              The DBMS_METADATA PL/SQL package (also known as the Metadata API) 
EXPDP – Client interface to the Data Pump Export utility. The interface closely resembles its predecessor, Oracle Export.
IMPDP – Client interface to the Data Pump Import utility. The interface closely resembles its predecessor, Oracle Import.
Grid Control Interface – This is a web based interface, which provides access to the DBMS_DATA_PUMP PL/SQL package.
STEPS:
Creating Directory Objects
In order to use Data Pump, the database administrator must create a directory object and grant privileges to the user on that directory object. If a directory object is not specified, a default directory object called data_pump_dir is provided. The default Data_pump_dir is available only to privileged users unless access is granted by the DBA.
CONN / AS SYSDBA
CREATE OR REPLACE DIRECTORY test_dir AS '/u01/app/oracle/oradata/';

After a directory is created, we need to grant READ and WRITE permission on the directory to other users. For example, to allow the Oracle database to read and to write to files on behalf of user test in the directory named by test_dir, you must execute the following command:
GRANT READ, WRITE ON DIRECTORY test_dir TO test;
Once the directory is granted, we can export a user’s object with command arguments that are very similar to exp and imp:
Schema Exports/Imports
The OWNER parameter of exp has been replaced by the SCHEMAS parameter which is used to specify the schemas to be exported. The following is an example of the schema export and import syntax.
expdp test/test@tsn schemas=TEST directory=TEST_DIR dumpfile=TEST.dmp logfile=expdpTEST.log
impdp test/test@tsn schemas=TEST directory=TEST_DIR dumpfile=TEST.dmp logfile=impdpTEST.log

Table Exports/Imports
The TABLES parameter is used to specify the tables that are to be exported. The following is an example of the table export and import syntax.

expdp test/test@tsn tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=expdpEMP_DEPT.log
impdp test/test@tsn tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=impdpEMP_DEPT.log

TABLE_EXISTS_ACTION – The original imp would allow rows to be appended to existing tables if IGNORE=Y was specified. The TABLE_EXISTS_ACTION parameter for Data Pump impdp provides four options:
1. SKIP is the default: A table is skipped if it already exists.
2. APPEND will append rows if the target table’s geometry is compatible.
3. TRUNCATE will truncate the table, then load rows from the source if the geometries     are compatible and truncation is possible. For example, it is not possible to truncate a table if it is the target of referential constraints.
4. REPLACE will drop the existing table, then create and load it from the source.
INCLUDE and EXCLUDE
The INCLUDE and EXCLUDE parameters can be used to limit the export/import to specific objects. When the INCLUDE parameter is used, only those objects specified by it will be included in the export/import. When the EXCLUDE parameter is used, all objects except those specified by it will be included in the export/import. The two parameters are mutually exclusive, so use the parameter that requires the least entries to give you the result you require. The basic syntax for both parameters is the same. 
expdp test/test@tsn schemas=TEST include=TABLE:"IN ('EMP', 'DEPT')" directory=TEST_DIR dumpfile=TEST.dmp logfile=expdpTEST.log
expdp test/test@tsn schemas=TEST exclude=TABLE:"= 'BONUS'" directory=TEST_DIR dumpfile=TEST.dmp
logfile=expdpTEST.log
Other filters
INCLUDE=VIEW
INCLUDE=PACKAGE:"LIKE '%API'"

Network Exports/Imports (NETWORK_LINK)
The NETWORK_LINK parameter identifies a database link to be used as the source for a network export/import. The following database link will be used to demonstrate its use.
1)             Create database link
CONN test/test
CREATE DATABASE LINK remote_test CONNECT TO test IDENTIFIED BY test USING 'TEST';
2)             Export data using database link
expdp test/test@tsn tables=TEST.EMP network_link=REMOTE_TEST directory=TEST_DIR dumpfile=EMP.dmp logfile=expdpEMP.log
3)               Import data using database link
impdp test/test@tsn tables=TEST.EMP network_link=REMOTE_TEST directory=TEST_DIR logfile=impdpTEST.log remap_schema=TEST:TEST_NEW
We can Imports the objects directly from the source into the local server without being written to a dump file.
Attaching to and Stopping an Existing Job
The ATTACH command attaches the client session to an existing export job and  automatically places you in the interactive-command interface. Export displays a description of the job to which you are attached and also displays the export prompt. A
job name does not have to be specified if there is only one export job that is associated with your schema. The job you attach to can be either currently executing or stopped
Attach /Detach job in case of Export:-
A)            You have  started the export job
expdp mm_stg/mm_stg dumpfile=DPUMP:export.dmp logfile= DPUMP:export.log schemas=mm_stg job_name=SYS_EXPORT_SCHEMA
B)            If you want to pause the job for some time then press CTL+C , type the command given below
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([yes]/no): yes
C)             Now you want to restart the job then you need to attach the job
         $ expdp mm_stg/mm_stg attach=SYS_EXPORT_SCHEMA
          Export> START_JOB
          Export> STATUS=600
          Export> CONTINUE_CLIENT

Attach /Detach job in case of Import:-
A)            You have  started the import job
impdp system/sys dumpfile=DPUMP:export3.dmp logfile=DPUMP:export2.log job_name=OEM_NEW_USER_job1
B)            If you want to pause the job for some time then press CTL+C , type the command given below
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([yes]/no): yes
C)             Now you want to restart the job then you need to attach the job
impdp system/sys dumpfile=DPUMP:export3.dmp logfile=DPUMP:export2.log   attach=OEM_NEW_USER_job1
Export> START_JOB
          Export> STATUS=600
          Export> CONTINUE_CLIENT
Job Monitor
i.e  Typing "status" at this prompt allows you to monitor the current job.
Export> status

Job: OEM_NEW_USER_JOB1
  Operation: IMPORT
  Mode: FULL
  State: EXECUTING
  Bytes Processed: 535,134,568
  Percent Done: 99
  Current Parallelism: 1
  Job Error Count: 0
  Dump File: /home/oracle/dpump/export3.dmp

Worker 1 Status:
  State: EXECUTING
  Object Type: SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
  Total Objects: 70
  Worker Parallelism: 1
Processing object type SCHEMA_EXPORT/MATERIALIZED_VIEW

No comments: