Tuesday, August 17, 2021

Difference between exp/imp and expdp/impdp

 

Difference between exp/imp and expdp/impdp

Datapump

Tradionational exp/imp

Datapump operates on a group of files called dump file sets.

Normal export operates on a single file.

Datapump access files in the server (using ORACLE directories)

Traditional export can access files in client and server both (not using ORACLE directories).

Datapump represent database metadata information as XML document format.

Exports (exp/imp) represent database metadata information as DDLs in the dump file

Datapump has parallel execution.

exp/imp single stream execution.

Datapump does not support sequential media like tapes

traditional export supports equential media like tapes

In Data Pump, we can stop and restart the jobs.

That facility don’t support in traditional exp/imp

Impdp/Expdp use parallel execution rather than a single stream of execution, for improved performance.

 

Data Pump will recreate the user, whereas the old imp utility required the DBA to create the user ID before importing.

 

.

 

Data Pump is faster than conventional export/import. 

·         Data Pump is block mode, Traditional exp is byte mode.

·         Data Pump will do parallel execution.

·         Data Pump uses direct path API.

·         Datapump gives 15 – 50% performance improvement than exp/imp.

 

 

 

 

Parameter name changes in datapump:

 

EXP/IMP Parameter

EXPDP/IMPDP Parameter

owner

schemas

file

dumpfile

log

logfile/nologfile

IMP: fromuser, touser

IMPDP: remap_schema

 

Additional Features in datapump:

1.      Job Estimation can be done in datapump

2.      SAMPLE parameter is used for taking the sample % data.

3.      Failed export/import Jobs can be restarted

4.      EXCLUDE/INCLUDE parameter allows the fine-grained object selection.

5.      Data remapping can be done using REMAPDATA parameter.

6.      You don’t need to specify the BUFFER size in datapump

7.      Job estimated completion time can be monitored from v$session_longops view.

8.      Dump file can be compressed with COMPRESSION parameter.

9.      Data encryption can be done in datapump.

10.  DATAPUMP has interactive options like ADD_FILE, START_JOB, KILL_JOB, STOP_JOB.

11.  REUSE_DUMPFILES parameter asks the confirmation/rewrite the existing dumpfile.

No comments:

Post a Comment

Oracle OS Management Hub in OCI – A Complete Overview

  Oracle OS Management Hub in OCI – A Complete Overview In any enterprise IT landscape, managing operating systems across hundreds of compu...