This is a premium alert message you can set from Layout! Get Now!

10g Data Pump

Anup
2

We use data pump for faster “export” and “import” operation in oracle 10g.
Data pump “export” and “import” operation are performed by multiple server processes (Jobs), Each process (jobs) use a master table that is created within user schema to control the requested operation activity, record its current status and maintain its re-start ability. The table is dropped on completion of the data pump jobs.

Note:

  • We can export/import Table, Schema and Database

  • All data pump “. DMP” and “. LOG” files are created on Server, not in the client machine.

  • Performance can be improved by using the PARALLEL parameter.

  • We can use “%U” with the DUMPFILE parameter to allow multiple dumpfiles to be created.

  • We can use TABLE_EXISTS_ACTION=APPEND parameter for importing data into existing tables.

  • The INCLUDE and EXCLUDE parameters can be used to limit the export/import to specific objects

  • DBA_DATAPUMP_JOBS view can be used to monitor the current jobs.

How to Setup

Log In as a SYSDBA and grant following privilege to user.

grant create any directory to ;
create or replace directory as ‘/oracle/export’;
grant read,write on directory to

How to Export/Import Table

expdp user/password tables=EMP,DEPT directory= dumpfile=.dmp logfile=.log

impdp user/password tables=EMP,DEPT directory= dumpfile=.dmp logfile=.log

How to Export/Import Schema

expdp user/password schemas= directory= dumpfile=.dmp logfile=.log

impdp user/password schemas= directory= dumpfile=.dmp logfile=.log

How to Export/import Database

expdp system/password full=Y directory= dumpfile=.dmp logfile=.log

impdp system/password full=Y directory= dumpfile=.dmp logfile=.log

How to monitor the current jobs

Select * from dba_datapump_jobs;

What to Do If a Data Pump Job Crashes

Data pump “export” and “import” operation are performed by multiple server processes (Jobs), Each process (jobs) use a master table that is created within user schema to control the requested operation activity, record its current status and maintain its re-start ability. The table is dropped on completion of the data pump jobs.

If a Data Pump job crash, this master table will remain in that schema. Oracle will not permit another instance of that job to be started until this master table is removed from the schema via the DROP TABLE ; command.

Post a Comment

2 Comments

Please Select Embedded Mode To show the Comment System.*

Join the conversation(2)
To Top