HOME > > 10g Data Pump

10g Data Pump

Anup - Tuesday, January 6, 2009

We use data pump for faster “export” and “import” operation in oracle 10g.
Data pump “export” and “import” operation are performed by multiple server processes (Jobs), Each process (jobs) use a master table that is created within user schema to control the requested operation activity, record its current status and maintain its re-start ability. The table is dropped on completion of the data pump jobs.

Note:

  • We can export/import Table, Schema and Database

  • All data pump “. DMP” and “. LOG” files are created on Server, not in the client machine.

  • Performance can be improved by using the PARALLEL parameter.

  • We can use “%U” with the DUMPFILE parameter to allow multiple dumpfiles to be created.

  • We can use TABLE_EXISTS_ACTION=APPEND parameter for importing data into existing tables.

  • The INCLUDE and EXCLUDE parameters can be used to limit the export/import to specific objects

  • DBA_DATAPUMP_JOBS view can be used to monitor the current jobs.

How to Setup

Log In as a SYSDBA and grant following privilege to user.

grant create any directory to ;
create or replace directory as ‘/oracle/export’;
grant read,write on directory to

How to Export/Import Table

expdp user/password tables=EMP,DEPT directory= dumpfile=.dmp logfile=.log

impdp user/password tables=EMP,DEPT directory= dumpfile=.dmp logfile=.log

How to Export/Import Schema

expdp user/password schemas= directory= dumpfile=.dmp logfile=.log

impdp user/password schemas= directory= dumpfile=.dmp logfile=.log

How to Export/import Database

expdp system/password full=Y directory= dumpfile=.dmp logfile=.log

impdp system/password full=Y directory= dumpfile=.dmp logfile=.log

How to monitor the current jobs

Select * from dba_datapump_jobs;

What to Do If a Data Pump Job Crashes

Data pump “export” and “import” operation are performed by multiple server processes (Jobs), Each process (jobs) use a master table that is created within user schema to control the requested operation activity, record its current status and maintain its re-start ability. The table is dropped on completion of the data pump jobs.

If a Data Pump job crash, this master table will remain in that schema. Oracle will not permit another instance of that job to be started until this master table is removed from the schema via the DROP TABLE ; command.

Anonymous said...

Hi anup bhai you have changed the face of your site, which is good to see. But I think as compared to previous one this seems to be less informative.

Anonymous said...

hi anup bhai can u give some information on using oracle wallet for transparent data encryption.

Contact me

Get in Touch

Need to get touch with me? Please fill out the form with your enquiry.

Name
Anup Srivastav
Address
Lucknow - Utter Pradesh
Email
myindiandba@gmail.com
Message me