What to Do If a DataPump Job Crashes
Each DataPump job uses a master table that is created
within the schema of the user account that is executing the job to control the
requested operation's activities, record its current status, and maintain its
restartability. The master table is named the same as the DataPump operation
being executed, even if the job name has been generated automatically (the
default behavior if no job name is specified via the JOB_NAME parameter).
If a DataPump job should crash, however, this master
table will remain behind in that schema. Oracle will not permit another
instance of that job to be started until this master table is removed from the
schema via the DROP TABLE <job_name>; command.
Building a Custom DataPump Job
As I mentioned in my prior article, DataPump utilizes
three components to do its work: a command-line interface to control DataPump
operations; DBMS_METADATA, for dumping and processing DDL; and the
DBMS_DATAPUMP PL/SQL package that is supplied with Oracle 10g Enterprise that
actually performs DataPump Export and Import operations. One really neat
byproduct is that it is now possible to create a stored procedure or package
that calls DBMS_DATAPUMP
procedures and functions directly to create and manage DataPump jobs.
To illustrate, I will create a new stored procedure named
This new procedure will control a DataPump Export job that dumps out all
metadata and data for all tables and their dependent objects in the Human
Resources (HR) schema. Most of the DBMS_DATAPUMP procedure and function names are
extremely similar to those used in DataPump interactive mode, so I found that
coding this simple example to be quite intuitive (but not trivial).
Listing 2.10 shows the
stored procedure source code I used to implement this example, and Log 2.10 shows the resulting DataPump Export log
file generated when this export session was executed.
Since I can create virtually any DataPump task as a
stored procedure or package, this also grants me the flexibility to schedule
this task using the new Oracle 10g Scheduler. Here is an example of how to
schedule this new stored procedure to execute on a nightly basis at 23:00 every
job_name => 'HR_EXPORT'
,job_type => 'PLSQL_BLOCK'
,job_action => 'BEGIN HR.SP_EXPORT;END;'
,start_date => '04/18/2005 23:00:00.000000'
,repeat_interval => 'FREQ=DAILY'
,enabled => TRUE
,comments => 'Performs HR Schema Export nightly at 11 PM'
For more information on the Oracle 10g Scheduler,
please review my earlier series
on this set of scheduling tools.
As these previous practical examples have shown, Oracle
10g's DataPump Export and Import utilities offer an impressive range of
control over export and import operations than ever before, The new interactive
command mode available during DataPump operations grants an Oracle DBA a
high level of control over DataPump tasks. Finally, Oracle DBAs can create
customized DataPump jobs by calling DBMS_DATAPUMP methods and attributes directly,
and these jobs can be scheduled via the Oracle 10g Scheduler. These new
capabilities are a welcome addition to any Oracle DBA's "tool belt"
and offer significant alternatives to the original Import and Export utilities.
References and Additional Reading
While there is no substitute for direct experience, reading
the manual is not a bad idea, either. I have drawn upon the following Oracle
10g documentation for the deeper technical details of this article:
B10750-01 Oracle Database
New Features Guide
B10825-01 Oracle Database
See All Articles by Columnist Jim Czuprynski