Database Journal
MS SQL Oracle DB2 Access MySQL PostgreSQL Sybase PHP SQL Etc SQL Scripts & Samples Links Database Forum

» Database Journal Home
» Database Articles
» Database Tutorials
MS SQL
Oracle
DB2
MS Access
MySQL
» RESOURCES
Database Tools
SQL Scripts & Samples
Links
» Database Forum
» Sitemap
Free Newsletters:
DatabaseDaily  
News Via RSS Feed


follow us on Twitter
Database Journal |DBA Support |SQLCourse |SQLCourse2
 

Featured Database Articles

Oracle

Posted Apr 28, 2005

Oracle 10g DataPump, Part 2: Implementation - Page 3

By Jim Czuprynski

What to Do If a DataPump Job Crashes

Each DataPump job uses a master table that is created within the schema of the user account that is executing the job to control the requested operation's activities, record its current status, and maintain its restartability. The master table is named the same as the DataPump operation being executed, even if the job name has been generated automatically (the default behavior if no job name is specified via the JOB_NAME parameter).

If a DataPump job should crash, however, this master table will remain behind in that schema. Oracle will not permit another instance of that job to be started until this master table is removed from the schema via the DROP TABLE <job_name>; command.

Building a Custom DataPump Job

As I mentioned in my prior article, DataPump utilizes three components to do its work: a command-line interface to control DataPump operations; DBMS_METADATA, for dumping and processing DDL; and the DBMS_DATAPUMP PL/SQL package that is supplied with Oracle 10g Enterprise that actually performs DataPump Export and Import operations. One really neat byproduct is that it is now possible to create a stored procedure or package that calls DBMS_DATAPUMP procedures and functions directly to create and manage DataPump jobs.

To illustrate, I will create a new stored procedure named HR.SP_EXPORT. This new procedure will control a DataPump Export job that dumps out all metadata and data for all tables and their dependent objects in the Human Resources (HR) schema. Most of the DBMS_DATAPUMP procedure and function names are extremely similar to those used in DataPump interactive mode, so I found that coding this simple example to be quite intuitive (but not trivial).

Listing 2.10 shows the stored procedure source code I used to implement this example, and Log 2.10 shows the resulting DataPump Export log file generated when this export session was executed.

Since I can create virtually any DataPump task as a stored procedure or package, this also grants me the flexibility to schedule this task using the new Oracle 10g Scheduler. Here is an example of how to schedule this new stored procedure to execute on a nightly basis at 23:00 every day:

BEGIN
	DBMS_SCHEDULER.CREATE_JOB (
		 job_name => 'HR_EXPORT'
		,job_type => 'PLSQL_BLOCK'
		,job_action => 'BEGIN HR.SP_EXPORT;END;'
		,start_date => '04/18/2005 23:00:00.000000'
		,repeat_interval => 'FREQ=DAILY'
		,enabled => TRUE
		,comments => 'Performs HR Schema Export nightly at 11 PM'
    );
END;
/

For more information on the Oracle 10g Scheduler, please review my earlier series on this set of scheduling tools.

Conclusion

As these previous practical examples have shown, Oracle 10g's DataPump Export and Import utilities offer an impressive range of control over export and import operations than ever before, The new interactive command mode available during DataPump operations grants an Oracle DBA a high level of control over DataPump tasks. Finally, Oracle DBAs can create customized DataPump jobs by calling DBMS_DATAPUMP methods and attributes directly, and these jobs can be scheduled via the Oracle 10g Scheduler. These new capabilities are a welcome addition to any Oracle DBA's "tool belt" and offer significant alternatives to the original Import and Export utilities.

References and Additional Reading

While there is no substitute for direct experience, reading the manual is not a bad idea, either. I have drawn upon the following Oracle 10g documentation for the deeper technical details of this article:

B10750-01 Oracle Database New Features Guide

B10825-01 Oracle Database Utilities

» See All Articles by Columnist Jim Czuprynski



Oracle Archives

Comment and Contribute

 


(Maximum characters: 1200). You have characters left.

 

 




Latest Forum Threads
Oracle Forum
Topic By Replies Updated
Oracle Data Mining: Classification jan.hasller 0 July 5th, 07:19 AM
Find duplicates - Unique IDs Lava 5 July 2nd, 08:30 AM
no matching unique or primary key rcanter 1 April 25th, 12:32 PM
Update values of one table based on condition of values in other table using Trigger Gladiator 3 February 29th, 06:01 PM


















Thanks for your registration, follow us on our social networks to keep up-to-date