Database Journal
MS SQL Oracle DB2 Access MySQL PostgreSQL Sybase PHP SQL Etc SQL Scripts & Samples Links Database Forum

» Database Journal Home
» Database Articles
» Database Tutorials
MS SQL
Oracle
DB2
MS Access
MySQL
» RESOURCES
Database Tools
SQL Scripts & Samples
Links
» Database Forum
» Sitemap
Free Newsletters:
DatabaseDaily  
News Via RSS Feed


follow us on Twitter
Database Journal |DBA Support |SQLCourse |SQLCourse2
 

Featured Database Articles

Oracle

Posted Apr 28, 2005

Oracle 10g DataPump, Part 2: Implementation - Page 2

By Jim Czuprynski

DataPump Import Scenarios

Oracle 10g's DataPump Import utility also demonstrates its power and flexibility when it is aimed at some real-world data and object import requirements:

Importing From Multiple DataPump Export Dump Files. DataPump Import automatically uses all available export dump files during an import operation as long as the dump file name qualifier specified for the DUMPFILE parameter matches that of the desired export files (including that handy %U variable).

To illustrate, I will use the export files I created via the example in Listing 2.4 as the data source. I will target those few tables that need to be refreshed in the Human Resources schema for the Payroll application refresh scenario. Listing 2.6 shows the DataPump Import control file used to accomplish this, and Log 2.6 contains the resulting log file generated.

Importing Selected Objects. Just as the original Import utility did, DataPump Import can load data from an existing DataPump Export dump file into existing tables. DataPump Import has enhanced this capability by providing a more robust set of parameters, INCLUDE and EXCLUDE, to specify limits on which objects should be imported.

Listing 2.7 shows the DataPump Import control file that illustrates this feature. I will use the disaster recovery nightly export dump file I generated in a previous example as the source for reloading data into the Sales History (SH) schema. I have specified the tables I want to reload in the INCLUDE parameter, and I have also instructed DataPump Import to first truncate the tables before loading them by specifying a value of TRUNCATE for the TABLE_EXISTS_ACTION parameter.

Log 2.7 contains the resulting log file from this import operation. Note that the DataPump wisely did not truncate any tables upon which a foreign key constraint depended, and gently reminded me of this fact by logging an error message during the attempted TRUNCATE operation.

Generating SQLFILES Instead of Importing Data. DataPump Import also offers the ability to save the resulting DDL statements that the DataPump would have executed to create the objects and tables in the export file to a SQLFILE.

To demonstrate, I will set up a DataPump Import control file (shown in Listing 2.8) that redirects the DDL to a SQLFILE stored within the export directory; Log 2.8 shows the SQLFILE output that results from this type of operation. The SQLFILE output produced is virtually identical to the type of metadata output generated from DBMS_METADATA.GET_DDL when that function is run against a database's objects - not surprising, of course, since DataPump relies on that supplied PL/SQL procedure to generate all metadata during export and import operations.

Remapping Database Object Attributes During DataPump Imports. Another handy feature of DataPump Import is the ability to remap database tables and objects stored in one schema to another completely new schema. See Listing 1.4 from the prior article in this series for an example of how to implement this feature.

Monitoring and Controlling DataPump Jobs

As mentioned in the prior article, the DataPump utility provides a new interactive mode that gives an Oracle DBA extensive control over DataPump Export and Import operations.

Switching Between Logging and Interactive Client Mode. To illustrate, I will initiate a long-running DataPump Export based on the parameter file shown in Listing 2.9:

C:\RMANCMD>expdp system/oracle parfile=c:\rmancmd\longexport.dpectl

Export: Release 10.1.0.2.0 - Production on Saturday, 16 April, 2005 12:44

Copyright (c) 2003, Oracle.  All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options
FLASHBACK automatically enabled to preserve database integrity.
Starting "SYSTEM"."LONGEXPORT":  system/******** parfile=c:\rmancmd\longexport.dpectl
Estimate in progress using BLOCKS method...
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA

Now I will immediately switch this job to interactive client mode by typing CTRL-C:

Export>

Viewing Job Status. From interactive mode, I can issue the STATUS command to view the job's current status:

Export> status
Job: LONGEXPORT
  Operation: EXPORT
  Mode: FULL
  State: EXECUTING
  Bytes Processed: 0
  Current Parallelism: 1
  Job Error Count: 0
  Dump File: c:\oracle\export_dir\long_export_%u.dmp
    size: 262,144
  Dump File: C:\ORACLE\EXPORT_DIR\LONG_EXPORT_01.DMP
    size: 262,144
    bytes written: 4,096

Worker 1 Status:
  State: EXECUTING

Closing and Reattaching To A Job. To switch from interactive client mode back to logging mode, I can issue the CONTINUE_CLIENT command, and if I want to detach from a client session, but leave a running job still executing, I can issue the EXIT_CLIENT command:

Export> exit_client
C:\RMANCMD>

Likewise, to reattach to a running job after closing the client connection, I can reopen a session while specifying the ATTACH=<job_name> parameter on the command line:

C:\RMANCMD>expdp system/oracle attach=longexport

Export: Release 10.1.0.2.0 - Production on Saturday, 16 April, 2005 13:03
Copyright (c) 2003, Oracle.  All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options

Job: LONGEXPORT
  Owner: SYSTEM
  Operation: EXPORT
  Creator Privs: FALSE
  GUID: 0CC46AD48B4B4C5AB17B9CBD2C85810C
  Start Time: Saturday, 16 April, 2005 13:03
  Mode: FULL
  Instance: zdcdb
  Max Parallelism: 1
  EXPORT Job Parameters:
  Parameter Name      Parameter Value:
     CLIENT_COMMAND        system/******** parfile=c:\rmancmd\longexport.dpectl
     DATA_ACCESS_METHOD    AUTOMATIC
     ESTIMATE              BLOCKS
     INCLUDE_METADATA      1
     LOG_FILE_DIRECTORY    EXPORT_DIR
     LOG_FILE_NAME         LongExport.explog
     TABLE_CONSISTENCY     0
  State: IDLING
  Bytes Processed: 0
  Current Parallelism: 1
  Job Error Count: 0
  Dump File: c:\oracle\export_dir\long_export_%u.dmp
    size: 5,242,880
  Dump File: c:\oracle\export_dir\long_export_01.dmp
    size: 5,242,880
    bytes written: 4,096

Worker 1 Status:
  State: UNDEFINED
Export>

Controlling A Job's Execution. If I decide to add more files to the job, I can do that via the ADD_FILE command, and if I want to add more worker processes, I can issue the PROCESSES command:

Export> status

Job: LONGEXPORT
  Operation: EXPORT
  Mode: FULL
  State: EXECUTING
  Bytes Processed: 0
  Current Parallelism: 1
  Job Error Count: 0
  Dump File: c:\oracle\export_dir\long_export_%u.dmp
    size: 262,144
  Dump File: C:\ORACLE\EXPORT_DIR\LONG_EXPORT_01.DMP
    size: 262,144
    bytes written: 4,096

Worker 1 Status:
  State: EXECUTING

Export> parallel=2

Export> status

Job: LONGEXPORT
  Operation: EXPORT
  Mode: FULL
  State: EXECUTING
  Bytes Processed: 0
  Current Parallelism: 2
  Job Error Count: 0
  Dump File: c:\oracle\export_dir\long_export_%u.dmp
    size: 262,144
  Dump File: C:\ORACLE\EXPORT_DIR\LONG_EXPORT_01.DMP
    size: 262,144
    bytes written: 4,096

Worker 1 Status:
  State: EXECUTING

Worker 2 Status:
  State: WORK WAITING

I can also halt the execution of the job via the STOP_JOB command:

Export7gt; stop_job;

At this point, I have two options. If I want to restart the job, I can reattach to the job and issue the START_JOB command. The job will resume at its current point. On the other hand, if I want to end the job -- perhaps because it is running too long, was executed in error, or is consuming too much space - I can reattach to the session and then issue the KILL_JOB command:

C:\RMANCMD>expdp system/oracle attach=longexport

Export: Release 10.1.0.2.0 - Production on Saturday, 16 April, 2005 13:03

Copyright (c) 2003, Oracle.  All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options

Job: LONGEXPORT
  Owner: SYSTEM
  Operation: EXPORT
  Creator Privs: FALSE
  GUID: 0CC46AD48B4B4C5AB17B9CBD2C85810C
  Start Time: Saturday, 16 April, 2005 13:03
  Mode: FULL
  Instance: zdcdb
  Max Parallelism: 1
  EXPORT Job Parameters:
  Parameter Name      Parameter Value:
     CLIENT_COMMAND        system/******** parfile=c:\rmancmd\longexport.dpectl
     DATA_ACCESS_METHOD    AUTOMATIC
     ESTIMATE              BLOCKS
     INCLUDE_METADATA      1
     LOG_FILE_DIRECTORY    EXPORT_DIR
     LOG_FILE_NAME         LongExport.explog
     TABLE_CONSISTENCY     0
  State: IDLING
  Bytes Processed: 0
  Current Parallelism: 1
  Job Error Count: 0
  Dump File: c:\oracle\export_dir\long_export_%u.dmp
    size: 5,242,880
  Dump File: c:\oracle\export_dir\long_export_01.dmp
    size: 5,242,880
    bytes written: 4,096

Worker 1 Status:
  State: UNDEFINED

Export> kill_job
Are you sure you wish to stop this job ([y]/n): y
C:\RMANCMD>


Oracle Archives

Comment and Contribute

 


(Maximum characters: 1200). You have characters left.

 

 




Latest Forum Threads
Oracle Forum
Topic By Replies Updated
Oracle Data Mining: Classification jan.hasller 0 July 5th, 07:19 AM
Find duplicates - Unique IDs Lava 5 July 2nd, 08:30 AM
no matching unique or primary key rcanter 1 April 25th, 12:32 PM
Update values of one table based on condition of values in other table using Trigger Gladiator 3 February 29th, 06:01 PM