Database Journal
MS SQL Oracle DB2 Access MySQL PostgreSQL Sybase PHP SQL Etc SQL Scripts & Samples Tips Database Forum Rss Feed

» Database Journal Home
» Database Articles
» Database Tutorials
MS Access
SQL Scripts & Samples
» Database Forum
» Slideshows
Free Newsletters:

News Via RSS Feed

Rss Feed

Database Journal |DBA Support |SQLCourse |SQLCourse2

Featured Database Articles


Posted Jun 17, 2004

Striking Gold with LogMiner - Part 1: Getting Started

By Steve Callan

Oracle's LogMiner utility has been around for several years now, and it originally appeared as a new feature (as opposed to being an improvement of something else). Each major release of Oracle typically ushers in a new set of features, some of which are actually useful and can make your job as a DBA even easier. The "buzz" on LogMiner has worn off. Part of its initial allure was its ability to help DBAs recover data by generating SQL "undo" statements from the online (and archived) redo logs. Part of what made LogMiner undesirable to use is its interface and reliance on executing a package containing a laundry list of options.

The purpose of this series is to help demystify some of the procedures, illustrate some examples of using LogMiner, and to recommend cases where LogMiner should be a DBA's primary tool for recovering lost data.

Let's look at an outline of some setup and initialization steps you can use to make LogMiner more palatable. Using LogMiner involves (or can involve) a one time setup, a command to start the utility or session, SQL queries you perform to capture the "redo" or "undo" statements, and a command to end the session. Data is stored in dynamic views ("v$" views), and the query to get the SQL statements is actually the easiest part of using LogMiner.

Shown below is a summary table of the steps involved.

Create a


Add logfiles (online or archive)

Start the session

End the session

Use a flat file; need utl_file_dir set for this

Use the redo logs; need to be in archivelog mode

Use an online catalog

1st one is new

2nd and later are "add"

Files can also be removed

Only need one from each group

Use option print_pretty_sql for better readability

Specify a time range or SCNs

End is one statement with no options

Creating a dictionary

LogMiner uses a dictionary to "translate internal object identifiers and datatypes to object names and external data formats." (Oracle9i Database Administrator's Guide) You have a choice of three types of dictionaries for LogMiner. To use the flat file version (which I use later in the article), you will need the UTL_FILE_DIR parameter set.

Having picked the flat file version for my dictionary, what does it take to set the parameter? Trivia question: if you are using an SPFILE (or not), can you dynamically set the UTL_FILE_DIR parameter? One way to find out is to look at what is displayed in the Enterprise Manager Console "Configuration" window.

In the highlighted line, there is no checkmark in the "Dynamic" column, so if "utl_file_dir" is not already set, you will have to set it and bounce the instance.

Before running the DBMS_LOGMNR utility, you may need to compile the packages. Instead of compiling each invalid object one at a time, you can make use of the utlrp.sql script.

SQL> select object_name, status
from dba_objects
where object_name like '%LOG%'
and object_type = 'PACKAGE';

OBJECT_NAME                    STATUS
------------------------------ -------
DBMS_LOGMNR                    INVALID
DBMS_LOGMNR_D                  INVALID
DBMS_LOGSTDBY                  VALID
(a couple of others are not shown)

After your DBMS_LOGMNR package is ready to go, you can create the LogMiner dictionary. Two recommendations from Oracle are to use a separate tablespace for LogMiner objects and to enable supplemental logging. The examples for these recommendations are shown below along with the "create dictionary" statement. Note that the package used to create the dictionary is DBMS_LOGMNR_D.

SQL> execute dbms_logmnr_d.build ('dictionary.ora', -
> 'c:\ora9i\admin\db00\file_dir', -
> options => dbms_logmnr_d.store_in_flat_file);

PL/SQL procedure successfully completed.

SQL> execute dbms_logmnr_d.set_tablespace('logmnr_ts');

PL/SQL procedure successfully completed.

SQL> alter database add supplemental log data
  2  (primary key, unique index) columns;

Database altered.

Adding logfiles

With these steps created, we are now ready to add logfiles to the collection of files we want to be able to examine via SQL statements. The steps to add logfiles are very easy, and you can add and remove them at will. You can also use a shortcut to add files (no need to specify the "OPTIONS" parameter). Track down the names of your log files (not using archive logs here) with

SQL> select * from v$logfile;

------- ------- ------- -----------------------------------
      3         ONLINE  C:\ORA9I\ORADATA\DB00\REDO03.LOG
      2         ONLINE  C:\ORA9I\ORADATA\DB00\REDO02.LOG
      1         ONLINE  C:\ORA9I\ORADATA\DB00\REDO01.LOG

With these file names in hand, use the ADD_LOGFILE procedure with the NEW option for the first one, and an optional ADDFILE for subsequent log files.

SQL> exec dbms_logmnr.add_logfile( -
> logfilename => 'C:\ORA9I\ORADATA\DB00\REDO03.LOG', -
> options => dbms_logmnr.new);

PL/SQL procedure successfully completed.

SQL> exec dbms_logmnr.add_logfile( -
> logfilename => 'C:\ORA9I\ORADATA\DB00\REDO02.LOG', -
> options => dbms_logmnr.addfile);

PL/SQL procedure successfully completed.

SQL> exec dbms_logmnr.add_logfile( -
> logfilename => 'C:\ORA9I\ORADATA\DB00\REDO01.LOG', -
> options => dbms_logmnr.addfile);

PL/SQL procedure successfully completed.

Log files are easily removed:

SQL> exec dbms_logmnr.add_logfile( -
> logfilename => 'C:\ORA9I\ORADATA\DB00\REDO01.LOG', -
> options => dbms_logmnr.removefile);

PL/SQL procedure successfully completed.

and easily replaced:

SQL> exec dbms_logmnr.add_logfile( -
> logfilename => 'C:\ORA9I\ORADATA\DB00\REDO01.LOG');

PL/SQL procedure successfully completed.

Oracle Archives

Latest Forum Threads
Oracle Forum
Topic By Replies Updated
Oracle Data Mining: Classification jan.hasller 0 July 5th, 07:19 AM
Find duplicates - Unique IDs Lava 5 July 2nd, 08:30 AM
no matching unique or primary key rcanter 1 April 25th, 12:32 PM
Update values of one table based on condition of values in other table using Trigger Gladiator 3 February 29th, 06:01 PM