Database Journal
MS SQL Oracle DB2 Access MySQL PostgreSQL Sybase PHP SQL Etc SQL Scripts & Samples Tips Database Forum Rss Feed

» Database Journal Home
» Database Articles
» Database Tutorials
MS Access
SQL Scripts & Samples
» Database Forum
» Slideshows
Free Newsletters:

News Via RSS Feed

Rss Feed

Database Journal |DBA Support |SQLCourse |SQLCourse2

Featured Database Articles


Posted Jan 14, 2008

Oracle 10g Backup Guide: A Small County Government Approach - Page 2

By DatabaseJournal.com Staff

by Kevin E. Medlin

5. Create archivelog backup which includes Recovery Catalog housekeeping

The archivelog backup is taken every day. We have already mentioned we normally take a level 0 backup once per week. Since this is a daily occurrence, we perform our RMAN Recovery Catalog maintenance in this step. The script is called DBNAME_arc_only_disk.rcv.

Code Listing 8:

#************* Configure controls for Recovery Manager ************
#************* in case they have been changed *********************
configure controlfile autobackup on;
configure controlfile autobackup format for device type disk 
	to 'X:\rman\CFILE_%F.bks';
configure retention policy to recovery window of 34 days;

#************* Perform nightly RMAN Archivelog Backup *************
backup archivelog all format 'X:\rman\ARC_%d_%U_%T_%s.bks';

#************* Maintenance Step to remove old Archivelogs *********
delete noprompt archivelog until time 'SYSDATE - 3';

#************* Maintenance Steps to clean Recovery Catalog ********
report obsolete device type disk;
crosscheck backup completed before 'sysdate-34';
delete noprompt obsolete recovery window of 34 days device type disk;
delete noprompt expired backup device type disk;
#************* Show all controls configured for this **************
#************* Database in RMAN ***********************************
show all;

#************* List all RMAN disk backups *************************
list backup of database device type disk;

The first two commands configure RMAN for the control file and spfile auto backup. The first command turns it on so that every time a backup is run for a target this has been set for, the control file and spfile are backed up. The second command instructs RMAN on how to name the file on a particular format. In this case, when backing up to disk call the file ‘X:\rman\CFILE_F%.bks’. The third line tells the Recovery Catalog that the backups for the target database are good for 34 days. The fourth line actually performs the backup. It tells RMAN to back up all the archivelogs on disk in the specified format. The next step removes all archivelogs older than three days. We like to keep three days of archivelogs on disk. Now we start in the Recovery Catalog maintenance. Catalog maintenance is very important. If these files were deleted and the maintenance steps not performed, then the Recovery Catalog would contain information about backups that were no longer significant (Alapati, 2005, p. 661). The next step reports obsolete backups made to disk that meet our retention policy of 34 days. Any backups older are considered obsolete and can be deleted. The crosscheck command will notify you whether or not any of your files are missing. If they are, they will be marked as expired. The next two delete commands remove the obsolete and expired information from the catalog. Remember, we delete all of our RMAN backup files from disk every 7 days. Our retention policy is set to 34 days because that is what our on-site tape retention policy is. If needed, we could restore RMAN files on a server up to 34 days old. Could we recover the files to a database? Yes, we could! How? Because we have a 34 day retention policy and all our RMAN information about those backups are still in the Recovery Catalog! Next, the show all command gives the configured parameters we have in RMAN (Hart & Freeman, 2007, p. 89). The list command shows all the disk backups that are still relevant in the Recovery Catalog.

Tip: Obviously, the Recovery Catalog is very important and needs to be backed up (Looney, 2004, p. 918). It is also the easiest to recover. If you lose your Recovery Catalog and you’re in a pinch, you can import the Recovery Catalog schema into any database and voila! You’ve got a new Recovery Catalog (Exporting and Importing, 2005).

The batch file for this script is simple. The batch file is called DBNAME_arc_only_disk.bat.

Code Listing 9:

set CURDIR=D:\oracle\admin\common\backup


rman target ID/pword@DBname catalog rcatID/rcatpword@rcatname log=logs\DBNAME_arc_only_disk.log @DBNAME_arc_only_disk.rcv

page_on_backup.vbs DBNAME_arc_only_disk.log page_DBNAME_arc_only.log DBNAME

The first two lines set variables for your environment. The third and fourth lines are important for using scheduled tasks. Windows needs to be directed to where the RMAN script is, so set the directory and then move there. Next RMAN is called. The target and the catalog are logged into. A log file is created in a separate directory inside the current directory called “logs” and the script in code listing 9 is called. If there are any errors, a Visual Basic script is called that pages support personnel. If there are no errors then an email of the log file is sent. There will be more details on paging in section 9.

6. Delete data pump export files

Data Pump needs new file names for the dump files each time it runs. Unlike export, it will not overwrite old dump files. So prior to any nightly data pump scheduled task, the old data pump files need to be removed. This batch file does just that.

Code Listing 10:

set CURDIR=D:\oracle\admin\common\batch
set logfile=logs\delete_Xdrive_expdp_files.log
echo 'logfile = ' %logfile% >  %logfile%
echo . >> %logfile%
echo . >> %logfile%
echo '*********************************************************' >> %logfile%
echo '* The following files will be deleted.                  *' >> %logfile%
echo '*********************************************************' >> %logfile%
echo . >> %logfile%
echo . >> %logfile%
FORFILES /p X:\data_pump\DMPs /m *.dmp   /d -0 /c "CMD /C echo @PATH\@FILE   @FDATE.@FTIME" >> %logfile%
echo . >> %logfile%
echo . >> %logfile%
echo '*********************************************************' >> %logfile%
echo '* Starting deletes now ...                              *' >> %logfile%
echo '*********************************************************' >> %logfile%
echo . >> %logfile%
echo . >> %logfile%
FORFILES /p X:\data_pump\DMPs /d -0 /c "CMD /C del  *.dmp" >> %logfile%
echo . >> %logfile%
echo . >> %logfile%

As we’ve seen in the other batch scripts here, initially we set the directory then move there. We also set the log file as a variable since we will be using it frequently. In fact, the first entry into the log file is the log file name. The echoes with dots are just for better readability in the log. There are really only two significant commands in this script and both of them are FORFILES. The first one simply lists the files that will be deleted. The second one actually performs the deletion of the files.

7. Perform data pump export

As an additional safety measure in our portfolio, we also take nightly data pump exports. As an additional advantage, when there are times when we need a table or two restored it is far easier to get them here than from RMAN.

We use par files to hold our data pump commands, just like regular exports (Kumar, Kanagaraj & Stroupe, 2005). We have some variables set in the database. You will see them in the par file and these are the SQL commands used to create them:

Code Listing 11:

create directory x_dp_dumps as 'X:\data_pump\DMPs';
create directory x_dp_logs  as 'X:\data_pump\logs';

These signal data pump where to send the dump files and log files. Here are the par file contents:

Code Listing 12:

content   = all
dumpfile  = x_dp_dumps:DBNAME_FULL_%U.dmp
estimate  = statistics
full      = y
job_name  = DBNAME_FULL
logfile   = x_dp_logs:DBNAME_FULL.log
filesize  = 2G

Content equals all means we want to export everything, or no exclusions. The dump file parameter asks for a file location and name. The location is given as a variable. The file name uses a substitution variable, %U. The %U will be replaced by a two number integer starting with 01. One file could be created or many, depending on the database size. Estimate gives you a good idea about what size your dump file will be. Block is the default but we use statistics since ours are current. Full specifies whether or not you want to export a full database mode export. Job_name is a preference, in case you like to name your own. Log file is set up similarly to dump file. Log file asks for a file location and name. The location is given as a variable and the name is also given. File size we use as a preference. We like to keep our file sizes to 2 GB or less. When copying or compressing, it is far easier and faster to move or compress 10 files at the same time than 1 big file.

We call data pump as a scheduled task, but we set it up a little differently. We have an individual par file for each database and one common batch file to execute them. Here is the command used in Scheduled Tasks:

Code Listing 13:

D:\oracle\admin\common\expdp\expdp_DATABASE.bat DBNAME

Here is the actual batch file used to call the data pump par files.

Code Listing 14:

set ORACLE_HOME=D:\oracle\product\10.2.0\db_1

%ORACLE_HOME%\bin\expdp ID/pword@%DATABASE% parfile=D:\oracle\admin\common\expdp\expdp_%DATABASE%.par

The only thing being passed to the batch file is the database name. It becomes %DATABASE%. Performing data pump exports in this manner has worked out pretty well for us.

8. Check logs for errors

Every night after all the batch jobs on a server have completed, we run an error check on that server. It is a simple batch file that performs a qgrep on key words in key logs and formats the information in an easily readable fashion. As previously stated, you can easily tell the “old” logs from the current “log”s by how the files are named. Here is the batch file called error_check.bat:

Code Listing 15:

set LOGFILE=error_check.log
echo ************************************************************ > %LOGFILE%
echo *********************************************************** >> %LOGFILE%
echo *********************************************************** >> %LOGFILE%
echo %SERVER% >> %LOGFILE%
echo Daily Error Report >> %LOGFILE%
date /T >> %LOGFILE%
echo *********************************************************** >> %LOGFILE%
echo *********************************************************** >> %LOGFILE%
echo *********************************************************** >> %LOGFILE%
echo .  >> %LOGFILE%
echo .  >> %LOGFILE%
echo *********************************************************** >> %LOGFILE%
echo The following files have been found with errors. >> %LOGFILE%
echo *********************************************************** >> %LOGFILE%
echo .  >> %LOGFILE%
echo .  >> %LOGFILE%
echo Backup files  >> %LOGFILE%
echo .  >> %LOGFILE%
qgrep -l RMAN- D:\oracle\admin\common\backup\logs\* >> %LOGFILE%
echo .  >> %LOGFILE%
echo .  >> %LOGFILE%
echo Batch files  >> %LOGFILE%
echo .  >> %LOGFILE%
qgrep -l error D:\oracle\admin\common\batch\logs\* >> %LOGFILE%
echo .  >> %LOGFILE%
echo .  >> %LOGFILE%
echo Clone files  >> %LOGFILE%
echo .  >> %LOGFILE%
qgrep -l RMAN- D:\oracle\admin\common\clone\logs\* >> %LOGFILE%
echo .  >> %LOGFILE%
echo .  >> %LOGFILE%
echo Alert Logs  >> %LOGFILE%
echo .  >> %LOGFILE%
qgrep -l ORA- D:\oracle\product\10.2.0\admin\DBNAME1\udump\* >> %LOGFILE%
qgrep -l ORA- D:\oracle\product\10.2.0\admin\DBNAME2\udump\* >> %LOGFILE%
echo .  >> %LOGFILE%
echo .  >> %LOGFILE%

The backup file check is for RMAN errors. The batch file check is for errors with file deletions and creations. The clone file error check is for failed database duplications. The alert log check is a little misleading. This actually checks the udump directories for files with errors. Shortly after the log is created, we send it to ourselves using a free email client called Bmail from Beyond Logic. This is what our email batch file email_errors.bat, looks like:

Code Listing 16:

bmail -s -t support2@thecountyoverhere.gov -f Oracle@ 
thecountyoverhere.gov -h -a "DBSERVER Daily Error Report" -m error_check.log
bmail -s -t support1@thecountyoverthere.gov -f Oracle@ 
thecountyoverhere.gov -h -a "DBSERVER Daily Error Report" -m error_check.log

9. Page and/or email short error description

Some jobs need immediate notification upon failure. For these, we use a Visual Basic script that sorts through whether or not we have an error and immediately sends us a page. This script runs at every execution and sends an email with the log output. This is something we want on these jobs regardless of whether the job completes successfully or not. But if it fails, we want an email of the log and a page indicating the failure. The script is called with three arguments, like this:

Code Listing 17:

page_on_backup.vbs DBNAME_arc_only_disk.log page_DBNAME_arc_only.log DBNAME

The arguments are log name, script log name, and database name. Here is a copy of page_on_backup.vbs. This is the script that runs in our RMAN level 0 backups and archive log backups.

Code Listing 18:

'This script emails the log file for a backup and searches it for the phrase "ORA-". If found, pages the recipients
'Additional pager Numbers
'   whodat - 7045551212@message.alltel.com
'   whodis - 7045551313@messaging.nextel.com

Dim ArgObj, var1, var2
Set ArgObj = WScript.Arguments
var1 = ArgObj(0)
var2 = ArgObj(1)
var3 = ArgObj(2)

'email log files

Dim WshSHell1 : set WshShell1 = CreateObject("WScript.Shell")
WshShell1.Run("D:\oracle\admin\common\error\bmail -s -t 
support1@thecountyoverthere.gov -f Oracle@thecountyoverhere.gov -h -a " & 
var1 & "  attached -m d:\oracle\admin\common\backup\logs\" & var1 &"")
WshShell1.Run("D:\oracle\admin\common\error\bmail -s -t 
support2@thecountyoverhere.gov -f Oracle@thecountyoverhere.gov -h -a " & var1 
& "  attached -m d:\oracle\admin\common\backup\logs\" & var1 &"")

'msgbox "var1 = " & var1 & "  var2 = " & var2 & ""

Const ForReading = 1, ForWriting = 2

Set WshNetwork = WScript.CreateObject("WScript.Network")
Dim lgmain : Set lgmain = CreateObject("Scripting.FileSystemObject")
Dim lgmain2 : Set lgmain2 = 
lgmain.OpenTextFile("D:\Oracle\Admin\common\backup\logs\" & var2 &"", 
ForWriting, True)

lgmain2.WriteLine "Processing began: " & Now
lgmain2.WriteLine ""
lgmain2.WriteLine ""

Set objRegEx = CreateObject("VBScript.RegExp")
objRegEx.Global = True
objRegEx.Pattern = "ORA-"

Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFile = objFSO.OpenTextFile("D:\oracle\admin\common\backup\logs\" & var1 & "", ForReading)
strSearchString = objFile.ReadAll

Set colMatches = objRegEx.Execute(strSearchString)
If colMatches.Count > 0 Then
  Dim WshSHell2 : set WshShell2 = CreateObject("WScript.Shell")
  WshShell2.Run("D:\oracle\admin\common\error\bmail -s -t 
 7045551414@my2way.com -f " & var3 & "@thecountyoverhere.gov -h -a " & var3 & 
  WshShell2.Run("D:\oracle\admin\common\error\bmail -s -t 
 support1@thecountyoverthere.gov -f " & var3 & "@thecountyoverhere.gov -h -a " 
  WshShell2.Run("D:\oracle\admin\common\error\bmail -s -t 
 support2@thecountyoverhere.gov -f " & var3 & "@thecountyoverhere.gov -h -a " 
  lgmain2.WriteLine "page completed"
End If
If colMatches.Count = 0 Then
          lgmain2.WriteLine "no problems found, no page required"
End If

The first thing that happens is that an email of the log is sent in an email. Next, the error codes are searched for in the log. If an error is found, a page is also sent. If not, the script completes without paging.


A successful backup plan is a major portion of any database administrator’s overall database strategy. Backups must be carefully planned and checked often. Automation is a good thing and can be very useful. It must also be thoroughly defined and rigorously tested. It can be done if you think about your environment logically. Ask questions such as, “What must be done first? What must be done next?” and so on. When you reach a roadblock, think about other ways you can perform the same task. This becomes easier if you try to think out your environment and what you would like to accomplish ahead of time, this could keep you from having to backtrack. Sometimes, changing the order of tasks may accomplish your goal as well.


Alapati, Sam R. (2005). Expert Oracle Database 10g Administration. New York. Springer-Verlag New York, Inc.

Bersinic, Damir & Watson, John (2005). Oracle Database 10g OCP Certification All-In-One Exam Guide [Electronic Version]. California. The McGraw-Hill Companies, Inc.

Bryla, Bob & Thomas, Biju (2005). OCP: Oracle 10g New Features for Administrators Study Guide. California. Sybex, Inc.

Chien, Timothy (2005). Best Practices for Oracle Database 10g Backup and Recovery. Retrieved 11-15-2007 from http://www.oracle.com/technology/deploy/availability/pdf/S942_Chien.doc.pdf.

(2005). Exporting and Importing the Recovery Catalog. Retrieved on 11-15-2007 from


Fenner, Tim (2007-08-22). Demystifying the AutoExNT tool in the Windows Server 2003 Resource Kit. Retrieved 11-15-2007 from http://searchwincomputing.techtarget.com/tip/0,289483,sid68_gci1269060,00.html.

Freeman, Robert G. (2004). Oracle Database 10g New Features. California. The McGraw-Hill Companies, Inc.

Greenwald, Rick, Stackowiak, Robert & Stern, Jonathan. (2004). Oracle Essentials: Oracle Database 10g. California. O'Reilly Media, Inc.

Hart, Matthew & Freeman, Robert G. (2007). Oracle Database 10g RMAN Backup & Recovery. California. The McGraw-Hill Companies, Inc.

Kumar, Arun R., Kanagaraj, John & Stroupe, Richard (2005). Oracle Database 10g INSIDER SOLUTIONS [Electronic Version]. Indiana. Sams Publishing.

Looney, Kevin (2005). Oracle Database 10g: The Complete Reference. California. The McGraw-Hill Companies, Inc.

Kevin Medlin

Kevin Medlin has been administering, supporting, and developing in a variety of industries including energy, retail, insurance and government since 1997. He is currently a DBA supporting Oracle and SQL Server, and is Oracle certified in versions 8 through 10g. He received his graduate certificate in Storage Area Networks from Regis University and he will be completing his MS in Technology Systems from East Carolina University in 2008. When he's not trying to make the world a better place through IT, he enjoys spending time with his family, traveling, hanging out by the pool, riding horses, hiking, and camping.

Oracle Archives

Latest Forum Threads
Oracle Forum
Topic By Replies Updated
Oracle Data Mining: Classification jan.hasller 0 July 5th, 07:19 AM
Find duplicates - Unique IDs Lava 5 July 2nd, 08:30 AM
no matching unique or primary key rcanter 1 April 25th, 12:32 PM
Update values of one table based on condition of values in other table using Trigger Gladiator 3 February 29th, 06:01 PM