Database Journal
MS SQL Oracle DB2 Access MySQL PostgreSQL Sybase PHP SQL Etc SQL Scripts & Samples Tips Database Forum

» Database Journal Home
» Database Articles
» Database Tutorials
MS Access
SQL Scripts & Samples
» Database Forum
» Slideshows
Free Newsletters:

News Via RSS Feed

Database Journal |DBA Support |SQLCourse |SQLCourse2

Featured Database Articles


Posted Jan 24, 2003

Leaving the History Behind with Oracle's Fine Grained Access Control

By DatabaseJournal.com Staff

Marin Komadina

Oracle has not solved the problem of exporting all but a few larger tables from a user schema thus far. This leaves DBAs to find their own workarounds when necessary. This article covers the three most used solutions.

  • Full export
  • Data dictionary view modification
  • Fine Grained Access Control ( FGAC)

Scenario A:
The DBA needs to export schema tables, without the static history data. The user schema has approximately 1000 tables; four tables are static and include historic partitioned data. These four tables need to be skipped in the export file.

Scenario B:
Recovery strategy, with daily export, has to be abandoned because it takes too long. Exporting the large history tables consumes most of the export time.

In a case where the production table data has been deleted, an export file can be used to quickly populate small tables, leaving the big historical tables to be fixed later with a transportable tablespaces solution or by importing them from separated export file.

Some solutions have suggested creating large tables with different schemas, importing, exporting, duplicating, moving etc. I have found this solution to be too haphazard for some bussiness oriented companies.

For a clever DBA, there are several possible solutions:

  • full schema export and send everything to developers, saying that is only way possible
  • play with fire, modifying the database catalog view
  • do a professional job and show others that you know what Fine Grained Access Control is

Test Case

As a test case, let's assume we have created a small Oracle database on Solaris.

  • User schema for this test will be user ARTIST
  • ARTIST has a few test tables
  • User ARTIST has DBA rights, just to make the test simpler.
  • The same logic applies to any other user you choose to export.
Database Version:
Oracle8i Enterprise Edition Release - Production
With the Partitioning option
JServer Release - Production
Schema ARTIST : 

TABLE_NAME                          COUNT(*)
BACKUP_STATUS                           411
CRITICAL_SEGMENTS                         0
INPUT                                10,000
PLAN_TABLE                               13
TRANSFER                             10,000
TS_STORAGE                           25,334


a. ) Full export

. about to export ARTIST's tables via Conventional Path ...
. . exporting table                  BACKUP_STATUS        411 rows exported
. . exporting table              CRITICAL_SEGMENTS          0 rows exported
. . exporting table                          INPUT      10000 rows exported
. . exporting table                     PLAN_TABLE         13 rows exported
. . exporting table                       TRANSFER      10000 rows exported
. . exporting table                     TS_STORAGE      25334 rows exported

The full export that we have just produced is one solution to our problem. You have just to send it, together with log file to developers.

Here is small tip regarding export. It is possible to make a selective table export with parameter tables.

$exp parfile=artist.par userid=system/manager

Example of parameter file: 

$ cat artist.par 

This allows you to mark up to a maximum of 99 table names for export in the form owner.table_name inside parameter file. In our case, we have 996 tables that we want to export. (We don't want the 4 large ones). It would be hard work to write all of the names in a list, and to make several export files with several parameter files. For that reason, we are going to solution b.)

b.) Data dictionary view modification

For this next solution, I'll remind you of the oft repeated warning from Oracle.

"You should not change datadictionary objects without supervision from Oracle."

That is absolutely correct; but we're still going to do it. Our operation is not that critical, because we are only going to change the definition of one datadictionary view: the one responsible for table selection during export. Be sure to make a backup before you start.

from exu81tab WHERE ownerid = uid 

Version 8: view name - SYS.EXU81TABU
Version 7: view name - SYS.EXU7TAB

After modification, let's run an export of test user ARTIST:

. about to export ARTIST's tables via Conventional Path ...
. . exporting table                  BACKUP_STATUS        411 rows exported
. . exporting table              CRITICAL_SEGMENTS          0 rows exported
. . exporting table                     PLAN_TABLE         13 rows exported
. . exporting table                     TS_STORAGE      25334 rows exported

We have succeeded in getting rid of the large tables INPUT and TRANSFER. The export file is consistent. As a check, you can run import to test recovery scenarios.

All of this, but yet, there is still a better solution.

Oracle Archives

Latest Forum Threads
Oracle Forum
Topic By Replies Updated
Oracle Data Mining: Classification jan.hasller 0 July 5th, 07:19 AM
Find duplicates - Unique IDs Lava 5 July 2nd, 08:30 AM
no matching unique or primary key rcanter 1 April 25th, 12:32 PM
Update values of one table based on condition of values in other table using Trigger Gladiator 3 February 29th, 06:01 PM