Oracle Error When Importing An Interval Partitioned Table

An interesting issue can reveal itself in Oracle 12.1.0.2 when importing a list partitioned table using Data Pump:

ORA-39083: Object type TABLE:<table name> failed to create with error:
ORA-1438: value larger than specified precision allowed for this column

Oracle Support reported this case back in March and as of now it still has no patch, workaround, or fix. This problem can be reproduced with the following code, modified to change table and column names. The example, including the output, follows:


SQL> CREATE TABLE "EXP_IMP_TEST"
  2  (       "KEY_NR" NUMBER(6,0) NOT NULL ENABLE,
  3          "VERS" NUMBER(5,0) NOT NULL ENABLE,
  4          "MY_DAT" DATE NOT NULL ENABLE,
  5          "YONP" NUMBER(3,0) NOT NULL ENABLE,
  6          "LAST_MODIFY_TIME" TIMESTAMP (6) DEFAULT SYSTIMESTAMP NOT NULL ENABLE,
  7           CONSTRAINT "IX_PK_WAL" PRIMARY KEY ("KEY_NR","VERS", "MY_DAT","YONP")
  8    USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255
  9    STORAGE(
 10    BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT )
 11    LOCAL (PARTITION "P_EXP_IMP_201301"  PCTFREE 10 INITRANS 2 MAXTRANS 255 NOLOGGING
 12    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645  PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1
 13    BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)
 14    TABLESPACE USERS )  ENABLE   ) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255
 15    STORAGE(  BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT )
 16    TABLESPACE USERS
 17    PARTITION BY RANGE ("KEY_NR") INTERVAL (100)
 18    (PARTITION "P_EXP_IMP_201301"  VALUES LESS THAN (201401) SEGMENT CREATION IMMEDIATE
 19    PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS NOLOGGING
 20    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
 21    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1
 22    BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT )
 23    TABLESPACE USERS )  ENABLE ROW MOVEMENT;

Table created.

SQL>
SQL>  insert into EXP_IMP_TEST("KEY_NR","VERS", "MY_DAT", "YONP") values(999901,1,sysdate,1);

1 row created.

SQL>   commit;

Commit complete.

All goes well until the import is attempted; notice the table is being remapped to a different schema:


Starting "BING"."SYS_IMPORT_TABLE_02":  bing/********@yinski directory=data_pump_dir2 dumpfile=exp_imp.dmp tables=EXP_IMP_TEST remap_schema=BING:BONG logfile=ti_tst.log
Processing object type SCHEMA_EXPORT/TABLE/TABLE
ORA-39083: Object type TABLE:"BONG"."EXP_IMP_TEST" failed to create with error:
ORA-01438: value larger than specified precision allowed for this column
Failing sql is:
CREATE TABLE "BONG"."EXP_IMP_TEST" ("KEY_NR" NUMBER(6,0) NOT NULL ENABLE, "VERS" NUMBER(5,0) NOT NULL ENABLE, "MY_DAT" DATE NOT NULL ENABLE, "YONP" NUMBER(3,0) NOT NULL ENABLE, "LAST_MODIFY_TIME" TIME
STAMP (6) DEFAULT SYSTIMESTAMP NOT NULL ENABLE) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255  STORAGE( BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
ORA-39112: Dependent object type INDEX:"BONG"."IX_PK_WAL" skipped, base object type TABLE:"BONG"."EXP_IMP_TEST" creation failed
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
ORA-39112: Dependent object type CONSTRAINT:"BONG"."IX_PK_WAL" skipped, base object type TABLE:"BONG"."EXP_IMP_TEST" creation failed
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
Job "BING"."SYS_IMPORT_TABLE_02" completed with 3 error(s) at Mon Nov 23 07:38:35 2015 elapsed 0 00:02:05

Oracle Support has declared this as Bug 20721183. Since the provided example uses remap_schema, let’s see if that is the cause of the problem:


Starting "BING"."SYS_IMPORT_TABLE_02":  bing/********@yinski directory=data_pump_dir2 dumpfile=exp_imp.dmp tables=EXP_IMP_TEST logfile=ti_tst_log.log
Processing object type SCHEMA_EXPORT/TABLE/TABLE
ORA-39083: Object type TABLE:"BING"."EXP_IMP_TEST" failed to create with error:
ORA-01438: value larger than specified precision allowed for this column
Failing sql is:
CREATE TABLE "BING"."EXP_IMP_TEST" ("KEY_NR" NUMBER(6,0) NOT NULL ENABLE, "VERS" NUMBER(5,0) NOT NULL ENABLE, "MY_DAT" DATE NOT NULL ENABLE, "YONP" NUMBER(3,0) NOT NULL ENABLE, "LAST_MODIFY_TIME" TIME
STAMP (6) DEFAULT SYSTIMESTAMP NOT NULL ENABLE) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255  STORAGE( BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
ORA-39112: Dependent object type INDEX:"BING"."IX_PK_WAL" skipped, base object type TABLE:"BING"."EXP_IMP_TEST" creation failed
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
ORA-39112: Dependent object type CONSTRAINT:"BING"."IX_PK_WAL" skipped, base object type TABLE:"BING"."EXP_IMP_TEST" creation failed
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
Job "BING"."SYS_IMPORT_TABLE_02" completed with 3 error(s) at Mon Nov 23 07:49:31 2015 elapsed 0 00:02:02

The same error is generated so it’s an issue with importing interval partitioned tables in Oracle 12.1.0.2 with NUMBER columns, possibly with explicit precision declarations. The problem reproduces on all ports of 12.1.0.2; the original error was reported for Exadata, and the example shown here was run on Windows 7. This is a severe error, as interval partitioned tables can be exported from 12.1.0.2 but cannot be imported. This also affects using transportable tablespaces as the same error is generated on import. Investigating if it’s the declared precision causing the problem, all NUMBER columns have been declared with the default size and precision and the export/import attempted again:


Starting "BING"."SYS_IMPORT_TABLE_02":  bing/********@yinski directory=data_pump_dir2 dumpfile=exp_imp_6b.dmp t
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "BONG"."EXP_IMP_TEST3":"SYS_P9323"          6.789 KB       1 rows
. . imported "BONG"."EXP_IMP_TEST3":"P_EXP_IMP3_201301"      0 KB       0 rows
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Job "BING"."SYS_IMPORT_TABLE_02" successfully completed at Mon Nov 23 08:52:38 2015 elapsed 0 00:01:54

Interestingly the import of the table where all of the NUMBER columns are simply declared as NUMBER succeeds. Making one more attempt, declaring the NUMBER columns as NUMBER(10) and inserting the same data as before (data where the number values do not meet the maximum length) we find:


Starting "BING"."SYS_IMPORT_TABLE_02":  bing/********@yinski directory=data_pump_dir2 dumpfile=exp_imp_8b.dmp tables=EXP_IMP_TEST2 remap_schema=BING:BONG logfile=ti_tst_log8b.log
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "BONG"."EXP_IMP_TEST2":"SYS_P9342"          6.937 KB       1 rows
. . imported "BONG"."EXP_IMP_TEST2":"P_EXP_IMP2_201301"      0 KB       0 rows
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Job "BING"."SYS_IMPORT_TABLE_02" successfully completed at Mon Nov 23 09:09:24 2015 elapsed 0 00:01:56

The issue presents itself when the number of digits in the NUMBER data meets the declared size limit for the column; when the columns are declared longer than the data is expected to be the problem goes away. This can be good news for those running 12.1.0.2 and using, or planning to use, interval partitioned tables; knowing how to declare such NUMBER columns can prevent the bug reported here from affecting production systems. There are some who will state that declaring column sizes larger than necessary is wasting space, and that is true. But, in this instance, it appears that wasting a little bit of space can work around a potential show-stopper, at least until Oracle Support can provide a patch to fix the bug.

See all articles by David Fitzjarrell

David Fitzjarrell
David Fitzjarrell
David Fitzjarrell has more than 20 years of administration experience with various releases of the Oracle DBMS. He has installed the Oracle software on many platforms, including UNIX, Windows and Linux, and monitored and tuned performance in those environments. He is knowledgeable in the traditional tools for performance tuning – the Oracle Wait Interface, Statspack, event 10046 and 10053 traces, tkprof, explain plan and autotrace – and has used these to great advantage at the U.S. Postal Service, American Airlines/SABRE, ConocoPhilips and SiriusXM Radio, among others, to increase throughput and improve the quality of the production system. He has also set up scripts to regularly monitor available space and set thresholds to notify DBAs of impending space shortages before they affect the production environment. These scripts generate data which can also used to trend database growth over time, aiding in capacity planning. He has used RMAN, Streams, RAC and Data Guard in Oracle installations to ensure full recoverability and failover capabilities as well as high availability, and has configured a 'cascading' set of DR databases using the primary DR databases as the source, managing the archivelog transfers manually and montoring, through scripts, the health of these secondary DR databases. He has also used ASM, ASMM and ASSM to improve performance and manage storage and shared memory.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends & analysis

Latest Articles