This discussion is archived
10 Replies Latest reply: Dec 8, 2012 8:53 AM by Baffy RSS

DataPump failing with 'value too large' for exactly identical tables

Baffy Newbie
Currently Being Moderated
All,

Please help if you've ever come accross and resolved this before:

1)Table was created on target with script taken from source, so tables are exactly the same.

2) NLS variables are the same:

Source:
=====
NLS_CHARACTERSET
UTF8

NLS_LENGTH_SEMANTICS
BYTE

Target:
=====
NLS_CHARACTERSET
UTF8

NLS_LENGTH_SEMANTICS
BYTE



3)Export command:

scott/******** directory=DPUMP_DIR logfile=sftv.log dumpfile=sftv.dmp tables=scott.sftv_track_ids

4)Import command used:

scott/******** directory=DPUMP_DIR logfile=sftv.log dumpfile=sftv.dmp content=data_only tables=scott.sftv_track_ids

Error encountered:

KUP-11007: conversion error loading table "scott"."SFTV_TRACK_IDS"
ORA-12899: value too large for column EP_ID (actual: 5, maximum: 3)
KUP-11009: data for row: EP_ID : '10002'
  • 1. Re: DataPump failing with 'value too large' for exactly identical tables
    Srini Chavali-Oracle Oracle ACE Director
    Currently Being Moderated
    Pl post details of exact versions of OS and database on the source and target servers. Pl identify NLS_LANG setting when the export and import were performed. Pl also post the first 20 lines of the export and import logs

    HTH
    Srini
  • 2. Re: DataPump failing with 'value too large' for exactly identical tables
    Baffy Newbie
    Currently Being Moderated
    Hi Srini,

    These are the answers to your questions:

    NLS_LANG for both databases:

    export NLS_LANG=ENGLISH.UTF8 (set in the shell script I used for both export and import)

    OS and DB Version for Source:
    Red Hat Enterprise Linux Server release 5.3 (Tikanga)
    Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bit Production

    OS and DB Version for target:
    Red Hat Enterprise Linux Server release 5.6 (Tikanga)
    Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bit Production

    LOGFILES:
    =======



    Export:
    (expdp bods_dm/******** directory=DPUMP_DIR logfile=sftv.log dumpfile=sftv.dmp tables=bods_dm.sftv_track_ids)


    ;;;
    Export: Release 10.2.0.5.0 - 64bit Production on Tuesday, 24 July, 2012 16:17:45

    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    ;;;
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    FLASHBACK automatically enabled to preserve database integrity.
    Starting "BODS_DM"."SYS_EXPORT_TABLE_01": bods_dm/******** directory=DPUMP_DIR logfile=sftv.log dumpfile=sftv.dmp tables=bods_dm.sftv_track_ids
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 128 KB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "BODS_DM"."SFTV_TRACK_IDS" 53.39 KB 1799 rows
    Master table "BODS_DM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    ******************************************************************************
    Dump file set for BODS_DM.SYS_EXPORT_TABLE_01 is:
    /shazamp_active/export/sftv.dmp
    Job "BODS_DM"."SYS_EXPORT_TABLE_01" successfully completed at 16:17:54


    Import
    ====
    (impdp bods_dm/******** directory=DPUMP_DIR logfile=sftv.log dumpfile=sftv.dmp content=data_only tables=bods_dm.sftv_track_ids)


    ;;;
    Import: Release 10.2.0.5.0 - 64bit Production on Tuesday, 24 July, 2012 16:18:34

    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    ;;;
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "BODS_DM"."SYS_IMPORT_TABLE_05" successfully loaded/unloaded
    Starting "BODS_DM"."SYS_IMPORT_TABLE_05": bods_dm/******** directory=DPUMP_DIR logfile=sftv.log dumpfile=sftv.dmp content=data_only tables=bods_dm.sftv_track_ids
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    KUP-11007: conversion error loading table "BODS_DM"."SFTV_TRACK_IDS"
    ORA-12899: value too large for column EP_ID (actual: 5, maximum: 3)
    KUP-11009: data for row: EP_ID : '10002'

    KUP-11007: conversion error loading table "BODS_DM"."SFTV_TRACK_IDS"
    ORA-12899: value too large for column EP_ID (actual: 5, maximum: 3)
    KUP-11009: data for row: EP_ID : '10002'

    KUP-11007: conversion error loading table "BODS_DM"."SFTV_TRACK_IDS"
    ORA-12899: value too large for column EP_ID (actual: 5, maximum: 3)
    KUP-11009: data for row: EP_ID : '10002'

    KUP-11007: conversion error loading table "BODS_DM"."SFTV_TRACK_IDS"
    ORA-12899: value too large for column EP_ID (actual: 5, maximum: 3)
    KUP-11009: data for row: EP_ID : '10001'
  • 3. Re: DataPump failing with 'value too large' for exactly identical tables
    Srini Chavali-Oracle Oracle ACE Director
    Currently Being Moderated
    >
    ...

    NLS_LANG for both databases:

    export NLS_LANG=ENGLISH.UTF8 (set in the shell script I used for both export and import)

    ...
    >

    This is an incorrect setting - pl review the docs and set it appropriately, then try expdp and impdp again

    http://docs.oracle.com/cd/B19306_01/server.102/b14225/ch3globenv.htm#sthref195

    HTH
    Srini
  • 4. Re: DataPump failing with 'value too large' for exactly identical tables
    Baffy Newbie
    Currently Being Moderated
    Hi Srini,

    Thanks for your help so far.

    I changed NLS_LANG as specified in the documentation in both export and import shell scripts:

    I tried NLS_LANG=AMERICAN_AMERICA.UTF8, got same error.

    Also tried NLS_LANG="ENGLISH_UNITED KINGDOM.UTF8", got same error.

    Thanks
  • 5. Re: DataPump failing with 'value too large' for exactly identical tables
    Srini Chavali-Oracle Oracle ACE Director
    Currently Being Moderated
    Can you please post a description of the table from the two instances ?

    Srini
  • 6. Re: DataPump failing with 'value too large' for exactly identical tables
    Baffy Newbie
    Currently Being Moderated
    Hi Srini,
    I've carefully extracted this from both source and target:

    SOURCE
    =====================

    CREATE TABLE SFTV_TRACK_IDS
    (
    TRACK_ID NUMBER NOT NULL,
    DATE_TIMESTAMP DATE DEFAULT SYSDATE,
    CAMPAIGN_ID INTEGER,
    EP_ID VARCHAR2(5 BYTE),
    TRACK_POSITION NUMBER,
    TYPE_ID INTEGER
    )
    TABLESPACE DA_BODI_PROD
    PCTUSED 0
    PCTFREE 10
    INITRANS 1
    MAXTRANS 255
    STORAGE (
    INITIAL 64K
    NEXT 1M
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    )
    LOGGING
    NOCOMPRESS
    NOCACHE
    NOPARALLEL
    MONITORING;


    ALTER TABLE SFTV_TRACK_IDS ADD (
    PRIMARY KEY
    (TRACK_ID)
    USING INDEX
    TABLESPACE DA_BODI_PROD
    PCTFREE 10
    INITRANS 2
    MAXTRANS 255
    STORAGE (
    INITIAL 64K
    NEXT 1M
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    ));

    TARGET:
    =======

    CREATE TABLE SFTV_TRACK_IDS
    (
    TRACK_ID NUMBER NOT NULL,
    DATE_TIMESTAMP DATE DEFAULT SYSDATE,
    CAMPAIGN_ID INTEGER,
    EP_ID VARCHAR2(5 BYTE),
    TRACK_POSITION NUMBER,
    TYPE_ID INTEGER
    )
    TABLESPACE DA_BODI_PROD
    PCTUSED 0
    PCTFREE 10
    INITRANS 1
    MAXTRANS 255
    STORAGE (
    INITIAL 64K
    NEXT 1M
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    )
    LOGGING
    NOCOMPRESS
    NOCACHE
    NOPARALLEL
    MONITORING;


    ALTER TABLE SFTV_TRACK_IDS ADD (
    PRIMARY KEY
    (TRACK_ID)
    USING INDEX
    TABLESPACE DA_BODI_PROD
    PCTFREE 10
    INITRANS 2
    MAXTRANS 255
    STORAGE (
    INITIAL 64K
    NEXT 1M
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    ));

    =========================

    They look the same.
  • 7. Re: DataPump failing with 'value too large' for exactly identical tables
    Baffy Newbie
    Currently Being Moderated
    Hi Srini,
    I think we can narrow down the problem to the DB unable to insert '10002' into column EP_ID.

    This must be a bug in Datapump (10.2.0.5). That value already exists in source, but now, datapump is reporting that the max characters to insert is 3:

    KUP-11007: conversion error loading table "BODS_DM"."SFTV_TRACK_IDS"
    ORA-12899: value too large for column EP_ID (actual: 5, maximum: 3)
    KUP-11009: data for row: EP_ID : '10001'
  • 8. Re: DataPump failing with 'value too large' for exactly identical tables
    Srini Chavali-Oracle Oracle ACE Director
    Currently Being Moderated
    Hmm weird - have you opened an SR ?

    Srini
  • 9. Re: DataPump failing with 'value too large' for exactly identical tables
    User286067 Journeyer
    Currently Being Moderated
    @Baffy,

    Does this happen if impdp process creates the table rather than you pre-creating it? if yes, then this is probably a bug, if not, can you show us the table describe after you create the table and before you impdp?

    Raj
  • 10. Re: DataPump failing with 'value too large' for exactly identical tables
    Baffy Newbie
    Currently Being Moderated
    Thanks all for your help.

    It's a bug.

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points