This content has been marked as final. Show 8 replies
I get the following error on few tables when i try to export from 10g and import to 11g DB.
import done in US7ASCII character set and AL16UTF16 NCHAR character set
import server uses AL32UTF8 character set (possible charset conversion)
. importing TBAADM's objects into TEST
. . importing table "ACCT_AUTH_SIGN_TABLE"
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "TEST"."ACCT_AUTH_SIGN_TABLE"."MODE_OF_DESPATCH" (actual: 3, maximum: 1)
How to over come this ?
[oracle@localhost sql]$ oerr ora 12899 12899, 00000, "value too large for column %s (actual: %s, maximum: %s)" // *Cause: An attempt was made to insert or update a column with a value // which is too wide for the width of the destination column. // The name of the column is given, along with the actual width // of the value, and the maximum allowed width of the column. // Note that widths are reported in characters if character length // semantics are in effect for the column, otherwise widths are // reported in bytes. // *Action: Examine the SQL statement for correctness. Check source // and destination column data types. // Either make the destination column wider, or use a subset // of the source column (i.e. use substring).
Can you check the value in your 10g database, is it one character?
In that case you can change your nls_length_semantics from byte to char.
When you create the table again it will use char instead of byte.
Or you will have create the table with the char option instead of byte.
When storing a multi byte character it will fit because of the definition of 1 char.