We are going to create a new database ( 11g ) having AL32UTF8 as database character set and AL32UTF16 as national character set. Some of the tables columns ( less than 25) might need to store chinese or japanese characters. What is the approach we need to follow for NLS_LENGTH_SEMANTICS settings.
Option 1 : set NLS_LENGTH_SEMANTICS to CHAR at DB Level.
Option 2: Leave NLS_LENGTH_SEMANTICS as oracle default ( i.e BYTE) and while defining DDL for those tables mention the respective columns with CHAR semantics i.e varchar2(20 CHAR) etc.
NLS_LENGTH_SEMANTICS should not be set database-wide, from the reference manual:
Oracle strongly recommends that you do NOT set the NLS_LENGTH_SEMANTICS parameter to CHAR in the instance or server parameter file. This may cause many existing installation scripts to unexpectedly create columns with character length semantics, resulting in runtime errors, including buffer overflows.
Installation-,patch-,upgrade scripts , which create/change data dictionary objects would lead to unintentional columns with character length semantics.
see MOS Doc please
Examples and limits of BYTE and CHAR semantics usage (NLS_LENGTH_SEMANTICS) [ID 144808.1]
Do not change this parameter in db level even if NLS_LENGTH_SEMANTICS does not apply to tables in SYS and SYSTEM schema. (The data dictionary always uses byte semantics.)
It is always easier to include an alter session command before table creation in scripts if you dont want to use varchar2(20 CHAR) syntax ...
But IMHO the best is to use (20 CHAR) precision in scripts to highlight that you use char semantics.
also check the following MOS doc for open bugs/issues about this parameter
Init.ora Parameter "NLS_LENGTH_SEMANTICS" Reference Note [ID 153365.1]
Also, a common mistake, originating from incorrect documentation, is to say that NLS_LENGTH_SEMANTICS does not apply to tables in SYS and SYSTEM schema. This is true for the SYS schema only. SYSTEM schema does use the parameter.