NLS_NCHAR_CHARACTERSET controls the character set used to encode NCHAR and NVARCHAR2 columns. If you don't use NCHAR/ NVARCHAR2 columns, that parameter doesn't affect you.
Both AL32UTF8 and AL16UTF16 are Unicode character sets. AL32UTF8 is a variable length character set. Basic ASCII characters require 1 byte of storage, Western European characters require 2 bytes of storage, Asian characters require 3 bytes of storage and a handful of characters require 4 bytes of storage. AL16UTF16 is also a variable length character set. Almost all characters require 2 bytes of storage with a handful of special characters requiring 4 bytes of storage. If you are going to be storing primarily English data or primarily some Asian language, there can be substantial differences in the space required depending on the National character set.
•The national character set is the character set which is defined in oracle database in addition to normal character set.
•The normal character set is defined by the parameter NLS_CHARACTERSET and the national character set is defined by the parameter NLS_NCHAR_CHARACTERSET.
•The national character set is used for data stored in NCHAR, NVARCHAR2 and NCLOB columns while the normal character set is used for data stored in CHAR, VARCHAR2, CLOB columns.
•You can get the value of national character set or NLS_NCHAR_CHARACTERSET by,
select value from nls_database_parameters where parameter='NLS_NCHAR_CHARACTERSET';
select value$ from sys.props$ where name='NLS_NCHAR_CHARACTERSET';
select property_value from database_properties where property_name='NLS_NCHAR_CHARACTERSET';
•NLS_NCHAR_CHARACTERSET is defined when the database is created and specified with the CREATE DATABASE command.
•The default value of NLS_NCHAR_CHARACTERSET is AL16UTF16.
•From Oracle 9i onwards the NLS_NCHAR_CHARACTERSET can have only 2 values, either UTF8 or AL16UTF16 and both are unicode character sets.
•National character set are always defined in CHAR length semantics and you cannot define them in BYTE. That means if you defines NCHAR(5) then 5 maximum characters can be stored regardless of how many bytes they can hold.
•Many one thinks that they need to use the NLS_NCHAR_CHARACTERSET to have UNICODE support in oracle but this is not true. One can always use UNICODE in either two ways. Storing data into NCHAR, NVARCHAR2 or NCLOB columns or you can perfectly use "normal" CHAR and VARCHAR2 columns for storing unicode in a database who has a AL32UTF8 / UTF8 NLS_CHARACTERSET.
Reference: Google - NLS_NCHAR_CHARACTERSET
If you are storing primarily English data, it makes a huge difference. For English data, 1 character = 1 byte of storage in the AL32UTF8 characcter set and 1 character = 2 bytes of storage in the AL16UTF16 character set. So you're talking about a factor of 2 difference in the storage requirements.
Thanks.Where excatly this storage refered to.suppose in a place where i need AL16UTF16 and i used AL32UTF8 what would be the impact?
I'm not sure I understand the question, so apologies if I'm answering the wrong question...
If I store the letter 'a' in a NVARCHAR2 column, Oarcle would allocate 2 bytes of storage if the NLS_NCHAR_CHARACTERSET is AL16UTF16
If I store the letter 'a' in a NVARCHAR2 column, Oracle would allocate 1 byte of storage if the NLS_NCHAR_CHARACTERSET is AL32UTF8
If you are storing primarily English data, AL16UTF16 will cause Oracle to consume nearly twice as much space on disk and in RAM for that data as would be required if you used an AL32UTF8 character set.