This content has been marked as final. Show 23 replies
For all the accented chars of "a", it shows plain english a, same wiht e , u etc.I don't think it has anything to do with CONVERT, probably you have to set codepage and font. Try the following from a DOS window :
- set codepage to 1252 :
C:\> chcp 1252
- set the font to "Lucida Console"
- set NLS_LANG (your setting is OK)
- launch sqlplus and try
SQL> select chr(233) from dual;
thanks for your reply. But I am still not able to get the correct display.
How will I find out which is that ansi code page that will work for me??
I have also visted the other post where u have replied to similar problem
Re: © symbol is becoming ? in my Web
I have had a look in the ACP section of windows registry and there i find this value 1252. After reading one of hte metalink notes, I understand that this is the windows character set that is not allowing hte display of the expected accented characters.
How do I set that for SQL PLUS directly - without coming to DOS??
Thanks for the help and the time you spent for this post.
Could you explain what you actually want Oracle to do. If I understand correctly you have a WE8ISO8859P1 database and a Windows client. Your accented data should show up perfectly well with standard NLS_LANG (=.WE8MSWIN1252) and Windows settings (ACP=1252). Why do you want to add conversion to EBCDIC?
You understand correctly that i have oracle db with the said char set. But the requirement of the work is such that i have to transfer this data outside my system in the said format - that is EBCDIC. The other system has to read it .....in that format. I have no idea about things on other side. My work is only limited to get the data from the tables thru normal/formal cursors, Convert it to EBCDIC and then write it to file using UTL_FILE.
I understand by now, that I have to change some char set in my Registry(windows XP) - my pc connects to db on sun thru SQL PLUS (9i).
Just that I am not able to indentify which charset should i set in registry param NLS_LANG. Currently it is -> AMERICAN_AMERICA.WE8MSWIN1252 and this char set does not show up some accented chars and some other chars as well.
I have a chart of characters that convey/show the each chars representation in diff system like ASCII, EBCDIC, Binary, HEX etc.
You can see that on the link
select convert('Ö', 'US7ASCII','WE8EBCDIC500' ) from dual;
select convert('O','WE8EBCDIC500','US7ASCII' ) from dual;
Ö - Ascii
O - EBCDIC version of ascii Ö
when i use the convert function to first convert say the character above Ö and pass the result to outer CONVERT again, it correctly displays the original characters. So CONVERT as such is able to identify and do the form change correctly. Its just the matter of SQLPLUs not able to show it.
OK, first, you cannot normally see any EBCDIC text on Windows. This has nothing to do with SQL*Plus. Notepad will not show EBCDIC letters either. This is simply a matter of available fonts and their encoding. You need a mainframe terminal (or an emulator that does the conversion from EBCDIC back to Windows code page) to correctly view an EBCDIC file.
Second, if you use UTL_FILE, NLS_LANG does not matter. NLS_LANG is for the client. UTL_FILE does everything on the database server. You use SQL*Plus to start some PL/SQL code to call UTL_FILE, but the file is generated on the server in the database character set.
If your database is 9.2, use the UTL_FILE.PUT_RAW function + the UTL_RAW.CAST_TO_RAW function + the UTL_RAW.CONVERT function, to achieve the effect in a supported way.
UTL_RAW.CAST_TO_RAW( <text expression> ),
'AMERICAN_AMERICA.WE8ISO8859P1' ) );
You can verify the generated text loking at EBCDIC codes with a hex editor.
You can verify that WE8EBCDIC850 corresponds to you requirement
by looking at WE8EBCDIC850 definition in the standard Locale Builder utility.
I tried working out ur suggestion. But it gives me error for UTL_RAW.CONVERT. The errror is
ORA-06502: PL/SQL: numeric or value error at m_iLine 3
Here is what i m doing.
UTL_RAW.CONVERT( UTL_RAW.CAST_TO_RAW(TO_CHAR(rec1.Field_01 )), 'AMERICAN_AMERICA.WE8EBCDIC850', 'AMERICAN_AMERICA.WE8ISO8859P1' ) );
In here, rec1.Field_01 is the column from cursor containing actual value to be converted to the other char set before writing to file. And m_fileType is the file handle that writes to the file (like UTL_FILE.PUT_LINE(m_fileType,m_RecordData);
m_RecordData is the variable that was containing proper column wise text that was getting added after conversion. But for trial, i have used only 2-3 columns.
I also did DBMS_OUTPUT before this function to see wht values is it converting.
DBMS_OUTPUT.PUT_LINE(UTL_RAW.CONVERT(UTL_RAW.CAST_TO_RAW(TO_CHAR( rec1.Field_01 )),'AMERICAN_AMERICA.WE8EBCDIC850','AMERICAN_AMERICA.WE8ISO8859P1'));
The first 2 outputs worked fine, when the value of the column was 01, it transfered to RAW (first time, in the same charset) to 3031. And now 3031 is to be transferred to RAW in another charset as per this code but at this point it gives me error as said above. M not able to understand what cud be the error.
I tried to solve it by using a RAW variable....but its not allowing me to declare one.
Can u help??
Sorry for inconvenience.
Yes now the conversion is working. But even then, What we are putting is raw data converted to EBCDIC, right?? I mena first we convert normal char data to RAW and then RAW is converted to EBCDIC. Will there be any difference in directly converting CHAR data to EBCDIC and like this??
Let me put my question clearly.
I tried to convert char data to hex. But found that there isno direct conversion logica available for that. So waht i did is first convert the data to ASCII and then convert ASCII to HEX. Now i was actually converting hte ascii values to HEX. So if i had char 'A' then i was actually converting 65 yo HEX not A direcetly. and same i was doing while getting back.
Is my question clear here?? Whether i convert the text CHAR directly to EBCDIC or thru RAW and then RAW to EBCDIC it should still give me the correct representation of EBCDIC of that text right??
I still feel the file that i m generating thru this is a normal dat file and not a binary file. Intension was to generate binary file. Any suggestions on that?
Anyways, thanks for the help. I just need to check if there is any Hex editor or something liek that so that i can OPEN the file generated and cross check whether it shows the same values of fname, lname columns as in database.
will produce the same file. The problem with the first form is that it is formally illegal and may fail in certain database character sets. This is because the result of CONVERT() is VARCHAR2 but it contains data not in the database character set. And this is illegal.
In the second case we cast (not convert!) the character data to RAW, i.e. we just change the datatype label of a buffer, and do the same conversion and output as in the first case, just declaring the data as bytes and not characters. This is safe and will work in any database character set.
Yes I understand what you are saying. Well... I was able to write the file using this logica. But now another problem. When ever i have avariables that is NULL (column field coming from table is null) then the following does not work.
and fails to write a null to the file. I can not use NVL here, because the requirement is that if its null then it shd put that char in EBCDIC. I mean there is a certain representation NULL = CHR(00) =NUL (in EBCDIC)
PLs see this chart on following link If am not able to explain what i mean. This chart contains the representation of each character in HEX, ASCII, EBCDIC, Binary etc.
Using this only i am able to crosscheck whether its doing the conversion the way i expect or not.
So now the only problem is when my CURSOR contains a NULL for a record, for a column. THen i want it to write NUL instead of me using NVL. If I use NVL i will be giving some value which is not supposed to be so, if a column is null, it shd be represented as nul.
Can u pls suggest me somethign agian. Yes thanks, surely your suggestions have helped me a lot.
before posting the question for NULL data, I had tried out the following code which did not give me the expected output.
m_raw := UTL_RAW.CAST_TO_RAW(CHR(000));
'AMERICAN_AMERICA.WE8ISO8859P1' ) );
And even after making CHR(0), as i also expected to give NUL, it is actually givin NULL meaning a blank(as if a space char is there). So I dont know wht to do to get the display of NUL there in the EBCDIC file.
Actually, I have checked all the possible values and their equivalent char in EBCDIC. In the sampel file that i m given, I do see the NUL char displayed(in VEDIT editor)
do let me know if you have tried something else as well.
Thanks so much for being so helpful.