The value of DBCOLUMINFO.ulColumnSize obtained when executing a IColumnsInfo::GetColumnInfo has changed for OLEDB connections at Oracle Client level 12.2.0.1. The value obtained in 12.1.0.2 was a size in bytes. Now the size is returned as a number of characters for a DBTYPE_STR or DBTYPE_WSTR. I understand that this is the correct behavior as it is defined by the Microsoft OLEDB standard. However it has caused an issue that I am trying to correct.
The code I'm working on is a general interface that uses OLEDB to connect to any database. It should be able to take a SQL statement and display the data. The returned data is displayed in columns on the terminal. The byte size was used to create the display size for a column of string data. Although this change did not effect SBCS environments the issue arises in MBCS environments, the specific environment I am working with right now is the JAPANESE_JAPAN.JA16SJISTILDE character set. The terminal environment we are using displays a 2 byte Kanji character in 2 character places on the terminal. So a character count doesn't help me much at all. The maximum byte size of a column is what I need to size the column on the terminal.
Simply doubling the column in the MBCS environments is not a viable solution because if the column was declared with a byte size maximum the ulColumnSize will report that and we will eat up more limited terminal space. I cant leave it as it is because if the column was declared with a char size maximum ulColumnSize will report that and we will truncate data.
I am looking for anything else in the OLEDB Interfaces that could get me a column byte size maximum for a WSTR or anything that would tell me how the column was declared (char or byte).
Thank you to anyone that can provide some help.