Forum Stats

  • 3,769,586 Users
  • 2,252,984 Discussions


OLEDB ulComlumn Size Change in 12c

3625313 Member Posts: 1
edited May 24, 2018 8:59PM in Oracle Provider for OLE DB

The value of DBCOLUMINFO.ulColumnSize obtained when executing a IColumnsInfo::GetColumnInfo has changed for OLEDB connections at Oracle Client level The value obtained in was a size in bytes. Now the size is returned as a number of characters for a DBTYPE_STR or DBTYPE_WSTR. I understand that this is the correct behavior as it is defined by the Microsoft OLEDB standard. However it has caused an issue that I am trying to correct.

The code I'm working on is a general interface that uses OLEDB to connect to any database. It should be able to take a SQL statement and display the data. The returned data is displayed in columns on the terminal. The byte size was used to create the display size for a column of string data. Although this change did not effect SBCS environments the issue arises in MBCS environments, the specific environment I am working with right now is the JAPANESE_JAPAN.JA16SJISTILDE character set. The terminal environment we are using displays a 2 byte Kanji character in 2 character places on the terminal. So a character count doesn't help me much at all. The maximum byte size of a column is what I need to size the column on the terminal.

Simply doubling the column in the MBCS environments is not a viable solution because if the column was declared with a byte size maximum the ulColumnSize will report that and we will eat up more limited terminal space. I cant leave it as it is because if the column was declared with a char size maximum ulColumnSize will report that and we will truncate data.

I am looking for anything else in the OLEDB Interfaces that could get me a column byte size maximum for a WSTR or anything that would tell me how the column was declared (char or byte).

Thank you to anyone that can provide some help.


This discussion has been closed.