Skip to Main Content

APEX

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Interested in getting your voice heard by members of the Developer Marketing team at Oracle? Check out this post for AppDev or this post for AI focus group information.

Looking for Tree Widget API samples (Apex 19.2)

John like many othersOct 21 2020 — edited Oct 21 2020

Hello
Before I put my questions I would like to make something clear to others not to pull out some hairs or wasting hours to find the cause:.
If you use a generated Apex tree (widget) and you want to access it by API you will get some confusing error messages and there is no help in the API itself telling you why. The way to access a generated tree is :
On Tree region level add an ID in 'Static ID' (f.e.: HIERARCHYTREE)
Access the Tree widget in Javascript by $('HIERARCHYTREE**_tree**')
The API documentation (I'm not allowed to link it here) is even misleading using the class identificator (dot notation) in the samples. So don't get confused about that.
Questions:
I would like to add and delete nodes in the generated tree. The API documentation does not provide ANY samples how to add or delete a node. I even searched here (spyglass on top) for "tree" but no result. So if you know any source on the Internet or a book providing examples I would be thankful..
I would like to react in the background if the user clicks on any treeview entry (node, child). The API doesn't support simple click events. How can I (simply) add a one-click event on any entry of the tree?
To add a node I guess I have to use the "addNode" option described in the API. It's already unclear how to get to those parameter: addNode(pParent, pIndex, pLabel_opt_, pContext_opt_, pCallback) -> f.e. HOW can I get the PARENT node into which I would like to add a child? There is function to get children but no function to get the parent of f.e. the selectedNode. pIndex: Why the heck do I have first to evaluate the amount of children of the node I would like to add another node(child) in order to get the amount of children and therefore add it at the end? As I'm using the default generated tree how can I pass the ID provided by that defaultNode?
I think with some simple code samples the questions of topic 3 will resolve themselves. So if you know any source where to find simple examples to add and/or delete nodes by API be would great.

Comments

Chanchal Wankhade
Hi,


Let us assume the database character set is UTF-8, which I believe is the default in recent version of Oracle. In this case, some characters take more than 1 byte to store in the database.

If you define the field as VARCHAR2(11 BYTE), Oracle will allocate 11 bytes for storage, but you may not actually be able to store 11 characters in the field, because some of them take more than one byte to store, e.g. non-English characters.

By defining the field as VARCHAR2(11 CHAR) you tell Oracle to allocate enough space to store 11 characters, no matter how many bytes it takes to store each one. I believe that in Oracle 10g, 3 bytes per character were used.


Also find good links.

http://stackoverflow.com/questions/81448/difference-between-byte-and-char-in-column-datatypes
2379822
theoa
Declare your variables as table.column%TYPE.
Even if the type (length) of the database column changes, it will still fit in the variable.
BluShadow
Chanchal Wankhade wrote:
Hi,


Let us assume the database character set is UTF-8, which I believe is the default in recent version of Oracle. In this case, some characters take more than 1 byte to store in the database.

If you define the field as VARCHAR2(11 BYTE), Oracle will allocate 11 bytes for storage, but you may not actually be able to store 11 characters in the field, because some of them take more than one byte to store, e.g. non-English characters.

By defining the field as VARCHAR2(11 CHAR) you tell Oracle to allocate enough space to store 11 characters, no matter how many bytes it takes to store each one. I believe that in Oracle 10g, 3 bytes per character were used.
It could be up to 4 bytes depending on the character set and the character being stored.

Also need to consider that if using a multi byte character set, then the limit on varchar2 columns on the database is still 4000 bytes and not 4000 characters, so if multi-byte characters get stored, the number of characters that one can store in that column can be as few as 1000. It can cause confusion when some people think the limit is 4000 characters and then find they have trouble storing that many because of multi-byte characters. (Same principle applies to the 32767 byte limit in PL/SQL varchar2 variables)
Umesh P
All the three options
1) varchar2(100 char)
2) varchar2(100 bytes)
3) varchar2(100)

are equally supported and recommended. Using any of them will result in insertion of 100 characters only.
BluShadow
992981 wrote:
All the three options
1) varchar2(100 char)
2) varchar2(100 bytes)
3) varchar2(100)

are equally supported and recommended. Using any of them will result in insertion of 100 characters only.
No it won't. Clearly you haven't read previous replies.

varchar2(100 bytes) will support at most 100 characters, but could be as few as 25 characters if it is populated with multi byte characters (each could take up to 4 bytes).
1 - 5

Post Details

Added on Oct 21 2020
6 comments
1,691 views