This content has been marked as final. Show 4 replies
>1 person found this helpful
So what is the best way, do generate this dynamic Temp Table?
You probably don't need a temp table in Oracle
Your code is only using the temp table as intermediate storage for data that is then just queried later multiple times.
In sql server temp tables are used a lot to avoid locking issues that cause performance problems when other users are querying some of the same data.
Oracle doesn't have that locking problem. Readers do not block writers or other readers.
So you can just repeat your query (the one that loads the temp table) when you need it.
See this link from the AskTom site that talks about global temporary tables in Oracle
There are other links on this forum that talk about the same issue when migrating from sql server to Oracle.
thank you for the answer,
what I did not mention was, that I do not use the temporary table because of tablelocks, I am using them because of performance Issues.
The Tables I am doing the selects of, are quite large, so somehow it happens, that because of the "distinct" and the "not in" the select is taking a very long time, compared to writing the data in Temp Table and going from there.
Next thing, I am using same technique is while serializing data from different Tables.
So while Monitor has Parameters like, "screen size", "Backlight", printers come with "Color", "page per minute". Since in Shopping frontend they are shown Serialized, and I do not want to change all tables just to insert a new parameter, I prefer them safed serialized. But I usually get the data in tableformat, which I access through linked tables.
>1 person found this helpful
I do not use the temporary table because of tablelocks, I am using them because of performance Issues.
Global Temporary Tables in Oracle are different than temp tables in Sql Server.
In Oracle the same table can be used by multiple users (sessions) without data conflicts; that is, to each user it appears as if it is their table and all the data in it belongs to them.
The table is a permanent table; you create it one time. But the data in it will be deleted automatically. When the data is deleted depends on the way the table is defined.
If you define it as
this is the default and when the session issues a COMMIT the table is truncated. Remember that only THIS sessions data is removed. Data for other sessions is unaffected.
ON COMMIT DELETE ROWS
If you define the table as
Then the data is not deleted automatically until the session ends. Of course the user can delete or truncate it manually at any time.
ON COMMIT PRESERVE ROWS
See Creating a Temporary Table in the DBA guide.
So unless that table has to dynamically change structure for each run of the stored procedure you can just define the global temp table once and then use it for multple sessions. You can also add any appropriate indexes on it and can add or drop columns as needed.
If the table needs to have a different dynamic structure based on the dynamic numbers or types of columns in the result set that you create dynamically then you will need to create the table dynamically also. In this case there is no need to use a temp table since a regular table will do.
NOTE - if you create the table dynamically then you MUST use dynamic sql to access it since the table will not exist at compile time and any reference to it will cause a compile error.
926165 wrote:Just my 2 cents here.
I am trying to convert a MSSQL Database to Oracle 11g. But I am a little bit stuck on Dynamic temp tables in stored Procedures.
Converting from Database Vendor A to Database Vendor B is never really a great idea. The products are fundamentally different in many respects. Analysing the application that needs to be ported from Database A to Database B and then implementing it in a specific manner to take advantage of the features of your new environment, accounting for any differences between Vendor implementations is going to be your best bet.
I say this because, to me at least, a conversion means you are trying to take what you have and slam it in to what you need. The much better approach is to analyse, design (to account for any discrepancies in behaviour between Vendor implementations, such as read consistency mechanisms) and proceed from there. That way you have a solid foundation on which to begin, instead of a skyscraper built out of toothpicks.