This content has been marked as final. Show 4 replies
1. Unable to restore the DB properly
2. Unable to connect to the DB from client machines
Read the last part of this 'interesting' post -- Re: Date range using dual table
I was in class- 7 then. ;-)
Recently i faced a lot of problem, when a PERL process brought a lot of junk/non-supported characters into our database, which was not handled.
Ohhh,..!!! it was a night-mare.
I'm sure most, if not all of us herethat have been around for a while, would have lots of war stories about someone or something blowing up the database and the trials and tribulations of getting it put back together properly. Many of us would also have lots of stories about that really tricky report/ETL job and all of the tricks we used to get it to work correctly and fast enough. However, for me the biggest challenges I face regularly have to do with the types of things that come up on this forum every day.
Basic things like not understanding correct datatype, dates are a real biggie here but storing/passing numbers as strings is right up there, and relying on implicit type conversions. Developers who do not think in sets and resort to cursors and row by row processing for (almost) every task. Misusing dynamic sql when it is not required, personally I rarely use execute immediate in my code, and when I do it is almost always for set up/re-set type activities in ETL code like truncating a staging table or disabling constraints/indexes prior to a data load.
These types of things can have an impact on performance, not only for the particular task that the piece of code is designed to accomplish, but quite possibly database wide. Flooding the shared pool with hundreds of copies of a sql statement that differ only by the literals used in a predicate (or worse as one app I dealt with only in literals in the projection) is detrimental to the health of the whole database, and when the app has hundreds of sql statements it gets reall bad for the well behaved developers inthe database.
Worse than the performance impact is the subtle bugs these types of things can introduce in the code. How many questions on this forum essentially boil down to "I put data in my table now when I try to query for a date range I get no rows, but I know I put it in" and the answer comes down to at some point in the process inserting the data or querying the data someone relied on an implicit string to date conversion and the date in the table (or perhaps the predicate) is actually 19 something or 00 something instead of the expected 20 something.
I cannot tell you how many hours I have spent debugging other people's code (usually after the data has been well and truly corrupted) asking how did that value get in that column because (so I'm told) it's impossible for that to happen. In the vast majority of the cases it is due to implicit data conversions.
Inspiring stories are all well and good, but if you are looking for inspiration to become a better developer/DBA look at the mindset of the people like Frank, Blushadow, Solomon and Billy that allow them to see the set-based solution to a problem (there is almost always one), to see the "correct" data types to use etc. And, to paraphrase Billy, programming is programming, basic good practices are the same in any language, so look at well-written code in other languages, not just PL/SQL.
But, if you want in inspiring story for a "junior", here is mine.
We had a very trick bit of ETL code that involved joining 5 or 6 tables. Complicating matters, because of the nature of the source system, one of the tables had to be pivotted, one had to be unpivotted and we needed a top-n query for one of the others. At the time, we were using 10 something, so no PIVOT/UNPIVOT, so I wrote a several hundred line query with
sub-queries over sub-queries, analytic functions and hierarchical queries. This produced correct results, and the run-time was more than adequate (about 3 minutes for 6.5 million rows as an insert append). I gave that query to one of our junior developers for integration into the ETL package we were creating. She looked at that monster and said "Huh?". I told her, as I tell everyone, break it down into pieces, run each of the bits alone to see what they do then build it back up again to see how it all fits together.
The next day, she came back to me and said "I spent a lot of time playing with your query, and I think I understand how it works. If I do understand, then I think this is an equivalent query, can you check it for me?" What she gave me was a fifty or so line query with two sub-queries, no analytics and no hierarchical queries. At first glance, it could not possibly work, but I checked it anyway and much to my surprise it was correct (and faster than mine). Sometimes, knowing too many tricks is not a good thing :-)
Wao... It was a nice story John. :-)
Thanks a lot for sharing with all of us.