Statistics_Level = All for Stress Testing?
Date: Thu, 20 Mar 2008 14:36:36 -0500
Message-ID: <OF6E66B4D5.4A144CFD-ON86257412.006A28EA-86257412.006BB6B6@austinisd.org>
Anyone use this setting in a Dev or Test environment to bring out the
worst in your Developers' SQL?
Background is that I set Statistics_Level = All in our Production Student Information database (9.2.0.4 on 64bit Linux) to do some tracing and analysis with HotSOS Profiler over Spring Break last week and didn't change it back to Typical before School restarted on Monday. I hadn't yet searched MetaLink and found all the problems associated with the All setting, so didn't even think about it until we'd fought maxed-out CPUs for a day and a half and had a bunch of frustrated Users who couldn't use our applications. Once I finally remembered that I'd done that and set it back to Typical, CPU went way down to normal levels.
In the course of trying to figure out what the heck was going wrong with our previously fairly nicely behaved apps, we found instances of inefficient SQL in several apps. Adding an index and hint or two here and there worked wonders and made performance noticeably better even after Statistics_Level was set back to Typical. That got me to thinking that maybe I should use the All setting in our Dev and Test databases to bring out the worst in new/revised apps and tune them before they go to Production.
Does that seem reasonable or does the All setting only affect certain kinds of SQL or be skewed some other way to not provide the kind of general stress-testing I'm envisioning?
Thanks.
Jack C. Applewhite - Database Administrator
Austin I.S.D.
414.9715 (phone) / 935.5929 (pager)
-- http://www.freelists.org/webpage/oracle-lReceived on Thu Mar 20 2008 - 14:36:36 CDT