Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.server -> Re: How do I handle statistics on volatile tables
From: tgp_at_mci2000.com >>
Peoplesoft application, if you're familiar
with it, and care about such things.
I am running 7.3.4 on AIX, with the cost-based optimizer. Things run pretty well in most situations. But the application has MANY "temp tables", defined to mean tables which are normally empty, but are filled with data during a batch run, used in some way, and truncated at the end.
I update my statistics every Sunday, like a good boy, but I'm certainly lying to Oracle when I do this, because all the "temp tables" have no rows in them at 6 AM Sunday, but may have hundreds or thousands of rows when I'd like a really good execution plan.
How do you handle this situation? <<
We have several tables that are truncated on a regular basis before use. To help the optimizer we analyze these tables when they have a normal load in them and then we do not re-analyze these tables. If the number of rows, if the average row length, and the relative uniqueness of the columns do not change much then the statistics do not need updating. I have not checked version 8 but with 7.3 and lower the truncate command does not resit the statistics so we let them age. We try to only analyze tables that grow or whose data varies.
The other option is to delete the stats for these temp tables and see if the jobs run better than with stats that show zero rows.
If batch jobs are involved and the load of the temp table is done in a separate step from the use of the table then you might consider coding a procedure that uses dynamic sql to analyze the temp table and place it between the two steps.
I hope this gives you something to think about.
Mark Powell -- Oracle 7 Certified DBA
- The only advice that counts is the advice that you follow so follow your own
advice -
Received on Mon Aug 31 1998 - 16:52:35 CDT
![]() |
![]() |