Statistics gathering approach [message #509286] |
Fri, 27 May 2011 01:12  |
Its_me_ved
Messages: 979 Registered: October 2009 Location: India
|
Senior Member |
|
|
DBA says they use default database parameters regarding statistics. The job (dbms_stats.gather_database_stats_job_proc) is scheduled to run seven nights a week and the parameter setting is 'TYPICAL'. The procedure determines which statistics are stale and creates as necessary.
Is it the correct method to gather stats? We observed lots of performance issues though. Also we checked last analyzed date.
For most of the tables last analyzed date is not updated.
Also say there is a table where num_rows is say 38900000
and if I check the count on the table it is 39900000+ rows.
Please advise!
Regards
Ved
|
|
|
|
Re: Statistics gathering approach [message #509300 is a reply to message #509286] |
Fri, 27 May 2011 01:39   |
John Watson
Messages: 8965 Registered: January 2010 Location: Global Village
|
Senior Member |
|
|
Ved, I would trust Uncle Oracle with this. If I remember correctly, the default setting for dbms_stats is that it considers statistics to be "stale" if 10% of the rows have changed since the last analysis. You can check this by querying dba_tab_modifications (you may need to execute dbms_stats.flush_database_monitoring_info first, as the view is not updated in real time) and if you don't like that 10% default you can change it.
I would think that if a table already has about 40 million rows, you could probably double (or halve) the number of rows without any change needed in execution plans.
|
|
|
Re: Statistics gathering approach [message #509301 is a reply to message #509294] |
Fri, 27 May 2011 01:40   |
Its_me_ved
Messages: 979 Registered: October 2009 Location: India
|
Senior Member |
|
|
So we should check DBA_SCHEDULER_JOBS to check if there is anything wrong.
Documentation says this procedure gather statistics ifmore than 10% of the data (by table) has changed since the last time statistics were collected.
So, In a table of say 38900000 if there is addition of 1000000 rows
statistics need not be gathered?
Or, should we gather statistics manually also to make sure that stats are upto datE?
Thanks for your reply.
Regards
Ved
|
|
|
Re: Statistics gathering approach [message #509311 is a reply to message #509301] |
Fri, 27 May 2011 02:20  |
 |
Michel Cadot
Messages: 68737 Registered: March 2007 Location: Saint-Maur, France, https...
|
Senior Member Account Moderator |
|
|
Quote:So, In a table of say 38900000 if there is addition of 1000000 rows
statistics need not be gathered?
2.7% modif. => No.
Quote:Or, should we gather statistics manually also to make sure that stats are upto datE?
With 2.7% error statistics are up to date (unless very specific case of screwed data in which the default job is not appropriate, see my previous answer).
Regards
Michel
|
|
|