Home » RDBMS Server » Performance Tuning » Statistics gathering approach (Oracle 10.2.0.3)
Statistics gathering approach [message #509286] Fri, 27 May 2011 01:12 Go to next message
Its_me_ved
Messages: 979
Registered: October 2009
Location: India
Senior Member
DBA says they use default database parameters regarding statistics. The job (dbms_stats.gather_database_stats_job_proc) is scheduled to run seven nights a week and the parameter setting is 'TYPICAL'. The procedure determines which statistics are stale and creates as necessary.

Is it the correct method to gather stats? We observed lots of performance issues though. Also we checked last analyzed date.
For most of the tables last analyzed date is not updated.
Also say there is a table where num_rows is say 38900000
and if I check the count on the table it is 39900000+ rows.

Please advise!

Regards
Ved
Re: Statistics gathering approach [message #509294 is a reply to message #509286] Fri, 27 May 2011 01:24 Go to previous messageGo to next message
Michel Cadot
Messages: 68641
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
It is correct unless something wrong appears.
So let it as it for the whole and manually calculate the stats for those for which the default is not appropriate (you can lock the stats for some tables so that the default job will skip them).

Quote:
Also say there is a table where num_rows is say 38900000
and if I check the count on the table it is 39900000+ rows

So error is 2.6% quite good, isn't it?
Statistics are not exact numbers, statistics are... statistics.

Regards
Michel
Re: Statistics gathering approach [message #509300 is a reply to message #509286] Fri, 27 May 2011 01:39 Go to previous messageGo to next message
John Watson
Messages: 8929
Registered: January 2010
Location: Global Village
Senior Member
Ved, I would trust Uncle Oracle with this. If I remember correctly, the default setting for dbms_stats is that it considers statistics to be "stale" if 10% of the rows have changed since the last analysis. You can check this by querying dba_tab_modifications (you may need to execute dbms_stats.flush_database_monitoring_info first, as the view is not updated in real time) and if you don't like that 10% default you can change it.
I would think that if a table already has about 40 million rows, you could probably double (or halve) the number of rows without any change needed in execution plans.
Re: Statistics gathering approach [message #509301 is a reply to message #509294] Fri, 27 May 2011 01:40 Go to previous messageGo to next message
Its_me_ved
Messages: 979
Registered: October 2009
Location: India
Senior Member
So we should check DBA_SCHEDULER_JOBS to check if there is anything wrong.

Documentation says this procedure gather statistics ifmore than 10% of the data (by table) has changed since the last time statistics were collected.

So, In a table of say 38900000 if there is addition of 1000000 rows
statistics need not be gathered?

Or, should we gather statistics manually also to make sure that stats are upto datE?

Thanks for your reply.

Regards
Ved
Re: Statistics gathering approach [message #509311 is a reply to message #509301] Fri, 27 May 2011 02:20 Go to previous message
Michel Cadot
Messages: 68641
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
Quote:
So, In a table of say 38900000 if there is addition of 1000000 rows
statistics need not be gathered?

2.7% modif. => No.

Quote:
Or, should we gather statistics manually also to make sure that stats are upto datE?

With 2.7% error statistics are up to date (unless very specific case of screwed data in which the default job is not appropriate, see my previous answer).

Regards
Michel

Previous Topic: Unable to avoid Indexes (2 Merged)
Next Topic: index advises
Goto Forum:
  


Current Time: Fri Apr 19 10:43:30 CDT 2024