RE: PeopleSoft query
Date: Tue, 14 Jul 2009 23:11:40 +0100
Message-ID: <A6ADB710763A476D90FB9E0F818C7B14_at_GOFASTER4>
>May I know what's good about OPTIMIZER_DYNAMIC_SAMPLING = 4?
>and the effects of this on the *WRK tables
Level 4 applies ODS to some addition scenarios.
See
http://download.oracle.com/docs/cd/B19306_01/server.102/b14211/stats.htm#sth
ref1111
My experiences is that this is generally beneficial with GP.
However, ODS is not a silver bullet. I have also found scenarios where Oracle doesn't choose the best execution plan with any level of ODS, but does if you explicitly collect statistics. In GP, there are a couple of statements that need a LEADING hint to stabilise the join order, after that Oracle generally does the right thing on its own.
>"I would always edit the dms scripts delivered by Oracle -
>that is the source - and then reload the statement"
>-> How do you reload the statement once you have edited the DMS script
>and put some hints?
With PeopleSoft's Data Mover utility.
>"When you get to a new version (or get a patch) do a flat file
>comparison of the new and old dms scripts."
>-> How do you make comparisons on the old and new scripts? Do you use
>some kind of diff utility to compare the lines of the new and old?
Any file comparison tool. Or diff or sdiff on Unix. But the PeopleSoft upgrade instructions will not remind you to do this. They do not anticipate that anyone would change the stored statements.
>"This is an either or choice with GP. Either you have the
>program analyze the tables at runtime, or you remove and lock
>the stats and use ODS. If you do both GPPDPRUN will crash
>because the analyze command will error."
>-> Which is better? The problem we are having now is that the run
>times are inconsistent that optimizer really goes bad
>sometimes. How do you compute stats on SYSADM tables? how
>about indexes? and how often you rebuild? how do you collect
>histograms?
>
There is not clear-cut answer to this question. It depends.
For a relatively small system where you don't need streamed payroll processing you might want to leave it as default as possible. You can adjust the Process Scheduler configuration so that you never have more than one payroll calc process running at any one time. And then you can probably stick with the payroll processes maintaining stats on the working storage tables. You also have to accept that the Cobol program will issue an ANALYZE command, while the rest of the statistics have been produced with dbms_stats - which is not ideal. If you work out which stored statement is causing the performance problem then you might be able to hint it so that it always uses the preferred execution plan.
However, if you need to any concurrent processing you will probably need to
introduce Global Temporary Tables. If you use 'payroll streams' then you
will need partitioning. At which point you have to delete and lock stats
and use ODS. The exact number of partitioned and GT tables varies with the
country extensions that have been loaded. In GPUK you have around 79 global
temporary working storage tables and 38 partitioned tables. You will
probably need to hint some statements in this arrangement too.
>
Where I have suggested adding hints to the stored statements, you could also
use stored outlines. The SQL statements use bind variables, and so will be
the same each time. I haven't done this myself because it adds to the
administrative overhead. You have to think about upgrading your outlines in
synchronisation with upgrading the stored statements, but it is an option.
regards
David Kurtz
Go-Faster Consultancy Ltd.
tel: +44 (0)7771 760660
fax: +44 (0)7092 348865
mailto:david.kurtz_at_go-faster.co.uk
web: www.go-faster.co.uk
Book: PeopleSoft for the Oracle DBA: http://www.psftdba.com DBA Blogs: PeopleSoft: http://blog.psftdba.com, Oracle: http://blog.go-faster.co.uk
PeopleSoft DBA Forum: http://groups.yahoo.com/group/psftdba
-- http://www.freelists.org/webpage/oracle-lReceived on Tue Jul 14 2009 - 17:11:40 CDT