Home » RDBMS Server » Performance Tuning » Low performing query (10.1.0.5.0 - 64bit, AIX 6)
Low performing query [message #457287] Sun, 23 May 2010 01:23 Go to next message
cherry
Messages: 56
Registered: December 2007
Member
hi everyone

below is a sql which i execute 3 times in a day, with 3 different parameters. for one of the parameters this query consistently takes around 90 mins to complete. for the other 2 parameters, it completes in few mins.

statistics for all the tables are upto date. im attaching the explain plan of the query. can anyone help in fine tuning this?

as a side note, 'gl_reval_account_ranges' table has a lot ranges defined for the time consuming query, in comparison to the other 2 parameters.

SELECT /*+ PARALLEL(f,2) */ c.account_type, 
       b.currency_code, 
       DECODE(account_type, 'A',1, 'L',2, 'O',3, 'R',4, 'E',5) X,
       SUM(d.begin_balance_dr - d.begin_balance_cr + d.period_net_dr - d.period_net_cr) ytd_bal,
       SUM(a.accounted_dr - a.accounted_cr) rev_gain_loss,
       gdr1.conversion_date today_date, 
       gdr1.conversion_type today_type,
       gdr1.conversion_rate today_rate,
       gdr2.conversion_date yesterday_date,
       gdr2.conversion_type yesterday_type,
       gdr2.conversion_rate yesterday_rate
  FROM gl_je_lines a, 
       gl_je_headers b, 
       gl_code_combinations c, 
       gl_balances d, 
       fnd_flex_values_vl e, 
       gl_daily_rates gdr1, 
       gl_daily_rates gdr2, 
       gl_reval_account_ranges f
WHERE a.je_header_id = b.je_header_id
  AND c.code_combination_id = a.code_combination_id
  AND b.je_source = 'Revaluation'
  AND a.set_of_books_id = :CF_SOB_ID 
  AND a.effective_date = TO_DATE(:P_Date) --'23-Aug-06'
  AND a.line_type_code IS NULL
  AND a.code_combination_id = d.code_combination_id  
  AND c.segment5 = e.flex_value					  			 			
  AND d.currency_Code= e.description
  AND d.code_combination_id = c.code_combination_id
  AND d.period_name = TO_CHAR(TO_DATE(:P_Date),'Mon-YY')
  AND d.currency_code <> 'KWD'
  AND d.set_of_books_id = a.set_of_books_id 
  AND c.chart_of_accounts_id= :P_COA_ID
  AND gdr1.from_currency=d.currency_code
  AND gdr2.from_currency=d.currency_code
  AND gdr1.from_currency=b.currency_code
  AND gdr2.from_currency=b.currency_code
  AND gdr1.to_currency='KWD'
  AND gdr2.to_currency='KWD'
  AND gdr1.conversion_date = TO_DATE(:P_Date) --'23-Aug-06'
  AND gdr2.conversion_date = TO_DATE(:P_Date) - 1 --'22-Aug-06' 
  AND c.segment1 BETWEEN f.segment1_low AND f.segment1_high 
  AND c.segment2 BETWEEN f.segment2_low AND f.segment2_high 
  AND c.segment3 BETWEEN f.segment3_low AND f.segment3_high 
  AND c.segment4 BETWEEN f.segment4_low AND f.segment4_high 
  AND c.segment5 BETWEEN f.segment5_low AND f.segment5_high 
  AND c.segment6 BETWEEN f.segment6_low AND f.segment6_high 
  AND f.revaluation_id = :P_REVAL_ID --11000 
  AND NOT EXISTS (SELECT 'x'
                    FROM gl_code_combinations gcc, gl_revaluations gr
                   WHERE gcc.segment1 IN (gr.gain_segment1, gr.loss_segment1)
                     AND gcc.segment2 IN (gr.gain_segment2, gr.loss_segment2)
                     AND gcc.segment3 IN (gr.gain_segment3, gr.loss_segment3)
                     AND gcc.segment4 IN (gr.gain_segment4, gr.loss_segment4)
                     AND gcc.segment6 IN (gr.gain_segment6, gr.Loss_segment6)
                     AND gr.set_of_books_id = :CF_SOB_ID
                     AND gcc.code_combination_id=a.code_combination_id
		  )
  AND EXISTS ( SELECT 'x'
                 FROM gl_je_batches jb, 
                      gl_je_headers jh, 
                      gl_je_lines   jl, 
                      gl_revaluations gr, 
                      gl_code_combinations gcc, 
                      fnd_flex_values_vl ffv
                WHERE jb.je_batch_id = jh.je_batch_id
                  AND jh.je_header_id = jl.je_header_id
                  AND jb.default_effective_date = TO_DATE(:P_Date) --'23-Aug-06' 
                  AND jl.line_type_code = 'GAIN_LOSS'
                  AND jh.set_of_books_id = :CF_SOB_ID
                  AND jh.set_of_books_id = gr.set_of_books_id
                  AND gr.revaluation_id = :P_REVAL_ID --11000 
                  AND ffv.description = jh.currency_code
                  AND jl.code_combination_id = gcc.code_combination_id
                  AND gcc.segment1 IN (gr.gain_segment1, gr.loss_segment1)
                  AND gcc.segment2 IN (gr.gain_segment2, gr.loss_segment2)
                  AND gcc.segment3 IN (gr.gain_segment3, gr.loss_segment3)
                  AND gcc.segment4 IN (gr.gain_segment4, gr.loss_segment4)
                  AND gcc.segment5 = ffv.flex_value
                  AND gcc.segment6 IN (gr.gain_segment6, gr.Loss_segment6)
                  AND jh.je_header_id = b.je_header_id 
		)
GROUP BY b.currency_code, 
         DECODE(account_type, 'A',1,'L',2,'O',3,'R',4,'E',5),
         c.account_type,
         gdr1.conversion_date, 
         gdr1.conversion_type, 
         gdr1.conversion_rate, 
         gdr2.conversion_date, 
         gdr2.conversion_type,
         gdr2.conversion_rate
ORDER BY X, 
         b.currency_code

im attaching the explain plan in pdf for better readability.
thanks for your help.
Re: Low performing query [message #457304 is a reply to message #457287] Sun, 23 May 2010 09:51 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
>statistics for all the tables are upto ; date. im attaching the explain plan of the query. can anyone help in fine tuning this?

Which PLAN was posted the "fast" PLAN or the "slow" PLAN?

Do the PLANS differ between fast & slow runs?
Re: Low performing query [message #457504 is a reply to message #457287] Mon, 24 May 2010 23:00 Go to previous messageGo to next message
Kevin Meade
Messages: 2103
Registered: December 1999
Location: Connecticut USA
Senior Member
Querying tuning usually follows the same basic process

1) define all PK,UK,FK constraints
2) make sure these constraints are supported by an index
3) collect stats using DBMS_STATS
4) break the query into parts and incrementally add parts in so you can see the cost of each part
5) tune the most expensive parts (rewrites/index changes/histograms/alternative storage strategies)

Sometimes getting 10 times as much data just takes 10 times as long.

Good luck, Kevin
Re: Low performing query [message #457556 is a reply to message #457504] Tue, 25 May 2010 03:37 Go to previous messageGo to next message
cherry
Messages: 56
Registered: December 2007
Member
hi, both queries, fast & slow, use the same plan.

thanks kevin for the steps. i usually follow them when Im authoring the query. in this case, i have inherited it from another developer. though it returns the desired output, its consuming much time.

re-writing the query is in my mind, but then i have to study the business logic & go back to the users asking them to test something which is perfectly running fine with them. in my situation, this is easier said than done.

any other suggestions?
Re: Low performing query [message #457632 is a reply to message #457556] Tue, 25 May 2010 07:55 Go to previous messageGo to next message
Kevin Meade
Messages: 2103
Registered: December 1999
Location: Connecticut USA
Senior Member
I gave my suggestions. Nothing is free in our business. You have to do work when you want something. In this case you have many hours ahead of you. What kind of work? Well...

How much time does adding each correlated subquery add to the query?
How much time does adding the table F add to the query?
How much time does each incremental step in the query add?
Which step shows a huge jump in time?
Which step causes the generation of a large intermediary rowset?
What table is your query's driving table?
Have you built a query diagram for this query?
Do you know if each join is supported by an index?
How many rows does each set of parameters return in the final result and how much time does it take?
How many rows are returned from each set of parameters when you remove the GROUP BY and ORDER BY clauses?

You need to do some work. I don't care if you inherited it from someone else; that is the business so get used to it. I don't care if your users will be required to re-test; you should be doing enough testing to satisfy yourself that any rewrite is sematically equivelant to the current query before you ask your users to test, and if your users can't be bothered to spend the time and effort to test your final fixes then you don't need to bother trying to make it go faster because they obviously don't want it bad enough.

Show us what work you have done so far. If you think there was some obvious or magical quick fix, there ain't.

Kevin x79427
Re: Low performing query [message #457639 is a reply to message #457632] Tue, 25 May 2010 08:35 Go to previous messageGo to next message
cherry
Messages: 56
Registered: December 2007
Member
hey kevin, cool. don't take the matter personal Smile notwithstanding your monumentally detailed answer, you are in no position to understand how things work my side. anyhow, i appreciate your response. thanks.
Re: Low performing query [message #457660 is a reply to message #457287] Tue, 25 May 2010 09:48 Go to previous messageGo to next message
Kevin Meade
Messages: 2103
Registered: December 1999
Location: Connecticut USA
Senior Member
Cherry, (Dukov's redhead). I guess I should not be surprised as you never properly thanked me for guiding you to Rivet City.

Hint: insulting someone of whom you have asked a question does not usually result in your getting an answer to the question. I have nothing to loose if you don't get an answer, and who else reading this thread will bother to tread the waters after your last reply to me.

Fact: you are not special. You might think you have a UNIQUE position but you don't. After 30 years of doing it I understand the business pretty well and the various difficulties you are working under. So do all the other moderators and experts on this forum as most of them have 20+ years in the Oracle field. Between us we have experienced all the same difficulties and feelings you are experiencing and feeling.

Observation: so far you have a) pleaded for help, b) not shown any work of your own other than posting a query and plan, c) complained about how hard it would be for you to do any actual work, d) tried to bully experts into doing your work for you.

Fortunately for you I don't take things personally anymore. I am so good at what I do that in the Oracle Seas, no one can make me feel inferior and scare me into compliance as might have been the case in my early years. So if you want to move forward with this you will have to start showing us your work.

Do you know what a query diagram is? If yes, show us yours. For this query it will take you about 15 minutes to build one and then about 1 hour to figure out how to create the diagram in a way that you can either inline it in your post, or attach it as a document. If you do not know what a query diagram is or how to build one, then say so and I will show you. Then you can spend the 75 minutes to build it and put it here.

Have you done any work timing the pieces of the query? If you have then show use the results and give us your thoughts. If not, then get to it.

If you want anyone on this forum to put out for you then you have to first put out for us.

Kevin
Re: Low performing query [message #457665 is a reply to message #457660] Tue, 25 May 2010 10:01 Go to previous messageGo to next message
Michel Cadot
Messages: 68716
Registered: March 2007
Location: Saint-Maur, France, https...
Senior Member
Account Moderator

Cherry, (Dukov's redhead).

/forum/fa/449/0/

Re: Low performing query [message #457684 is a reply to message #457287] Tue, 25 May 2010 11:30 Go to previous messageGo to next message
Kevin Meade
Messages: 2103
Registered: December 1999
Location: Connecticut USA
Senior Member
/forum/fa/7846/0/

Here is my version of the query diagram. From this I can see several things:

how tables are joined
where constant tests are applied to the query and thus possible driving tables
additional seperate components like correlated subqueries
where special constructs like for example outer-join might cause issues
what index strategies might make sense (multiple lines entering a single table are often candidates for a composite index)

Based on these simple items, I can plan on how to step through the query one piece at a time to determine how much time is added by each new step.

Naturally this does not show everything. All tools have their limits. In this case, I have not yet diagramed the correlated subqueries (I leave that to you). Additionally there is no hint that a group by/order by is being done. On the other hand, this technique is valid for all databases, not just Oracle so it has value as a standardizable practice for those of use who deal with more than one database. Additionally, it is extensible so you can add as much information as you feel is of value assuming you can figure out some kind of non-cluttering notation to represent it.

Have your hand at creating diagrams for the correlated subqueries.

Also, use this diagram to plan how you are going to break up the query into it parts for testing. As a simplified start, use the same driving table that the plan you supplied uses.

As you can see, tuning individual SQL can take time. I have not figured out a fast way to do it. Maybe that is why I spend so much of my time making sure that the databases I build and work with meet a minimum standard set of practices. This allows me to minimize poor sql so I have less work to do overall.

Kevin
  • Attachment: SCAN0019.JPG
    (Size: 57.98KB, Downloaded 5231 times)

[Updated on: Tue, 25 May 2010 11:40]

Report message to a moderator

Re: Low performing query [message #457775 is a reply to message #457684] Wed, 26 May 2010 00:19 Go to previous messageGo to next message
cherry
Messages: 56
Registered: December 2007
Member
kevin,

i don't wish to respond to your 2nd last post(i'm sticking to technicalities here). if you feel offended with any of my replies, i'm extremely sorry. i did not mean to be rude to you or others here.

& thank you AGAIN for sharing your constructive thoughts.
regards
Re: Low performing query [message #457816 is a reply to message #457775] Wed, 26 May 2010 01:49 Go to previous messageGo to next message
Kevin Meade
Messages: 2103
Registered: December 1999
Location: Connecticut USA
Senior Member
We are square. I was a bit harsh.

So what can you provide us as regards the timings on your query?

Were you able to break off any of its parts to see what time is added? If we are lucky you will be able to isolate some piece of the query as being much more expensive compared to the query overall. Then we can work on tuning that piece.

Is the diagram I provided of any value to you? I usually only create these digrams for the most difficult of queries. Then I use them like a road map. For example, from the diagram and a driving table, I can figure out easily how to construct incrementally bigger queries that each adds a step to the query. Then I can time each new iteration and know how much more time the addition has added. Eight times out of Ten I will find one iteration that exhibits a big jump in time. Then I know where to focus tuning efforts. For example, here would be one possible start:

Get timings on this query (remember to account for caching of sql and data so as not to skew your timings)

SELECT (a.accounted_dr - a.accounted_cr) rev_gain_loss
  FROM gl_je_lines a
WHERE a.set_of_books_id = :CF_SOB_ID 
  AND a.effective_date = TO_DATE(:P_Date) --'23-Aug-06'
  AND a.line_type_code IS NULL
/


Then add the related correlated subquery and get timings on this new iteration.

SELECT (a.accounted_dr - a.accounted_cr) rev_gain_loss
  FROM gl_je_lines a
WHERE a.set_of_books_id = :CF_SOB_ID 
  AND a.effective_date = TO_DATE(:P_Date) --'23-Aug-06'
  AND a.line_type_code IS NULL
  AND NOT EXISTS (SELECT 'x'
                    FROM gl_code_combinations gcc, gl_revaluations gr
                   WHERE gcc.segment1 IN (gr.gain_segment1, gr.loss_segment1)
                     AND gcc.segment2 IN (gr.gain_segment2, gr.loss_segment2)
                     AND gcc.segment3 IN (gr.gain_segment3, gr.loss_segment3)
                     AND gcc.segment4 IN (gr.gain_segment4, gr.loss_segment4)
                     AND gcc.segment6 IN (gr.gain_segment6, gr.Loss_segment6)
                     AND gr.set_of_books_id = :CF_SOB_ID
                     AND gcc.code_combination_id=a.code_combination_id
		  )
/


Subtract the two and you know how how much work the corrlated subquery adds to the process. Maybe it adds most of the time. If it does then tuning the correlated subquery would be the thing to tune. For example, I would rewrite this corrlated subquery by replacing this line with another and see what the effect is. Sometimes adding constants in correlated subqueries can be good and sometimes not.

                     AND gr.set_of_books_id = :CF_SOB_ID

                     AND gr.set_of_books_id = a.set_of_books_id 


I also wonder how the IN tests in the subquery are affecting your runtimes. As I read it you need to do 32 compares for each GR row you fetch. One alternative might be to generate the 32 different combinations of segment values from GR to then be able to directly find a row on GCC. Then again Oracle might be doing that for us already. It is just a thought. Not sure if it can go anywhere. Indeed it is premature to talk about it till we know if this correlated subquery is in fact part of the timings problem.

Anyway, I think you get the idea. Keep building up the query one step at a time by adding in one table or one component (this means selected columns, from clause, where clause) and take differences to determine where all your time is going.

Please, post some query breakdowns and their times and tell us where the time is going.

Good luck, Kevin

[Updated on: Wed, 26 May 2010 01:57]

Report message to a moderator

Re: Low performing query [message #457904 is a reply to message #457287] Wed, 26 May 2010 06:04 Go to previous messageGo to next message
michael_bialik
Messages: 621
Registered: July 2006
Senior Member
Post TKPROF.

Re: Low performing query [message #458050 is a reply to message #457816] Thu, 27 May 2010 01:11 Go to previous messageGo to next message
cherry
Messages: 56
Registered: December 2007
Member
thanks much Kevin for the details.

im already on the job of breaking the query into bits. your diagram is the guiding force. my suspect is the 2nd co-related sub-query in the select where it again has to scan through 80% of the tables, already being called in the main select.

as a side note: problem/barrier my side is that we have access only to the dev env which is primitive; in the sense that we hardly have 10% of data of what exists in the prod/test & this data is barely meaningful as it was keyed-in by the implementors for experimenting. apart from the difference in the setups, versions of the db & apps in the dev env is entirely different from that of prod/test, as dev was never patched. since our dba's & user's sit at remote locations, sometimes getting appropriate outcome from either of them is a challenge. tuning is a time consuming exercise, but with the above bottlenecks it becomes uninspiring.

hi @michael_bialik, i will get back asap with the details.

thanks.
Re: Low performing query [message #460790 is a reply to message #458050] Mon, 14 June 2010 17:08 Go to previous messageGo to next message
mikerault
Messages: 21
Registered: June 2010
Location: Alpharetta, Ga
Junior Member
The optimizer may not be properly evaluating the last two tables in your join due to the limit of 2000 on query paths considered. Since query paths are based on the factorial of the tables it stops considering tables for optimization after the 6th table in the from clause. At least in Oracle9 if you moved the tables from the end of the from clause to the beginning it would sometimes generate a totally different explain plan, usually better. I have not verified if there is some magic mojo added to 10 and 11 to move the more important tables up in the hierarchy of pathway consideration but I somehow doubt it it much different than version 9.

If it works good on some values but not on others, have you considered that you may have skewed data?
Re: Low performing query [message #462881 is a reply to message #460790] Mon, 28 June 2010 07:56 Go to previous messageGo to next message
cherry
Messages: 56
Registered: December 2007
Member
i've been able to tune the query to make it run under 3 mins, compared to the earlier 90 mins. i had to step through piece by piece.

i rearranged the where clause's by first employing the columns that are indexed followed by the un-indexed. also, i commented out few conditions where the high & low values are the same, making the 'between' clause unnecessary.

thanks all for your support.

SELECT   c.account_type, 
         a.currency_code,
         DECODE (c.account_type, 'A', 1, 'L', 2, 'O', 3, 'R', 4, 'E', 5) x,
         SUM (d.begin_balance_dr
              - d.begin_balance_cr
              + d.period_net_dr
              - d.period_net_cr
             ) ytd_bal,
         SUM(b.accounted_dr - b.accounted_cr) rev_gain_loss,
         gdr1.conversion_date today_date, 
         gdr1.conversion_type today_type,
         gdr1.conversion_rate today_rate,
         gdr2.conversion_date yesterday_date,
         gdr2.conversion_type yesterday_type,
         gdr2.conversion_rate yesterday_rate
    FROM gl_je_headers a,
         gl_je_lines b,
         gl_code_combinations c,
         gl_balances d,
         gl_revaluations gr,
         gl_reval_account_ranges f,
         fnd_flex_values_vl e,
         gl_daily_rates gdr1,
         gl_daily_rates gdr2
   WHERE a.je_header_id = b.je_header_id
     AND b.code_combination_id = c.code_combination_id
     AND c.code_combination_id = d.code_combination_id
     AND gr.set_of_books_id = a.set_of_books_id
     AND gr.revaluation_id = f.revaluation_id
     AND c.segment5 = e.flex_value
     AND a.period_name = TO_CHAR (TO_DATE (:P_Date), 'Mon-YY')
     AND a.period_name = d.period_name
     AND d.currency_code <> 'KWD'
     AND gr.revaluation_id = :P_REVAL_ID
     AND a.je_source = 'Revaluation'
     AND a.set_of_books_id = :CF_SOB_ID
     AND b.line_type_code IS NULL
     AND b.effective_date = TO_DATE(:P_Date)
     AND gdr1.from_currency = d.currency_code
     AND gdr2.from_currency = d.currency_code
     AND gdr1.from_currency = a.currency_code
     AND gdr2.from_currency = a.currency_code
     AND gdr1.to_currency = 'KWD'
     AND gdr2.to_currency = 'KWD'
     AND gdr1.conversion_date = TO_DATE(:P_Date)
     AND gdr2.conversion_date = TO_DATE(:P_Date) - 1
     AND d.currency_code IN (	SELECT ffv.description
              			FROM fnd_flex_values_vl ffv, fnd_flex_value_sets ffvs
             			WHERE ffvs.flex_value_set_id = ffv.flex_value_set_id
               			AND ffvs.flex_value_set_name = 'NBK_CURRENCY'
               			AND enabled_flag = 'Y'
               			AND TRUNC (SYSDATE) <= NVL (end_date_active, TRUNC (SYSDATE)))
     AND c.segment3 BETWEEN f.segment3_low AND f.segment3_high
     AND c.segment2 BETWEEN f.segment2_low AND f.segment2_high
     --AND c.segment5 BETWEEN f.segment5_low AND f.segment5_high
     --AND c.segment4 BETWEEN f.segment4_low AND f.segment4_high
     --AND c.segment1 BETWEEN f.segment1_low AND f.segment1_high
     --AND c.segment6 BETWEEN f.segment6_low AND f.segment6_high
     AND NOT EXISTS (
            SELECT 'x'
              FROM gl_code_combinations gcc, gl_revaluations gr
             WHERE gcc.segment3 IN (gr.gain_segment3, gr.loss_segment3)
               AND gcc.segment2 IN (gr.gain_segment2, gr.loss_segment2)
               AND gcc.segment4 IN (gr.gain_segment4, gr.loss_segment4)
               AND gcc.segment1 IN (gr.gain_segment1, gr.loss_segment1)
               AND gcc.segment6 IN (gr.gain_segment6, gr.loss_segment6)
               AND gr.set_of_books_id = :CF_SOB_ID
               AND gcc.code_combination_id = b.code_combination_id)
     AND EXISTS (
            SELECT 'x'
              FROM gl_je_batches jb,
                   gl_je_headers jh,
                   gl_je_lines jl,
                   gl_code_combinations gcc,
                   gl_revaluations gr,
                   fnd_flex_values_vl ffv
             WHERE jb.je_batch_id = jh.je_batch_id
               AND jh.je_header_id = jl.je_header_id
               AND jl.code_combination_id = gcc.code_combination_id
               AND jb.default_period_name = TO_CHAR (TO_DATE(:P_Date), 'Mon-YY')
               AND jb.default_effective_date = TO_DATE(:P_Date)
               AND jl.line_type_code = 'GAIN_LOSS'
               AND jh.set_of_books_id = :CF_SOB_ID
               AND jh.set_of_books_id = gr.set_of_books_id
               AND gr.revaluation_id = :P_REVAL_ID
               AND jh.currency_code = ffv.description
               AND gcc.segment3 IN (gr.gain_segment3, gr.loss_segment3)
               AND gcc.segment2 IN (gr.gain_segment2, gr.loss_segment2)
               AND gcc.segment5 = ffv.flex_value
               --AND gcc.segment4 IN (gr.gain_segment4, gr.loss_segment4)
               --AND gcc.segment1 IN (gr.gain_segment1, gr.loss_segment1)
               --AND gcc.segment6 IN (gr.gain_segment6, gr.loss_segment6)
               AND jh.je_header_id = a.je_header_id)
GROUP BY c.account_type,
         a.currency_code,
         (c.account_type,'A',1,'L',2,'O',3,'R',4,'E',5),
         gdr1.conversion_date, 
         gdr1.conversion_type, 
         gdr1.conversion_rate, 
         gdr2.conversion_date, 
         gdr2.conversion_type,
         gdr2.conversion_rate
ORDER BY X, 
         a.currency_code

Re: Low performing query [message #462906 is a reply to message #462881] Mon, 28 June 2010 10:35 Go to previous messageGo to next message
Kevin Meade
Messages: 2103
Registered: December 1999
Location: Connecticut USA
Senior Member
Way to go Cherry. What an inspiring improvement.

Thank you for sharing your solution. You willingness to come back with how you fixed it is the kind of sharing that makes our website work.

Kevin
Re: Low performing query [message #471311 is a reply to message #462906] Fri, 13 August 2010 05:08 Go to previous messageGo to next message
khailey
Messages: 3
Registered: June 2007
Location: portland, oregon
Junior Member
awesome - I love the break down on the query diagram.
The query diagram can be a lot of work so I've been designing functionality in a product called DB Optimizer at Embarcadero to do it automatically. Would be super curious to know what you think, Kevin and maybe discuss some ideas.
Here is my blog:
db-optimizer dot blogspot dot com
and a link to a 10 minute video extract on visual sql tuning diagrams. The extract is from a webinar that Jonathan Lewis and collaborated on:

Explantation of Visual SQL Tuning on video here
db-optimizer dot blogspot dot com/2010/06/jonathan-lewis-webinar-replay.html

Best
Kyle
Re: Low performing query [message #471312 is a reply to message #471311] Fri, 13 August 2010 05:09 Go to previous messageGo to next message
khailey
Messages: 3
Registered: June 2007
Location: portland, oregon
Junior Member
ps sorry for the URLs with "dot" instead of "." but the URLs were flagged so had to modify them
Re: Low performing query [message #471313 is a reply to message #471312] Fri, 13 August 2010 05:11 Go to previous messageGo to next message
cookiemonster
Messages: 13958
Registered: September 2008
Location: Rainy Manchester
Senior Member
That's because you're a new member. We do that to stop spambots.
Re: Low performing query [message #471315 is a reply to message #471312] Fri, 13 August 2010 05:20 Go to previous messageGo to next message
khailey
Messages: 3
Registered: June 2007
Location: portland, oregon
Junior Member
I found your query diagram example by googling "query diagram". Your example d is about the only interesting thing I found on query diagrams other then a couple of references to Dan Tow's work. Its intriguing to me that maybe query diagramming is something commonly discussed? Am I missing something? I've been working on Oracle for 20 years and hardly have heard a peep about diagramming queries. I was looking for a product by DB2 that does query diagrams according to one of my seminar attendees. I haven't found anything on query diagrams from DB2. The only place I've seen query diagrams formally talked about was Dan Tow's book SQL Tuning. Jonathan Lewis and I just recently did a webinar on the "visual approach" to SQL tuning where we laid out parts of a formal approach and Stephane Faroult has some examples in his excellent book "Refactoring SQL Applications". Stephane lays out some excellent visual analysis scenarios but he doesn't really formalize the process of diagramming the query. I've been formalizing bit by bit the sql diagramming as part of the process of creating the specifications for the developers of DB Optimizer.
Re: Low performing query [message #471365 is a reply to message #457287] Fri, 13 August 2010 11:16 Go to previous message
Kevin Meade
Messages: 2103
Registered: December 1999
Location: Connecticut USA
Senior Member
You are correct. It is not talked about it very much. I like them because they give some clarity on what a query "looks like". As you point out they are "visualization" tools.

I am going on vacation today so I won't be able to get back to you till the 23-aug.

I would be honored to assist in any way I can.

I first got introduced to query diagrams back in 1990 when I read a book called The Ingres Papers. This was a great read by my standards and even though it is pretty old, it ain't actually dated as everything in the book is still relevant in some way. There was a chapter on query diagrams. Dan then created his book to show people the basics of optimization and how to follow two basic rules:

1) from where you are currently in a query, where can you go next
2) which potential next step serves to keep the intermediary result smallest

Thus the basic driver of Dan's work is predominantly to keep intermediary result sets to fewest rows as possible as you got. This is a very good rule. I would call it a 80% rule as for most queries will yield great if not best plan using this rule.

The other 20% (and this percentage is increasing), new plan methods (hash join etc.) offer different advantages and are hard to consider at a causaul level. This is another reason Dan's method works for most people, it does not require a Master's Degree in Math to understand it and thus is available to the masses of the asses for use.

With the advent of SOX and other "overheads???" the industry is starting to see the emergence of additional documentation needs. Query Diagrams are among these. When doing query diagrams I note there are at least two variations:

1) standard ERD
2) augmented ERD (multiple references to the same table have their own boxes).

The first shows how the query fits to the data model.
The second allows for plan analysis on the query.

There are a lot of ways to get the first. No automated way to get the second. If you are adding the functionality of the second diagram in addition to the first, you will be creating a unique tool.

Send me an email if you would like to discuss further. I will check out your URLs. Again I will be out until the 23-aug.

Looking forward to it. Kevin
Previous Topic: Requie help to tune the process
Next Topic: redo logs generating after every 3 min
Goto Forum:
  


Current Time: Fri Nov 22 07:51:03 CST 2024