Home » RDBMS Server » Performance Tuning » DECODE in WHERE CLAUSE Performance (Oracle 11g, Windows XP)
DECODE in WHERE CLAUSE Performance [message #527327] |
Mon, 17 October 2011 09:53 |
|
ninan
Messages: 163 Registered: June 2011 Location: Noida
|
Senior Member |
|
|
The following query gets input parameter from the Front End application, which User queries to get Reports.
There are many drop down boxes like LOB, FAMILY, BRAND etc.,
The user may or may not select values from drop down boxes.
If the user select any one or more values ( against each drop down box) it has to fetch all matching values from DB. If the user does'nt select any values it has to fetch all the records, in this case application will send a value 'DEFAULT' (which is not a value in DB ) so that the DB will fetch all the records.
For getting this I wrote a query like below using DECODE, which colleague suggested that will hamper performance.
From the below query all the variables V_ are defined in procedure which gets the values selected by user as a comma separated string here V_SELLOB and LOB_DESC is column in DB.
DECODE (V_SELLOB, 'DEFAULT', V_SELLOB, LOB_DESC) IN
Can anyone suggest any alternative for this usage. ?
OPEN v_refcursor FOR
SELECT /*+ FULL(a) PARALLEL(a, 5) */
*
FROM items a
WHERE a.sku_status = 'A'
AND a.RPT_FLAG =
DECODE (V_COSTTYPE, '1', 'U', '2', 'O', '4', 'O')
AND DECODE (V_SELBU, '-1', V_SELBU, BU_ID) IN
(SELECT *
FROM THE (
SELECT CAST (
item_comma_to_tab (V_SELBU) AS item_commatotab_type)
FROM DUAL))
AND DECODE (V_SELLOB, 'DEFAULT', V_SELLOB, LOB_DESC) IN
(SELECT *
FROM THE (
SELECT CAST (
item_comma_to_tab (V_SELLOB) AS item_commatotab_type)
FROM DUAL))
AND DECODE (V_SELBRAND, 'DEFAULT', V_SELBRAND, BRAND) IN
(SELECT *
FROM THE (
SELECT CAST (
item_comma_to_tab (V_SELBRAND) AS item_commatotab_type)
FROM DUAL))
AND DECODE (V_SELFP, 'DEFAULT', V_SELFP, FAMILY_PARENT) IN
(SELECT *
FROM THE (
SELECT CAST (
item_comma_to_tab (V_SELFP) AS item_commatotab_type)
FROM DUAL))
AND DECODE (V_SELFAMILY, 'DEFAULT', V_SELFAMILY, FAMILY_DESC) IN
(SELECT *
FROM THE (
SELECT CAST (
item_comma_to_tab (V_SELFAMILY) AS item_commatotab_type)
FROM DUAL));
[Updated on: Mon, 17 October 2011 09:56] Report message to a moderator
|
|
|
|
|
|
|
Re: DECODE in WHERE CLAUSE Performance [message #527333 is a reply to message #527332] |
Mon, 17 October 2011 10:38 |
cookiemonster
Messages: 13958 Registered: September 2008 Location: Rainy Manchester
|
Senior Member |
|
|
The problem with using OR in a query is the same as with using functions in query, both can prevent index usage.
Union all is unlikely to be an improvement.
If performance really is a problem (and it might not be) then the best solution would probably be to open different queries depending on the parameters. So you'd have different queries depending on which parameters have values and only check the relevant ones in each. No OR's or decodes.
|
|
|
|
|
|
Re: DECODE in WHERE CLAUSE Performance [message #528666 is a reply to message #527537] |
Wed, 26 October 2011 06:01 |
|
Kevin Meade
Messages: 2103 Registered: December 1999 Location: Connecticut USA
|
Senior Member |
|
|
Your problem here is a common one. You have what is essentially a generic query for which you have no clue what people will enter as criteria. Thus you have written a worst case query and will pay the performance penalty all the time. Indeed, your hint says to do the same table scan regardless of what criteria has been entered so it does not really matter what you do at this point. Additionally, the decodes as you have will disable indexes anway for these columns.
If you really want the best query performance, then consider using DYNAMIC SQL. Construct the query with a WHERE clause that matches the criteria input and then open it. Include the FULL/PARALLEL hints as you desire but only for cases where you don't want to use indexes. Since this is DYNAMIC SQL I am not sure bind variables will make a difference or not with respect to parsing. Using DYNAMIC SQL will put a limit on the scalability of your app, but I would suggest not worse that doing a FTS every time which puts its own limits on scalability.
This should give you some idea:
SQL> set serveroutput on
SQL> declare
2 vsql varchar2(32000);
3 vdummy varchar2(1) := 'X';
4 c1 sys_refcursor;
5 vdummy_2 varchar2(1);
6 begin
7 vsql := 'select * from dual where dummy = :1';
8 open c1 for vsql using vdummy;
9 fetch c1 into vdummy_2;
10 close c1;
11 dbms_output.put_line('vdummy_2 = '||vdummy_2||'...');
12 end;
13 /
vdummy_2 = X...
PL/SQL procedure successfully completed.
The point is, that you have five subqueries here that will always be executed, no matter what you do. Thus you will always be paying a performance hit for all of them. Using DYNAMIC SQL, you can build a query that only executes the ones you need based on what was entered.
HOWEVER... in your case I am not sure how much difference any of this will make. The nature of your subqueries is that they do not go to the database for data. You are just unpacking a comma delimited string that was given to you as input. In this case, there is no real IO involved for these subqueries so there is not much to optimize there.
You should look deeper into the data statistics of your data and answer these questions:
1) what is the number of rows in items
2) what is the number of rows in items for SKU_STATUS='A' and each type of RPT_FLAG
3) how well does each additional parameter improve performance for your delimited strings (rowcount again). Consider that you will not be able to use more than one index as an access path to the data even if the user enters multiple values, unless you use BITMAP indexes, but don't get there unless you really have the right kind of system for them.
Your goal with these questions is to understand how good a job filtering for each scenario is and based on this, get some idea of if there is any tuning you can do at all and where it makes sense. For example, if you do have excellent filtering for every one of the five related attributes then you should likely have five indexes of the form (SKU_STATUS, RPT_FLAG, <specific column>).
Also, you should consider removing the function calls and instead using inline sql and an alternative delimited string unpacking method. If for example your delimited string 1) contains no dups and 2) has no NOISE between items, then you can do something like this if you want:
select a.*
from (
select substr(
','||V_SELBU||','
,instr(','||V_SELBU||',',',',1,rownum)+1
,instr(','||V_SELBU||',',',',1,rownum+1)-instr(','||V_SELBU||',',',',1,rownum+1)-1
) bu_id
from dual
connect by level <= length(V_SELBU)-length(replace(V_SELBU,','))
) b
, items a
wher a.bu_id = b.bu_id
and a.sku_status = 'A'
and a.rpt_flag = decode(V_COSTTYPE, '1', 'U', '2', 'O', '4', 'O')
/
create index on items (sku_status,rpt_flag,bu_id)
/
This does not require use of a function in your query to unpack data, which will reduce possible context switching overhead. It also converts your data to a simple join which again is only valid if the input variable does not contain dups. But you should be able to control that. It also will only work if this is a true delimited string without anything extra. For example:
ABC,DEF,GHI works fine and is my guess what you have
"ABC","DEF","GHI" won't work unless you account for the double quotes via the substr
ABC, DEF, GHI won't work unless <space>DEF and <space>GHI are that values you want
But again, your app controls what is in the variables so none of this should be an issue for you. This rewrite will allow use of indexes against your filter columns. I have not tested syntax of the above, I leave that to you.
Lastly, as long as the criteria sufficiently filters data to a small enough percentage, then if your buddy says he won't create five indexes for you, you can create just one with all the attributes in it. It won't be as efficient as the five indexes specifically tuned for each situation, but it will allow you to filter data before heading to the table, and will allow you to filter on multiple criteria before going to the table if such is selected by your users which you can't do with the five individual indexes.
So in summary:
1) use DYNAMIC SQL to build a query with only the pieces you need each time.
2) build a query using a different unpacking method so that you can remove references to the table functions, and have a query who's syntax allows use of indexes.
3) make sure you understand the filtering potential of your input criteria and create indexes appropriately.
Good luck, not an easy nut to crack. Kevin
[Updated on: Wed, 26 October 2011 06:15] Report message to a moderator
|
|
|
Goto Forum:
Current Time: Thu Nov 21 18:51:35 CST 2024
|