Odd Parallelism Behavior (11.2.0.4) when DEGREE=DEFAULT and INSTANCES=DEFAULT
From: Chris Taylor <christopherdtaylor1994_at_gmail.com>
Date: Tue, 15 Mar 2016 13:05:41 -0500
Message-ID: <CAP79kiQ+RJoa8aWVwPCQ+1XaSdiCfE71xfJEM8vuwhd+=sc6nw_at_mail.gmail.com>
Today we've had some queries suddenly experience terrible performance degradation. The execution plan showed that parallelism was being used but wasn't specified explicitly in the query.
Date: Tue, 15 Mar 2016 13:05:41 -0500
Message-ID: <CAP79kiQ+RJoa8aWVwPCQ+1XaSdiCfE71xfJEM8vuwhd+=sc6nw_at_mail.gmail.com>
Today we've had some queries suddenly experience terrible performance degradation. The execution plan showed that parallelism was being used but wasn't specified explicitly in the query.
I found that 3 indexes were attempting to use parallel processing so I looked at their definitions and the DEGREE & INSTANCES parameters were set to "DEFAULT" instead of explicit "1".
So, I fixed that but what I'm trying to understand is this question:
"How to determine the Oracle default degree of parallelism - what controls/influences the "default" degree?"
I thought it was default=1 but obviously that isn't true. Is there some way I can find the calculated DEFAULT degree of parallelism that Oracle might be using ?
Thanks,
Chris
-- http://www.freelists.org/webpage/oracle-lReceived on Tue Mar 15 2016 - 19:05:41 CET