Re: Updating 100 M rows table

From: Sanjay Mishra <smishra_97_at_yahoo.com>
Date: Fri, 22 Jan 2010 13:52:42 -0800 (PST)
Message-ID: <175736.17864.qm_at_web51306.mail.re2.yahoo.com>




Thanks Kellyn. I am not that Sanjay. This is Data Warehouse and thanks for the suggestion. I am working on the base of ASK Tom big link sent by Josh and trying few process.

SM


________________________________
From: Kellyn Pedersen <kjped1313_at_yahoo.com>
To: "smishra_97_at_yahoo.com" <smishra_97_at_yahoo.com>; "oracle-l_at_freelists.org" <oracle-l_at_freelists.org>; Josh.Collier@banfield.net
Sent: Thu, January 21, 2010 6:22:32 PM
Subject: RE: Updating 100 M rows table


Now if you are the same Sanjay Mishra that co-authored a book on parallel execution, I know for a fact that we do NOT have to go over that topic with you... I read this book when I was researching for a white paper and presentation... :)

Is this an OLTP or OLAP environment, (some folks might think I'm crazy for asking that, but I have a 2TB OLTP, so I will...:))?  

If you are just looking for a couple of options to use with your current knowledge:
And...If you do have the ability to "quiet" this table along with partitioning, performing CTAS of each partition, updating the new table and then exchanging the partition with an update indexes statement is pretty easy to do unless there is a massive quantity of partitions.

A table redefinition, creating the new table, performing the update on the new table and then sync'ing up and ridding of the original.  I've never done this, but the DBA I work with loves this option..

I know this is always up for debate, but freelists and initrans.  I've actually been able to gain the same performance on huge updates by hiking the freelists and initrans as I would have with nologging, (dataguard invironment that I tired of rebuilding the secondary on after this monthly load process.)  When doing so, you must remember to match the new settings to the indexes as well.  This may give you added speed with nologging though!

Hope this helps,

Kellyn Pedersen
Multi-Platform DBA
I-Behavior Inc.
http://www.linkedin.com/in/kellynpedersen
www.dbakevlar.blogspot.com

"Go away before I replace you with a very small and efficient shell script..."

--- On Thu, 1/21/10, Josh Collier <Josh.Collier_at_banfield.net> wrote:


>From: Josh Collier <Josh.Collier_at_banfield.net>
>Subject: RE: Updating 100 M rows table
>To: "smishra_97_at_yahoo.com" <smishra_97_at_yahoo.com>, "oracle-l_at_freelists.org" <oracle-l@freelists.org>
>Date: Thursday, January 21, 2010, 2:35 PM
>
>
>Asktom
> 
>http://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:6407993912330 
>  
>From:oracle-l-bounce_at_freelists.org [mailto:oracle-l-bounce_at_freelists.org] On Behalf Of Sanjay Mishra
>Sent: Thursday, January 21, 2010 1:33 PM
>To: oracle-l_at_freelists.org
>Subject: Updating 100 M rows table
>  
>Can  anyone suggest some experience with big updates. I had at table with 30 column (non-Lob or LONG) in 10g and has few indexes. I need to update 20 million Records of 100Million, what is the best appproach. We can use nologging as it is not having any DR associated with it and Database is not in FORCE Logging. 
> 
>TIA
>Sanjay
>  



      
--
http://www.freelists.org/webpage/oracle-l
Received on Fri Jan 22 2010 - 15:52:42 CST

Original text of this message