Update issue with high volume table [message #135077] |
Mon, 29 August 2005 16:05  |
Nits
Messages: 2 Registered: August 2005
|
Junior Member |
|
|
Friends,
Our data warehouse application is issuing an update against a table which is having 2 billion (more then One Terabyte) rows using the primary key for some reason we can’t do partition elimination; I want to know performance problems with this dml activity.
I really appreciate your feedback on this issue
thanks,
Nits
|
|
|
|
Re: Update issue with high volume table [message #135229 is a reply to message #135208] |
Tue, 30 August 2005 09:23  |
Nits
Messages: 2 Registered: August 2005
|
Junior Member |
|
|
Actually, we have a job which updates the target table (2 billion rows in production) using the primary key, bulk update (at present 40 million) takes around 3 seconds processing 10,000 rows, I want to have your feedback on the update behavior when we hit 2 billion rows in production with out partition elimination, is it ok to hit 2 biiion row table for update without doing partition elimination ?
thanks,
nits
|
|
|