Hybrid Columnar Compression [message #578770] |
Mon, 04 March 2013 18:52 |
pzlj6x
Messages: 107 Registered: May 2005 Location: Louisville
|
Senior Member |
|
|
We have a table that is defined as de-normalized table with avg row length of 300bytes. The table is has range partition on date and hash sub partition on ID field. The table is expected to contain over 1 trillion records. Insert only table. No update.
Question:
Can we write data into the table while table is defined forhcc compression for high at partition level. ? OR the current partition has to be uncompressed for insert ? Can someone please help in explaining the compression methodology that can be applied.
Thx,
R.
|
|
|
|
Re: Hybrid Columnar Compression [message #578784 is a reply to message #578770] |
Tue, 05 March 2013 01:55 |
John Watson
Messages: 8962 Registered: January 2010 Location: Global Village
|
Senior Member |
|
|
Direct loads will be HCC compressed, conventional inserts will be OLTP compressed. I think it becomes clear when you think it through: the HCC block format is different (a row will be distributed thought the blocks of a compression unit) so it can't be done in the buffer cache, only in the PGA.
By the way, if you find this useful, a fair return would be detail of the compression ratios you are achieving.
--
John Watson
Oracle Certified Master DBA
http://skillbuilders.com
|
|
|