Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.server -> Re: Capacity planning for VERY large table
Thanx for responding.
Here are the answers for your questions:
I'm trying to convince the application designers to keep only last 10% of data ( may be even less ). Still the data volumes are going to be impressive.
Thanx again all "respondees". Michael.
In article <969958077.2130.1.nnrp-12.9e984b29_at_news.demon.co.uk>,
"Jonathan Lewis" <jonathan_at_jlcomp.demon.co.uk> wrote:
>
>
> Interesting challenge.
>
> Do all 25 columns always appear in all queries ?
> Do the typical ranges for each predicate cover about 20 values ?
> Are the values 0..255 used fairly evenly in any given column ?
> Is the distribution of values in a column highly clustered, or
very
> scattered ?
> Is this very static data so that it can be bitmap indexed ?
>
> When the smoke has died down, how many rows should
> a single query return in a 'typical' query.
>
> Arithmetic:
> 256 values per column to cover 25,000 000 000 rows
> Selecting on a range of 21 values
> => one column in predicate targets 10% of the data
>
> To reduce the target data to 25 rows, (for sub-second
> access - assume 25 rows = 25 physical disc reads)
> we need to apply the predicates for 9 columns.
>
> Estimate of size of bitmap,
> assuming poor compression: one byte per row
> assuming terrific compression: one bit per row
>
> In a good case, each bitmap index would be 3GB,
> in a poor case 25GB - and you need to keep about
> 10% of nine of them in memory to achieve sub-second
> responses.
>
> Even if Oracle uses a perfect range reduction
> algorithm, the very first step of the query has
> to process that 3GB.
>
> I think you need much tighter queries, with predicates
> on columns with far higher cardinality before you can
> hope to get high-speed queries.
>
> --
>
> Jonathan Lewis
> Yet another Oracle-related web site: http://www.jlcomp.demon.co.uk
>
> Michael Bialik wrote in message <8qoc4j$cfc$1_at_nnrp1.deja.com>...
> >Hi.
> >
> > I have a very large table ( 25 * 10**9 rows = 25,000,000,000 rows ).
> > The row length is 60 bytes, so the table size is 1.5Tb - 2.0 Tb
> > ( including Oracle overhead ).
> >
> > The table contains 24 numeric fields and 1 description.
> > Each numeric field ( F01, F02, ... F24 ) may contain values
> > from 0 to 255 only.
> >
> > The SELECT is going to look like that:
> >
> > SELECT description FROM large_table
> > WHERE
> > F01 between 24 and 44 AND
> > F02 between 66 and 86 AND
> > F03 between 1 and 21 AND
> > ...
> > F24 between 235 and 255.
> >
> > Does anybody worked with such big tables?
> > What response time ( with what type of computer ) may I expect?
> > Is it possible to achieve 1 second response time with Oracle (
8i )?
> > On what computer?
> >
> > TIA. Michael
> >
> >
> >
> >
> >Sent via Deja.com http://www.deja.com/
> >Before you buy.
>
>
Sent via Deja.com http://www.deja.com/
Before you buy.
Received on Tue Sep 26 2000 - 15:26:53 CDT
![]() |
![]() |