Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Creating 100,000+ tables.

Re: Creating 100,000+ tables.

From: Joel Garry <joel-garry_at_home.com>
Date: 9 Dec 2005 14:48:57 -0800
Message-ID: <1134168537.816326.147530@z14g2000cwz.googlegroups.com>


>Are you sure that's what would happen? Maybe someone should try
>it....? Obtain some empirical evidence and report back (because I
>honestly have no clue). Seems like Oracle should have the technology
>in place to manage such things. 100 concurrent requests to create
>tables is unlikely, but it should be able to deal with it -- even if it
>comes with a cost to performance.

No, I'm not sure, and yes, someone should try it with statspack or traces and report back. In V6 and V7 I saw much lower volume table creation scripts skip the occassional table because the dbwr would hiccup. More modern Oracle has better ways of handling volume of transaction, but some things may go south fast over a certain transaction rate. Oracle can usually deal with it - but it may be much slower than serializing to begin with.

Remember, a commit is done before and after each DDL. So see http://www.ixora.com.au/newsletter/2001_09.htm#commits

There may also be issues because the dictionary tables that track tables just aren't defined to be high volume tables. If you are just sequentially increasing the new tables in the dictionary (like when you create a database and run all those cat scripts and one cpu is pegged) that is easier than 100 processes contending to pound on the same blocks, interested transaction lists and headers. You are going to have some very hot blocks. I would expect you would see lots of things mentioned in sections 4 through 9 of
http://www.oracle.com/technology/oramag/oracle/03-jan/o13expert.html - although whatever the choke point is will overshadow all the others, and that will vary by version, configuration and specific user code.

I've made the dunderhead mistake in the past of trying to run more processes than I've configured Oracle (or unix) for, so that's where that comes from, not from an inherent limit. D'Oh! (But an easy mistake to make when you suddenly increase processes).

Frank wrote:

>I would make the initial extent as small as possible (assuming
>it's all about table creation, and you do not have
>some crazy plan to actually *use* 100k+ tables)

But isn't there a minimum with LMT's anways? (Is it 64K or 5*64K, I've forgotten... and don't recall seeing official documentation on it.)

I could imagine having 1000 users using 100 tables each... or 10 apps users...

jg

--
@home.com is bogus.
Christmas songs before Christmas is like singing Happy Birthday before
someone's birthday.
Received on Fri Dec 09 2005 - 16:48:57 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US