Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Mailing Lists -> Oracle-L -> RE: Choosing data file size for a multi TB database?
On very large data files running on buffered filesystem, wouldn't the
single-writer lock cause foreground processes (that are trying to read
data) to wait when the DBWR is check pointing?
From: oracle-l-bounce_at_freelists.org
[mailto:oracle-l-bounce_at_freelists.org] On Behalf Of Branimir Petrovic
Sent: Friday, September 02, 2005 6:54 PM
To: oracle-l_at_freelists.org
Subject: RE: Choosing data file size for a multi TB database?
What about checkpoint against tens of thousands of data files, surely
more-merrier rule holds? For that reason (or due to a fear factor) I
was under may be false impression that smaller number (in hundreds)
of relatively larger data files (20 GB or so) might be better choice.
Other very real problem with 10TB database I can easily foresee, but
for which I do not know proper solution, is how would one go about the
business of regular verification of taped backup sets? Have another
humongous hardware just for that purpose? Fully trust the rust? (i.e.
examine backup logs and never try restoring, or...) What do people
do to ensure multi TB monster databases are surely and truly safe
and restorable/rebuildable?
Branimir
-----Original Message----- From: Tim Gorman [mailto:tim_at_evdbt.com] Sent: Friday, September 02, 2005 5:59 PM To: oracle-l_at_freelists.org Subject: Re: Choosing data file size for a multi TBdatabase?
Datafile sizing has the greatest regular impact on backups and restores. Given a large multi-processor server with 16 tape drives available, which would do a full backup or full restore fastest?
Be sure to consider what type of backup media are you using, how much concurrency will you be using, and the throughput of each device?
There is nothing "unmanageable" about hundreds or thousands of datafiles; don't know why that's cited as a concern. Oracle8.0 and above has a limitation on 65,535 datafiles per tablespace, but otherwise large numbers of files are not something to be concerned about. Heck, the average distribution of a Java-based application is comprised of 42 million directories and files and nobody ever worries about it...
-- http://www.freelists.org/webpage/oracle-lReceived on Fri Sep 02 2005 - 18:50:05 CDT
![]() |
![]() |