Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Mailing Lists -> Oracle-L -> RE: best case scenarios for export/import
2 x (IBM p690, 16CPU, 16GB Ram, Oracle 9201), Export file from 8161 34GB.
Import is 12 parallel streams (6 on each machines) with no constraints,
custom parfiles (to import certain schema per stream only), buffer (either
8M and 16M), 12GB Undo on each instance, all tablespaces LMT
When testing the migration plan, I used to (everything below was scripted)
1. Drop the database 2. Create the database total datafiles are about 140GB. 3. precreate all objects with no data, no indexes 4. Run all imports 5. Validate everything is okay. 6. Trash everything and start all over again.
The last week (Mon-Fri) I actually tested the whole process at-least twice in a day proofing the scripts, always looking for UNKNOWN-PROBLEMS and always scanning for KNOWN PROBLEMS.
Total time 1 Hour and 30 minutes for the import using pre-created objects
(no indexes).
This is all from memory we did migrate on Oct 12, we had a good sized window. AIX created a problem with shared disk when we were in middle of import. So we trashed all we did and restarted. The databases were up and running, all verified, completely analyzed, necessary indexes (intermedia etc) rebuilt and running Active/Active on RAC one minute before user testing was supposed to start.
Raj
QOTD: Any clod can have facts, but having an opinion is an art!
-----Original Message-----
Sent: Friday, December 20, 2002 9:04 AM
To: Multiple recipients of list ORACLE-L
Good day, all:
I'm looking for real-life best-case scenarios for running import/export . . . I've been playing with this for quite some time and would like to know how "fast" I can really expect this to go, particularly for the import.
I'd be interested to hear others' experiences - how fast have you been able to import data? what parameters have you used? etc. . . . it's both for informational purposes and as a sanity check.
For example: I'm now trying to import a dump file of appx 6.5 Gb - breaks
down into 12G data and 4G indexes.
using the following params on the first import, to just get the data (I then
rerun with the indexfile param to get the indexes):
recordlength=65535
buffer=15000000 (15M)
commit=y
indexes=n
constraints=n
grants=n
This will import in appx 36 hours using a single 3 Gb rollback segment
What kind of experiences have you had?
Thanks
bill
-- Please see the official ORACLE-L FAQ: http://www.orafaq.net -- Author: Jamadagni, Rajendra INET: Rajendra.Jamadagni_at_espn.com Fat City Network Services -- 858-538-5051 http://www.fatcity.com San Diego, California -- Mailing list and web hosting services --------------------------------------------------------------------- To REMOVE yourself from this mailing list, send an E-Mail message to: ListGuru_at_fatcity.com (note EXACT spelling of 'ListGuru') and in the message BODY, include a line containing: UNSUB ORACLE-LReceived on Fri Dec 20 2002 - 10:09:24 CST
(or the name of mailing list you want to be removed from). You may
also send the HELP command for other information (like subscribing).
- text/plain attachment: ESPN_Disclaimer.txt
![]() |
![]() |