Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Mailing Lists -> Oracle-L -> First large table
Currently our database is small, with tables having no more than 200,000
rows.
Now we have a requirement to load data from text files and hold approx 15 million rows at any one time.
The Oracle platform is 8.1.7 on AIX 4.33
The data will be queried from a client PC using Crystal Reports
The table definition is somelike like
Column1 NUMBER(6) Column2 NUMBER(2) Column3 NUMBER(1) Column4 VARCHAR2(20) Column5 NUMBER(8), Column6 NUMBER(8), Column7 NUMBER(2), Column8 VARCHAR2(2), Column9 VARCHAR2(2), Column10(Todays date) DATE, Column11 VARCHAR2(8), Column12 VARCHAR2(8), Column13 VARCHAR2(20), Column14 VARCHAR2(60),
The report will select records using columns 14,13 and 10
The report will only be run once a day.
Data from the text files will be loaded into the table by sqlldr throughout the day with approx 5000 records in a file. Approx 250000 will be loaded in a day, i,e approx 50 files loaded per day.
Data will be deleted once a day. Data where column 10 equals a date 3 months in the past will be deleted. i.e a days worth of data, approx 250000 records will be deleted.
Data must be recoverable.
Although this does not seem too much of a problem I would like to know what techniques I should be considering to
Anything else to consider?
3. Deletion of data, rollback segments
By deleting based on the table partition will this minimise rollback
requirements?
Any other advise on tuning this?
Any comments will be very much appreciated.
John
-- http://www.freelists.org/webpage/oracle-lReceived on Wed Jan 19 2005 - 06:04:11 CST
![]() |
![]() |