Optimization for Large-Volume Data Streams [message #301658] |
Thu, 21 February 2008 04:44 |
ramsatish
Messages: 8 Registered: November 2006
|
Junior Member |
|
|
Hi,
I have 25 tables in my oracle database, which consists of 50,000 records each on an average.
Those tables get refreshed everyday(previous day's data will get deleted and current date's data get inserted).
There is a requirement to generate multiple reports for all the accounts having the same format. Those reports are based on
last seven days data, hence require to store data for last week.
My idea is to create a single table which stores all the data of 25 accounts for last 7 days.It means that table consists of
25(accounts)*7(days)*50,000(Number of records in each account)=87,50,000 records.
I implemented my plan taking 3 accounts and for 3 days data.It means that table consists of
3(accounts)*3(days)*50,000(Number of records in each account)=4,50,000 records.
My observation was, for any simple query on the table is taking a lot of time.
What is the best or optimised way to access larger volume of data in lesser amount of time?
My SYSTEM CONFIGURATION IS:
o/S : wINDOWS 2000
DATABASE: ORACLE 8i
RAM : 2 GB
HARDDISK: 120 GB
PROCESSOR: PENTIUM
Thanks in Advance
Regards,
Sriram Satish
|
|
|
|
|