Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Mailing Lists -> Oracle-L -> RE: linux scripting question

RE: linux scripting question

From: Mark W. Farnham <mwf_at_rsiz.com>
Date: Wed, 21 Mar 2007 11:06:42 -0400
Message-ID: <01d701c76bca$89a694f0$1100a8c0@rsiz.com>


Exactly!

Then you combine this cron with something like extmon (or any other history of used size within your Oracle databases) and you have yourself an overall tool for "acreage" disk farm capacity planning (as distinct from throughput planning, which is an entirely different exercise.)

-----Original Message-----
From: Rodd Holman [mailto:Rodd.Holman_at_gmail.com] Sent: Wednesday, March 21, 2007 10:07 AM To: mwf_at_rsiz.com
Cc: niall.litchfield_at_gmail.com; 'oracle-l' Subject: Re: linux scripting question

If you want historical tracking of this, cron this to run at a specific interval. Redirect the output to a file somewhere. Externally map the directory as an oracle directory and make the file an external table. Then create yourself history table and as the file updates kick off a proc in the db to load your history from the updated file. I'm actually working through a process like this right now to provide some trending data on disk utilization for our Exec VP's.

Mark W. Farnham wrote:
>
> What I would do is ls -lR every day to a date named file and then diff
> any two days and pick off the rows and fields I wanted and net up the
> ins versus the outs with awk or perl.
>
> Then if you want to drill in on more details about the changes, you've
> got them. If your arbitrary time granule is smaller than a day, then
> you'd need more frequent snaps, and of course this process dies by
> Xeno at some level of willingness to spend space for the data.
>
>
> <http://www.orawin.info>
>

--
http://www.freelists.org/webpage/oracle-l
Received on Wed Mar 21 2007 - 10:06:42 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US