Performance issue in reading file in unix [message #471374] |
Fri, 13 August 2010 12:59 |
amit.sehrawat
Messages: 29 Registered: September 2009 Location: India
|
Junior Member |
|
|
Hi,
I have extraction job, am running a shell script to extract the data from database and write to csv files.
I have divided the task into two operations:
1) extract account no. of each client from outfile.txt and write it into a temp table on some condition.
2) extract data from database for each client in database and put it on output file.
Operation 1 is taking 7 minutes for 1300 accounts, which is huge. although operation 2 is not taking much of time, hardly seconds.
my code for file reading is:
/*
cat input_file.txt | while read line
do
num_cnt=`echo $line`
ExtractClient $num_cnt
done
ExtractClient()
{
DATE_RS=`date +%y%m%d%H%M%S`
sqlplus -s "$sChaineConnect" << FINREQ > $DIR_SAVE_FILE/SQLResult.txt
insert into Migration_table(col1,col2,col3,col4) (SELECT col1, col2,col3,col4 from client where client.col1='$1' AND col3 IN(4,5));
commit;
exit;
/
FINREQ
FSQLError
}
*/
Someone please tell me how can i decrease this time?
Or there is any better idea of parsing file in UNIX and putting into database? or any idea, am out of options here
|
|
|
|
|
|
|
|