简体   繁体   中英

Shell Script to read from log file and update to Oracle DB table

I have a requirement to read a splunk log file for certain parameters and use that data to update an Oracle 11g DB table once those parameters are found.

for eg

Splunk log file name is: app.log

input parameters in log file would be:

[timestamp] amount=100,name=xyz,time=19 May 2018 13:45 PM

output from shell script should be: amount should be read in to a variable and 100 should be assigned to that. This value 100 should be stored in a DB table in Oracle.

I may have to use awk script for this. I am not getting an idea on this as I am new to shell scripting..

tail -f|egrep -wi 'amount' /apps/JBoss/log/app.log 

This type of commands doesn't seem to be working.

You may easily capture such values using Perl's regex.

amt=$(perl -pe 's/^amount=(\d+).*/$1/' /apps/JBoss/log/app.log)

If you want to use pure shell commands,

amt=$(grep amount app.log| cut -f1 -d',' | cut -f2 -d '=')

You may use this variable in the insert query from sqlplus

sqlplus -s USER/PWD<<SQL
INSERT INTO yourtable(column_name) VALUES(${amt});
commit;
exit
SQL

For an input file ( app.log ) like:

[timestamp] amount=100,name=xyz,time=19 May 2018 13:45 PM                
[timestamp] amount=150,name=xyz,time=19 May 2018 13:45 PM                
[timestamp] amount=200,name=xyz,time=19 May 2018 13:45 PM                

you could use grep's P flag (PCRE):

arr=($(grep -oP "(?<=amount=)\d+" app.log))

This will store the values of amount in an array arr . Output:

echo ${arr[@]}
100 150 200

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM