bsdjunkie
November 11th, 2003, 15:38
Can anyone reccommend a good tool to parse checkpoint FW 1 logs? Currently I need to take the data and report on which rule numbers were not hit over a period of time. Was just wondering if anyone knew of something out there b4 I sit down and write one. Not sure if i got the time to do so here at work :P

Btw, the format looks like the following:

datetime=26Oct2003 23:59:59 action=accept fw_name=myfirewall dir=inbound src=x.x.x.x dst=y.y.y.y rule=74 proto=tcp/http 80

frisco
November 11th, 2003, 16:37
If you're looking through a single logfile for rules not used (as in you rotate every day):

[code:1:ebb4d57db3]#!bin/sh
p=0
for c in `awk '{ print $8 }' < $1 | cut -f2 -d= | sort -nu`; do
s=`expr $c - 1`
if [ $s -ne $p ]; then
n=`expr $p + 1`
while [ $n -le $s ]; do
echo $n
n=`expr $n + 1`
done
fi
p=$c
done
[/code:1:ebb4d57db3]

Run as `scriptname [filename]`
Only had the one line to work with, so may not be too robust, and can probably be optimized somewhere... (boring day at work for me).

bsdjunkie
November 11th, 2003, 17:30
Frisco, thanks a lot, looks like what i was looking for, Trouble is, due to the file size of the logs, i recieved the following error:

sort: /var/tmp/sort.pYWwjL6687: Too many open files


Is this a limitation of space on my var partition?

btw, the avg daily log is about 500-600meg and Im looking to run against a month or so at a time.

frisco
November 11th, 2003, 17:35
If you run only `awk '{ print $8 }' < $1 | cut -f2 -d= | sort -nu` do you get the same problem? (substitute filename for $1)

bsdjunkie
November 11th, 2003, 17:46
I get the same error just running that.,

frisco
November 11th, 2003, 17:53
sort: /var/tmp/sort.pYWwjL6687: Too many open files


Is this a limitation of space on my var partition?

I'm thinking it might be a restriction of your login class or ulimit. Try changing maxproc-cur in login.conf, or your ulimit via 'ulimit -n'.

Then again, reading the manpage for sort, there are some limitations to it. First try running 'sort -unH' instead (add an H to it).

bsdjunkie
November 11th, 2003, 18:00
sort -unH produces the following:

sort: cannot allocate memory


Ill play with the ulimit stuff here quick as well....

bsdjunkie
November 11th, 2003, 18:03
btw, what are good values for ulimit and maxproc-cur?
ulimit is currently at 128.
maxproc-cur = 64

frisco
November 11th, 2003, 18:06
It may be that there simply is too much data for sort to handle (check out the BUGS section in sort(1) if you're on OpenBSD, perhaps you'll get better results from sort on a different OS)

If that is the problem, then we'll need to get trickier to split the file up first (or you rotate your logs every few hours!)

bsdjunkie
November 11th, 2003, 18:08
I have no control over log rotation. Its set to one day, and thats the end of it. =P

bsdjunkie
November 11th, 2003, 18:10
readng the man page i see any file over 704 meg must be split. The file im working on is about 675 meg. Im wondering if its just not enough RAM to handle it here on this machine.

frisco
November 11th, 2003, 18:15
btw, what are good values for ulimit and maxproc-cur?
ulimit is currently at 128.


128 fd's is good for users, daemons may need more. We were having problems yesterday b/c someone set oracle's max fd's too low...

maxproc-cur = 64

This will depend on how much you do. I am currently running 87 different processes (and i'm being slow right now since i'm about to leave) so i have my login class's maxproc set to 256. On some machines, for some accounts, i have this set to 4 so i'm guaranteed the account never runs out of control (it should only ever be doing 4 things at a time).


But the "cannot allocate memory" error is different. Check the other settings in login.conf as well as 'ulimit -a'. Default limits are kind of low when you start working on large files.

frisco
November 11th, 2003, 18:19
readng the man page i see any file over 704 meg must be split. The file im working on is about 675 meg. Im wondering if its just not enough RAM to handle it here on this machine.

The data you're actually working on will be less than that, though, since it's being parsed by awk and cut first. I think it's a memory limitation of your login class.

bsdjunkie
November 11th, 2003, 18:47
After cranking up those values, it appears to be working.. At least its been running about 20 min without giving me any errors so far :P

I think Ill use this as an excuse for work to buy me a big burly *nix box instead of this regular desktop for this type of stuff ;)

elmore
November 12th, 2003, 11:05
Might wanna check out logrep. I was just looking at it, it supports a variety of applications including Checkpoint fw-1

http://logrep.sourceforge.net

frisco
November 12th, 2003, 12:30
Might wanna check out logrep. I was just looking at it, it supports a variety of applications including Checkpoint fw-1

Does it require a "big burly *nix box" ? If not, i don't think it'll meet with bsdjunkie's current requirements!

bsdjunkie
November 12th, 2003, 12:52
Frisco, after tweking all the values, its working great. for about 650megs of data, its taking about 1.5 - 2 min to complete. =)