-
September 9th, 2005, 10:29 PM
#1
Member
PIX firewall logs
We have a Cisco PIX firewall in place at work. Currently we are using kiwi to gather the FW logs and send them to a Windows 2K3 server. There is soooo ( about 1.5 G per day of text files ) much data that it is impossible to look through these logs manually. What to do? Any suggestions?
What are you guys using out there to "go through" or analyize your FW\router logs?
-
September 10th, 2005, 05:40 AM
#2
Junior Member
How much traffic are you generating in\out of the pix?
Do you know your bandwidth usage?
Your logging level might be set a little high..
\"Poor planning on your part does not necessitate an emergency on my part.\" -Unknown
-
September 10th, 2005, 05:50 AM
#3
Perl, Grep and regular expressions. You need to figure out a layered filtering scheme to handle the parsing of the data. That or you could always import things into a sql server and use that to filter your data, maybe even provide some type of web based reporting.
"When I get a little money I buy books; and if any is left I buy food and clothes." - Erasmus
"There is no programming language, no matter how structured, that will prevent programmers from writing bad programs." - L. Flon
"Mischief my ass, you are an unethical moron." - chsh
Blog of X
-
September 10th, 2005, 06:09 AM
#4
i use batch files to filter out the stuff im not interested in seeing like outbound udp which is denied anyway using find with the '/v' option to print all lines not containing "outbound udp" and redirect the output to another file '>'
i make one 'find /v' entry for each message i think is garbage using the string thats common to all similar messages and redirect the output to another file for each one i want to remove.
when i get rid of all the garbage i start breaking down the last file that was created into smaller catagories using an html format
type head.txt >some.htm REM contains html header tags (black/Yellow/larger font)
find "some string" >> some.htm
find "another string" >>some.htm
type foot.txt >>some.htm REM contains closing tags
start some.htm
kiwi can also archive to a csv format which can be opened with xcell (import to a vb app?) and filters used for searching what your looking for
this works for me but my logs are only 15-45 Megs before filtering
this is a slow process so i set the kiwi log to archive and schedule the batch file to run way before i get in. this is a cumbersome method but hopefully someone will want to 'one up' me and post a better method
edit:
Juridian's right! perl would be a hell of a lot faster and more efficient. ive just been to busy to change it.
Bukhari:V3B48N826 “The Prophet said, ‘Isn’t the witness of a woman equal to half of that of a man?’ The women said, ‘Yes.’ He said, ‘This is because of the deficiency of a woman’s mind.’”
-
September 10th, 2005, 03:12 PM
#5
You didn't mention a budget or anything in your original post and you say it is for your work so here goes....
My workplace has purchased a product called Huntsman from an australian firm called Tier-3
http://www.tier-3.com/
This software has the ability to monitor a variety of log files, store results in SQL database and you can set up user defined alerts so that you can be alerted about events you are interested in.
Note 1: I was not involved in the original purchase but I believe this is not cheap - I have seen the support bill and it is alot of $AUS.
Note 2: We currently do not have this fully deployed in our environment but have seen it working in test environments and has real potential from what I can tell
You would not buy this to only monitor your PIX logs but as a much broader Security Monitoring tool
-
September 10th, 2005, 10:52 PM
#6
Cash is the name of the game here. If you have enough horsepower and enough cash, pump your syslog dump into a SIM (Security Information Manager) product such as NeuSecure by http://www.Guarded.net.
The sky is the limit as to what you can do with the data when a solution like this gets its teeth in your data.
If you're broke, then PERL and simple piped linux commands such as cat <filename> | grep 'something to find' will do the trick as long as you are aware of how to write expressions in a way that the shell wont interpret them.
Anyway, just another 2 cents...
Our scars have the power to remind us that our past was real. -- Hannibal Lecter.
Talent is God given. Be humble. Fame is man-given. Be grateful. Conceit is self-given. Be careful. -- John Wooden
-
September 12th, 2005, 06:12 PM
#7
Member
I am going to try to filter out some of the data to make it more intelligable. I am just getting involved with analyzing PIX logs. Do you guys have any suggestions on what I could safely "trim" without shooting myself in the foot?
Thanks
-
September 13th, 2005, 01:31 AM
#8
The sans perimeter protection (gcfw) training has a section on doing the kind of filtering/munging you mention. You might try the reading room there to see if anyone has expanded upon the topic with their own paper.
I poked around a little on infosecwriters.com but couldn't find anything that looked helpful. You might try a more indepth search there tho.
"When I get a little money I buy books; and if any is left I buy food and clothes." - Erasmus
"There is no programming language, no matter how structured, that will prevent programmers from writing bad programs." - L. Flon
"Mischief my ass, you are an unethical moron." - chsh
Blog of X
-
September 13th, 2005, 01:52 AM
#9
Member
I just downloaded a 30 day trial of Firegen. Cool product, it takes the logs and converts them into useable information with graphs\comparison\etc. Its relativly cheap to purchase too. Looks like I found my sw to compile the logs.
Thanks for your help guys
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
|