Recently I found myself in the unhappy position of needing to sift through slightly more than a billion Checkpoint Firewall-1 log lines, looking for specific patterns of access. The problem was that many of the exported fwm log files had differing column positions and there had been many ruleset changes over the course of 11 months worth of log data. Many of the excellent FW1 log summarization tools (such as Peter Sundstrom’s fwlogsum) didn’t handle the hundreds of files and differing column positions.
The final scripted solution was processing over 11,000 lines/second .. and still took over 23 hours for the first run.
Log file exports via fwm logexport can have variable column positioning, except for record ID number “num”, which is *always* column number one. I see three viable alternatives to the changing column position in the ASCII log files exported via fwm – so we can automate the log processing:
fwm logexport -i fw1-binary-logfile -o fw1-ascii-logfile.txt -n -p
- Parse the header line (line #1) of every log file and dynamically map (rearrange) the columns to a pre-determined standard in memory before further processing (painful, expensive)
- Tell Checkpoint fwm to export in a fixed column ordering
- create
logexport.ini
and place in
$FWDIR/conf directory
- eg. fwmgmtsrv:
C:\WINDOWS\FW1\R65\FW1\conf
- logexport.ini:
[Fields_Info]
included_fields = num,date,time,orig,origin_id,type,action,alert,i/f_name,
i/f_dir,product,rule,src,dst,proto,service,s_port,xlatesrc,xlatedst,
nat_rulenum,nat_addtnl_rulenum,xlatesport,xlatedport,user,
partner,community,session_id,ipv6_src,ipv6_dst,
srckeyid,dstkeyid,CookieI,CookieR,msgid,elapsed,
bytes,packets,start_time,snid,ua_snid,d_name,id_src,ua_operation,
sso_type_desc,app_name,auth_domain,uname4domain,wa_headers,
result_desc,r_dest,comment,url,redirect_url,enc_desc,e2e_enc_desc,
auth_result,attack,log_sys_message,
rule_uid,rule_name,service_id,resource,reason,cat_server,
dstname,SOAP Method,category,ICMP,message_info,
TCP flags,rpc_prog,Total logs,
Suppressed logs,DCE-RPC Interface UUID,Packet info,
message,ip_id,ip_len,ip_offset,fragments_dropped,during_sec - Use OPSEC LEA tools to extract event log records instead of export via fwm logexport
Once the ASCII log files are available for processing, my fw1logsearch.pl script can be used to find complex patterns of interest. Any matching records found by fw1logsearch will be output with an initial FW1 header line so that fw1logsearch can be used iteratively, to build very complex search criteria. fw1logsearch can also write out a discard file allowing completely negative logic searches resulting in 100% of the input data separated into a match file and a didn’t match file. Some examples of how I’ve used it are shown here:
gunzip -c fwlogs/2009*gz | \
fw1logsearch.pl --allinclude \
-S '10\.1\.1[1359]\.|10\.2\.1[01]\.|192\.168\.2[245]\.' \
-d '10\.1\.1[1359]\.|10\.2\.1[01]\.|192\.168\.2[245]\.' \
-p '^1310$|^1411$|^1812$|^455' | \
fw1logsearch.pl -S '192\.168\.22\.14$|10\.2\.11\.12$' |\
fw1logsearch.pl --allexclude \
-S '^192\.168\.24\.12$' -P '^1310$' --rejectfile 192-168-24-12-port-1310.txt
Line by line:
1. Unzip the compressed ASCII log files, feed them to the first instance of fw1logsearch.pl
2. First fw1logsearch – all conditions must be true for any events to match
Source address must NOT be in any of the following regex ranges:
10.1.11.* 10.1.13.* 10.1.15.* 10.1.19.*
10.2.10.* 10.2.11.*
192.168.22.* 192.168.24.* 192.168.25.*
Destination address must be in one of the same following regex ranges.
Service (destination port) must be one of:
Exactly port: 1310, 1411, 1812, or any port starting with 455
No protocol is specified, so it will match either TCP or UDP
fw1logsearch.pl will output any matching events to stdout, including a FW1 log header line, so the next instance of fw1logsearch.pl continues filtering the result set.
3. The second fw1logsearch.pl specifies Source Address must not be any of the following
192.168.22.14
10.2.11.12
4. The last fw1logsearch.pl excludes port 1310 from 192.168.24.12, and puts all those records into a separate reject file, while writing the other records to stdout.
This script has been used to process over 4 billion records within the project I wrote it for – and precisely found all the use of particular business cases I needed to modify. The result was zero outages and no unintended business interruption.
Basic syntax/help file:
Usage: fw1logsearch.pl
[-a|–incaction|-A|–excaction <action regex>]
[-p|–incservice|-P|–excservice <dst port regex>]
[-b|–incs_port|-B|–excs_port <src port regex>]
[-s|–incsrc|-S|–excsrc <src regex>]
[-d|–incdst|-D|–excdst <dst regex>]
[-o|–incorig|-O|–excorig <fw regex>]
[-r|–incrule|-R|–excrule <rule-number regex>]
[-t|–incproto|-T|–excproto <proto regex>]
[–dnscache <dns-cache-file>]
[–resolveip]
[–allinclude]
[–allexclude]
[–rejectfile <file>]
[–debug <level>]
fw1logsearch.pl will search a fwm logexport text file for regex patterns specified for supported columns (such as service, src, dst, rule, action, proto and orig).
Include and exclude regex matches may be specified on the same line, although they both will include (print) a line or exclude (reject) a line based on single matches. Allinclude or Allexclude must be specified to force a match
only on all specified column regex patterns.
Regex patterns can be enclosed with single quotes to include characters that are special to the shell, such as the ‘or’ (|) operator.
Header will be output only if there are any matching lines.
Example invocations:
$ cat 2008-07-07*txt | \
fw1logsearch.pl \
-p ’53|domain’ \
-d ‘192.168.1.2|host1|10.10.1.2|host2’ \
-o ‘192.168.2.3|10.10.2.4|10.10.4.5’ \
-S ‘64.65.66.67|32.33.34.35|10.10.*|192.168.*’ \
–resolveip
Will require destination port (service) to be 53, destination IP to be any of 192.168.1.2, host1, 10.10.1.2, or host2 the reporting firewall (origin) to be any of 192.168.2.3, 10.10.2.4, or 10.10.4.5 and the source IP must not be
any of 64.65.66.67, 32.33.34.35, 10.10.*, or 192.168.* Any lines that match this criteria, will display and the orig, src, and dst columns will use the default DNS cache file (dynamically built/managed) to perform name resolution, replacing the IP addresses where possible.
Include regex patterns:
-a –incaction Rule action (accept, deny)
-b –incs_port Source port (s_port)
-p –incservice Destination port (service)
-s –incsrc Source IP|hostname
-d –incdst Destination IP|hostname
-o –incorig Reporting FW IP|hostname
-r –incrule Rule number that triggered entry
-t –incproto Protocol of connection
Exclude regex patterns:
-A –excaction Rule action (accept, deny)
-B –excs_port Source port (s_port)
-P –excservice Destination port (service)
-S –excsrc Source IP|hostname
-D –excdst Destination IP|hostname
-O –excorig Reporting FW IP|hostname
-R –excrule Rule number that triggered entry
-T –excproto Protocol of connection
Other options:
–debug {level} Turn on debugging
–dnscache Specify location of DNS cache file to be used with
the Resolve IPs option
–resolveip Resolve IPs for orig, src, and dst columns AFTER filtering
–rejectfile Write out all rejected lines to a specified file
Download fw1logsearch.pl