Grep not dropping event when using multiple matches
Description
Attachments
1
Gliffy Diagrams
Activity
Show:

Danny May 8, 2013 at 12:58 AM
Thanks. I ended up using filtering at rsyslog to reduce traffic on the network and the amount of processing that logstash would have to do.

Philippe Weber April 18, 2013 at 6:57 PM
You should not use the @, simply the same names as in the COMBINEDAPACHELOG pattern

Danny April 18, 2013 at 6:24 PM
Thanks for the quick reply. Please see the attached screenshot, I have changed my grep filter to reference the @request and @response fields and the events are still not getting dropped. I also upgraded logstash to 1.1.10 as well.
Cheers

Philippe Weber April 18, 2013 at 4:58 AM
Don't use the @fields. prefix. Logstash does that internally
I'm using a grok and then a grep and the grep filter is not dropping the events.
grok {
type => "syslog"
See the following URL for a complete list of named patterns
logstash/grok ships with by default:
https://github.com/logstash/logstash/tree/master/patterns
#
The grok filter will use the below pattern and on successful match use
any captured values as new fields in the event.
pattern => "%{COMBINEDAPACHELOG} %{NUMBER:duration}"
keep_empty_captures => true
}
grep {
type => "syslog"
match => [ "@fields.request", "\/internal\/health", "@fields.response", "200" ]
negate => true
}
Sample Like:
10.12.31.28 - - [17/Apr/2013:21:32:59 0700] "GET /TotalLossFastTrack/internal/health HTTP/1.0" 200 16 "" "nginx_upstream_check_module/1.2" 1000
I've proven that the grep matches fine by adding a add_tag => [ "DELETE" ] and it tags the event correctly but when I ask it to drop the event via negate => true it doesn't drop the event.
BTW: Awesome software.