Grok Regexp Threw Exception on syslog file input

Description

I'm using the file input plugin and getting an error in the LS logs and it's crashing logstash.

I'm parsing rsyslog files only.

message=>"Grok regexp threw exception", :exception=>"invalid byte sequence in UTF-8", :field=>"message"

This is highlighted in a few other jira cases (https://logstash.jira.com/browse/LOGSTASH-615 and https://logstash.jira.com/browse/LOGSTASH-496), but the notes mention to use the charset option, which is deprecated.

Activity

Show:
James Hodgkinson
September 23, 2013, 10:32 PM

We're also seeing the following errors in the logstash logs after the initial error:

{:timestamp=>"2013-09-24T01:00:00.839000+1000", :message=>"Failed to flush outgoing items", :outgoing_count=>1, :exception=>#<Encoding::ConverterNotFoundError: code converter not found (ASCII-8BIT to UTF-8)>, :backtrace=>["org/jruby/RubyString.java:7590:in `encode'", "json/ext/GeneratorMethods.java:71:in `to_json'", "file:/usr/share/logstash/logstash-1.2.1-flatjar.jar!/logstash/event.rb:169:in `to_json'", "file:/usr/share/logstash/logstash-1.2.1-flatjar.jar!/logstash/outputs/elasticsearch.rb:163:in `flush'", "org/jruby/RubyArray.java:1617:in `each'", "file:/usr/share/logstash/logstash-1.2.1-flatjar.jar!/logstash/outputs/elasticsearch.rb:158:in `flush'", "file:/usr/share/logstash/logstash-1.2.1-flatjar.jar!/stud/buffer.rb:219:in `buffer_flush'", "org/jruby/RubyHash.java:1332:in `each'", "file:/usr/share/logstash/logstash-1.2.1-flatjar.jar!/stud/buffer.rb:216:in `buffer_flush'", "file:/usr/share/logstash/logstash-1.2.1-flatjar.jar!/stud/buffer.rb:193:in `buffer_flush'", "file:/usr/share/logstash/logstash-1.2.1-flatjar.jar!/stud/buffer.rb:159:in `buffer_receive'", "file:/usr/share/logstash/logstash-1.2.1-flatjar.jar!/logstash/outputs/elasticsearch.rb:153:in `receive'", "(eval):246:in `initialize'", "org/jruby/RubyProc.java:255:in `call'", "file:/usr/share/logstash/logstash-1.2.1-flatjar.jar!/logstash/pipeline.rb:247:in `output'", "file:/usr/share/logstash/logstash-1.2.1-flatjar.jar!/logstash/pipeline.rb:212:in `outputworker'", "file:/usr/share/logstash/logstash-1.2.1-flatjar.jar!/logstash/pipeline.rb:140:in `start_outputs'"], :level=>:warn}

Markus Fischer
September 30, 2013, 8:31 AM
Edited

I'm seeing this too in production systems. Currently my logstash agents won't run longer than a few hours because at some point there will also be an invalid UTF-8 sequence leaking into the system (I'm having those in nginx logs for example) and the agent will stop.

I would rather prefer have the message just not match put still be sent to redis and maybe warning in the log.

Lance A. Brown
November 11, 2013, 2:47 PM

Just ran into this problem in production on an OSSEC server. Logstash is picking up the OSSEC archive.log and feeding it into elasticsearch. Definitely would prefer this not crash logstash.

Assignee

Logstash Developers

Reporter

James Hodgkinson

Affects versions

Configure