File input not performing any actions
Description
Gliffy Diagrams
Activity
Show:
Philippe Weber February 25, 2015 at 5:46 AM
Closing old issue with too specific use case impossible to reproduce.
If you still experience issue with the file input please report new bug to https://github.com/logstash-plugins/logstash-input-file/issues
john Phan November 5, 2013 at 11:54 PM
After much debugging and patience.
I was able to get the logs out. However all the log lines are encoded and the output message are scrambled.
An example of this is shown below.
???!?=Fs ???4? ?=Hs????4? ?=Hs????4? ?=g???a ?%??I?2?4? ?=Hs ???????? ?=Hs ??c4? ?/O? ??~A\? ??c4? ?=Fs ???4? As ??c4? ?=Fs ???4???u} ??O? i?1?{?? ???h?A?{?? ???h?A?{?????????&????6?{?? ?? i?1?{?? ?? i?=???#?=Fs ??c4? ?=Hk??? ?x??? ?=Fs ??c4? ?=G??c4? ?=Fs ???6??_???????x?J????4? ?=Hs ?????@i???o?????+????????w J?????H~ ????i???Bs ???????J???? ? <_???? ?+??W?_??? ?+O??OW??C<? h_??????O?????/oW?^?P????;(oW^NO4????>7i?7???5 ? ????V??~$???q\>???F~;??????8n??ok_?S???=??@?????JkqBs??A ??H~ ????????2?4?} ?? ?0?????z??????!????z|?????qk`Ds??_??8.? ????qkPBs_q? ???h?+ ?>Os?s??V`w?? ??????1.??h?+??=@s????o?Gh?+?? *=??????????RVZ?????=j ??S???J????[???s] ????? +>Ds_y<????k.o???C4?#?u|?????C4?w?J ??W??o?Eh?????? ???8? h?+?a??{+>Ds??_??8>T|??????$/??????? ??C4??? |Ds_y??? ?? ???c???? ????V:?y??J?k*????? ???3X|FP?+?<??H~;?? ?? ???c o>? ???<? ???????M4???;h?Ih?+???oX?Cs?v? |?$?????)U9?K $????K?? ??W??*#4? ?? j?Ds_yyt????? ????[?#????:>???Q?#??????????gi?k h! ????X! ????Ka?M??? $4?k.???T?<.??h?+oo??????????????=????<zDs_y;???????:>? ????????@e?h?+ow?~ E??k??}k?Bs_q??? ?? ????w1? ??B4???'?????W ?|?? ???)?&?}??&?? ? D4?????>?????C4????[?!????????}?q{??%???<?p?8x ~zS?~?_* ?}?????#4? ?W?u ?? ????Ds_y?>??W^?=???<?w?=? ?x??? ?}???VZ?????Ms_?O??? v ???? ??C6?#?u|?<n?~c?q*=Bs????}???Q? ?? i?0?;?? ?? i?@?{???c ? 4w?@s ??A4w?@s ??a4w ? Hk?"o?A*^A4w?@s ???6??_???C?zA\?[ ? Hs????4w ? Hs ??q?????~?? i?0?;?? ????:>?q? C??? ??????4w ? G???4w ? Fs ??a4w ? Hs??u|?K B?;??? h?@?;?? ???h?@?;?? ????z|?????4w ?????1??? e4w ? Fs ??a4w ? Ds? 4w ? Fs ???4??_??? ?? i?@?{?o???c4w ? Fs??N_Xz?JKai)+-Hs ??{~ ??|* Ds? 4w ? Hs????4w ? hc????zy??? ????:>???Y?A?;?? ???h?@?;?? ???h?@?;??? h?@?;?????? ? ?? ?????+??}+=Fs ???4w ????????? ?????1n??J????4w ? '???4w ? ??g?? ?1??+/4????o?g????u|??????4w ? Ds?M??=?H?4w ? Hs????4w ? '??? ? !?? ????:>?? "? Fs ???4w ??????[ ?@???h?@?;?? ?? i?@?{???c o ?8???h? ?; ???h?8?? ???h?@?;?? ????v|?<? R????/???h?@?;?? ?? i?0?;?? ?? i?0?;?? ???????0?? ??$P C?;?? ?? i?W?m??~?? ?? i?0?{??????fk? .? i?0?;?? ?? i?@?;??? i?0?;?? ?? i?@ ????>??????4w ? Hs ??a4w ? Fs ??a4w ? Hs??u|???Oo???a??? ?? i?0?;?? ?? i?=???#? Fs ??a4w ? Hk??? ?x??? ? Fs ??a4w ? G??a4w ? Fs ???6??_??????76 ??c4w ? Fs ???4w ??Hs????4w ??Hs'??{~ ??|??? ?;??N??? ?;??N?? i?DZ? y{ R?J??? ?;??N????:?^ ?? ?R??h?D?;??N???h?D?;????4?[??1??N?? i?D?{???c ?O0??????? ??Hs'??y ??Hs'??i4w"??Fs'???4??_???4 ???h?<??N???h?D?;??N?? i?D?{????? ??Hs'??{~ ??|kPFs'??i4w"??Fs'??I4w?@s'??i4w"??Hs??u|?K?A?;??N????v|+=Fs'??i4w ????SVZ
They should be plain text with a possibility of an ASC2 character as these are standard combine log formatting's being pumped out by Nginx.
Incomplete
Details
Details
Assignee
Logstash Developers
Logstash DevelopersReporter
john Phan
john PhanLabels
Affects versions
Created November 5, 2013 at 11:54 AM
Updated February 25, 2015 at 5:46 AM
Resolved February 25, 2015 at 5:46 AM
I am using logstash V1.2.1.
I have installed s3fs 1.71 using the following guide.
http://juliensimon.blogspot.co.uk/2013/08/howto-aws-mount-s3-buckets-from-linux.html
This program allows me to mount an S3 bucket onto a system as though it was an nfs drive.
After mounting this, I use the file input so I can process log files fro Apache. The main reason I don't use the S3 input is outlined in ticket LOGSTASH-1548.
However when trying to process the files, nothing occures. When -vv output turned on on a system trying to process these files. All I get is the following information. It gets stuck or repeats the following line indefinatly
Command run to start process
/usr/bin/java -Xmx512m -jar /opt/logstash/logstash.jar agent -f /opt/logstash/logstash.conf.bck6 --log /opt/logstash/logstash-test.log --pluginpath /opt -vv
command line output
Sending logstash logs to /opt/logstash/logstash-test.log. Adding plugin path {:path=>"/opt", :level=>:debug, :file=>"/opt/logstash/logstash-1.2.1-flatjar.jar!/logstash/agent.rb", :line=>"275", :method=>"configure_plugin_path"} Reading config file {:file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/agent.rb", :level=>:debug, :line=>"290", :method=>"load_config"} Compiled pipeline code: @inputs = [] @filters = [] @outputs = [] @input_file_1 = plugin("input", "file", LogStash::Util.hash_merge_many({ "add_field" => {"Component" => "Nginx"} }, { "add_field" => {"log_file" => "Access"} }, { "add_field" => {"Environment" => "TEST"} }, { "sincedb_path" => "/opt/logstash/.sincedb_testing" }, { "codec" => plugin("codec", "plain", LogStash::Util.hash_merge_many({ "charset" => "ASCII" })) }, { "type" => "NginxL" }, { "path" => "/mnt/s3/ec2/test-hostname/mnt/www/logs/access*" })) @inputs << @input_file_1 @filter_grok_2 = plugin("filter", "grok", LogStash::Util.hash_merge_many({ "type" => "NginxL" }, { "pattern" => "%{IPORHOST:c-ip} %{USER:ident} %{USER:auth} \\[%{HTTPDATE:timestamp}\\] \\\"(?:%{WORD:cs-method} %{NOTSPACE:cs-uri-stem}(?: HTTP/%{NUMBER:httpversion})?|-)\\\" %{NUMBER:sc-status} (?:%{NUMBER:sc-bytes}|-) ?(?:%{QS:referrer}|-)? ?(?:%{QS:User-Agent})? ?(?:%{QS:cookies})?" })) @filters << @filter_grok_2 @filter_date_3 = plugin("filter", "date", LogStash::Util.hash_merge_many({ "type" => "NginxL" }, { "match" => ["timestamp", "dd/MMM/yyyy:HH:mm:ss Z"] })) @filters << @filter_date_3 @output_elasticsearch_4 = plugin("output", "elasticsearch", LogStash::Util.hash_merge_many({ "host" => "127.0.0.1" })) @outputs << @output_elasticsearch_4 @filter_func = lambda do |event, &block| extra_events = [] @logger.info? && @logger.info("filter received", :event => event) newevents = [] extra_events.each do |event| @filter_grok_2.filter(event) do |newevent| newevents << newevent end end extra_events += newevents @filter_grok_2.filter(event) do |newevent| extra_events << newevent end if event.cancelled? extra_events.each(&block) return end newevents = [] extra_events.each do |event| @filter_date_3.filter(event) do |newevent| newevents << newevent end end extra_events += newevents @filter_date_3.filter(event) do |newevent| extra_events << newevent end if event.cancelled? extra_events.each(&block) return end extra_events.each(&block) end @output_func = lambda do |event, &block| @logger.info? && @logger.info("output received", :event => event) @output_elasticsearch_4.receive(event) end {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/pipeline.rb", :line=>"24", :method=>"initialize"} config LogStash::Codecs::Plain/@charset = "ASCII" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} Using milestone 2 input plugin 'file'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.2.1/plugin-milestones {:level=>:warn, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"202", :method=>"validate_milestone"} config LogStash::Inputs::File/@add_field = {"Component"=>"Nginx", "log_file"=>"Access", "Environment"=>"TEST"} {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Inputs::File/@sincedb_path = "/opt/logstash/.sincedb_testing" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain charset=>"ASCII"> {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Inputs::File/@type = "NginxL" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Inputs::File/@path = ["/mnt/s3/ec2/test-hostname/mnt/www/logs/access*"] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Inputs::File/@debug = false {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Inputs::File/@stat_interval = 1 {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Inputs::File/@discover_interval = 15 {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Inputs::File/@sincedb_write_interval = 15 {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Inputs::File/@start_position = "end" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} You are using a deprecated config setting "type" set in grok. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. You can achieve this same behavior with the new conditionals, like: `if [type] == "sometype" { grok { ... } }`. If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"type", :plugin=><LogStash::Filters::Grok --->, :level=>:warn, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"62", :method=>"config_init"} You are using a deprecated config setting "pattern" set in grok. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"pattern", :plugin=><LogStash::Filters::Grok --->, :level=>:warn, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"62", :method=>"config_init"} config LogStash::Filters::Grok/@type = "NginxL" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@pattern = ["%{IPORHOST:c-ip} %{USER:ident} %{USER:auth} \\[%{HTTPDATE:timestamp}\\] \\\"(?:%{WORD:cs-method} %{NOTSPACE:cs-uri-stem}(?: HTTP/%{NUMBER:httpversion})?|-)\\\" %{NUMBER:sc-status} (?:%{NUMBER:sc-bytes}|-) ?(?:%{QS:referrer}|-)? ?(?:%{QS:User-Agent})? ?(?:%{QS:cookies})?"] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@tags = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@exclude_tags = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@add_tag = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@remove_tag = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@add_field = {} {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@remove_field = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@match = {} {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@patterns_dir = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@drop_if_match = false {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@break_on_match = true {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@named_captures_only = true {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@keep_empty_captures = false {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@singles = true {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@tag_on_failure = ["_grokparsefailure"] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Grok/@overwrite = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} You are using a deprecated config setting "type" set in date. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. You can achieve this same behavior with the new conditionals, like: `if [type] == "sometype" { grok { ... } }`. If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"type", :plugin=><LogStash::Filters::Date --->, :level=>:warn, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"62", :method=>"config_init"} config LogStash::Filters::Date/@type = "NginxL" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Date/@match = ["timestamp", "dd/MMM/yyyy:HH:mm:ss Z"] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Date/@tags = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Date/@exclude_tags = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Date/@add_tag = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Date/@remove_tag = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Date/@add_field = {} {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Filters::Date/@remove_field = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Codecs::Plain/@charset = "UTF-8" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@host = "127.0.0.1" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@type = "" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@tags = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@exclude_tags = [] {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain charset=>"UTF-8"> {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@index = "logstash-%{+YYYY.MM.dd}" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@document_id = nil {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@port = "9300-9400" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@embedded = false {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@embedded_http_port = "9200-9300" {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@max_inflight_requests = 50 {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@flush_size = 100 {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"} config LogStash::Outputs::ElasticSearch/@idle_flush_time = 1 {:level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/config/mixin.rb", :line=>"98", :method=>"config_init"}
log output from logstash-test.log once the server starts up correctly
log4j, [2013-11-05T11:49:29.420] INFO: org.elasticsearch.node: [J2] started {:timestamp=>"2013-11-05T11:49:23.432000+0000", :message=>"Adding type with date config", :type=>"NginxL", :field=>"timestamp", :format=>"dd/MMM/yyyy:HH:mm:ss Z", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/filters/date.rb", :line=>"168", :method=>"setupMatcher"} {:timestamp=>"2013-11-05T11:49:23.435000+0000", :message=>"Pipeline started", :level=>:info, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/pipeline.rb", :line=>"69", :method=>"run"} {:timestamp=>"2013-11-05T11:49:23.580000+0000", :message=>"log4j java properties setup", :log4j_level=>"DEBUG", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/logging.rb", :line=>"86", :method=>"setup_log4j"} {:timestamp=>"2013-11-05T11:49:23.638000+0000", :message=>"New ElasticSearch output", :cluster=>nil, :host=>"127.0.0.1", :port=>"9300-9400", :embedded=>false, :level=>:info, :file=>"logstash/logstash-1.2.1-flatjar.jar!/logstash/outputs/elasticsearch.rb", :line=>"114", :method=>"register"} {:timestamp=>"2013-11-05T11:49:34.511000+0000", :message=>"_discover_file_glob: /mnt/s3/ec2/test-hostname/mnt/www/logs/access*: glob is: []", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/filewatch/watch.rb", :line=>"117", :method=>"_discover_file"} {:timestamp=>"2013-11-05T11:49:49.517000+0000", :message=>"_discover_file_glob: /mnt/s3/ec2/test-hostname/mnt/www/logs/access*: glob is: []", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/filewatch/watch.rb", :line=>"117", :method=>"_discover_file"} {:timestamp=>"2013-11-05T11:50:04.523000+0000", :message=>"_discover_file_glob: /mnt/s3/ec2/test-hostname/mnt/www/logs/access*: glob is: []", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/filewatch/watch.rb", :line=>"117", :method=>"_discover_file"} {:timestamp=>"2013-11-05T11:50:19.536000+0000", :message=>"_discover_file_glob: /mnt/s3/ec2/test-hostname/mnt/www/logs/access*: glob is: []", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/filewatch/watch.rb", :line=>"117", :method=>"_discover_file"} {:timestamp=>"2013-11-05T11:50:34.542000+0000", :message=>"_discover_file_glob: /mnt/s3/ec2/test-hostname/mnt/www/logs/access*: glob is: []", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/filewatch/watch.rb", :line=>"117", :method=>"_discover_file"} {:timestamp=>"2013-11-05T11:50:49.549000+0000", :message=>"_discover_file_glob: /mnt/s3/ec2/test-hostname/mnt/www/logs/access*: glob is: []", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/filewatch/watch.rb", :line=>"117", :method=>"_discover_file"} {:timestamp=>"2013-11-05T11:51:04.577000+0000", :message=>"_discover_file_glob: /mnt/s3/ec2/test-hostname/mnt/www/logs/access*: glob is: []", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/filewatch/watch.rb", :line=>"117", :method=>"_discover_file"} {:timestamp=>"2013-11-05T11:51:19.584000+0000", :message=>"_discover_file_glob: /mnt/s3/ec2/test-hostname/mnt/www/logs/access*: glob is: []", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/filewatch/watch.rb", :line=>"117", :method=>"_discover_file"} {:timestamp=>"2013-11-05T11:51:34.590000+0000", :message=>"_discover_file_glob: /mnt/s3/ec2/test-hostname/mnt/www/logs/access*: glob is: []", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/filewatch/watch.rb", :line=>"117", :method=>"_discover_file"} {:timestamp=>"2013-11-05T11:51:49.596000+0000", :message=>"_discover_file_glob: /mnt/s3/ec2/test-hostname/mnt/www/logs/access*: glob is: []", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/filewatch/watch.rb", :line=>"117", :method=>"_discover_file"} {:timestamp=>"2013-11-05T11:52:04.602000+0000", :message=>"_discover_file_glob: /mnt/s3/ec2/test-hostname/mnt/www/logs/access*: glob is: []", :level=>:debug, :file=>"logstash/logstash-1.2.1-flatjar.jar!/filewatch/watch.rb", :line=>"117", :method=>"_discover_file"}
No action seems to be taken by logstash to process the log files. Can anyone assist me in working out why this is?
input { file { add_field => { "Component" => "Nginx" } add_field => { "log_file" => "Access" } add_field => { "Environment" => "TEST_DATA" } sincedb_path => "/opt/logstash/.sincedb_testing" codec => plain { charset => "ASCII" } type => "NginxL" path => "/mnt/s3/ec2/blanked-for-security/mnt/www/logs/access*" } } filter { grok { type => "NginxL" pattern => "%{IPORHOST:c-ip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] \"(?:%{WORD:cs-method} %{NOTSPACE:cs-uri-stem}(?: HTTP/%{NUMBER:httpversion})?|-)\" %{NUMBER:sc-status} (?:%{NUMBER:sc-bytes}|-) ?(?:%{QS:referrer}|-)? ?(?:%{QS:User-Agent})? ?(?:%{QS:cookies})?" } date { type => "NginxL" match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ] } } output { elasticsearch { host => "127.0.0.1" } }