We're updating the issue view to help you get more done. 

Failed to output ElasticSearch_http

Description

Hi,
I'm trying the elasticsearch_http output plugin, the config looks like below:

1 2 3 4 5 6 elasticsearch_http { host => "10.42.217.97" port => 9200 flush_size => 300 idle_flush_time => 5 }

After start the logstash agent, i got some error in log:

1 {:timestamp=>"2013-09-17T00:31:58.860000-0700", :message=>"Failed to flush outgoing items", :outgoing_count=>300, :exception=>#<Errno::EPIPE: Broken pipe - Broken pipe>, :backtrace=>["org/jruby/RubyIO.java:1297:in `syswrite'", "jar:file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/ftw/connection.rb:210:in `write'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/ftw/protocol.rb:83:in `write_http_body_chunked'", "org/jruby/RubyArray.java:1617:in `each'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/ftw/protocol.rb:83:in `write_http_body_chunked'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/ftw/protocol.rb:64:in `write_http_body'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/ftw/request.rb:78:in `execute'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/ftw/agent.rb:313:in `execute'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/ftw/agent.rb:205:in `post!'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/logstash/outputs/elasticsearch_http.rb:82:in `post'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/logstash/outputs/elasticsearch_http.rb:77:in `flush'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/stud/buffer.rb:219:in `buffer_flush'", "org/jruby/RubyHash.java:1332:in `each'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/stud/buffer.rb:216:in `buffer_flush'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/stud/buffer.rb:193:in `buffer_flush'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/stud/buffer.rb:159:in `buffer_receive'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/logstash/outputs/elasticsearch_http.rb:59:in `receive'", "(eval):74:in `initialize'", "org/jruby/RubyProc.java:255:in `call'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/logstash/pipeline.rb:247:in `output'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/logstash/pipeline.rb:212:in `outputworker'", "file:/usr/share/logstash/indexer/logstash-1.2.1-flatjar.jar!/logstash/pipeline.rb:140:in `start_outputs'"], :level=>:warn}

The error log in ElasticSearch:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 2013-09-17 00:31:46,042][WARN ][http.netty ] [Collector] Caught exception while handling client http traffic, closing connection [id: 0x7bcabc3d, /10.42.219.99:24448 => /10.42.217.97:9200] java.lang.NumberFormatException: For input string: "" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at java.lang.Integer.parseInt(Integer.java:504) at org.elasticsearch.common.netty.handler.codec.http.HttpMessageDecoder.getChunkSize(HttpMessageDecoder.java:621) at org.elasticsearch.common.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:318) at org.elasticsearch.common.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:101) at org.elasticsearch.common.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:500) at org.elasticsearch.common.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435) at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.elasticsearch.common.netty.OpenChannelsHandler.handleUpstream(OpenChannelsHandler.java:74) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268) at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255) at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:109) at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312) at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:90) at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) at org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) at org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:724)

Any idea?

Environment

None

Status

Assignee

Jordan Sissel

Reporter

Zz Chen

Fix versions

Affects versions

1.2.1

Priority