We're updating the issue view to help you get more done. 

Error writing to elasticsearch

Description

We are currently working with version 1.1.13 installed on ubuntu and elasticsearch version 0.90.2 of working on two-node cluster.

When we write some traces with characters such as é or í we find the following error in log file logstash:

1 {: Message => "Error writing to elasticsearch",: response => # <FTW :: Response: FTW 0x6f229f11 @ headers = HTTP :: Headers :: <{"content-type" => "text / plain; charset = UTF-8 "," content-length "=>" 70 "}>, @ body = <FTW :: Connection (@ 4046) @ destinations = [" xxx.xxx.xxx.xxx: 9200 "] @ connected = true @ remote_address = "10.35.167.205" @ secure = false>, @ status = 400, @ logger = # <Cabin :: Channel: 0x22956f6b @ subscriber_lock = # <Mutex:0xc5eb8a>, @ metrics = # <Cabin :: Metrics: 0x41eab16b @ channel = # <Cabin::Channel:0x22956f6b ...>, @ metrics = {}, @ metrics_lock = # <Mutex:0x1726099c>>, @ data = {}, @ subscribers = {}, @ level =: info>, @ reason = "Bad Request", @ version = 1.1>,: response_body => "No handler found for uri [/ logstash-2013.08.15/test] and method [GET]",: level =>: error }

The elasticsearch log file shows the following error message:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 [2013-08-14 13:53:59,652][DEBUG][action.index ] [xxx.xxx.xxx.xxx] [logstash-2013.08.15][2], node[kAqv4xJHTB6uyWCmcWKyrw], [P], s[STARTED]: Failed to execute [index {[logstash-2013.08.15][test][_IpC9zxLTLiOvWFc4XZBow], source[{"@source":"file://control-node/tmp/test/test.log","@tags":["test"],"@fields":{"logLevel":["INFO"],"petitionID":["113002"],"userID":["Unknown"],"type":["Server"],"received":["2013-08-14T11:53:45.058Z"]},"@timestamp":"2013-08-15T06:27:58.615Z","@source_host":"control-node","@source_path":"/tmp/test/test.log","@message":"2013-08-15 08:27:58,615 [Server] INFO - [113002] [Unknown] [Server] El usuario se ha registrado con éxito test@gmail.com nil null","@type":"test"]}] org.elasticsearch.index.mapper.MapperParsingException: Failed to parse at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:509) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:430) at org.elasticsearch.index.shard.service.InternalIndexShard.prepareCreate(InternalIndexShard.java:297) at org.elasticsearch.action.index.TransportIndexAction.shardOperationOnPrimary(TransportIndexAction.java:211) at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:532) at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:430) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:679) Caused by: org.elasticsearch.common.jackson.core.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: [B@410f87d7; line: 1, column: 0]) at [Source: [B@410f87d7; line: 1, column: 979] at org.elasticsearch.common.jackson.core.JsonParser._constructError(JsonParser.java:1378) at org.elasticsearch.common.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:599) at org.elasticsearch.common.jackson.core.base.ParserMinimalBase._reportInvalidEOF(ParserMinimalBase.java:532) at org.elasticsearch.common.jackson.core.base.ParserBase._handleEOF(ParserBase.java:479) at org.elasticsearch.common.jackson.core.json.UTF8StreamJsonParser._skipWSOrEnd(UTF8StreamJsonParser.java:2512) at org.elasticsearch.common.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:626) at org.elasticsearch.common.xcontent.json.JsonXContentParser.nextToken(JsonXContentParser.java:48) at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:461) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:486) ... 8 more

logstash.conf

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 input { file { 'debug' => false 'path' => ['/tmp/test/test.log'] 'tags' => 'test' 'type' => 'test' 'discover_interval' => 0 'start_position' => 'beginning' } } filter { multiline { 'pattern' => '(^\sat)|(^java)|(^com)|(^Cause)|(^\s+.)|(^org)' 'tags' => 'test' 'type' => 'test' 'what' => 'previous' } grok { 'add_field' => ['received', '%{@timestamp}'] 'pattern' => '%{TIMESTAMP_ISO8601:date} \[%{DATA}\] %{USERNAME:logLevel}( )+- (\[%{INT:petitionID}\] ((\[%{USERNAME:userID}\] \[%{USERNAME:type}\] )|(\[%{USERNAME:server}\] )))?((?<iphone>%{USERNAME}\.09)|%{GREEDYDATA})' 'tags' => 'test' 'type' => 'test' } date { 'match' => ['date', 'yyyy-MM-dd HH:mm:ss,SSS'] 'tags' => 'test' 'type' => 'test' } mutate { 'remove' => ['date'] 'tags' => 'test' 'type' => 'test' } } output { elasticsearch_http { host => "x.x.x.x" flush_size => 1} }

We have verified that the data file was utf-8, and have tried to run the option logstash java -Dfile.encoding = UTF8 with the same results.

How we could get to write the traces in elasticsearch. Any suggestions?

Environment

None

Status

Assignee

Logstash Developers

Reporter

Francisco Rubio

Labels

Affects versions

1.1.13

Priority