logstash doesn't insert events in the new daily index anymore
Description
Gliffy Diagrams
Activity
Archa May 15, 2013 at 12:33 PM
Hi today it's a beautiful day where everything is ok, logstash creates its daily index and currently continues to process logs !
It seems that the problem was the memory allowed to ES, it gave 1 Go, but now it's 2 Go.
Thank you very much Philippe for the help.
Archa May 14, 2013 at 1:19 PM
heyhey thank you, I spent a lot of time to get a mapping like that with varnish !
There is nothing in elasticsearch log
"you are on localhost, but who knows..."
And you are right ! Thank you for this suggestion, I checked and indeed both run with a DIFFERENT java version... (logstash java7 and ES java6).
I switched ES with java7.
Now wait and see
Thank you again !
Philippe Weber May 14, 2013 at 12:15 PM
it is recommended to validate that the ES version AND the java version are the same on both ends, you are on localhost, but who knows...
Philippe Weber May 14, 2013 at 12:08 PM
That's a nice default mapping
no other clues about the error from the elasticsearch logs ?
Archa May 14, 2013 at 11:31 AM
Ok thank you, here is my mapping configuration (it is dynamic) :
rsyslog:
properties:
'@fields':
dynamic: "true"
properties:
agent:
type: "string"
backend:
type: "string"
bytes:
type: "string"
cached:
type: "string"
clientip:
type: "string"
httpversion:
type: "string"
nb_hit:
type: "string"
pid:
type: "string"
program:
type: "string"
referrer:
type: "string"
request:
type: "string"
response:
type: "string"
verb:
type: "string"
'@message':
type: "string"
'@source':
type: "string"
'@source_host':
type: "string"
'@source_path':
type: "string"
'@tags':
type: "string"
'@timestamp':
type: "date"
format: "dateOptionalTime"
'@type':
type: "string"
Hi,
I'm using logstash 1.1.9 with elasticsearch 0.20.5.
Since 2 weeks logstash created one index per day with no problem, but since 3 or 4 days, after logstash creates the new index at midnight, it doesn't insert events in ES anymore.
To test, I config logstash to create an index per hour, and it works !
Reboot logstash solves my issue, but I can't figure out why I have this problem...any idea ?
Here is my logstash.conf output :
elasticsearch {
host => "127.0.0.1"
cluster => "logcluster"
node_name => "logstash-client"
index => "logstash-%{+YYYY.MM.dd}"
}
Consequently, I have a "Parse Failure [No mapping found for [@timestamp] in order to sort on]" in Kibana, because the index is empty...
Thank you for the help,
Archa
EDIT :
I let logstash creates index per hour today, and sadly, the same problem occur 7 hours later...reboot logstash solves my issue...