Elastic search and logstash integration error
Description
Gliffy Diagrams
Activity
Show:
Philippe Weber February 5, 2014 at 9:04 AM
No reply from issuer
Philippe Weber May 21, 2013 at 5:05 AM
Did you send enouhg sample data, elasticsearch_http is buffering events up to flush_size (default to 100) before sending to elasticsearch, see http://logstash.net/docs/1.1.12/outputs/elasticsearch_http#setting_flush_size
Also just a clarification, even with elasticsearch as a separated process you can still use the logstash elasticsearch output.
Incomplete
Details
Details
Assignee
Philippe Weber
Philippe WeberReporter
rohit
rohitAffects versions
Created May 21, 2013 at 3:12 AM
Updated February 5, 2014 at 9:04 AM
Resolved February 5, 2014 at 9:04 AM
Hi,
this is my logstash configuration file:-
input{
tcp{
type =>"dummy"
port => 3333
}
}
output{
elasticsearch_http{
host => "my ip address"
port => 9200
}
}
iam actually running elastic search as a separate process(version-0.20.5)
so iam using elasticsearch_http as output.
iam running
1)java -jar logstash-1.1.9-monlithic.jar agent -f logstash-simple.conf
2)./elasticsearch -f
as two separate processes..
now when iam netcatting any file,
(nc localhost 3333 < dummyfile1.txt)
elastic search is not receiving it.
(when i use curl -s http://localhost:9200/_status?pretty=true | grep logstash)
this curl command returning nothing..
iam unable to see anything...
i cannnot see any indexes in the format of (logstash-2013.*.*)
why logstash is unable to pump the data into elasticsearch??and why elastic search is unable to receive the data??
p.s:i cannot see any error messages too and where can i find logs of logstash so i can figure out whats happening..:)