Uploaded image for project: 'logstash'
  1. LOGSTASH-2247

Logstash crashs and Google BigQuery feature upgrade


    • Type: Support
    • Status: Answered (View workflow)
    • Resolution: Unresolved
    • Affects versions: 1.4.0, 1.4.3
    • Fix versions: None
    • Labels:


      1. Running Logstash i get the following error over and over again:
      IOError: closed stream
      flush at org/jruby/RubyIO.java:2199
      size at org/jruby/RubyFile.java:1108
      method_missing at /opt/logstash/lib/logstash/outputs/google_bigquery.rb:565
      receive at /opt/logstash/lib/logstash/outputs/google_bigquery.rb:194
      handle at /opt/logstash/lib/logstash/outputs/base.rb:86
      worker_setup at /opt/logstash/lib/logstash/outputs/base.rb:78

      I'm trying to understand what am i doing wrong
      i've tested my configuration and everything seems more than fine
      as sometimes it works and sometimes it just fails with unknown reason

      2. Google BigQuery output plugin
      I think its a good idea to add the threshold attribute for allowed errors within an bq upload job, you can see example in the bigquery portal

      3. Unknown bug of file control within Google BigQuery plugin
      within the plugin you have two keys you can set:
      deleter_interval_secs - set the interval in sec for deleting the temp file from disk
      uploader_interval_secs - set the interval in sec from data collection to uploading the data to bigquery

      I've noted that despite that the file was uploaded and was done been written with data, once it's deleted using the deleter_interval_secs key, Logstash craches because it can't find the file

        Gliffy Diagrams




              • Assignee:
                Logstash Developers (Inactive)
                Dave Ezrakhovich
              • Votes:
                0 Vote for this issue
                2 Start watching this issue


                • Created: