class LogStash::Outputs::SolrHTTP
This output lets you index&store your logs in Solr. If you want to get started quickly you should use version 4.4 or above in schemaless mode, which will try and guess your fields automatically. To turn that on, you can use the example included in the Solr archive:
- source,shell
-
tar zxf solr-4.4.0.tgz cd example mv solr solr_ back up the existing sample conf cp -r example-schemaless/solr/ . put the schemaless conf in place java -jar start.jar start Solr
You can learn more at lucene.apache.org/solr/[the Solr home page]
Public Instance Methods
flush(events, close=false)
click to toggle source
# File lib/logstash/outputs/solr_http.rb, line 60 def flush(events, close=false) documents = [] #this is the array of hashes that we push to Solr as documents events.each do |event| document = event.to_hash() document["@timestamp"] = document["@timestamp"].iso8601 #make the timestamp ISO if @document_id.nil? document ["id"] = UUIDTools::UUID.random_create #add a unique ID else document ["id"] = event.sprintf(@document_id) #or use the one provided end documents.push(document) end @solr.add(documents) rescue Exception => e @logger.warn("An error occurred while indexing: #{e.message}") end
receive(event)
click to toggle source
# File lib/logstash/outputs/solr_http.rb, line 54 def receive(event) buffer_receive(event) end
register()
click to toggle source
# File lib/logstash/outputs/solr_http.rb, line 43 def register require "rsolr" @solr = RSolr.connect :url => @solr_url buffer_initialize( :max_items => @flush_size, :max_interval => @idle_flush_time, :logger => @logger ) end