Logstash-Elasticsearch-Kibana Setup on Ubuntu -


i trying setup elk stack on ubuntu sandbox , struck issue. issue logstash not sending data elasticsearch. referred elasticsearch documentation.

looks kibana , elasticsearch connectivity working fine, think kibana reporting can't find data. spent couple of hours figure out no luck...

appreciate fix issue. thank much!

here setup details,

logstash setup:

sirishg@sirishg-vm:/u02/app/logstash-2.1.1/bin$ ./logstash -f /u02/app/logstash-2.1.1/first-pipeline.conf  settings: default filter workers: 1 logstash startup completed 

first-pipeline.conf:

        # #  character @ beginning of line indicates comment. use comments describe configuration.     input {         file {             path => "/u02/app/logstash-tutorial-dataset.log"             start_position => beginning         }     }     filter {         grok {             match => { "message" => "%{combinedapachelog}"}         }         geoip {             source => "clientip"         }     }     output {         elasticsearch {          hosts => ["localhost:9200"]         }         stdout {          codec => rubydebug         }     } 

elastic search setup:

health check report:

{"cluster_name":"my-application","status":"yellow","timed_out":false,"number_of_nodes":1,"number_of_data_nodes":1,"active_primary_shards":1,"active_shards":1,"relocating_shards":0,"initializing_shards":0,"unassigned_shards":1,"delayed_unassigned_shards":0,"number_of_pending_tasks":0,"number_of_in_flight_fetch":0,"task_max_waiting_in_queue_millis":0,"active_shards_percent_as_number":50.0} 

startup logs:

sirishg@sirishg-vm:/u02/app/elasticsearch-2.1.1/bin$ ./elasticsearch [2016-01-16 18:17:36,591][info ][node                     ] [node-1] version[2.1.1], pid[3596], build[40e2c53/2015-12-15t13:05:55z] [2016-01-16 18:17:36,594][info ][node                     ] [node-1] initializing ... [2016-01-16 18:17:36,798][info ][plugins                  ] [node-1] loaded [], sites [] [2016-01-16 18:17:36,907][info ][env                      ] [node-1] using [1] data paths, mounts [[/ (/dev/sda1)]], net usable_space [12.6gb], net total_space [45.1gb], spins? [possibly], types [ext4] [2016-01-16 18:17:43,349][info ][node                     ] [node-1] initialized [2016-01-16 18:17:43,350][info ][node                     ] [node-1] starting ... [2016-01-16 18:17:43,693][info ][transport                ] [node-1] publish_address {localhost/127.0.0.1:9300}, bound_addresses {127.0.0.1:9300} [2016-01-16 18:17:43,713][info ][discovery                ] [node-1] my-application/8bftdwzcszanc9_p2vyyvw [2016-01-16 18:17:46,878][info ][cluster.service          ] [node-1] new_master {node-1}{8bftdwzcszanc9_p2vyyvw}{127.0.0.1}{localhost/127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received) [2016-01-16 18:17:46,980][info ][http                     ] [node-1] publish_address {localhost/127.0.0.1:9200}, bound_addresses {127.0.0.1:9200} [2016-01-16 18:17:46,991][info ][node                     ] [node-1] started [2016-01-16 18:17:47,318][info ][gateway                  ] [node-1] recovered [1] indices cluster_state [2016-01-16 18:20:03,866][info ][rest.suppressed          ] /logstash-*/_mapping/field/* params: {ignore_unavailable=false, allow_no_indices=false, index=logstash-*, include_defaults=true, fields=*, _=1452986403826} [logstash-*] indexnotfoundexception[no such index]     @ org.elasticsearch.cluster.metadata.indexnameexpressionresolver$wildcardexpressionresolver.resolve(indexnameexpressionresolver.java:636)     @ org.elasticsearch.cluster.metadata.indexnameexpressionresolver.concreteindices(indexnameexpressionresolver.java:133)     @ org.elasticsearch.cluster.metadata.indexnameexpressionresolver.concreteindices(indexnameexpressionresolver.java:77)     @ org.elasticsearch.action.admin.indices.mapping.get.transportgetfieldmappingsaction.doexecute(transportgetfieldmappingsaction.java:57)     @ org.elasticsearch.action.admin.indices.mapping.get.transportgetfieldmappingsaction.doexecute(transportgetfieldmappingsaction.java:40)     @ org.elasticsearch.action.support.transportaction.execute(transportaction.java:70)     @ org.elasticsearch.client.node.nodeclient.doexecute(nodeclient.java:58) 

kibana status:

sirishg@sirishg-vm:/u02/app/kibana-4.3.1-linux-x86/bin$ ./kibana    log   [18:18:36.697] [info][status][plugin:kibana] status changed uninitialized green - ready   log   [18:18:36.786] [info][status][plugin:elasticsearch] status changed uninitialized yellow - waiting elasticsearch   log   [18:18:36.852] [info][status][plugin:kbn_vislib_vis_types] status changed uninitialized green - ready   log   [18:18:36.875] [info][status][plugin:markdown_vis] status changed uninitialized green - ready   log   [18:18:36.883] [info][status][plugin:metric_vis] status changed uninitialized green - ready   log   [18:18:36.907] [info][status][plugin:spymodes] status changed uninitialized green - ready   log   [18:18:36.936] [info][status][plugin:statuspage] status changed uninitialized green - ready   log   [18:18:36.950] [info][status][plugin:table_vis] status changed uninitialized green - ready   log   [18:18:37.078] [info][listening] server running @ http://0.0.0.0:5601   log   [18:18:37.446] [info][status][plugin:elasticsearch] status changed yellow green - kibana index ready 

kibana ui errors:

error: please specify default index pattern kbnerror@http://localhost:5601/bundles/commons.bundle.js:58172:21 nodefaultindexpattern@http://localhost:5601/bundles/commons.bundle.js:58325:6 loaddefaultindexpattern/<@http://localhost:5601/bundles/kibana.bundle.js:97911:1 processqueue@http://localhost:5601/bundles/commons.bundle.js:42358:29 scheduleprocessqueue/<@http://localhost:5601/bundles/commons.bundle.js:42374:28 $rootscopeprovider/this.$get</scope.prototype.$eval@http://localhost:5601/bundles/commons.bundle.js:43602:17 $rootscopeprovider/this.$get</scope.prototype.$digest@http://localhost:5601/bundles/commons.bundle.js:43413:16 $rootscopeprovider/this.$get</scope.prototype.$apply@http://localhost:5601/bundles/commons.bundle.js:43710:14 $locationprovider/this.$get</<@http://localhost:5601/bundles/commons.bundle.js:39839:14 jquery.event.dispatch@http://localhost:5601/bundles/commons.bundle.js:22720:16 jquery.event.add/elemdata.handle@http://localhost:5601/bundles/commons.bundle.js:22407:7 

logstash debug logs:

  {:timestamp=>"2016-01-17t11:07:06.287000-0500", :message=>"reading config file", :config_file=>"/u02/app/logstash-2.1.1/first-pipeline.conf", :level=>:debug, :file=>"logstash/agent.rb", :line=>"325", :method=>"local_config"} {:timestamp=>"2016-01-17t11:07:06.420000-0500", :message=>"compiled pipeline code:\n        @inputs = []\n        @filters = []\n        @outputs = []\n        @periodic_flushers = []\n        @shutdown_flushers = []\n\n          @input_file_1 = plugin(\"input\", \"file\", logstash::util.hash_merge_many({ \"path\" => (\"/u02/app/logstash-tutorial-dataset.log\") }, { \"start_position\" => (\"beginning\") }))\n\n          @inputs << @input_file_1\n\n          @filter_grok_2 = plugin(\"filter\", \"grok\", logstash::util.hash_merge_many({ \"match\" => {(\"message\") => (\"%{combinedapachelog}\")} }))\n\n          @filters << @filter_grok_2\n\n            @filter_grok_2_flush = lambda |options, &block|\n              @logger.debug? && @logger.debug(\"flushing\", :plugin => @filter_grok_2)\n\n              events = @filter_grok_2.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"flushing\", :plugin => @filter_grok_2, :events => events)\n\n                          events = @filter_geoip_3.multi_filter(events)\n  \n\n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_grok_2.respond_to?(:flush)\n              @periodic_flushers << @filter_grok_2_flush if @filter_grok_2.periodic_flush\n              @shutdown_flushers << @filter_grok_2_flush\n            end\n\n          @filter_geoip_3 = plugin(\"filter\", \"geoip\", logstash::util.hash_merge_many({ \"source\" => (\"clientip\") }))\n\n          @filters << @filter_geoip_3\n\n            @filter_geoip_3_flush = lambda |options, &block|\n              @logger.debug? && @logger.debug(\"flushing\", :plugin => @filter_geoip_3)\n\n              events = @filter_geoip_3.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"flushing\", :plugin => @filter_geoip_3, :events => events)\n\n                \n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_geoip_3.respond_to?(:flush)\n              @periodic_flushers << @filter_geoip_3_flush if @filter_geoip_3.periodic_flush\n              @shutdown_flushers << @filter_geoip_3_flush\n            end\n\n          @output_elasticsearch_4 = plugin(\"output\", \"elasticsearch\", logstash::util.hash_merge_many({ \"hosts\" => [(\"localhost:9200\")] }))\n\n          @outputs << @output_elasticsearch_4\n\n          @output_stdout_5 = plugin(\"output\", \"stdout\", logstash::util.hash_merge_many({ \"codec\" => (\"rubydebug\") }))\n\n          @outputs << @output_stdout_5\n\n  def filter_func(event)\n    events = [event]\n    @logger.debug? && @logger.debug(\"filter received\", :event => event.to_hash)\n              events = @filter_grok_2.multi_filter(events)\n              events = @filter_geoip_3.multi_filter(events)\n    \n    events\n  end\n  def output_func(event)\n    @logger.debug? && @logger.debug(\"output received\", :event => event.to_hash)\n    @output_elasticsearch_4.handle(event)\n    @output_stdout_5.handle(event)\n    \n  end", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"38", :method=>"initialize"} {:timestamp=>"2016-01-17t11:07:06.426000-0500", :message=>"plugin not defined in namespace, checking plugin file", :type=>"input", :name=>"file", :path=>"logstash/inputs/file", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"} {:timestamp=>"2016-01-17t11:07:06.451000-0500", :message=>"plugin not defined in namespace, checking plugin file", :type=>"codec", :name=>"plain", :path=>"logstash/codecs/plain", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"} {:timestamp=>"2016-01-17t11:07:06.465000-0500", :message=>"config logstash::codecs::plain/@charset = \"utf-8\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.468000-0500", :message=>"config logstash::inputs::file/@path = [\"/u02/app/logstash-tutorial-dataset.log\"]", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.469000-0500", :message=>"config logstash::inputs::file/@start_position = \"beginning\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.472000-0500", :message=>"config logstash::inputs::file/@codec = <logstash::codecs::plain charset=>\"utf-8\">", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.480000-0500", :message=>"config logstash::inputs::file/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.481000-0500", :message=>"config logstash::inputs::file/@stat_interval = 1", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.492000-0500", :message=>"config logstash::inputs::file/@discover_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.493000-0500", :message=>"config logstash::inputs::file/@sincedb_write_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.496000-0500", :message=>"config logstash::inputs::file/@delimiter = \"\\n\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.498000-0500", :message=>"plugin not defined in namespace, checking plugin file", :type=>"filter", :name=>"grok", :path=>"logstash/filters/grok", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"} {:timestamp=>"2016-01-17t11:07:06.515000-0500", :message=>"config logstash::filters::grok/@match = {\"message\"=>\"%{combinedapachelog}\"}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.524000-0500", :message=>"config logstash::filters::grok/@add_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.532000-0500", :message=>"config logstash::filters::grok/@remove_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.535000-0500", :message=>"config logstash::filters::grok/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} {:timestamp=>"2016-01-17t11:07:06.536000-0500", :message=>"config logstash::filters::grok/@remove_field = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"} 

elasticsearch recent logs:

    sirishg@sirishg-vm:/u02/app/elasticsearch-2.1.1/bin$ ./elasticsearch [2016-01-17 11:00:23,467][info ][node                     ] [node-1] version[2.1.1], pid[3418], build[40e2c53/2015-12-15t13:05:55z] [2016-01-17 11:00:23,470][info ][node                     ] [node-1] initializing ... [2016-01-17 11:00:23,698][info ][plugins                  ] [node-1] loaded [], sites [] [2016-01-17 11:00:23,853][info ][env                      ] [node-1] using [1] data paths, mounts [[/ (/dev/sda1)]], net usable_space [12.6gb], net total_space [45.1gb], spins? [possibly], types [ext4] [2016-01-17 11:00:27,412][info ][node                     ] [node-1] initialized [2016-01-17 11:00:27,412][info ][node                     ] [node-1] starting ... [2016-01-17 11:00:27,605][info ][transport                ] [node-1] publish_address {localhost/127.0.0.1:9300}, bound_addresses {127.0.0.1:9300} [2016-01-17 11:00:27,616][info ][discovery                ] [node-1] my-application/rd4s1zodqxoj3_g-n22nnq [2016-01-17 11:00:31,121][info ][cluster.service          ] [node-1] new_master {node-1}{rd4s1zodqxoj3_g-n22nnq}{127.0.0.1}{localhost/127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received) [2016-01-17 11:00:31,259][info ][http                     ] [node-1] publish_address {localhost/127.0.0.1:9200}, bound_addresses {127.0.0.1:9200} [2016-01-17 11:00:31,260][info ][node                     ] [node-1] started [2016-01-17 11:00:31,830][info ][gateway                  ] [node-1] recovered [2] indices cluster_state 

have been able work? comments:

1) fact have kibana running on "0.0.0.0" sign of going wrong, check configuration , connectivity elasticsearch.

2) index putting information into? logstash*?

3) if else fails, update current 2.3.* (elasticsearch) , 4.4.* (kibana).

4) in order have logstash capture file , read (and therefore send data elasticsearch) should write file again (change file creation / modification timestamps). part not comes easy, logstash (the file input-input) keeps pointer last line added file, or something.

you got working now, maybe blowing in wind, on other hand maybe can someone.


Comments

Popular posts from this blog

get url and add instance to a model with prefilled foreign key :django admin -

css - Make div keyboard-scrollable in jQuery Mobile? -

ruby on rails - Seeing duplicate requests handled with Unicorn -