Code Monkey home page Code Monkey logo

kube-elk-filebeat's People

Contributors

komljen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

kube-elk-filebeat's Issues

elasticsearch can't mount external volume

I create elasticsearch development with container volume like this
volumeMounts:
- mountPath: /data
name: storage
volumes:
- name: storage
hostPath:
path: /var/lib/elasticsearch

But on the host,there is nothiing in /var/lib/elasticsearch/

Not able to create index in kibana

First of all I want say thanks for nice blog (https://crondev.com/elk-stack-kubernetes/)

I am using kubrnetes 1.6.4 with RBAC.

using ELK of your ELK and filebeat.

all running fine, but not able to create index in kibana.

logs from ES:

[2017-06-08T12:07:41,753][INFO ][o.e.c.m.MetaDataCreateIndexService] [08gKlRV] [filebeat-2017.06.08] creating index, cause [auto(bulk api)], templates [], shards [5]/[1], mappings []
[2017-06-08T12:07:42,412][INFO ][o.e.c.m.MetaDataMappingService] [08gKlRV] [filebeat-2017.06.08/Azlfbk47TfyOaSLgudBmGg] create_mapping [kube-logs]
[2017-06-08T12:08:51,098][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [buildNum]
[2017-06-08T12:08:51,100][INFO ][o.e.c.m.MetaDataCreateIndexService] [08gKlRV] [.kibana] creating index, cause [api], templates [], shards [1]/[1], mappings [server, config]
[2017-06-08T12:11:54,986][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [title]
[2017-06-08T12:11:54,987][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [timeFieldName]
[2017-06-08T12:11:54,989][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [intervalName]
[2017-06-08T12:11:54,989][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [fields]
[2017-06-08T12:11:54,989][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [sourceFilters]
[2017-06-08T12:11:54,989][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [fieldFormatMap]
[2017-06-08T12:11:54,991][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [title]
[2017-06-08T12:11:54,991][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [timeFieldName]
[2017-06-08T12:11:54,991][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [intervalName]
[2017-06-08T12:11:54,992][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [fields]
[2017-06-08T12:11:54,992][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [sourceFilters]
[2017-06-08T12:11:54,992][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [fieldFormatMap]
[2017-06-08T12:11:54,993][INFO ][o.e.c.m.MetaDataMappingService] [08gKlRV] [.kibana/2c9YzwJfRqm4d2XAVr_0jA] create_mapping [index-pattern]

Logs from Kibana:

{"type":"response","@timestamp":"2017-06-08T12:27:14Z","tags":[],"pid":6,"method":"get","statusCode":200,"req":{"url":"/bundles/4b5a84aaf1c9485e060c503a0ff8cadb.woff2","method":"get","headers":{"host":"logging.cloudapps.cloud-cafe.in","x-real-ip":"10.18.5.55","connection":"close","x-forwarded-for":"10.18.5.55, 10.18.5.55","x-forwarded-host":"logging.cloudapps.cloud-cafe.in","x-forwarded-port":"80","x-forwarded-proto":"http","x-original-uri":"/bundles/4b5a84aaf1c9485e060c503a0ff8cadb.woff2","x-scheme":"http","user-agent":"Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36","origin":"http://logging.cloudapps.cloud-cafe.in","accept":"*/*","referer":"http://logging.cloudapps.cloud-cafe.in/bundles/commons.style.css?v=14849","accept-encoding":"gzip, deflate, sdch","accept-language":"en-US,en;q=0.8,bn;q=0.6","cache-control":"max-stale=0","x-bluecoat-via":"62c9536659579913"},"remoteAddress":"10.244.1.1","userAgent":"10.244.1.1","referer":"http://logging.cloudapps.cloud-cafe.in/bundles/commons.style.css?v=14849"},"res":{"statusCode":200,"responseTime":1,"contentLength":9},"message":"GET /bundles/4b5a84aaf1c9485e060c503a0ff8cadb.woff2 200 1ms - 9.0B"}
{"type":"response","@timestamp":"2017-06-08T12:27:14Z","tags":[],"pid":6,"method":"get","statusCode":404,"req":{"url":"/elasticsearch/logstash-*/_mapping/field/*?_=1496924834408&ignore_unavailable=false&allow_no_indices=false&include_defaults=true","method":"get","headers":{"host":"logging.cloudapps.cloud-cafe.in","x-real-ip":"10.18.5.55","connection":"close","x-forwarded-for":"10.18.5.55, 10.18.5.55","x-forwarded-host":"logging.cloudapps.cloud-cafe.in","x-forwarded-port":"80","x-forwarded-proto":"http","x-original-uri":"/elasticsearch/logstash-*/_mapping/field/*?_=1496924834408&ignore_unavailable=false&allow_no_indices=false&include_defaults=true","x-scheme":"http","accept":"application/json, text/plain, */*","kbn-version":"5.3.2","user-agent":"Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36","referer":"http://logging.cloudapps.cloud-cafe.in/app/kibana","accept-encoding":"gzip, deflate, sdch","accept-language":"en-US,en;q=0.8,bn;q=0.6","cache-control":"max-stale=0","x-bluecoat-via":"62c9536659579913"},"remoteAddress":"10.244.1.1","userAgent":"10.244.1.1","referer":"http://logging.cloudapps.cloud-cafe.in/app/kibana"},"res":{"statusCode":404,"responseTime":4,"contentLength":9},"message":"GET /elasticsearch/logstash-*/_mapping/field/*?_=1496924834408&ignore_unavailable=false&allow_no_indices=false&include_defaults=true 404 4ms - 9.0B"}
{"type":"response","@timestamp":"2017-06-08T12:28:54Z","tags":[],"pid":6,"method":"get","statusCode":200,"req":{"url":"/elasticsearch/filebeat-*/_mapping/field/*?_=1496924933416&ignore_unavailable=false&allow_no_indices=false&include_defaults=true","method":"get","headers":{"host":"logging.cloudapps.cloud-cafe.in","x-real-ip":"10.18.5.55","connection":"close","x-forwarded-for":"10.18.5.55, 10.18.5.55","x-forwarded-host":"logging.cloudapps.cloud-cafe.in","x-forwarded-port":"80","x-forwarded-proto":"http","x-original-uri":"/elasticsearch/filebeat-*/_mapping/field/*?_=1496924933416&ignore_unavailable=false&allow_no_indices=false&include_defaults=true","x-scheme":"http","accept":"application/json, text/plain, */*","kbn-version":"5.3.2","user-agent":"Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36","referer":"http://logging.cloudapps.cloud-cafe.in/app/kibana","accept-encoding":"gzip, deflate, sdch","accept-language":"en-US,en;q=0.8,bn;q=0.6","cache-control":"max-stale=0","x-bluecoat-via":"62c9536659579913"},"remoteAddress":"10.244.1.1","userAgent":"10.244.1.1","referer":"http://logging.cloudapps.cloud-cafe.in/app/kibana"},"res":{"statusCode":200,"responseTime":10,"contentLength":9},"message":"GET /elasticsearch/filebeat-*/_mapping/field/*?_=1496924933416&ignore_unavailable=false&allow_no_indices=false&include_defaults=true 200 10ms - 9.0B"}
{"type":"response","@timestamp":"2017-06-08T12:28:54Z","tags":[],"pid":6,"method":"get","statusCode":200,"req":{"url":"/api/kibana/filebeat-*/field_capabilities","method":"get","headers":{"host":"logging.cloudapps.cloud-cafe.in","x-real-ip":"10.18.5.55","connection":"close","x-forwarded-for":"10.18.5.55, 10.18.5.55","x-forwarded-host":"logging.cloudapps.cloud-cafe.in","x-forwarded-port":"80","x-forwarded-proto":"http","x-original-uri":"/api/kibana/filebeat-*/field_capabilities","x-scheme":"http","accept":"application/json, text/plain, */*","kbn-version":"5.3.2","user-agent":"Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36","referer":"http://logging.cloudapps.cloud-cafe.in/app/kibana","accept-encoding":"gzip, deflate, sdch","accept-language":"en-US,en;q=0.8,bn;q=0.6","cache-control":"max-stale=0","x-bluecoat-via":"62c9536659579913"},"remoteAddress":"10.244.1.1","userAgent":"10.244.1.1","referer":"http://logging.cloudapps.cloud-cafe.in/app/kibana"},"res":{"statusCode":200,"responseTime":14,"contentLength":9},"message":"GET /api/kibana/filebeat-*/field_capabilities 200 14ms - 9.0B"}

logs from logstash:

Sending Logstash's logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2017-06-08T12:07:32,586][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[2017-06-08T12:07:32,632][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"26821dff-3864-4ef6-a5ef-797a2ea1f4bd", :path=>"/usr/share/logstash/data/uuid"}
[2017-06-08T12:07:33,734][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2017-06-08T12:07:33,739][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elasticsearch:9200/, :path=>"/"}
log4j:WARN No appenders could be found for logger (org.apache.http.client.protocol.RequestAuthCache).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
[2017-06-08T12:07:34,161][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x7638001c URL:http://elasticsearch:9200/>}
[2017-06-08T12:07:34,163][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-06-08T12:07:34,374][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-06-08T12:07:34,402][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2017-06-08T12:07:34,931][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x7c989f50 URL://elasticsearch:9200>]}
[2017-06-08T12:07:35,110][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250}
[2017-06-08T12:07:36,355][INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5000"}
[2017-06-08T12:07:36,421][INFO ][logstash.pipeline        ] Pipeline main started
[2017-06-08T12:07:36,579][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Logs from filebeat:

2017/06/08 12:36:05.983084 metrics.go:39: INFO Non-zero metrics in the last 30s: filebeat.harvester.closed=1 filebeat.harvester.open_files=-1 filebeat.harvester.running=-1 libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_bytes=6 libbeat.logstash.publish.write_bytes=477 libbeat.logstash.published_and_acked_events=1 libbeat.publisher.published_events=1 publish.events=2 registrar.states.update=2 registrar.writes=2
2017/06/08 12:36:10.192418 log.go:116: INFO File is inactive: /var/log/containers/nginx-ingress-lb-v3qch_kube-system_nginx-ingress-lb-d3bda964189c4323bac4646ae41cf57e3b042caa7dd01a7229c7bd85a74a1258.log. Closing because close_inactive of 5m0s reached.
2017/06/08 12:36:35.983058 metrics.go:39: INFO Non-zero metrics in the last 30s: filebeat.harvester.closed=1 filebeat.harvester.open_files=-1 filebeat.harvester.running=-1 libbeat.logstash.call_count.PublishEvents=3 libbeat.logstash.publish.read_bytes=18 libbeat.logstash.publish.write_bytes=1471 libbeat.logstash.published_and_acked_events=3 libbeat.publisher.published_events=3 publish.events=4 registrar.states.update=4 registrar.writes=3
2017/06/08 12:37:05.983047 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_bytes=6 libbeat.logstash.publish.write_bytes=514 libbeat.logstash.published_and_acked_events=1 libbeat.publisher.published_events=1 publish.events=1 registrar.states.update=1 registrar.writes=1
2017/06/08 12:37:35.983062 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=3 libbeat.logstash.publish.read_bytes=18 libbeat.logstash.publish.write_bytes=2127 libbeat.logstash.published_and_acked_events=14 libbeat.publisher.published_events=14 publish.events=14 registrar.states.update=14 registrar.writes=3
2017/06/08 12:38:05.983105 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_bytes=6 libbeat.logstash.publish.write_bytes=491 libbeat.logstash.published_and_acked_events=1 libbeat.publisher.published_events=1 publish.events=1 registrar.states.update=1 registrar.writes=1
2017/06/08 12:38:35.983159 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_bytes=6 libbeat.logstash.publish.write_bytes=474 libbeat.logstash.published_and_acked_events=1 libbeat.publisher.published_events=1 publish.events=1 registrar.states.update=1 registrar.writes=1
2017/06/08 12:39:05.983074 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_bytes=6 libbeat.logstash.publish.write_bytes=476 libbeat.logstash.published_and_acked_events=1 libbeat.publisher.published_events=1 publish.events=1 registrar.states.update=1 registrar.writes=1
2017/06/08 12:39:35.983056 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_bytes=6 libbeat.logstash.publish.write_bytes=481 libbeat.logstash.published_and_acked_events=1 libbeat.publisher.published_events=1 publish.events=1 registrar.states.update=1 registrar.writes=1
2017/06/08 12:40:05.982969 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_bytes=6 libbeat.logstash.publish.write_bytes=480 libbeat.logstash.published_and_acked_events=1 libbeat.publisher.published_events=1 publish.events=1 registrar.states.update=1 registrar.writes=1
2017/06/08 12:40:35.983054 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_bytes=6 libbeat.logstash.publish.write_bytes=480 libbeat.logstash.published_and_acked_events=1 libbeat.publisher.published_events=1 publish.events=1 registrar.states.update=1 registrar.writes=1
2017/06/08 12:41:05.983044 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_bytes=6 libbeat.logstash.publish.write_bytes=481 libbeat.logstash.published_and_acked_events=1 libbeat.publisher.published_events=1 publish.events=1 registrar.states.update=1 registrar.writes=1

master host filebeat log:

017/06/08 12:34:16.674892 tcp.go:26: WARN DNS lookup failure "logstash": lookup logstash on 10.96.0.10:53: read udp 10.244.0.9:51125->10.96.0.10:53: i/o timeout
2017/06/08 12:34:16.674915 single.go:140: ERR Connecting error publishing events (retrying): lookup logstash on 10.96.0.10:53: read udp 10.244.0.9:51125->10.96.0.10:53: i/o timeout
2017/06/08 12:34:35.977411 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:35:06.073742 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:35:35.977417 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:36:05.977435 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:36:06.774576 tcp.go:26: WARN DNS lookup failure "logstash": lookup logstash on 10.96.0.10:53: read udp 10.244.0.9:46842->10.96.0.10:53: i/o timeout
2017/06/08 12:36:06.774599 single.go:140: ERR Connecting error publishing events (retrying): lookup logstash on 10.96.0.10:53: read udp 10.244.0.9:46842->10.96.0.10:53: i/o timeout
2017/06/08 12:36:35.977407 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:37:05.977406 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:37:35.977442 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:37:56.875027 tcp.go:26: WARN DNS lookup failure "logstash": lookup logstash on 10.96.0.10:53: read udp 10.244.0.9:55181->10.96.0.10:53: i/o timeout
2017/06/08 12:37:56.875146 single.go:140: ERR Connecting error publishing events (retrying): lookup logstash on 10.96.0.10:53: read udp 10.244.0.9:55181->10.96.0.10:53: i/o timeout
2017/06/08 12:38:05.977408 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:38:35.978374 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:39:05.977406 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:39:35.977417 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:39:46.878325 tcp.go:26: WARN DNS lookup failure "logstash": lookup logstash on 10.96.0.10:53: read udp 10.244.0.9:52687->10.96.0.10:53: i/o timeout
2017/06/08 12:39:46.878352 single.go:140: ERR Connecting error publishing events (retrying): lookup logstash on 10.96.0.10:53: read udp 10.244.0.9:52687->10.96.0.10:53: i/o timeout
2017/06/08 12:40:05.977415 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:40:35.977412 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:41:05.980660 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:41:35.977419 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/06/08 12:41:36.977157 tcp.go:26: WARN DNS lookup failure "logstash": lookup logstash on 10.96.0.10:53: read udp 10.244.0.9:45589->10.96.0.10:53: i/o timeout
2017/06/08 12:41:36.977280 single.go:140: ERR Connecting error publishing events (retrying): lookup logstash on 10.96.0.10:53: read udp 10.244.0.9:45589->10.96.0.10:53: i/o timeout

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.