Code Monkey home page Code Monkey logo

ansible's Introduction

Version badge

YouTube

Elastic Integration

pfSense/OPNsense + Elastic Stack

pfelk dashboard

Contents

Prerequisites

  • Ubuntu Server v20.04+ or Debian Server 11+ (stretch and buster tested)
  • pfSense v2.5.0+ or OPNsense 23.0+
  • Minimum of 8GB of RAM (Docker requires more) and recommend 32GB (WiKi Reference)
  • Setting up remote logging (WiKi Reference)

pfelk is a highly customizable open-source tool for ingesting and visualizing your firewall traffic with the full power of Elasticsearch, Logstash and Kibana.

Key features:

  • ingest and enrich your pfSense/OPNsense firewall traffic logs by leveraging Logstash

  • search your indexed data in near-real-time with the full power of the Elasticsearch

  • visualize you network traffic with interactive dashboards, Maps, graphs in Kibana

Supported entries include:

  • pfSense/OPNSense setups
  • TCP/UDP/ICMP protocols
  • KEA-DHCP (v4/v6) message types with dashboard - in development
  • DHCP (v4/v6) message types with dashboard - depreciated
  • IPv4/IPv6 mapping
  • pfSense CARP data
  • openVPN log parsing
  • Unbound DNS Resolver with dashboard and Kibana SIEM compliance
  • Suricata IDS with dashboard and Kibana SIEM compliance
  • Snort IDS with dashboard and Kibana SIEM compliance
  • Squid with dashboard and Kibana SIEM compliance
  • HAProxy with dashboard
  • Captive Portal with dashboard
  • NGINX with dashboard

pfelk aims to replace the vanilla pfSense/OPNsense web UI with extended search and visualization features. You can deploy this solution via ansible-playbook, docker-compose, bash script, or manually.

pfelk overview

  • pfelk-overview

Quick start

Installation

docker-compose

script installation method

  • Download installer script from pfelk repository
  • $ wget https://raw.githubusercontent.com/pfelk/pfelk/main/etc/pfelk/scripts/pfelk-installer.sh
  • Make script executable
  • $ chmod +x pfelk-installer.sh
  • Run installer script
  • $ sudo ./pfelk-installer.sh
  • Configure Security here
  • Templates here
  • Finish Configuring here
  • YouTube Guide

manual installation method

Roadmap

This is the experimental public roadmap for the pfelk project.

See the roadmap »

Comparison to similar solutions

Comparisions »

Contributing

Please reference to the CONTRIBUTING file. Collectively we can enhance and improve this product. Issues, feature requests, PRs, and documentation contributions are encouraged and welcomed!

License

This project is licensed under the terms of the Apache 2.0 open source license. Please refer to LICENSE for the full terms.

ansible's People

Contributors

fktkrt avatar ztroop avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

ansible's Issues

Unable to install using Ansible

I followed the guide and attemtped to run the Ansible install but am presented with the below error.

TASK [elasticsearch : Debian - Add elasticsearch repository] ******************************************************************************************************************************************************************************************************************
failed: [localhost] (item={'repo': 'deb https://artifacts.elastic.co/packages/7.x/apt stable main', 'state': 'present'}) => {"ansible_loop_var": "item", "changed": false, "item": {"repo": "deb https://artifacts.elastic.co/packages/7.x/apt stable main", "state": "present"}, "msg": "apt cache update failed"}

PLAY RECAP ********************************************************************************************************************************************************************************************************************************************************************
localhost                  : ok=3    changed=0    unreachable=0    failed=1    skipped=0    rescued=0    ignored=0

Travis-CI Failes After Updates (configuration files/main.yml)

Describe the bug
Travis-CI Failing. Updated the configuration files and amended the logstash>>main.yml file with updated paths. However, Travis-CI continues to fail. Made updates to main.yml...reverted to previously working version but still received Travis-CI failures. Will peel back and continue to troubleshoot later next week as time permits.

To Reproduce
Steps to reproduce the behavior:
Resort to previously working main.yml and/or amend with updates...Travis-CI fails

Unable to install using Ansible playbook - GeoIP fails

The directions are a bit unclear as to the fact that the Ansible install will fail at the
install maxmind

download GeoIP databases
setup a cron job for automated updates

steps.

Assuming that I need to let it fail, then get on the target host and edit the /etc/GeoIP.conf file?

once I do that and re-run the Ansible playbook, I then get this:

fatal: [pfelk.yourcompany.org]: FAILED! => {
"changed": true,
"cmd": "geoipupdate -d /usr/share/GeoIP/",
"delta": "0:00:00.006563",
"end": "2020-02-25 02:02:09.074367",
"invocation": {
"module_args": {
"_raw_params": "geoipupdate -d /usr/share/GeoIP/",
"_uses_shell": true,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"msg": "non-zero return code",
"rc": 1,
"start": "2020-02-25 02:02:09.067804",
"stderr": "error loading configuration file: EditionIDs' is in the config multiple times", "stderr_lines": [ "error loading configuration file: EditionIDs' is in the config multiple times"
],
"stdout": "",
"stdout_lines": []
}

Logstash unable to start after installation. Issue with grok file?

Followed your ansible install and everything is working except for logstash. I'm getting this error:

ERROR] 2020-03-13 20:55:28.125 [[main]-pipeline-manager] javapipeline - Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<RuntimeError: Grok pattern file does not exist: /etc/logstash/conf.d/patterns/pf-12.2019.grok>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.2.0/lib/logstash/filters/grok.rb:445:in `block in add_patterns_from_files'", "org/jruby/RubyArray.java:1814:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.2.0/lib/logstash/filters/grok.rb:443:in `add_patterns_from_files'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.2.0/lib/logstash/filters/grok.rb:282:in `block in register'", "org/jruby/RubyArray.java:1814:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.2.0/lib/logstash/filters/grok.rb:278:in `block in register'", "org/jruby/RubyHash.java:1428:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.2.0/lib/logstash/filters/grok.rb:273:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:56:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:200:in `block in register_plugins'", "org/jruby/RubyArray.java:1814:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:199:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:502:in `maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:212:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:154:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:109:in `block in start'"], "pipeline.sources"=>["/etc/logstash/conf.d/01-inputs.conf", "/etc/logstash/conf.d/11-firewall.conf", "/etc/logstash/conf.d/12-suricata.conf", "/etc/logstash/conf.d/13-snort.conf", "/etc/logstash/conf.d/15-others.conf", "/etc/logstash/conf.d/20-geoip.conf", "/etc/logstash/conf.d/50-outputs.conf"], :thread=>"#<Thread:0x4ec02327 run>"}
[ERROR] 2020-03-13 20:55:28.146 [Converge PipelineAction::Create<main>] agent - Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}

Logstash not starting due to error

Followed your ansible install and everything is working except for logstash. I'm getting this error:

ERROR] 2020-03-13 20:55:28.125 [[main]-pipeline-manager] javapipeline - Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<RuntimeError: Grok pattern file does not exist: /etc/logstash/conf.d/patterns/pf-12.2019.grok>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.2.0/lib/logstash/filters/grok.rb:445:in `block in add_patterns_from_files'", "org/jruby/RubyArray.java:1814:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.2.0/lib/logstash/filters/grok.rb:443:in `add_patterns_from_files'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.2.0/lib/logstash/filters/grok.rb:282:in `block in register'", "org/jruby/RubyArray.java:1814:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.2.0/lib/logstash/filters/grok.rb:278:in `block in register'", "org/jruby/RubyHash.java:1428:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.2.0/lib/logstash/filters/grok.rb:273:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:56:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:200:in `block in register_plugins'", "org/jruby/RubyArray.java:1814:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:199:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:502:in `maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:212:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:154:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:109:in `block in start'"], "pipeline.sources"=>["/etc/logstash/conf.d/01-inputs.conf", "/etc/logstash/conf.d/11-firewall.conf", "/etc/logstash/conf.d/12-suricata.conf", "/etc/logstash/conf.d/13-snort.conf", "/etc/logstash/conf.d/15-others.conf", "/etc/logstash/conf.d/20-geoip.conf", "/etc/logstash/conf.d/50-outputs.conf"], :thread=>"#<Thread:0x4ec02327 run>"}
[ERROR] 2020-03-13 20:55:28.146 [Converge PipelineAction::Create<main>] agent - Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}

no syslogs from pfSense; Couldn't find any Elasticsearch data

Describe the bug
Sounds similar to issue pfelk/pfelk#118

To Reproduce
Steps to reproduce the behavior:

  1. install Ubuntu 18
  2. install pfElk via ansible
  3. access Kibana frontend
  4. setup pfSense to forward logs per Git doc
  5. Kibana frontend complains "Couldn't find any Elasticsearch data"

Firewall System (please complete the following information):

  • pfSense 2.4.5-RELEASE

Operating System (please complete the following information):

NAME="Ubuntu"
VERSION="18.04.4 LTS (Bionic Beaver)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 18.04.4 LTS"
VERSION_ID="18.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=bionic
UBUNTU_CODENAME=bionic

Installation method (manual, ansible-playbook, docker, script):
ansible-playbook

Elasticsearch, Logstash, Kibana (please complete the following information):

  • Version of ELK components (dpkg -l [elasticsearch]|[logstash]|[kibana])
    I don't know; playbooks never prompted for version

Elasticsearch, Logstash, Kibana logs:

  • Elasticsearch logs (tail -f /var/log/elasticsearch/[your-elk-cluster-name].log)
	at org.elasticsearch.action.support.ChannelActionListener.onFailure(ChannelActionListener.java:56) [elasticsearch-7.7.1.jar:7.7.1]
	at org.elasticsearch.search.SearchService.lambda$runAsync$0(SearchService.java:413) [elasticsearch-7.7.1.jar:7.7.1]
	at org.elasticsearch.common.util.concurrent.TimedRunnable.doRun(TimedRunnable.java:44) [elasticsearch-7.7.1.jar:7.7.1]
	at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:692) [elasticsearch-7.7.1.jar:7.7.1]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.7.1.jar:7.7.1]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) [?:?]
	at java.lang.Thread.run(Thread.java:832) [?:?]
[2020-06-04T23:36:51,119][INFO ][o.e.c.r.a.AllocationService] [node-1] Cluster health status changed from [RED] to [GREEN] (reason: [shards started [[.kibana_task_manager_1][0]]]).
[2020-06-04T23:55:09,287][INFO ][o.e.c.m.MetaDataIndexTemplateService] [node-1] adding template [.management-beats] for index patterns [.management-beats]
  • Logstash logs (tail -f /var/log/logstash/logstash-plain.log)
      # This setting must be a path
      # File does not exist or cannot be opened /etc/logstash/conf.d/template/pf-geoip-template.json
      template => "/etc/logstash/conf.d/template/pf-geoip-template.json"
      ...
    }
  }
[2020-06-05T00:14:35,247][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:119)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:80)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:837)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1169)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:1156)", "org.jruby.ir.targets.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:39)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:43)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:82)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:332)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:86)", "org.jruby.RubyClass.newInstance(RubyClass.java:939)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$__VARARGS__(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:82)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)", "usr.share.logstash.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:342)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:138)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:52)", "org.jruby.runtime.Block.call(Block.java:139)", "org.jruby.RubyProc.call(RubyProc.java:318)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)", "java.lang.Thread.run(Thread.java:748)"]}
[2020-06-05T00:14:35,278][ERROR][logstash.agent           ] An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handle `Java::JavaLang::IllegalStateException` for `PipelineAction::Create<main>`", :backtrace=>["org/logstash/execution/ConvergeResultExt.java:129:in `create'", "org/logstash/execution/ConvergeResultExt.java:57:in `add'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:355:in `block in converge_state'"]}
[2020-06-05T00:14:35,389][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::JavaLang::IllegalStateException` for `PipelineAction::Create<main>`>, :backtrace=>["org/logstash/execution/ConvergeResultExt.java:129:in `create'", "org/logstash/execution/ConvergeResultExt.java:57:in `add'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:355:in `block in converge_state'"]}
[2020-06-05T00:14:35,451][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
[2020-06-05T00:15:16,578][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.7.1"}

Additional context
The logstash port is not up and listening:
netstat --listen | grep 5140

Attach the pfELK Error Log (error.pfelk), for Better Assistance*
"error.pfelk" doesn't exist on filesystem

sudo find / -type f -name error.pfelk
justin@pfelk1:/$

File doesn't exist during Logstash deploy

Ansible complains that it can't deploy 05-firewall.conf. Looking at the source, it doesn't actually exist there, or in your repo.

It exists in the v6.1 tag, but not in master.

Permission errors

After running the ansible script logstash is not starting because the user "logstash" has no eXecute-permissions on the folder /etc/pfelk/conf.d/ and therefore cannot read the *.conf files in there.
Manually setting the permissions to 755 let logstash successfully start.
Same with the databases and patterns folder.

[2021-10-05T20:36:09,306][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties [2021-10-05T20:36:09,346][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.14.0", "jruby.version"=>"jruby 9.2.19.0 (2.5.8) 2021-06-15 55810c552b OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [linux-x86_64]"} [2021-10-05T20:36:15,375][INFO ][logstash.config.source.local.configpathloader] No config files found in path {:path=>"/etc/pfelk/conf.d/*.conf"} [2021-10-05T20:36:15,435][ERROR][logstash.config.sourceloader] No configuration found in the configured sources. [2021-10-05T20:36:16,029][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2021-10-05T20:36:21,003][INFO ][logstash.runner ] Logstash shut down.

Logstash not tagging(?) things correctly

Describe the bug
Logs should be mutated into usable data that allows for the data to be ingested and displayed on dashboards. This is not happening and the message is not translated into usable data.

To Reproduce
Steps to reproduce the behavior:
Install pfelk using this project. Look at the dashboards

Screenshots
If applicable, add screenshots to help explain your problem.
image

Platform Version informations (please complete the following information):

- printf "$(uname -srm)\n$(cat /etc/os-release)\n":  
Linux 5.3.13-1-pve x86_64
NAME="Ubuntu"
VERSION="18.04.4 LTS (Bionic Beaver)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 18.04.4 LTS"
VERSION_ID="18.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=bionic
UBUNTU_CODENAME=bionic
 - ansible 2.9.6
  config file = /etc/ansible/ansible.cfg
  configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python2.7/dist-packages/ansible
  executable location = /usr/bin/ansible
  python version = 2.7.17 (default, Nov  7 2019, 10:07:09) [GCC 7.4.0]

Elastic, Logstash, Kibana:

  • Version of ELK (dpkg -l elasticsearch|logstash|kibana): 7.6.0

Relevant entries of ELK service logs/status

  • service elasticsearch|logstash|kibana status
  • tailing corresponding logs eg. /var/log/logstash-plain.log
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logs
tash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using
 default config which logs errors to the console
[INFO ] 2020-03-16 18:31:57.669 [LogStash::Runner] runner - Starting Logstash {"logstash.versio
n"=>"7.6.1"}
[INFO ] 2020-03-16 18:32:01.309 [Converge PipelineAction::Create<main>] Reflections - Reflectio
ns took 42 ms to scan 1 urls, producing 20 keys and 40 values
[INFO ] 2020-03-16 18:32:02.516 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[WARN ] 2020-03-16 18:32:02.736 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://localhost:9200/"}
[INFO ] 2020-03-16 18:32:02.939 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7}
[WARN ] 2020-03-16 18:32:02.942 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}   
[INFO ] 2020-03-16 18:32:03.010 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[INFO ] 2020-03-16 18:32:03.019 [[main]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/GeoIP/GeoLite2-City.mmdb"}
[INFO ] 2020-03-16 18:32:03.060 [[main]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/GeoIP/GeoLite2-ASN.mmdb"}
[INFO ] 2020-03-16 18:32:03.113 [Ruby-0-Thread-6: :1] elasticsearch - Using default mapping template 
[INFO ] 2020-03-16 18:32:03.193 [Ruby-0-Thread-6: :1] elasticsearch - Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[INFO ] 2020-03-16 18:32:03.506 [[main]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/GeoIP/GeoLite2-ASN.mmdb"}
[INFO ] 2020-03-16 18:32:03.587 [[main]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/GeoIP/GeoLite2-City.mmdb"}
[WARN ] 2020-03-16 18:32:03.686 [[main]-pipeline-manager] LazyDelegatingGauge - A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[INFO ] 2020-03-16 18:32:03.690 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/01-inputs.conf", "/etc/logstash/conf.d/11-firewall.conf", "/etc/logstash/conf.d/12-suricata.conf", "/etc/logstash/conf.d/13-snort.conf", "/etc/logstash/conf.d/15-others.conf", "/etc/logstash/conf.d/20-geoip.conf", "/etc/logstash/conf.d/50-outputs.conf"], :thread=>"#<Thread:0x297fca02 run>"}
[INFO ] 2020-03-16 18:32:08.349 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
[INFO ] 2020-03-16 18:32:08.461 [Agent thread] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2020-03-16 18:32:08.485 [[main]<udp] udp - Starting UDP listener {:address=>"0.0.0.0:5140"}  
[INFO ] 2020-03-16 18:32:08.570 [[main]<udp] udp - UDP listener started {:address=>"0.0.0.0:5140", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}
[INFO ] 2020-03-16 18:32:08.805 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.