Code Monkey home page Code Monkey logo

ansible-role-logstash's Issues

/opt/logstash expected but /usr/share/logstash used

Installed on ubuntu

fatal: [54.154.5.17]: FAILED! => {"changed": false, "failed": true, "module_stderr": "Shared connection to 54.154.5.17 closed.\r\n", "module_stdout": "Traceback (most recent call last):\r\n File "/tmp/ansible__6axEB/ansible_module_command.py", line 212, in \r\n main()\r\n File "/tmp/ansible__6axEB/ansible_module_command.py", line 155, in main\r\n os.chdir(chdir)\r\nOSError: [Errno 2] No such file or directory: '/opt/logstash'\r\n", "msg": "MODULE FAILURE"}

# dpkg -L logstash|grep 'bin/logstash-plugin$'
/usr/share/logstash/bin/logstash-plugin

Update repositories URI

  • Allow to choose a new repositories URIs
  • Update URI to https://artifacts.elastic.co/packages/oss-7.x/yum/repodata/repomd.xml

Set the apt repository url variable

Hello,
I am using your role to install logstash on a private environment which doesn't have access to the internet, so I need to replace the external Elastic repository by our internal one.

Is it possible to set the repository url as a variable, to replace it for our private repository url ?

Thanks for your great work !

Disable default config installation

The role installs this configuration at least :

    - 01-beats-input.conf
    - 10-syslog.conf
    - 11-nginx.conf
    - 12-apache.conf
    - 14-solr.conf
    - 15-drupal.conf
    - 30-elasticsearch-output.conf

It should be configurable to not install this configuration.
The logstash configuration should be handled by another dedicated role, project dependant.

How would I add custom filters the Ansible way?

I use this role via librarian-ansible which pulls the role from Galaxy. That means that I have a librarian_roles/ dir at the top level of my Configuration Management Repo which is git-ignored. Thus, I can't modify anything in there.
How can I add custom filters so that they are used by this role?

The conditional check 'logstash_ssl_key_file' failed

I am seeing a strange issue, I am not sure if it's me or the role and was hoping for some input.

I have the following in my groupvars:

logstash_ssl_dir: /etc/pki/logstash
logstash_ssl_certificate_file: domain.com.crt
logstash_ssl_key_file: domain.com.key

And I get the following from the playbook run:

TASK [geerlingguy.logstash : Ensure Logstash SSL key pair directory exists.] ******************************
task path: /home/anthony/repos/ansible/roles/geerlingguy.logstash/tasks/ssl.yml:2
fatal: [52.211.217.99]: FAILED! => {
    "failed": true, 
    "msg": "The conditional check 'logstash_ssl_key_file' failed. The error was: error while evaluating conditional (logstash_ssl_key_file): 'domain' is undefined\n\nThe error appears to have been in '/home/anthony/repos/ansible/roles/geerlingguy.logstash/tasks/ssl.yml': line 2, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n---\n- name: Ensure Logstash SSL key pair directory exists.\n  ^ here\n"

I can confirm the directory and keys exist on the remote server with the correct permissions.

Undefined variable __java_packages' is undefined

Hi,

I run AWX version 10.0.0 with Ansible 2.9.5.

I am trying to use geerlingguy.logstash but got the following error:

"msg": "The task includes an option with an undefined variable. The error was: '__java_packages' is undefined\n\nThe error appears to be in '/tmp/awx_121_6qgkvwxg/project/TEST-Elk_stack/roles/geerlingguy.java/tasks/main.yml': line 20, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n- name: Define java_packages.\n ^ here\n",
"_ansible_no_log": false

Here is my playbook which is just an example from role's Read Me:

  • hosts: tag_elastic_role_logstash

    pre_tasks:

    • name: Use Java 8 on Debian/Ubuntu.
      set_fact:
      java_packages:
      - openjdk-8-jdk
      when: ansible_os_family == 'Debian'

    roles:

    • geerlingguy.java
    • geerlingguy.elasticsearch
    • geerlingguy.logstash

Could you please advise on this?

Sorry I couldn't format the code in this post it doesn't work for some reason...

Cheers

check mode fails

With "--check" option, the role fails

TASK [logstash : Get list of installed plugins.] *******************************
skipping: [xxx]

TASK [logstash : Install configured plugins.] **********************************
fatal: [xxx]: FAILED! => {"failed": true, "msg": "The conditional check 'item not in logstash_plugins_list.stdout' failed. The error was: error while evaluating conditional (item not in logstash_plugins_list.stdout): Unable to look up a name or access an attribute in template string ({% if item not in logstash_plugins_list.stdout %} True {% else %} False {% endif %}).\nMake sure your variable name does not contain invalid characters like '-': argument of type 'StrictUndefined' is not iterable\n\nThe error appears to have been in '/xxx/logstash/tasks/plugins.yml': line 16, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n- name: Install configured plugins.\n  ^ here\n"}

Issue installing logstash on centos 7 (AWS)

Hi.

First of all thanks for this awesome job.

I am trying to use this role to install logstash along with filebeat and java. My playbook looks like:

---
- name: My awesome machine
  hosts: all
  become: true
  roles:
    - role: geerlingguy.java
    - role: geerlingguy.logstash
      logstash_install_plugins:
        - logstash-input-beats
        - logstash-output-amazon_es
    - role: geerlingguy.filebeat
      filebeat_output_logstash_enabled: false
      filebeat_enable_logging: true
      filebeat_log_level: info
      filebeat_create_config: false
      filebeat_prospectors:
        - input_type: log
          paths:
            - "/var/lib/docker/containers/*/*.log"
          json.message_key: log
          json.keys_under_root: true
          processors:
            - add_docker_metadata: ~

The output says:

TASK [geerlingguy.logstash : Get list of installed plugins.] **********************************************************************************************************************************
fatal: [ec2-XX-XXX-XXX-XXX.eu-central-1.compute.amazonaws.com]: FAILED! => {"changed": false, "module_stderr": "Shared connection to ec2-XX-XXX-XXX-XXX.eu-central-1.compute.amazonaws.com closed.\r\n", "module_stdout": "Traceback (most recent call last):\r\n  File \"/tmp/ansible_1fjH5w/ansible_module_command.py\", line 271, in <module>\r\n    main()\r\n  File \"/tmp/ansible_1fjH5w/ansible_module_command.py\", line 217, in main\r\n    os.chdir(chdir)\r\nOSError: [Errno 2] No such file or directory: '/opt/logstash'\r\n", "msg": "MODULE FAILURE", "rc": 1}

The machine has not the /opt/logstash directory. So the script fails.
I will try to fix it in a fork.

Thanks!

Not Logstash 7.x compatible

To be expected as v7.x came out this month :)

https://github.com/geerlingguy/ansible-role-logstash/blob/master/templates/30-elasticsearch-output.conf.j2#L5

[2019-04-17T16:21:44,245][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"%{[@metadata][beat]}-%{+YYYY.MM.dd}", id=>"ad98956be5916254196fbc3e0b2ed41672bc1bce2a010672de6a04d45ac7af65", hosts=>[http://es1:9200, http://es2:9200, http://es3:9200], document_type=>"%{[@metadata][type]}", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_6570343b-ed27-4d7b-a140-ec50b2d0075a", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>false, ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}

https://github.com/geerlingguy/ansible-role-logstash/blob/master/files/filters/14-solr.conf#L9

[2019-04-17T16:20:49,288][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::PluginLoadingError", :message=>"Couldn't find any filter plugin named 'multiline'. Are you sure this is correct? Trying to load the multiline filter plugin resulted in this error: no such file to load -- logstash/filters/multiline", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:211:in `lookup_pipeline_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/plugin.rb:137:in `lookup'", "org/logstash/plugins/PluginFactoryExt.java:200:in `plugin'", "org/logstash/plugins/PluginFactoryExt.java:184:in `plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:71:in `plugin'", "(eval):293:in `initialize'", "org/jruby/RubyKernel.java:1047:in `eval'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:49:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:43:in `block in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:96:in `block in exclusive'", "org/jruby/ext/thread/Mutex.java:165:in `synchronize'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:96:in `exclusive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:39:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:334:in `block in converge_state'"]}

UseParNewGC error on Ubuntu 18.04

The ansible-java-role installs Java 10 on Ubuntu 18.04 recently and since UseParNewGC is deprecated in Java 9 and completely removed from Java 10 (source), this role (ansible-role-logstash) will fail in installation step. It's good to mention this issue somewhere in the readme file, so users are aware that they need to define a working Java version in their playbook like this:

  vars:
    - java_packages:
      - openjdk-8-jre

Thanks for your great works :-)

Installation fails on Ubuntu 18.04LTS due to apt repo issue

Environment = Ubuntu Server 18.04LTS, using apt, x64

Play runs as expected to configure logstash role until reaching the apt repository stage. Repo GPG key is added successfully, check for logstash install is successful (as not yet installed), but refresh of apt repos fail with an error: "does not have a release file", so the apt update is blocked and play ends. This seems to be something to do with the apt repo that is added, as it does not match the current documentation for logstash on elastic.co , so perhaps it just needs updating.

Workaround is to install logstash by hand, or by separate task, then turn off error halting behaviour in play and allow to pass this step of the role configuration to proceed on to logstash configuration, which will then work as normal.

Multiline filter issue with solr

Thereis a multiline filter reference in this filter that fails in Logstash 6.12 because there is no multiline filter installed anymore, just the codec:

ansible-role-logstash/files/filters/14-solr.conf

  multiline {
      pattern => "(([^\s]+)Exception.+)|(at:.+)"
      stream_identity => "%{logsource}.%{@type}"
      what => "previous"
    }

The symptom was Logstash would start and would not bind on port 5044. The log showed an error

ERROR] 2018-09-29 01:26:13.695 [LogStash::Runner] agent - Cannot create pipeline {:reason=>"Couldn't find any filter plugin named 'multiline'. Are you sure this is correct? Trying to load the multiline filter plugin resulted in this error: Problems loading the requested plugin named multiline of type filter. Error: NameError NameError"}

Removing this conf allowed Logstash to listen on port 5044 successfully. It was a bit confusing because the process was started, just not listening on the beats input port.

Enable installation of plugins

@geerlingguy I was thinking about making a pull request with the options to enable/install (some) logstash plugins.
Is this something you'd like in this repo, or would you prefer I create a separate role, tagging this as a dependency?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.