Code Monkey home page Code Monkey logo

Comments (10)

okkez avatar okkez commented on June 7, 2024 2

Now it works, it means two records of same id neeed to be appear in 5 seconds?

Yes. You can customize via flush_interval.

You can try https://github.com/fluent/fluentd-benchmark to estimate performance.

from fluent-plugin-concat.

okkez avatar okkez commented on June 7, 2024 1

It is expected behavior. Check the document of timeout_label and try followings.

<source>
  @type dummy
  dummy [
   {"date": "2018-10-10", "id": "unique_id_01", "payload": "start"},
   {"date": "2018-10-10", "id": "unique_id_02", "payload": "start"},
   {"date": "2018-10-10", "id": "unique_id_01", "payload": "stop success"},
   {"date": "2018-10-10", "id": "unique_id_02", "payload": "stop fail"}
  ]
  tag dummy
  @label @INPUT
</source>

<label @INPUT>
  <filter dummy>
    @type concat
    key payload
    stream_identity_key id
    multiline_start_regexp /^start/
    flush_interval 5
    timeout_label @NORMAL
  </filter>
  <match dummy>
    @type relabel
    @label @NORMAL
  </match>
</label>

<label @NORMAL>
  <match dummy>
    @type stdout
  </match>
</label>
2018-10-16 18:35:21.030265578 +0900 dummy: {"date":"2018-10-10","id":"unique_id_01","payload":"start\nstop success"}
2018-10-16 18:35:22.031409080 +0900 dummy: {"date":"2018-10-10","id":"unique_id_02","payload":"start\nstop fail"}
2018-10-16 18:35:25.034645575 +0900 dummy: {"date":"2018-10-10","id":"unique_id_01","payload":"start\nstop success"}
2018-10-16 18:35:26.035902926 +0900 dummy: {"date":"2018-10-10","id":"unique_id_02","payload":"start\nstop fail"}

from fluent-plugin-concat.

okkez avatar okkez commented on June 7, 2024

Please use stream_identity_key.

<source>
  @type tail
  path /path/to/log
  <parse>
    @type regexp
    expression /(?<date>2018-10-10) (?<unique_id>unique_id_\d\d) (?<message>.+)/
  </parse>
</source>

<filter>
  @type concat
  key message
  stream_identity_key unique_id
  multiline_start_regexp /^start/
</filter>

from fluent-plugin-concat.

like-inspur avatar like-inspur commented on June 7, 2024

I add config for input and filter, and want to output records to elasticsearch. But there is no any output about this input, and /var/log/td-agent/td-agent.log show this:
2018-10-16 14:54:39 +0800 [info]: #0 disable filter chain optimization because [Fluent::Plugin::ConcatFilter, Fluent::RecordModifierFilter] uses #filter_stream method.

from fluent-plugin-concat.

okkez avatar okkez commented on June 7, 2024

Can you show me your full configuration and example logs?

from fluent-plugin-concat.

like-inspur avatar like-inspur commented on June 7, 2024

current config:
7zmrt0nzl3 h h3 l bl90
y2 i08bczi 0r eeiww20 c

example logs:
2018-10-10 unique_id_01 start
2018-10-10 unique_id_02 start
2018-10-10 unique_id_01 stop success
2018-10-10 unique_id_02 stop fail

from fluent-plugin-concat.

okkez avatar okkez commented on June 7, 2024

First, check your configuration using filter_stdout like followings:

<source>
  @type tail
  # ... snip...
  tag audit
</source>
<filter audit>
  @type stdout # check parsed log
</filter>
<filter audit>
  @type concat
  # ... snip ...
</filter>
<filter audit>
  @type stdout # check concatenated log
</filter>
# ... snip ...
<match audit>
  @type copy
  <store>
    @type stdout
  </store>
  <store>
    @type elasticsearch
    # ... snip ...
  </store>
</match>

from fluent-plugin-concat.

like-inspur avatar like-inspur commented on June 7, 2024

There is no error in parsing logs, but no output in concatenating logs but an warn after a while like this:
2018-10-16 19:24:08 +0800 [warn]: #0 dump an error event: error_class=Fluent::Plugin::ConcatFilter::TimeoutError error="Timeout flush: audit:unique_id_01" location=nil tag="audit" time=2018-10-16 19:24:08.770271142 +0800 record={"date"=>"2018-10-10", "id"=>"unique_id_01", "Payload"=>"start\nstop success"}

from fluent-plugin-concat.

like-inspur avatar like-inspur commented on June 7, 2024

Now it works, it means two records of same id neeed to be appear in 5 seconds? But how can I estimate fluentd can handle how many records every second

from fluent-plugin-concat.

okkez avatar okkez commented on June 7, 2024

Closing.

from fluent-plugin-concat.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.