Code Monkey home page Code Monkey logo

Comments (19)

danhanks avatar danhanks commented on August 18, 2024 1

I can't recall all the reasons why, but I ended up writing it as a standalone daemon/exporter that watches (via inotify) for json datafiles generated by dsc. It parses those files, and generates a bunch of metrics that can then be scraped regularly by Prometheus. I imagine some of this code could also be re-purposed into an output module for dsc-datatool, but I need to do more reading about dsc-datatool to see how that would work.

from dsc-datatool.

Daniel15 avatar Daniel15 commented on August 18, 2024

@danhanks in DNS-OARC/dsc#204 you said that you were writing a script to run in cron that would convert the data to a format that Prometheus can use. Did you end up implementing that?

I haven't looked too far into it but I guess using https://github.com/prometheus/influxdb_exporter could be an option as well. It exposes an InfluxDB-like web API and converts all metrics to Prometheus format.

from dsc-datatool.

jelu avatar jelu commented on August 18, 2024

Hey @Daniel15, since Dan hasn't responded, are you going ahead with the development of this?

from dsc-datatool.

danhanks avatar danhanks commented on August 18, 2024

@Daniel15, @jelu,

Yes, I did end up writing that script. I would be happy to contribute it, just need to get permission from my employer.

from dsc-datatool.

jelu avatar jelu commented on August 18, 2024

@danhanks Great to hear, hope you can share. I would be very nice if this then can be turned into an output module for dsc-datatool.

from dsc-datatool.

danhanks avatar danhanks commented on August 18, 2024

@Daniel15 @jelu,

I have received approval to contribute this code. Where in the repo would it make sense to put it? contrib, maybe?

from dsc-datatool.

jelu avatar jelu commented on August 18, 2024

@danhanks Just put it anywhere really, or do a gist. Will look at reworking it into a plugin once I'm back from holidays.

from dsc-datatool.

jelu avatar jelu commented on August 18, 2024

@danhanks Back from holidays, did you put the code somewhere?

from dsc-datatool.

danhanks avatar danhanks commented on August 18, 2024

@jelu Thanks for the reminder. Here you go: https://gist.github.com/danhanks/9c59734f380ac56a8c1bdb7bec54bdb4

Let me know if you have any questions.

from dsc-datatool.

jelu avatar jelu commented on August 18, 2024

Thanks @danhanks.

By the looks of it it's not something I can add as crontrib right away, nor make into a dsc-datatool module.

The script seems to be very specific for your needs and your setup. It assumes a lot of things that might not be what others want/have. Someone would need to work a bit on it to make it more suitable for anyone, like command line options for most stuff and maybe use dnspython to get DNS numbers to text conversion rather then hardcoded lists.

An output module could be made but it would likely solely depend on node_exporter or some other mechanic for delivering the stats to Prometheus. While the format are similar they are not the same, things like grouping, help text, histogram and summaries are not something that is done in InfluxDB - that was done in Grafana. I could probably create an output module quite quickly but I would need someone with Prometheus knowledge and a setup to test it.

Do any of you (@danhanks @Daniel15) have time/want to take on any of this?

from dsc-datatool.

jguidini avatar jguidini commented on August 18, 2024

Hi All!
A prometheus exporter for DSC is great!! I'm looking for this some time. Here I cannot use InfluxDB (like needed by dsc-datatool), so grafana cannot be used to show data.
I can test some sort of software, if needed.
Thanks all by idea and effort!

from dsc-datatool.

jelu avatar jelu commented on August 18, 2024

@jguidini Are you able to use Prometheus node_exporter in your setup?

from dsc-datatool.

jguidini avatar jguidini commented on August 18, 2024

@jelu Yes.

from dsc-datatool.

jelu avatar jelu commented on August 18, 2024

@jguidini Please try this branch with the node_exporter.

You'll need to use the new output --output ";Prometheus;file=<file>" and put the file somewhere before moving it to node exporter directory (as describe in their link above).

This generates output as:

# TYPE pcap_stats counter
pcap_stats{server="test-server",node="test-node",pcap_stat="filter_received",ifname="eth0"} 5625 1563520560000
pcap_stats{server="test-server",node="test-node",pcap_stat="kernel_dropped",ifname="eth0"} 731 1563520560000
pcap_stats{server="test-server",node="test-node",pcap_stat="pkts_captured",ifname="eth0"} 4894 1563520560000

from dsc-datatool.

jelu avatar jelu commented on August 18, 2024

@jguidini if you run into problem or need help setting it up maybe it's easier if we talk on OARC's Mattermost, find me here: https://chat.dns-oarc.net/community/channels/oarc-software.

from dsc-datatool.

jguidini avatar jguidini commented on August 18, 2024

@jelu I'm had installed a prometheus and configure a node_exporter to an DNS server in our site. I'd installed dsc-datatool and generated a file in Prometheus style. On grafana added your dashboards, but now I'm working on how to node_exporter read the file generated by dsc-datatools (like --collector.textfile.directory /opt/prometheus/data) to prometheus collect them.

from dsc-datatool.

jguidini avatar jguidini commented on August 18, 2024

@jelu I found the error, from debug log on node_exporter:

Jan 18 16:39:15 bee10 node_exporter[4794]: ts=2022-01-18T19:39:15.328Z caller=textfile.go:219 level=error collector=textfile msg="failed to collect textfile data" file=datatool.prom err="failed to parse textfile data from \"/opt/prometheus/data/datatool.prom\": text format parsing error in line 120: invalid escape sequence '\\='"

From file (datatool.prom):

   120 asn_all{server="bee10",node="recursivo",ipversion="IPv4",asn="PzQ\="} 60 1642521960000
   121 asn_all{server="bee10",node="recursivo",ipversion="IPv6",asn="PzY\="} 27 1642521960000
   122 country_code{server="bee10",node="recursivo",countrycode="BR"} 125 1642521960000
   123 country_code{server="bee10",node="recursivo",countrycode="PzQ\="} 60 1642521960000
   124 country_code{server="bee10",node="recursivo",countrycode="PzY\="} 27 1642521960000

To test I removed \= on all file (old friend sed), then from node_exporter log:

Jan 18 16:45:15 bee10 node_exporter[5934]: ts=2022-01-18T19:45:15.368Z caller=textfile.go:219 level=error collector=textfile msg="failed to collect textfile data" file=datatool.prom err="failed to parse textfile data from \"/opt/prometheus/data/datatool.prom\": text format parsing error in line 4054: invalid escape sequence '\\ '"

Again sed on some entries.. and another error:

Jan 18 16:50:01 bee10 node_exporter[6873]: ts=2022-01-18T19:50:01.268Z caller=textfile.go:219 level=error collector=textfile msg="failed to collect textfile data" file=datatool.prom err="textfile \"/opt/prometheus/data/datatool.prom\" contains unsupported client-side timestamps, skipping entire file"

Now i'm not know how to solve this.

from dsc-datatool.

jelu avatar jelu commented on August 18, 2024

@jguidini I've fixed the quoting, you can git pull to get the updated code.

I've also removed timestamps because while the format supports it, I found Note: Timestamps are not supported. in node_exporter docs 😕

I see that you've joined our Mattermost, so lets continue discussions there 🙂

from dsc-datatool.

jelu avatar jelu commented on August 18, 2024

Is there anyone here that is up for writing a guide how to set this up?

   Prometheus' node_exporter
       This output can be used together with Prometheus' node_exporter's Textfile Collector
       to automate statistics gathering but some specific setup and  requirements  must  be
       meet.

       You  must  hide  the timestamp with option timestamp=hide because timestamps are not
       supported by the Textfile Collector.

       You must make sure only one XML file from a server+node combination is processed  at
       a time.  Because otherwise you will have multiple data point from the same metric in
       the files generated and because the Textfile Collector does not  support  timestamps
       it cannot separate the measurements.

       You  must  make sure that only one file (per server+node combo) is generated for the
       Textfile Collector to read, and it should be the same between  runs.   See  Textfile
       Collectors documentation how to setup that atomically.

from dsc-datatool.

Related Issues (10)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.