Code Monkey home page Code Monkey logo

gelf's Introduction

Log4JS - GELF appender

This is an optional appender for log4js-node.

npm install @log4js-node/gelf

The GELF appender supports sending log messages over UDP to a GELF compatible server such as Graylog. It uses node's core UDP support and does not require any other dependencies. If you use this appender, remember to call log4js.shutdown when your application terminates, so that all messages will have been sent to the server and the UDP socket can be closed. The appender supports passing custom fields to the server in both the config, and in individual log messages (see examples below).

Configuration

  • type - @log4js-node/gelf
  • host - string (defaults to localhost) - the gelf server hostname
  • port - integer (defaults to 12201) - the port the gelf server is listening on
  • hostname - string (defaults to OS.hostname()) - the hostname used to identify the origin of the log messages.
  • facility - string (optional)
  • customFields - object (optional) - fields to be added to each log message; custom fields must start with an underscore.

Example (default config)

log4js.configure({
  appenders: {
    gelf: { type: '@log4js-node/gelf' }
  },
  categories: {
    default: { appenders: ['gelf'], level: 'info' }
  }
});

This will send log messages to a server at localhost:12201.

Example (custom fields in config)

log4js.configure({
  appenders: {
    gelf: { type: '@log4js-node/gelf', host: 'gelf.server', customFields: { '_something': 'yep' } }
  },
  categories: {
    default: { appenders: ['gelf'], level: 'info' }
  }
});

This will result in all log messages having the custom field _something set to 'yep'.

Example (custom fields in log message)

log4js.configure({
  appenders: {
    gelf: { type: '@log4js-node/gelf', customFields: { '_thing': 'isathing' } }
  },
  categories: {
    default: { appenders: ['gelf'], level: 'info' }
  }
});
const logger = log4js.getLogger();
logger.error({ GELF: true, _thing2: 'alsoathing' }, 'oh no, something went wrong');

This will result in a log message with the custom fields _thing and _thing2. Note that log message custom fields will override config custom fields.

gelf's People

Contributors

andyschwarzl avatar dependabot[bot] avatar lamweili avatar nomiddlename avatar sockethe2nd avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

gelf's Issues

short_message not appearing

I'm in the process of testing out log4js-node as a replacement for bunyan, and checking that the gelf transport works as expected.

I'm using gelf to send data to logstash. It's working for the most part, except that short_message doesn't appear in kibana. If I change the code to use _short_message instead, it works as expected. I've tried several different layouts - currently testing with dummy, but I've tested basic and the default.

My apender looks like this:

gelf: {type: '@log4js-node/gelf', host: 'xxx', port: 12201, facility: 'log-test', layout: {type: 'dummy'}, customFields: {_foo: 'bar'}}

This is really weird. The facility property gets sent correctly, timestamp, host, level etc all work.

According to the logstash docs short_message is expected, and it works with the existing bunyan setup.

I can't see any important difference between the code in this repo vs the gelf-stream package used for bunyan.

I also checked gelf-stream's dep gelfling and it's not adding the _.

I am still looking for configuration issues that could be causing this, but I'm pretty confused because the messages are being sent to a different index but the same server, with all the same config as the working bunyan version.

Any ideas?

Not Able to specify timestamp with the message

Hi,
I am using @log4js-node/[email protected]. I am sending a message with timestamp info, which is not being picked up by the module.

const log4js = require('log4js');

log4js.configure({
  appenders: {
    gelf: { type: '@log4js-node/gelf', host: 'localhost', port: '12201', }
  },
  categories: {
    default: { appenders: ['gelf'], level: 'info'  }
  }
});

module.exports = log4js;

const logger = log4js.getLogger();

logger.info({ GELF: true, _tool: 'proxy-parser', _uniqueId: 'test', _timestamp: 1611283489  }, 'very very old message');

But the message is getting logged with the current timestamp on graylog.

Looks like the issue with this line

msg.timestamp = msg.timestamp || new Date().getTime() / 1000; // log should use millisecond

It should have been

msg.timestamp = msg._timestamp || new Date().getTime() / 1000; // log should use millisecond

Can this be fixed?

Thank You,
Phani

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.