Code Monkey home page Code Monkey logo

logflow's Introduction

LogFlow

An simple log agent aimed for use with Elasticsearch and Kibana. The main use is to take a line of log from disk and tranform it to a json that can be index by ElasticSearch.

Install from nuget

Command line:

cmd> nuget install LogFlow

Package Manager console in Visual Studio:

PM> Install-Package LogFlow

Example

Read and send a log line

2013-09-11 11:53:43 WARN All log are belong to us

Send it to ElasticSearch formated JSON

{ 
	'@timestamp': '2013-09-11 11:53:43'
	'LogLevel': 'WARN'
	'Message': 'All log are belong to us'	 
}

Project setup

Create a project of type Class Library Install nuget package

PM> Install-Package LogFlow

Two config files used by log flow will be added to your project. LogFlow.exe.config, setup for storage and web interface, and NLog.config, log configuration.

Flow

A flow has and one input, multiple processors and one output.

public class MyLogFlow : Flow
    {
        public MyLogFlow()
        {
            CreateProcess("InsteadOfClassName")
                .FromInput(new FileInput("C:\\MyLogPath", Encoding.UTF8, true))
                .Then(new MyLogLineParser())
                .ToOutput(new ElasticSearchOutput(new ElasticSearchConfiguration()
                {
                    Host = "localhost",
                    Port = 9200,
                    IndexNameFormat = @"\m\y\L\o\g\s\-yyyyMM" //new index each month
                }));

        }
    }

Included inputs are FileInput

Included outputs are ElasticSearchOutput and ConsoleOutput

No processor are included.

Processors

A processor takes a result and returns a result. Result contains the read line, the resulting json structure, event time stamp and the ability to cancel the flow and read next line.

public class MyLogLineParser : LogProcessor
    {
        public override Result Process(Result result)
        {
            var logLine = result.Line;
            var timestampPosition = logLine.IndexOf(" ", logLine.IndexOf(" ") + 1);
            var eventTimeStamp = DateTime.Parse(logLine.Substring(0, timestampPosition).Trim(), CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal | DateTimeStyles.AdjustToUniversal);
            var logLevelPosition = logLine.IndexOf(" ", timestampPosition + 1);
            var logLevel = logLine.Substring(timestampPosition, logLevelPosition - timestampPosition).Trim();
            var message = logLine.Substring(logLine.IndexOf(" ", logLevelPosition)).Trim();

            result.EventTimeStamp = eventTimeStamp;
            result.Json.Add("LogLevel", new JValue(logLevel));
            result.Json.Add("Message", new JValue(message));

            return result;
        }
    }

Try not to over write the Json property only add to it because erlier steps might have added data to it. There is MetaData to help you to transport data in the flow without it saving to the output.

Set up debugging

  • Sucessfully build the project in Debug
  • Project properties > Debug > Start external program > Select logflow.exe in debug folder.
  • Press F5

Install as a service

LogFlow.exe install

logflow's People

Contributors

andlju avatar captainjinx avatar emilcardell avatar gittest0101 avatar hcanber avatar laviniac avatar madstt avatar offerta-jenkins avatar torkelo avatar wallymathieu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

logflow's Issues

Skip lines

How do I skip lines?

IIS log files start with some column description lines. When I return a null result or just an empty result from the LogParser I get exceptions.

Need to clear cache and restart service to index log rows

Hello,
we've noticed that sometimes (and sorry to be vague here but we're not exactly sure of how and when this appears) we need to stop the log flow service, clear the temporary files stored in StoragePath and then restart the service again to index log rows.

During the period logflow does not appear to index anything even though there are rows to index the following line is repeated (with different time stamps ofc) in the logflow log file:

2015-03-09 09:54:52.5988 TRACE PaymentFlow: Enqueuing file 'd:\path\current.log' for processing.

After a restart of the service the following is printed in the log file:

2015-03-09 14:23:47.6314 INFO PaymentFlow: Stopping.
2015-03-09 14:23:48.6315 INFO PaymentFlow: Stopped FileSystemWatcher for path D:\path\current.log
2015-03-09 14:23:49.6470 INFO PaymentFlow: Stopped.
2015-03-09 14:24:07.3814 TRACE Starting
2015-03-09 14:24:07.3971 TRACE Assembly Path:D:(path)
2015-03-09 14:24:07.7720 TRACE Number of flows found: 1
2015-03-09 14:24:07.8033 INFO PaymentFlow: Starting.
2015-03-09 14:24:07.8033 INFO PaymentFlow: Starting FileInput.
2015-03-09 14:24:07.8033 INFO PaymentFlow: Adding all current files as changed.
2015-03-09 14:24:07.8033 TRACE PaymentFlow: Enqueuing file 'D:\path\current.log' for processing.
2015-03-09 14:24:07.8033 INFO PaymentFlow: Started FileSystemWatcher for path D:\path\current.log
2015-03-09 14:24:07.8033 INFO PaymentFlow: Started.
2015-03-09 14:24:07.8346 TRACE PaymentFlow: (2fbcca25-8454-43f2-bf5d-ef21b08ecaa5) from 'D:\payment\current.log' at byte position 0.
2015-03-09 14:24:07.8346 TRACE PaymentFlow: (2fbcca25-8454-43f2-bf5d-ef21b08ecaa5) line '(logrow)' read.
2015-03-09 14:24:08.1939 TRACE PaymentFlow: (2fbcca25-8454-43f2-bf5d-ef21b08ecaa5) Indexed successfully.

We're using nlog to create the log file.

  <nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <include file="${basedir}/NLogLayouts.config" />
    <targets async="true">
      <target name="file" xsi:type="File" layout="${FileLayout}" fileName="D:\path\current.log" archiveFileName="D:\path\{#}.log" archiveEvery="Day" archiveNumbering="Date" archiveDateFormat="yyyy-MM-dd" maxArchiveFiles="7" concurrentWrites="false" />
    </targets>
    <rules>
      <logger name="*" minlevel="Info" writeTo="file" />
    </rules>
  </nlog>

Any ideas? =)

Failed to try examples from source

I tried running the LogFlow.Examples (from source in VS) but Nancy seems to have a dependency on Newtonsoft json 4.5

When I disabled EnableNancyHealthModule (set to false in config) I get another assembly loading error. Not sure what yet.. will have to check fusion assembly loading logs.

Very line oriented

It would nice to pipe xml logs using this tool. For instance log4j or log4net formated file logs.

Performance and flow

I am not really sure how the FileInput is supposed to work.

It looks like it is opening and closing the file every line read.

The reason I ask is that I see extremely poor performance. When using FileInput and piped to console out, I only get about 100 lines per second.

File Path formats

It appears that I you cannot use a single FileInput to get mulitple logs in many directories.

new FileInput(@"D:\Logs\**\*.log" , Encoding.UTF8, true)

Did not seem to work.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.