opc40772 / pfsense-graylog Goto Github PK
View Code? Open in Web Editor NEWPfsense Logs Parsed by Graylog
License: GNU General Public License v3.0
Pfsense Logs Parsed by Graylog
License: GNU General Public License v3.0
Hi,
Thank you for this, it's really good stuff, excellent work!
I'm having problems using the dashboard provided, if I try to use it, all the information on it isn't loaded (only the interfaces and IPs... for what I can identify, I think this is cause by the use of a field that in all the project isn't created on the Graylog... the field is: timestamp_graf and is used by several panels on the dashboard.
Can you please clarify if that field was intended to be the "real_timestamp" field?
Regards.
I have a graylog cluster setup, and receiving the following error in the logs;
2018-09-26T14:08:30.076+01:00 INFO [InputStateListener] Input [Syslog UDP/5a982448687cf8128c10ce6e] is now STOPPING
2018-09-26T14:08:30.077+01:00 INFO [InputStateListener] Input [Syslog UDP/5a982448687cf8128c10ce6e] is now STOPPED
2018-09-26T14:08:30.077+01:00 INFO [InputStateListener] Input [Syslog UDP/5a982448687cf8128c10ce6e] is now TERMINATED
2018-09-26T14:08:30.078+01:00 INFO [InputStateListener] Input [Syslog UDP/5a982448687cf8128c10ce6e] is now STARTING
2018-09-26T14:08:30.115+01:00 INFO [connection] Opened connection [connectionId{localValue:19, serverValue:5956}] to syslog01:27017
2018-09-26T14:08:30.118+01:00 ERROR [BundleImporter] Error while creating entities in content pack. Starting rollback.
java.lang.IllegalStateException: Configured lookup table doesn't exist
at org.graylog2.inputs.extractors.LookupTableExtractor.(LookupTableExtractor.java:62) ~[graylog.jar:?]
at org.graylog2.inputs.extractors.ExtractorFactory.factory(ExtractorFactory.java:72) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.addExtractor(BundleImporter.java:435) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.addExtractors(BundleImporter.java:422) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.createMessageInput(BundleImporter.java:400) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.createInputs(BundleImporter.java:356) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.runImport(BundleImporter.java:187) [graylog.jar:?]
at org.graylog2.bundles.BundleService.applyConfigurationBundle(BundleService.java:112) [graylog.jar:?]
at org.graylog2.bundles.BundleService.applyConfigurationBundle(BundleService.java:105) [graylog.jar:?]
at org.graylog2.rest.resources.system.bundles.BundleResource.applyBundle(BundleResource.java:184) [graylog.jar:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$VoidOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:143) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99) [graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389) [graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347) [graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102) [graylog.jar:?]
at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:326) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:315) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:297) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:267) [graylog.jar:?]
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317) [graylog.jar:?]
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:305) [graylog.jar:?]
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1154) [graylog.jar:?]
at org.glassfish.jersey.grizzly2.httpserver.GrizzlyHttpContainer.service(GrizzlyHttpContainer.java:384) [graylog.jar:?]
at org.glassfish.grizzly.http.server.HttpHandler$1.run(HttpHandler.java:224) [graylog.jar:?]
at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:176) [graylog.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
2018-09-26T14:08:30.118+01:00 ERROR [BundleImporter] Error while removing grok patterns during rollback.
java.lang.IllegalArgumentException: invalid hexadecimal representation of an ObjectId: [PFSENSE_LOG_DATA]
at org.bson.types.ObjectId.parseHexString(ObjectId.java:549) ~[graylog.jar:?]
at org.bson.types.ObjectId.(ObjectId.java:239) ~[graylog.jar:?]
at org.graylog2.grok.MongoDbGrokPatternService.load(MongoDbGrokPatternService.java:61) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.deleteCreatedGrokPatterns(BundleImporter.java:267) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.rollback(BundleImporter.java:228) [graylog.jar:?]
at org.graylog2.bundles.BundleImporter.runImport(BundleImporter.java:197) [graylog.jar:?]
at org.graylog2.bundles.BundleService.applyConfigurationBundle(BundleService.java:112) [graylog.jar:?]
at org.graylog2.bundles.BundleService.applyConfigurationBundle(BundleService.java:105) [graylog.jar:?]
at org.graylog2.rest.resources.system.bundles.BundleResource.applyBundle(BundleResource.java:184) [graylog.jar:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$VoidOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:143) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99) [graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389) [graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347) [graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102) [graylog.jar:?]
at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:326) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:315) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:297) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:267) [graylog.jar:?]
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317) [graylog.jar:?]
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:305) [graylog.jar:?]
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1154) [graylog.jar:?]
at org.glassfish.jersey.grizzly2.httpserver.GrizzlyHttpContainer.service(GrizzlyHttpContainer.java:384) [graylog.jar:?]
at org.glassfish.grizzly.http.server.HttpHandler$1.run(HttpHandler.java:224) [graylog.jar:?]
at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:176) [graylog.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
2018-09-26T14:08:30.119+01:00 ERROR [BundleImporter] Rollback unsuccessful.
2018-09-26T14:08:30.119+01:00 ERROR [AnyExceptionClassMapper] Unhandled exception in REST resource
java.lang.RuntimeException: java.lang.IllegalStateException: Configured lookup table doesn't exist
at org.graylog2.bundles.BundleImporter.runImport(BundleImporter.java:200) ~[graylog.jar:?]
at org.graylog2.bundles.BundleService.applyConfigurationBundle(BundleService.java:112) ~[graylog.jar:?]
at org.graylog2.bundles.BundleService.applyConfigurationBundle(BundleService.java:105) ~[graylog.jar:?]
at org.graylog2.rest.resources.system.bundles.BundleResource.applyBundle(BundleResource.java:184) ~[graylog.jar:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) ~[graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144) ~[graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161) ~[graylog.jar:?]
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$VoidOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:143) ~[graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99) ~[graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389) ~[graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347) ~[graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102) ~[graylog.jar:?]
at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:326) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:315) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:297) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:267) [graylog.jar:?]
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317) [graylog.jar:?]
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:305) [graylog.jar:?]
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1154) [graylog.jar:?]
at org.glassfish.jersey.grizzly2.httpserver.GrizzlyHttpContainer.service(GrizzlyHttpContainer.java:384) [graylog.jar:?]
at org.glassfish.grizzly.http.server.HttpHandler$1.run(HttpHandler.java:224) [graylog.jar:?]
at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:176) [graylog.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
Caused by: java.lang.IllegalStateException: Configured lookup table doesn't exist
at org.graylog2.inputs.extractors.LookupTableExtractor.(LookupTableExtractor.java:62) ~[graylog.jar:?]
at org.graylog2.inputs.extractors.ExtractorFactory.factory(ExtractorFactory.java:72) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.addExtractor(BundleImporter.java:435) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.addExtractors(BundleImporter.java:422) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.createMessageInput(BundleImporter.java:400) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.createInputs(BundleImporter.java:356) ~[graylog.jar:?]
at org.graylog2.bundles.BundleImporter.runImport(BundleImporter.java:187) ~[graylog.jar:?]
... 30 more
2018-09-26T14:08:30.131+01:00 WARN [NettyTransport] receiveBufferSize (SO_RCVBUF) for input SyslogUDPInput{title=Pfsense-Logs, type=org.graylog2.inputs.syslog.udp.SyslogUDPInput, nodeId=20c96645-7700-48ea-9ede-f828c812397b} should be 262144 but is 212992.
2018-09-26T14:08:30.130+01:00 INFO [connection] Opened connection [connectionId{localValue:18, serverValue:5955}] to syslog01:27017
2018-09-26T14:08:30.133+01:00 INFO [InputStateListener] Input [Syslog UDP/5a982448687cf8128c10ce6e] is now RUNNING
From what I can tell it is complaining about the service ports, but the csv file is in the specified location
Hello
I can not log in graylog container as root user. Can any one help me about this?
While trying to import your pfsense_custom_template
into my ES via cerebro
, i am getting following error:
[2018-06-26 12:58:30,262][DEBUG][action.admin.indices.template.put] [Sabra] failed to put template [pfsense-custom] MapperParsingException[Failed to parse mapping [message]: No handler for type [keyword] declared on field [PFSENSE_UDP_DATA]]; nested: MapperParsingException[No handler for type [keyword] declared on field [PFSENSE_UDP_DATA]]; at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:291) at org.elasticsearch.cluster.metadata.MetaDataIndexTemplateService.validateAndAddTemplate(MetaDataIndexTemplateService.java:213) at org.elasticsearch.cluster.metadata.MetaDataIndexTemplateService.access$200(MetaDataIndexTemplateService.java:57) at org.elasticsearch.cluster.metadata.MetaDataIndexTemplateService$2.execute(MetaDataIndexTemplateService.java:157) at org.elasticsearch.cluster.ClusterStateUpdateTask.execute(ClusterStateUpdateTask.java:45) at org.elasticsearch.cluster.service.InternalClusterService.runTasksForExecutor(InternalClusterService.java:480) at org.elasticsearch.cluster.service.InternalClusterService$UpdateTask.run(InternalClusterService.java:784) at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:231) at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:194) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: MapperParsingException[No handler for type [keyword] declared on field [PFSENSE_UDP_DATA]] at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:307) at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:222) at org.elasticsearch.index.mapper.object.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:139) at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:118) at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:99) at org.elasticsearch.index.mapper.MapperService.parse(MapperService.java:549) at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:319) at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:289)
i am using ES 2.4.6 with cerebro 0.8.1 and graylog 2.4.5
Anyone an idea what is wrong here?
My grafana dashboard is blank. I think I got all the steps in the document.
Does anyone have any clues on what I could be missing?
Elastic search version
apt-cache policy elasticsearch
elasticsearch: Installed: 5.6.14
Graylog server
apt-cache policy graylog-server
graylog-server: Installed: 2.4.6-1
Grafana
grafana: Installed: 5.4.2
I can see only ovpns statistics in Grafana. My pfsense interface are lagg0.{VLAN_ID}.
Looking into Graylog stream I see pfsense fields populated only for ovpns related items.
I'm new to Graylog an Grafana world but I think my issue is in the grok pattern. Using these two logs messages as example:
filterlog: 475,,,1424803213,lagg0.31,match,block,in,4,0x0,,64,39847,0,DF,6,tcp,60,192.168.31.168,95.100.81.146,52414,80,0,S,358918382,,29200,,mss;sackOK;TS;nop;wscale
filterlog: 9,,,1000000103,ovpns1,match,block,in,4,0x0,,1,59729,0,DF,17,udp,199,10.0.8.26,239.255.255.250,59296,1900,179
I tested the grok pattern with a grok pattern tester and the first example fails, stopping at iface field, which is parsed as lagg0.
I edited the grok patterns replacing the expression WORD:iface with USERNAME:iface
%{INT:rule},%{INT:sub_rule}?,,%{INT:tracker},%{USERNAME:iface},%{WORD:reason},%{WORD:action},%{WORD:direction},
Now the stream fill the fields in the correct way, but I can't see any interface except ovpns in grafana.
Hello,
First of all, many thanks for the great job.
I have one question about Pipeline rule:
Could you please explain me the following line ?
let source_timestamp = parse_date(substring(to_string(now("America/Habana")),0,23), "yyyy-MM-dd'T'HH:mm:ss.SSS");
In my Pipeline statistics, no message is matching the rule. It's probably the reason I see nothing in my Grafana dashboard.
Many thanks.
Changed it to
{ "order": -1, "index_patterns": "pfsense_*", "settings": { "index": { "analysis": { "analyzer": { "analyzer_keyword": { "filter": "lowercase", "tokenizer": "keyword" } } }, "max_result_window": "1000000" } }, "mappings": { "message": { "_source": { "enabled": true }, "dynamic_templates": [ { "internal_fields": { "mapping": { "type": "keyword" }, "match_mapping_type": "string", "match": "gl2_*" } }, { "store_generic": { "mapping": { "type": "keyword" }, "match_mapping_type": "string" } } ], "properties": { "gl2_processing_timestamp": { "format": "yyyy-MM-dd HH:mm:ss.SSS", "type": "date" }, "gl2_accounted_message_size": { "type": "long" }, "gl2_receive_timestamp": { "format": "yyyy-MM-dd HH:mm:ss.SSS", "type": "date" }, "full_message": { "fielddata": false, "analyzer": "standard", "type": "text" }, "streams": { "type": "keyword" }, "source": { "fielddata": true, "analyzer": "analyzer_keyword", "type": "text" }, "message": { "fielddata": false, "analyzer": "standard", "type": "text" }, "timestamp": { "format": "yyyy-MM-dd HH:mm:ss.SSS", "type": "date" } } } }, "aliases": {} }
This is now broken due to changes in Graylog 3.x (and a critical support package dropping support for underscores).
Graylog2/graylog2-server#5704
thekrakken/java-grok#108
https://community.graylog.org/t/upgrade-of-graylog-from-2-5-x-to-3-x-results-in/9368
I'm on Graylog 2.4.0 and I created the Index, but when I go to apply the imported PFsense Content Pack I get a general error to check logs. I do see it created the Input and Extractor, but not the Stream and remaining. Any help would be appreciated.
Hi,
How i can install Cerebro in Ubuntu Can you please help me out?
When running the Query Inspector, data returns as normal. When looking at the dashboard, I get No Data Available in every panel or when letting the query go for more than 6 hours in history, I get a Failed To Parse Query when I pull up the Dashboard now. I had tons of firewall events coming in and all of a sudden they stopped. Please help? Green elasticsearch cluster, and all.
Please include this information:
Query: iface:$iface AND src_ip:$src_ip
Query Inspector Response: response: Object
responses:Array[1]
0:Object
took:1
timed_out:false
_shards:Object
total:8
successful:8
skipped:0
failed:0
hits:Object
total:0
max_score:0
hits:Array[0]
aggregations:Object
2:Object
buckets:Array[901]
status:200
How are you getting the geolocation data from pfsense? My pfsense is only sending in the IPs in the firewall log, not the geohash, country code, or lat/long.
After checking out my stream I can see
But not the rest of the expected fields. If I select "All Field" and for example, select "Action" Nothing appears as I would expect. In turn this means I have no results in Grafana.
Currently Running:
ElasticSearch: v5.6.12
PfSense: 2.4.4-RELEASE-p1 (amd64)
Graylog: v2.4.7
Hi,
first of all, thank your efforts and info on setting up an easy way to get pfsense firewall log into a nice dashboard.
I've follow the instructions and also this thread : #5
Now I finally have some data in my grafana dashboard, but some graphs are still empty ("no datapoints" or "no data available") :
These graphs are :
and in the realtime logs by iface, source and destination portname are empty
regarding the portname, maybe this has something todo with: the lookup in the CSV ?
java.lang.NullPointerException: null at org.graylog2.lookup.adapters.CSVFileDataAdapter.doRefresh(CSVFileDataAdapter.java:102) ~[graylog.jar:?] at org.graylog2.plugin.lookup.LookupDataAdapter.refresh(LookupDataAdapter.java:89) ~[graylog.jar:?] at org.graylog2.lookup.LookupDataAdapterRefreshService.lambda$schedule$0(LookupDataAdapterRefreshService.java:142) ~[graylog.jar:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_191] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [?:1.8.0_191] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_191] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [?:1.8.0_191] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_191] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_191] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
Anyone some advice or help?
I am new to the whole graylog world, but data is coming thru from pfsense but everything ends up in the default stream and not in "pfsense logs" where have I taken a wrong turn?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.