defenxor / dsiem Goto Github PK
View Code? Open in Web Editor NEWSecurity event correlation engine for ELK stack
License: GNU General Public License v3.0
Security event correlation engine for ELK stack
License: GNU General Public License v3.0
/events only accepts a single event. Code in event.go and handler.go
I am setting up an EFK stack, and then using using dsiem for correlation.
Fluentd out_http
plugin, IIUC, only emits either ndjson
or json
with an array of events. Docs.
POST
ing application\x-ndjson
, which fails.
POST
ing application\json
with an array of normalized events, which fails.
Different fluentd configs for out_http
Parse an event or an array of events.
I don't know if that's trivial (or even necessary), or it adds extra complexity.
ETA:
I'm running docker deployment in AWS to test Dsiem.
All containers up and running.
Unfortunately, I'm unable to trigger alert by pinging a host in the same subnet.
Is there any other way to trigger an alert?
It is said in the description that
Dsiem: preconfigured with a directive that correlates Suricata and Ossec events for the above scenario.
I've found Suricata json plugin and then obtained eve.json from suricata docker container. Fields are matching and everything is clear with that. But there's no plugin for ossec. Is ossec parser hardcoded? How does it works?
need help on updated json format with HTTPS
I have created a logstash configuration file using dplugger to filter the firewall logs . I have attached the file for reference.
After giving the input , SIEM config file and output, the firewall logs are not yet reported to either DSIEM and Kibana dashboard. Can anyone please help me on this to achieve the firewall logs visualization in Kibana dashboard.
Should I need to use the 80_siem.conf file?
Can anyone please share the updated logstash configuration file if possible?
input_file.conf:
syslog {
port => 514
codec => "line"
tags => ["firewall"]
grok_pattern => "%{GREEDYDATA:message}"
}
}
dsiem_firewall.conf:
###############################################################################
###############################################################################
filter {
if [fields][log_type] == "firewall" and [alert]) {
clone {
clones => [ "siem_events" ]
}
if [type] == "siem_events" {
mutate {
id => "tag normalizedEvent "
remove_field => [ "[fields][log_type]" , "type" ]
add_field => {
"[@metadata][siem_plugin_type]" => "firewall"
"[@metadata][siem_data_type]" => "normalizedEvent"
}
}
}
}
}
filter {
if [@metadata][siem_plugin_type] == "firewall" {
date {
id => "timestamp "
match => [ "[timestamp]", "ISO8601" ]
target => [timestamp]
}
mutate {
id => "siem_event fields "
replace => {
"title" => "%{[title]}"
"src_index_pattern" => "siem_events-*"
"sensor" => "%{[host]}"
"product" => "Firewall-logs"
"src_ip" => "%{[src_ip]}"
"dst_ip" => "%{[dst_ip]}"
"protocol" => ""
"category" => "%{[category]}"
"subcategory" => "%{[subcategory]}"
"src_port" => "%{[src_port]}"
"dst_port" => "%{[dst_port]}"
}
}
mutate {
id => "integer fields "
convert => {
"src_port" => "integer"
"dst_port" => "integer"
}
}
# delete fields except those included in the whitelist below
prune {
whitelist_names => [ "@timestamp$" , "^timestamp$", "@metadata", "^src_index_pattern$", "^title$", "^sensor$", "^product$",
"^src_ip$", "^dst_ip$", "^plugin_id$", "^plugin_sid$", "^category$", "^subcategory$",
"^src_port$", "^dst_port$", "^protocol$", "^custom_label1$", "^custom_label2$", "^custom_label3$",
"^custom_data1$", "^custom_data2$", "^custom_data3$" ]
}
}
}
Hello, Nice work !
Any plan to add another way to add enrichment instead of wise ?
OpenCTI or MISP directly ?
Best regards,
Hi everyone
How can i add alienvault otx to dsiem?or how can to see ossim rules?
Hi DSIEM Team,
While I was trying to test and implement your solution, I figured out the index siem_alarms
created based on the triggered Suricata rules while carrying out the basic ICMP flood attack. But I am not able to trigger any of the Threat Intel Sources such as Alien Vault OTX which should show up as a SIEM alarm along with the enriched data for the malicious Public IP. I have followed the following guide for installation [https://github.com/defenxor/dsiem/blob/master/docs/installation.md] and running your solution using docker containers based on the provided guide.
Do I need to make any extra configuration changes or enable any other settings to get the SIEM alarms along with an enriched Threat Intel data from sources such as Alien Vault OTX?
Hi and Thanks in Advance,
Can we match custom data fields from previous levels as below?
"custom_data1":":1"
Also, can we match other custom data fields in previous levels like below?
"custom_data1":"custom_data2:1"
Please let me know.
I can't start the kibana docker and logstash docker in MacOS
And it show the log:
- [ERRCODE: SC_ERR_AFP_CREATE(190)] - Can not open iface 'en5'
How to solve this problem?
I installed from zip, not docker, to an existing ELK 6.8.6 system with 6 months worth of firewall logs indexed. I followed this document here: https://hakin9.org/dsiem-security-event-correlation-engine-for-elk-stack/
$ cat /var/dsiem/web/dist/assets/config/esconfig.json
{
"elasticsearch": "http://localhost:9200",
"kibana": "http://10.10.10.5:5601"
}
Looking at Kibana it appears the siem indexes were created but not populated. All my firewall logs are in logstash-* . Not sure how else to troubleshoot this. Need some help.
Thanks.
Firstly thank you for the great project!
I wanted to create a directive which only consists of one rule. DSIEM hereby raises the error "Skipping [...] has only one rule and will therefore never expire." I don't quite understand why it is necessary to have more than one rule, especially because it seems to be in OSSIM. Is there a workaround for my problem?
Thank you very much in advance.
Hello,
I am having an issue starting the demo script (run.sh) to setup the filebeat-es index template:
** ensuring filebeat-es index template is correctly installed .. curl: (22) The requested URL returned error: 404 Not Found
the 404 will repeat indefinitely.
Did I do something wrong? I am using latest Ubuntu, docker and docker compose installed. I tried using the wlp2s0 and docker0 adapter and the IPs the script gave me (only one option in each case).
Hi Mate,
I just want to know whether DSIEM can be used with Security Onion which is a ELK based network sensor.
If yes, then please tell how. Please tell the steps.
Thanks,
Vikas
Is it possible to create a directive that is the AND of three rules that is order independent? For example, I would like to trigger when rule A, rule B and rule C are satisfied, but I don't care what order they were received. I realize I could write six separate directives for all possible sequential occurrences (ABC, ACB, BAC, BCA, CAB, CBA), but this is cumbersome.
Am I missing something from the docs explaining how to pass credentials to Elasticsearch, Logstash, and Kibana?
Even tried setting DSIEM_WEB_ESURL
to http://user:password@elasticsearch:9200 but the web ui still says Disconnected from ES http://user:password@elasticsearch:9200
.
Hi there,
I have followed the instructions and also modified the docker composer file but I it seems like disem is unable to connect to the elasticsearch database.
I don't see any errors in the logs so I am not sure what else to do.
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
a2527ee60473 defenxor/dsiem:latest "/init" 3 minutes ago Up 3 minutes 0.0.0.0:8080->8080/tcp, :::8080->8080/tcp dsiem
b4b3418e1c43 defenxor/suricata:1710 "/bin/bash -c 'chown…" 3 minutes ago Up 3 minutes suricata
8959724c464e docker.elastic.co/elasticsearch/elasticsearch:7.11.0 "/bin/tini -- /usr/l…" 3 minutes ago Up 3 minutes 0.0.0.0:9200->9200/tcp, :::9200->9200/tcp, 9300/tcp elasticsearch
83e38b2213ad defenxor/docker-logstash:7.11.0 "/usr/local/bin/dock…" 3 minutes ago Up 3 minutes 5044/tcp, 9600/tcp logstash
07e8143e0a00 docker.elastic.co/kibana/kibana:7.11.0 "/bin/tini -- /usr/l…" 3 minutes ago Up 3 minutes 0.0.0.0:5601->5601/tcp, :::5601->5601/tcp kibana
I know this issue was closed but I have tried
{
"elasticsearch": "http://localhost:9200/"
"kibana": "http://localhost:5601/"
}
and I also tried
{
"elasticsearch": "http://:9200"
"kibana": "http://l/<ipAddress of my vm:5601"
}
But the result is the same. That is "Disconnected from ES http://localhost:9200/: Error: No Living connections" and
" Disconnected from ES http://:9200: Error: No Living connections"
AM running this on Ubuntu 20.4 VM. The ELK is working fine and I have Suricata also installed with the suricata-* logs and siem_events-* visible in my kibana. I have followed the instructions in the installation
Hi I have problem dsiem or filebeat. I downloaded docker file and installed it, so files are opened and web UI is working well. However, the problem is that after installation DSIEM works good a litlle period then it does not accept a log from Pfsense pls help me,
thnks!
HI DSIEM Team:
Thank you for your DSIEM project, which I believe will be a great project. I've used OSSIM before, and that's kind of nice to me. could you consider adding dynamic field association matching? For example: HTTP URL, HTTP HOSTNAME, HTTP METHOD some fields?
Sometimes I want to associate with some HTTP keywords, which is common in business security. I see that you defined custom_data1, custom_data2, custom_data3. Do I need to re-normalize my log if I want to use custom_data? For example: HTTP URL rename custom_data1?
Hello guys,
I am setting up Dsiem for the first time. the question is should I have to depoly ELK first then Dsiem?. if yes then after deploying ELK how will I have to integerate Dsiem with ELK. There is very limited information in documentations
HI,I have encountered some problems in operation, I want to modify the rule value of directives.json, but I don't know how to ensure that he can successfully read the modified result.
Hi, DSIEM Team:
My ElasticSearch Server uses ssl and certificate authentication, but esconfig.json
does not see an example of loading a certificate. Does DSIEM support the use of ElasticSearch certificates?
Hi there,
thank you for your excellent work!
We are trying to transform Ossim directives to dsiem ones using your documentation and although there is no error and ossim produced tsv files are parsed with the command ./ossimcnv -i some.xml -o ./some.json -r ossimref/, the outcome is a file with only the following:
{
"directives": null
}
Do you maybe know why this is happening?
I am hosting dsiem on Oracle Ampere instances with os/arch linux/arm64
.
After checking the release code, I was able to build a linux/arm64
image, albeit with a little arm twisting. I'd assume this architecture is getting more common and it would be nice to make it easy for others.
In scripts/gobuild-cmd-release.sh there's a note:
# release only the linux version of the server, there's no testing environment for Win/OSX version for this
# and we use drwmutex that only supports Linux
Is there a testing environment for linux/arm64
atm? From what I gather CircleCI supports arm images.
Would the maintainers be willing to support this architecture?
Here are the changes I had to make 1:
s6-overlay-aarch64
instead of s6-overlay-amd64
, b) use the locally built arm64 dsiem binaryHi DSIEM people,
Not really an issue per-se, but I'm struggling to understand how you actually implement Intel Feeds for DSIEM.
From what I can gather, you are using Wise for Moloch to collect intel from various sources. But what I'm having trouble understanding is how you grab the normalized event, and then check the data in that event against a piece of intel.
I have read https://github.com/defenxor/dsiem/blob/master/docs/directive_and_alarm.md and https://github.com/defenxor/dsiem/blob/master/docs/ti_vuln_plugins.md but no clearer really.
Would you have any pointers to assist?
Thanks
TLDR: How can I reload directives while dsiem is running?
I am using dsiem for a user-facing product. The setup includes a bunch of containers managed with docker compose. Directives can be updated from the UI and then persisted in the filesystem (a shared volume).
There are a few options I have thought of to apply the changes:
/var/run/docker.sock
into a container then use the Docker Go SDK to find and restart the dsiem container. This is the solution I am using at the moment. It still flaky, and despite adding locks to the API calls, it does not seem like a good idea to restart the container every so often from a user facing operation. What if the restart fails? Of course, part of the mitigation is that I validate the directives.Hi all, I have problem when forward normalize event
I deployed Suricata and wazuh output json, after i using filebeat sendlog to Logstash. I am using Logstash v7.9
In file 80_siem.conf, two type output Normalize event:
When more than five requests are received from the same SRC-IP to different ports of the same DST-IP in one minute, an alarm should be generated.
Json-Formatter
{ "directives": [ { "id": 500002, "name": "excludePort_TargetIP", "priority": 3, "disabled": false, "kingdom": "Delivery Attack", "category": "Portscan", "rules": [ { "stage": 1, "name": "rule1", "type": "TaxonomyRule", "product": [ "Intrusion Detection", "Intrusion Prevention", "Firewall" ], "category": "Access", "occurrence": 1, "from": "ANY", "to": "ANY", "port_from": "ANY", "port_to": "ANY", "reliability": 1, "timeout": 0, "protocol": "ANY" }, { "stage": 2, "name": "rule2", "type": "TaxonomyRule", "product": [ "Firewall", "Intrusion Detection", "Intrusion Prevention" ], "category": "Access", "occurrence": 5, "from": ":1", "to": ":1", "port_from": "ANY", "port_to": ":!1", "reliability": 6, "timeout": 60, "protocol": "ANY" } ] } ]}
{Related Line: "port_to": ":!1"}
{"level":"WARN","ts":"2020-09-25T13:07:34.911+0300","msg":"Skipping directive ID 500002 'excludePort_TargetIP' due to error: strconv.Atoi: parsing \":!1\": invalid syntax"}
Hello,
Can we create the rule not within the Plugin_Sid values? (example !SRC_IP or !Custom_Data1)
Our goal is to generate an alarm if a user connecting with vpn does not connect to a server within 10 minutes. So is it possible to generate an alarm in case of a non-existent event?
Where should we add taxonomy information to write taxonomy rule?
Hi guys!
I tried to install DSIEM and connect it to existing ELK installation. After all steps in manual I have an error like this. Interesting thing that I have have this error with localhost:9200, but my friend has it with elastic.local what we are using as domain name for elastic server.
I also has an icmp connection between nodes and both nodes are in one network without any acl, elastic is accessable by ip by remote kibana and logstash.
Hi Guys, I've been exploring this SIEM for a while and I only found that there is only a Dsiem UI web interface to show the alarm list.
Is there any other page that Dsiem have to manage all the configurations by GUI? I found the dsiem-tools on the download page and is there a GUI web face for that ? so we can easily manage the directive tuning for the events or tickets?
Regards,
Aywa
cc : @mmta @rkspx @Cempakers87
Hi, sorry I'm a newbie here. Just want to ask, I just finished install this Dsiem in docker but the thing is, when I want to use the Siem for the alert and incident management function, it said only platinum license can use the feature. So basically, the Dsiem and elk built-in siem feature both need platinum license to run? or somehow I wrongly setup the Dsiem .
Problem using a word I set in the CustomData field?
For example, when I write a rule that the user X who makes the vpn logs in to the Y IP after connecting, it does not work.
"label" : "Username"
"content" : "admin"
I can't enter definition. How can we solve this?
Hi, DSIEM Team:
We will deploy DSIEM in production. Now does the DSIEM WEB UI support login validation? Or will it be later?
Hi all,
Am trying to run the demo of the application on my Ubuntu VM. I cloned it from the website and below is the error message am getting. I have not made any change to the application. Please can I get any help on how to solve this.
"
⠿ Container wise Started 35.1s
⠿ Container dsiem-nesd Started 40.6s
** finding target IP address .. done
** preparing nesd CSV file .. done
** verifying 10.0.2.15 in Wise .. done
** verifying 192.168.99.14:80 in Nesd .. done
** ensuring elasticsearch is ready .. done
** preparing es indices .. done
** setting up suricata interface .. done
** making sure target is ready .. done
** ensuring logstash is ready .. done
** ensuring filebeat-es index template is correctly installed .. curl: (22) The requested URL returned error: 404
curl: (22) The requested URL returned error: 404
curl: (22) The requested URL returned error: 404
curl: (22) The requested URL returned error: 404
curl: (22) The requested URL returned error: 404
"
ISSUE 2
AM running this as standalone mode and am following the instructions on the installation.md file.
Run ELK, Suricata, and Dsiem in standalone mode
localhost:9200 gives me this
"
{
"name" : "6cee26827eb3",
"cluster_name" : "docker-cluster",
"cluster_uuid" : "vJGs6x6kSm-OoYxEBq59xA",
"version" : {
"number" : "7.11.0",
"build_flavor" : "default",
"build_type" : "docker",
"build_hash" : "8ced7813d6f16d2ef30792e2fcde3e755795ee04",
"build_date" : "2021-02-08T22:44:01.320463Z",
"build_snapshot" : false,
"lucene_version" : "8.7.0",
"minimum_wire_compatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}
"
DSIEM UI
http://localhost:8080/ui/#/data/alarm-list
shows
"no record found"
Please how do I solve this
Thanks
Hi DSIEM Team,
I'm Trying to integrate Wazuh plugins into DSIEM Directives, i generated the directive file successfuly and the parser too (70_siem-plugin-wazuh.conf), i placed it in logstash pipeline as described and followed the steps.
I sent the wazuh_alerts (json) through logstash and i can see that events where cloned successfully (siem_events-).
but no alarms were indexed in (siem_alarms / siem_alarm_events-) indexes.
Any one had the same issue when integrating plugins in Dsiem ?
Btw, I tested suricata within the same pipeline and it works successfully.
Thank you.
Am I suppose to install a kibana plugin. it also says that "it could not locate the index-pattern-field (id: @timestamp)
Am wondering why, because the siem_events-* is working fine in kibana.
I look forward to hearing from you.
Regards
Peter
Hi
Thanks for this project ...
Can you please explain to configure DSIEM for accepting traffic from Firewall Devices over Syslog 514 ( Cisco ASA / Fortinet / Palo Alto / Checkpoint ) .
Thanks
Hi dsiem team,
How can i send alert to slack?
The !IP address and :1 functions are operational. But when writing a rule, it is not supported to have an IP different from the previous stage in the form of !:1. Can you help with this?
When writing a rule, how can we cross-use the SRC_IP and DST_IP information in a parent rule on a new line? (SRC_IP and DST_IP replacement)
i install dsiem with docker on my ubuntu server. when i running dsiem in my local browser, dsiem disconnected from ES. why this happen? and what else i should config?
Cannot use network CIDR ranges in the directive rule.
"from": "192.168.2.0/24, 10.10.0.0/24",
"to": "!192.168.2.0/24, !10.10.0.0/24",
Error getting:
{"level":"WARN","ts":"2021-11-23T10:13:57.116Z","msg":"Skipping directive ID 3020 'DNS Test' due to error: !192.168.2.0/24 is not a valid IPv4 address or CIDR"}
When we write and run the correlation rule for the custom data filed, the alarm goes on for the first username that matches the rule. A continuous alarm is also on for similar activities of that user. But the alarm doesn't go on for other users doing similar activity. Could it be a structural problem?
We are trying to send ossec logs from logstash to Dsiem without using Elastic search. Below is the logstash configuration... what output pllugin is required to get desired work done.
p.s. when i use dtester with command ./dtester dsiem -f directives_dsiem.json -v -n 10000000 then Dsiem shows events , otherwise it shows "0 eps" . But desired work is to get events without using dtester
I'll value early responses
input {
syslog {
port => 514
add_field => { "application" => "syslog" }
}
}
filter {
if [application] == "syslog" {
clone {
clones => [ "siem_ossec_events" ]
}
if [type] == "siem_ossec_events" {
mutate {
id => "tag normalizedEvent 50001"
remove_field => [ "application" , "type" ]
add_field => {
"[@metadata][siem_plugin_type]" => "ossec"
"[@metadata][siem_data_type]" => "normalizedEvent"
}
}
}
}
if [@metadata][siem_plugin_type] == "ossec" {
date {
id => "timestamp 50001"
match => [ "[timestamp]", "ISO8601" ]
target => [timestamp]
}
mutate {
id => "siem_event fields 50001"
replace => {
"title" => "%{[description]}"
"src_index_pattern" => "ossec-*"
"sensor" => "%{[host]}"
"product" => "Host Intrusion Detection System"
"src_ip" => "%{[src_ip]}"
"dst_ip" => "%{[dst_ip]}"
"protocol" => "TCP/IP"
"category" => "%{[classification]}"
"plugin_id" => "50001"
"plugin_sid" => "%{[id]}"
"custom_label1" => "message"
"custom_data1" => "%{[message]}"
}
}
mutate {
id => "integer fields 50001"
convert => {
"plugin_id" => "integer"
"plugin_sid" => "integer"
}
}
if [src_ip] == "%{[src_ip]}" {
mutate {
replace => {
"src_ip" => "0.0.0.0"
}
# remove_field => [ "src_ip" ]
}
}
if [custom_data1] == "%{[message]}" { mutate { remove_field => [ "custom_label1", "custom_data1" ]}}
# delete fields except those included in the whitelist below
prune {
whitelist_names => [ "@timestamp$" , "^timestamp$", "@metadata", "^src_index_pattern$", "^title$", "^sensor$", "^product$",
"^src_ip$", "^dst_ip$", "^plugin_id$", "^plugin_sid$", "^category$", "^subcategory$",
"^src_port$", "^dst_port$", "^protocol$", "^custom_label1$", "^custom_label2$", "^custom_label3$",
"^custom_data1$", "^custom_data2$", "^custom_data3$" ]
}
}
if [application] == "dtester" {
mutate {
remove_field => [ "application", "beat", "host.name", "source" ]
add_field => {
"[@metadata][siem_plugin_type]" => "dtester"
"[@metadata][siem_data_type]" => "normalizedEvent"
}
}
}
}
output {
http {
format=>"json"
http_method=>"post"
url=>"http://0.0.0.0:8080"
}
stdout { codec => json }
}
I am hoping to use DSIEM (specifically the correlation engine) without Elastic stack.
So far, I have gotten some pointers from the FAQ. Currently, I can receive and normalize events using fluentd, and then output them using fluentd's http output.
First use something else other than Logstash for normalizing your logs in accordance to Dsiem normalized event specification. For instance, you can use Fluentd for this purpose.
However, I have not yet figured out how to set up DSIEM to receive events using HTTP. For context, I am using Docker with the defenxor/dsiem:latest
image.
Next, send those normalized events to Dsiem through HTTP. Again, should be possible with something like Fluentd HTTP output.
Finally, substitute Filebeat with something else to read Dsiem output (siem_alarms.json), and send it to the final storage or notification destination. In Fluentd this may involve the tail input and json parser plugins sending results to one of Fluentd data output plugin.
Admittedly, I have not spent a lot of time with DSIEM and may have glaring gaps in my understanding of how DSIEM works.
How should I setup/configure DSIEM for this scenario?
ETA:
I have gathered from the logstash config events are sent as JSON via /events and I can now successfully receive events. I will reopen this issue in case I bump into another blocker.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.