Code Monkey home page Code Monkey logo

dsiem's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dsiem's Issues

[Feature Request] Accept json array in POST /events route

Problem to be solved

/events only accepts a single event. Code in event.go and handler.go

I am setting up an EFK stack, and then using using dsiem for correlation.

Fluentd out_http plugin, IIUC, only emits either ndjson or json with an array of events. Docs.

Alternatives I have tried

POSTing application\x-ndjson, which fails.
POSTing application\json with an array of normalized events, which fails.
Different fluentd configs for out_http

Suggested solution

Parse an event or an array of events.

I don't know if that's trivial (or even necessary), or it adds extra complexity.

ETA:

  1. Fixed typo
  2. In the meantime I have set up a tiny proxy to split the array

How to trigger test alarm

I'm running docker deployment in AWS to test Dsiem.
All containers up and running.
Unfortunately, I'm unable to trigger alert by pinging a host in the same subnet.
Is there any other way to trigger an alert?

[dsiem-demo] how ossec logs are parsed?

It is said in the description that

Dsiem: preconfigured with a directive that correlates Suricata and Ossec events for the above scenario.

I've found Suricata json plugin and then obtained eve.json from suricata docker container. Fields are matching and everything is clear with that. But there's no plugin for ossec. Is ossec parser hardcoded? How does it works?

Error while getting firewall logs to DSIEM and Kibana Dashboard

I have created a logstash configuration file using dplugger to filter the firewall logs . I have attached the file for reference.
After giving the input , SIEM config file and output, the firewall logs are not yet reported to either DSIEM and Kibana dashboard. Can anyone please help me on this to achieve the firewall logs visualization in Kibana dashboard.
Should I need to use the 80_siem.conf file?
Can anyone please share the updated logstash configuration file if possible?
input_file.conf:

Input from Sophos firewall

Open port 514 of the logstash

syslog {

port => 514
codec => "line"
tags => ["firewall"]
grok_pattern => "%{GREEDYDATA:message}"

}

}

dsiem_firewall.conf:
###############################################################################

Dsiem firewall Plugin

Type: Taxonomy

Auto-generated by dpluger on 2023-05-09T13:05:10+05:30

###############################################################################

filter {

1st step: identify the source log and clone it to another event with type => siem_events

if [fields][log_type] == "firewall" and [alert]) {
clone {
clones => [ "siem_events" ]
}

2nd step: remove the source log identifier from the clone, so that the clone will not

go through the same pipeline as the source log. Also remove the temporary type field,

replacing it with metadata field that will be read by the rest of siem pipeline.

if [type] == "siem_events" {
  mutate {
    id => "tag normalizedEvent "
    remove_field => [ "[fields][log_type]" , "type" ]
    add_field => {
      "[@metadata][siem_plugin_type]" => "firewall"
      "[@metadata][siem_data_type]" => "normalizedEvent"
    }
  }
}

}
}

3rd step: the actual event normalization so that it matches the format that dsiem expect.

Required fields:

timestamp (date), title (string), sensor (string), product (string), dst_ip (string), src_ip (string)

For PluginRule type plugin, the following are also required:

plugin_id (integer), plugin_sid (integer)

For TaxonomyRule type plugin, the following is also required:

category (string)

Optional fields:

These fields are optional but should be included whenever possible since they can be used in directive rules:

dst_port (integer), src_port (integer), protocol (string), subcategory (string)

These fields are also optional and can be used in directive rules. They should be used for custom data that

are not defined in standard SIEM fields.

custom_label1 (string), custom_data1 (string), custom_label2 (string), custom_data2 (string)

custom_label3 (string), custom_data3 (string)

And this field is optional, and should be included if the original logs are also stored in elasticsearch.

This will allow direct pivoting from alarm view in the web UI to the source index.

src_index_pattern (string)

As for other fields from source log, they will be removed by logstash plugin prune below

filter {
if [@metadata][siem_plugin_type] == "firewall" {
date {
id => "timestamp "
match => [ "[timestamp]", "ISO8601" ]
target => [timestamp]
}
mutate {
id => "siem_event fields "
replace => {
"title" => "%{[title]}"
"src_index_pattern" => "siem_events-*"
"sensor" => "%{[host]}"
"product" => "Firewall-logs"
"src_ip" => "%{[src_ip]}"
"dst_ip" => "%{[dst_ip]}"
"protocol" => ""
"category" => "%{[category]}"
"subcategory" => "%{[subcategory]}"

    "src_port" => "%{[src_port]}"
    "dst_port" => "%{[dst_port]}"

  }
}

mutate {
  id => "integer fields "
  convert => {

    "src_port" => "integer"
    "dst_port" => "integer"
  }
}

# delete fields except those included in the whitelist below
prune {
  whitelist_names => [ "@timestamp$" , "^timestamp$", "@metadata", "^src_index_pattern$", "^title$", "^sensor$", "^product$",
    "^src_ip$", "^dst_ip$", "^plugin_id$", "^plugin_sid$", "^category$", "^subcategory$",
    "^src_port$", "^dst_port$", "^protocol$", "^custom_label1$", "^custom_label2$", "^custom_label3$",
    "^custom_data1$", "^custom_data2$", "^custom_data3$" ]
}

}
}

Threat Intel enrichment

Hello, Nice work !

Any plan to add another way to add enrichment instead of wise ?
OpenCTI or MISP directly ?

Best regards,

Alienvault OTX

Hi everyone
How can i add alienvault otx to dsiem?or how can to see ossim rules?

Trigger Alarm to see Threat Intel Enriched Data

Hi DSIEM Team,

While I was trying to test and implement your solution, I figured out the index siem_alarms created based on the triggered Suricata rules while carrying out the basic ICMP flood attack. But I am not able to trigger any of the Threat Intel Sources such as Alien Vault OTX which should show up as a SIEM alarm along with the enriched data for the malicious Public IP. I have followed the following guide for installation [https://github.com/defenxor/dsiem/blob/master/docs/installation.md] and running your solution using docker containers based on the provided guide.

Do I need to make any extra configuration changes or enable any other settings to get the SIEM alarms along with an enriched Threat Intel data from sources such as Alien Vault OTX?

Can it be deployed in MacOS?

I can't start the kibana docker and logstash docker in MacOS
And it show the log:
- [ERRCODE: SC_ERR_AFP_CREATE(190)] - Can not open iface 'en5'
How to solve this problem?

No Living Connections - Non Docker

Screen Shot 2020-01-31 at 10 00 10 PM

I installed from zip, not docker, to an existing ELK 6.8.6 system with 6 months worth of firewall logs indexed. I followed this document here: https://hakin9.org/dsiem-security-event-correlation-engine-for-elk-stack/

$ cat /var/dsiem/web/dist/assets/config/esconfig.json
{
    "elasticsearch": "http://localhost:9200",
    "kibana": "http://10.10.10.5:5601"
}

Looking at Kibana it appears the siem indexes were created but not populated. All my firewall logs are in logstash-* . Not sure how else to troubleshoot this. Need some help.

Thanks.

Building directives with only one rule

Firstly thank you for the great project!

I wanted to create a directive which only consists of one rule. DSIEM hereby raises the error "Skipping [...] has only one rule and will therefore never expire." I don't quite understand why it is necessary to have more than one rule, especially because it seems to be in OSSIM. Is there a workaround for my problem?

Thank you very much in advance.

filebeat-es index template not correctly installed in demo

Hello,

I am having an issue starting the demo script (run.sh) to setup the filebeat-es index template:

** ensuring filebeat-es index template is correctly installed .. curl: (22) The requested URL returned error: 404 Not Found
the 404 will repeat indefinitely.

Did I do something wrong? I am using latest Ubuntu, docker and docker compose installed. I tried using the wlp2s0 and docker0 adapter and the IPs the script gave me (only one option in each case).

Can we use DSIEM with Security Onion?

Hi Mate,

I just want to know whether DSIEM can be used with Security Onion which is a ELK based network sensor.
If yes, then please tell how. Please tell the steps.

Thanks,
Vikas

Order Independent "AND" for Directives?

Is it possible to create a directive that is the AND of three rules that is order independent? For example, I would like to trigger when rule A, rule B and rule C are satisfied, but I don't care what order they were received. I realize I could write six separate directives for all possible sequential occurrences (ABC, ACB, BAC, BCA, CAB, CBA), but this is cumbersome.

Disconnected from ES http://elasticsearch:9200: Error: No Living connections

Hi there,
I have followed the instructions and also modified the docker composer file but I it seems like disem is unable to connect to the elasticsearch database.
I don't see any errors in the logs so I am not sure what else to do.

image

CONTAINER ID   IMAGE                                                  COMMAND                  CREATED         STATUS         PORTS                                                 NAMES
a2527ee60473   defenxor/dsiem:latest                                  "/init"                  3 minutes ago   Up 3 minutes   0.0.0.0:8080->8080/tcp, :::8080->8080/tcp             dsiem
b4b3418e1c43   defenxor/suricata:1710                                 "/bin/bash -c 'chown…"   3 minutes ago   Up 3 minutes                                                         suricata
8959724c464e   docker.elastic.co/elasticsearch/elasticsearch:7.11.0   "/bin/tini -- /usr/l…"   3 minutes ago   Up 3 minutes   0.0.0.0:9200->9200/tcp, :::9200->9200/tcp, 9300/tcp   elasticsearch
83e38b2213ad   defenxor/docker-logstash:7.11.0                        "/usr/local/bin/dock…"   3 minutes ago   Up 3 minutes   5044/tcp, 9600/tcp                                    logstash
07e8143e0a00   docker.elastic.co/kibana/kibana:7.11.0                 "/bin/tini -- /usr/l…"   3 minutes ago   Up 3 minutes   0.0.0.0:5601->5601/tcp, :::5601->5601/tcp             kibana

Disconnected from ES http://:9200: Error: No Living connection

I know this issue was closed but I have tried
{
"elasticsearch": "http://localhost:9200/"
"kibana": "http://localhost:5601/"
}

and I also tried
{
"elasticsearch": "http://:9200"
"kibana": "http://l/<ipAddress of my vm:5601"

}

But the result is the same. That is "Disconnected from ES http://localhost:9200/: Error: No Living connections" and
" Disconnected from ES http://:9200: Error: No Living connections"
AM running this on Ubuntu 20.4 VM. The ELK is working fine and I have Suricata also installed with the suricata-* logs and siem_events-* visible in my kibana. I have followed the instructions in the installation

why doesn't dsiem accept logs?

Hi I have problem dsiem or filebeat. I downloaded docker file and installed it, so files are opened and web UI is working well. However, the problem is that after installation DSIEM works good a litlle period then it does not accept a log from Pfsense pls help me,
thnks!

About the use of the custom_data field

HI DSIEM Team:
Thank you for your DSIEM project, which I believe will be a great project. I've used OSSIM before, and that's kind of nice to me. could you consider adding dynamic field association matching? For example: HTTP URL, HTTP HOSTNAME, HTTP METHOD some fields?
Sometimes I want to associate with some HTTP keywords, which is common in business security. I see that you defined custom_data1, custom_data2, custom_data3. Do I need to re-normalize my log if I want to use custom_data? For example: HTTP URL rename custom_data1?

Setting up Dsiem

Hello guys,

I am setting up Dsiem for the first time. the question is should I have to depoly ELK first then Dsiem?. if yes then after deploying ELK how will I have to integerate Dsiem with ELK. There is very limited information in documentations

How to set directives

HI,I have encountered some problems in operation, I want to modify the rule value of directives.json, but I don't know how to ensure that he can successfully read the modified result.

DSIEM support the use of ElasticSearch certificates?

Hi, DSIEM Team:
My ElasticSearch Server uses ssl and certificate authentication, but esconfig.json does not see an example of loading a certificate. Does DSIEM support the use of ElasticSearch certificates?

ossimcnv does not work

Hi there,

thank you for your excellent work!

We are trying to transform Ossim directives to dsiem ones using your documentation and although there is no error and ossim produced tsv files are parsed with the command ./ossimcnv -i some.xml -o ./some.json -r ossimref/, the outcome is a file with only the following:

{
"directives": null
}

Do you maybe know why this is happening?

linux/arm64 server build

I am hosting dsiem on Oracle Ampere instances with os/arch linux/arm64.

After checking the release code, I was able to build a linux/arm64 image, albeit with a little arm twisting. I'd assume this architecture is getting more common and it would be nice to make it easy for others.

In scripts/gobuild-cmd-release.sh there's a note:

    # release only the linux version of the server, there's no testing environment for Win/OSX version for this
    # and we use drwmutex that only supports Linux

Is there a testing environment for linux/arm64 atm? From what I gather CircleCI supports arm images.

Would the maintainers be willing to support this architecture?


Here are the changes I had to make 1:

  1. Build an arm64 dsiem binary 2
  2. Update deployments/docker/build/Dockerfile: a) use s6-overlay-aarch64 instead of s6-overlay-amd64, b) use the locally built arm64 dsiem binary
  3. Minor changes to the build scripts

Footnotes

  1. I'd be happy to push a PR if the need arises

  2. Go's cross-compilation is quite nice

Lacking documentation on Using Intel Feeds

Hi DSIEM people,

Not really an issue per-se, but I'm struggling to understand how you actually implement Intel Feeds for DSIEM.

From what I can gather, you are using Wise for Moloch to collect intel from various sources. But what I'm having trouble understanding is how you grab the normalized event, and then check the data in that event against a piece of intel.

I have read https://github.com/defenxor/dsiem/blob/master/docs/directive_and_alarm.md and https://github.com/defenxor/dsiem/blob/master/docs/ti_vuln_plugins.md but no clearer really.

Would you have any pointers to assist?

Thanks

[Question] Reload directives

TLDR: How can I reload directives while dsiem is running?

I am using dsiem for a user-facing product. The setup includes a bunch of containers managed with docker compose. Directives can be updated from the UI and then persisted in the filesystem (a shared volume).

There are a few options I have thought of to apply the changes:

  1. Reload directives from within dsiem as the container is running (say by hitting a certain endpoint). Does not currently seem to be possible.
  2. Use a named pipe to restart the container with regular docker/docker compose command - SO answer. This is probably wielding too much power over the host and a little hacky to get results back.
  3. Mount /var/run/docker.sock into a container then use the Docker Go SDK to find and restart the dsiem container. This is the solution I am using at the moment. It still flaky, and despite adding locks to the API calls, it does not seem like a good idea to restart the container every so often from a user facing operation. What if the restart fails? Of course, part of the mitigation is that I validate the directives.
  4. Use some sort of orchestration. I am not familiar with container orchestration, and I have not investigated this.

Cannot Forward normalize event to Dsiem

Hi all, I have problem when forward normalize event
I deployed Suricata and wazuh output json, after i using filebeat sendlog to Logstash. I am using Logstash v7.9
In file 80_siem.conf, two type output Normalize event:

  1. Output ES -> It has log index siem*
  2. Output Dsiem -> I ran Logstash mode debug, It has error:
    [HTTP Output Failure] Encountered non-2xx HTTP code 418 {:response_code=>418, :url=>"http://x.x.x.x:8080/events"
    Can anyone help me?
    Thanks you, best regrads!

Exclude some port numbers

When more than five requests are received from the same SRC-IP to different ports of the same DST-IP in one minute, an alarm should be generated.

Related Rule:

Json-Formatter

{ "directives": [ { "id": 500002, "name": "excludePort_TargetIP", "priority": 3, "disabled": false, "kingdom": "Delivery Attack", "category": "Portscan", "rules": [ { "stage": 1, "name": "rule1", "type": "TaxonomyRule", "product": [ "Intrusion Detection", "Intrusion Prevention", "Firewall" ], "category": "Access", "occurrence": 1, "from": "ANY", "to": "ANY", "port_from": "ANY", "port_to": "ANY", "reliability": 1, "timeout": 0, "protocol": "ANY" }, { "stage": 2, "name": "rule2", "type": "TaxonomyRule", "product": [ "Firewall", "Intrusion Detection", "Intrusion Prevention" ], "category": "Access", "occurrence": 5, "from": ":1", "to": ":1", "port_from": "ANY", "port_to": ":!1", "reliability": 6, "timeout": 60, "protocol": "ANY" } ] } ]}

Error Message:

{Related Line: "port_to": ":!1"}
{"level":"WARN","ts":"2020-09-25T13:07:34.911+0300","msg":"Skipping directive ID 500002 'excludePort_TargetIP' due to error: strconv.Atoi: parsing \":!1\": invalid syntax"}

Request:

  • As can be seen above, we encounter an error when we try to exclude some port numbers. Can such a feature be added or if so, how will we do it?

It's not rule

Hello,

Can we create the rule not within the Plugin_Sid values? (example !SRC_IP or !Custom_Data1)

Our goal is to generate an alarm if a user connecting with vpn does not connect to a server within 10 minutes. So is it possible to generate an alarm in case of a non-existent event?

Taxonomy

Where should we add taxonomy information to write taxonomy rule?

Disconnected from ES http://localhost:9200: Error: No Living connections

Hi guys!
I tried to install DSIEM and connect it to existing ELK installation. After all steps in manual I have an error like this. Interesting thing that I have have this error with localhost:9200, but my friend has it with elastic.local what we are using as domain name for elastic server.
I also has an icmp connection between nodes and both nodes are in one network without any acl, elastic is accessable by ip by remote kibana and logstash.

Dsiem-Tools

Hi Guys, I've been exploring this SIEM for a while and I only found that there is only a Dsiem UI web interface to show the alarm list.
Is there any other page that Dsiem have to manage all the configurations by GUI? I found the dsiem-tools on the download page and is there a GUI web face for that ? so we can easily manage the directive tuning for the events or tickets?

Regards,
Aywa

cc : @mmta @rkspx @Cempakers87

dsiem vs built-in elk siem

Hi, sorry I'm a newbie here. Just want to ask, I just finished install this Dsiem in docker but the thing is, when I want to use the Siem for the alert and incident management function, it said only platinum license can use the feature. So basically, the Dsiem and elk built-in siem feature both need platinum license to run? or somehow I wrongly setup the Dsiem .

CustomData special

Problem using a word I set in the CustomData field?
For example, when I write a rule that the user X who makes the vpn logs in to the Y IP after connecting, it does not work.
"label" : "Username"
"content" : "admin"
I can't enter definition. How can we solve this?

404 error when running demo

Hi all,
Am trying to run the demo of the application on my Ubuntu VM. I cloned it from the website and below is the error message am getting. I have not made any change to the application. Please can I get any help on how to solve this.

"
⠿ Container wise Started 35.1s
⠿ Container dsiem-nesd Started 40.6s
** finding target IP address .. done
** preparing nesd CSV file .. done
** verifying 10.0.2.15 in Wise .. done
** verifying 192.168.99.14:80 in Nesd .. done
** ensuring elasticsearch is ready .. done
** preparing es indices .. done
** setting up suricata interface .. done
** making sure target is ready .. done
** ensuring logstash is ready .. done
** ensuring filebeat-es index template is correctly installed .. curl: (22) The requested URL returned error: 404
curl: (22) The requested URL returned error: 404
curl: (22) The requested URL returned error: 404
curl: (22) The requested URL returned error: 404
curl: (22) The requested URL returned error: 404
"
ISSUE 2
AM running this as standalone mode and am following the instructions on the installation.md file.

Run ELK, Suricata, and Dsiem in standalone mode

localhost:9200 gives me this
"
{
"name" : "6cee26827eb3",
"cluster_name" : "docker-cluster",
"cluster_uuid" : "vJGs6x6kSm-OoYxEBq59xA",
"version" : {
"number" : "7.11.0",
"build_flavor" : "default",
"build_type" : "docker",
"build_hash" : "8ced7813d6f16d2ef30792e2fcde3e755795ee04",
"build_date" : "2021-02-08T22:44:01.320463Z",
"build_snapshot" : false,
"lucene_version" : "8.7.0",
"minimum_wire_compatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}
"

DSIEM UI
http://localhost:8080/ui/#/data/alarm-list
shows
"no record found"

Please how do I solve this
Thanks

Wazuh

Hi DSIEM Team,

I'm Trying to integrate Wazuh plugins into DSIEM Directives, i generated the directive file successfuly and the parser too (70_siem-plugin-wazuh.conf), i placed it in logstash pipeline as described and followed the steps.
I sent the wazuh_alerts (json) through logstash and i can see that events where cloned successfully (siem_events-).
but no alarms were indexed in (siem_alarms / siem_alarm_events-
) indexes.
Any one had the same issue when integrating plugins in Dsiem ?

Btw, I tested suricata within the same pipeline and it works successfully.

Thank you.

Unable to see siem_alarms in Kibana and Dsiem

Hello DSIEM Team,
While trying to implement your solution, I could only see the siem_events in kibana. I could not see the siem_alarms in kibana. I could not see either siem_events or siem_alarms in DSIEM UI. The error message am getting in kibana is:

KbnError@http://localhost:5601/37897/bundles/plugin/kibanaUtils/kibanaUtils.plugin.js:1:47891
SavedObjectNotFound@http://localhost:5601/37897/bundles/plugin/kibanaUtils/kibanaUtils.plugin.js:1:48326
FieldParamType/this.deserialize@http://localhost:5601/37897/bundles/plugin/data/data.plugin.js:1:661365
setParams/<@http://localhost:5601/37897/bundles/plugin/data/data.plugin.js:1:415077
setParams@http://localhost:5601/37897/bundles/plugin/data/data.plugin.js:1:414501
AggConfig@http://localhost:5601/37897/bundles/plugin/data/data.plugin.js:1:414363
agg_configs_AggConfigs/<@http://localhost:5601/37897/bundles/plugin/data/data.plugin.js:1:197161
agg_configs_AggConfigs/<@http://localhost:5601/37897/bundles/plugin/data/data.plugin.js:1:197513
agg_configs_AggConfigs@http://localhost:5601/37897/bundles/plugin/data/data.plugin.js:1:197492
createAggConfigs@http://localhost:5601/37897/bundles/plugin/data/data.plugin.js:8:195209
_callee6$@http://localhost:5601/37897/bundles/plugin/discover/discover.chunk.6.js:7:71124
l@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:321:968491
s/o._invoke</<@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:321:968245
_/</e[t]@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:321:968848
discover_asyncGeneratorStep@http://localhost:5601/37897/bundles/plugin/discover/discover.chunk.6.js:7:50339
_next@http://localhost:5601/37897/bundles/plugin/discover/discover.chunk.6.js:7:50685
201/discover_asyncToGenerator/</<@http://localhost:5601/37897/bundles/plugin/discover/discover.chunk.6.js:7:50831
201/discover_asyncToGenerator/<@http://localhost:5601/37897/bundles/plugin/discover/discover.chunk.6.js:7:50570
setupVisualization@http://localhost:5601/37897/bundles/plugin/discover/discover.chunk.6.js:7:70478
d/</<@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:367:95062
d/<@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:367:95200
$digest@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:367:100369
$apply@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:367:102548
callInDigest@http://localhost:5601/37897/bundles/plugin/kibanaLegacy/kibanaLegacy.plugin.js:1:88215
next@http://localhost:5601/37897/bundles/plugin/kibanaLegacy/kibanaLegacy.plugin.js:1:88429
kbnSharedDeps</d</t.prototype.__tryOrUnsub@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:21:42332
kbnSharedDeps</d</t.prototype.next@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:21:41477
kbnSharedDeps</u</t.prototype._next@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:21:40553
kbnSharedDeps</u</t.prototype.next@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:21:40224
kbnSharedDeps</re</t.prototype.debouncedNext@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:429:97522
oe@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:429:97703
kbnSharedDeps</r</t.prototype._execute@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:298:35758
kbnSharedDeps</r</t.prototype.execute@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:298:35574
kbnSharedDeps</o</t.prototype.flush@http://localhost:5601/37897/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:298:33880

Am I suppose to install a kibana plugin. it also says that "it could not locate the index-pattern-field (id: @timestamp)
Am wondering why, because the siem_events-* is working fine in kibana.

I look forward to hearing from you.

Regards

Peter

Firewall Logs over Syslog

Hi

Thanks for this project ...

Can you please explain to configure DSIEM for accepting traffic from Firewall Devices over Syslog 514 ( Cisco ASA / Fortinet / Palo Alto / Checkpoint ) .

Thanks

!:1 usage

The !IP address and :1 functions are operational. But when writing a rule, it is not supported to have an IP different from the previous stage in the form of !:1. Can you help with this?

SRC_IP & DST_IP info replacement

When writing a rule, how can we cross-use the SRC_IP and DST_IP information in a parent rule on a new line? (SRC_IP and DST_IP replacement)

Using CIDR ranges in 'from' & 'to' fields

Cannot use network CIDR ranges in the directive rule.
"from": "192.168.2.0/24, 10.10.0.0/24",
"to": "!192.168.2.0/24, !10.10.0.0/24",

Error getting:
{"level":"WARN","ts":"2021-11-23T10:13:57.116Z","msg":"Skipping directive ID 3020 'DNS Test' due to error: !192.168.2.0/24 is not a valid IPv4 address or CIDR"}

CustomData problem

When we write and run the correlation rule for the custom data filed, the alarm goes on for the first username that matches the rule. A continuous alarm is also on for similar activities of that user. But the alarm doesn't go on for other users doing similar activity. Could it be a structural problem?

Unable to send events from Logstash to Dsiem

We are trying to send ossec logs from logstash to Dsiem without using Elastic search. Below is the logstash configuration... what output pllugin is required to get desired work done.
p.s. when i use dtester with command ./dtester dsiem -f directives_dsiem.json -v -n 10000000 then Dsiem shows events , otherwise it shows "0 eps" . But desired work is to get events without using dtester
I'll value early responses

input {
syslog {
port => 514
add_field => { "application" => "syslog" }
}
}

filter {

if [application] == "syslog" {
clone {
clones => [ "siem_ossec_events" ]
}

if [type] == "siem_ossec_events" {
  mutate {
    id => "tag normalizedEvent 50001"
    remove_field => [ "application" , "type" ]
    add_field => {
      "[@metadata][siem_plugin_type]" => "ossec"
      "[@metadata][siem_data_type]" => "normalizedEvent"
    }
  }
}

}

if [@metadata][siem_plugin_type] == "ossec" {
date {
id => "timestamp 50001"
match => [ "[timestamp]", "ISO8601" ]
target => [timestamp]
}
mutate {
id => "siem_event fields 50001"
replace => {
"title" => "%{[description]}"
"src_index_pattern" => "ossec-*"
"sensor" => "%{[host]}"
"product" => "Host Intrusion Detection System"
"src_ip" => "%{[src_ip]}"
"dst_ip" => "%{[dst_ip]}"
"protocol" => "TCP/IP"
"category" => "%{[classification]}"

    "plugin_id" => "50001"
    "plugin_sid" => "%{[id]}"

    "custom_label1" => "message"

    "custom_data1" => "%{[message]}"

  }
}

mutate {
  id => "integer fields 50001"
  convert => {
    "plugin_id" => "integer"
    "plugin_sid" => "integer"

  }
}

if [src_ip] == "%{[src_ip]}" {
  mutate {
    replace => {
     "src_ip" => "0.0.0.0"
    }
    # remove_field => [ "src_ip" ]
  }
}

if [custom_data1] == "%{[message]}" { mutate { remove_field => [ "custom_label1", "custom_data1" ]}}

# delete fields except those included in the whitelist below
prune {
  whitelist_names => [ "@timestamp$" , "^timestamp$", "@metadata", "^src_index_pattern$", "^title$", "^sensor$", "^product$",
    "^src_ip$", "^dst_ip$", "^plugin_id$", "^plugin_sid$", "^category$", "^subcategory$",
    "^src_port$", "^dst_port$", "^protocol$", "^custom_label1$", "^custom_label2$", "^custom_label3$",
    "^custom_data1$", "^custom_data2$", "^custom_data3$" ]
}

}
if [application] == "dtester" {
mutate {
remove_field => [ "application", "beat", "host.name", "source" ]
add_field => {
"[@metadata][siem_plugin_type]" => "dtester"
"[@metadata][siem_data_type]" => "normalizedEvent"
}
}
}

}

output {
http {
format=>"json"
http_method=>"post"
url=>"http://0.0.0.0:8080"
}
stdout { codec => json }
}

[Question] Using Dsiem without ELK

I am hoping to use DSIEM (specifically the correlation engine) without Elastic stack.

So far, I have gotten some pointers from the FAQ. Currently, I can receive and normalize events using fluentd, and then output them using fluentd's http output.

First use something else other than Logstash for normalizing your logs in accordance to Dsiem normalized event specification. For instance, you can use Fluentd for this purpose.

However, I have not yet figured out how to set up DSIEM to receive events using HTTP. For context, I am using Docker with the defenxor/dsiem:latest image.

Next, send those normalized events to Dsiem through HTTP. Again, should be possible with something like Fluentd HTTP output.

Finally, substitute Filebeat with something else to read Dsiem output (siem_alarms.json), and send it to the final storage or notification destination. In Fluentd this may involve the tail input and json parser plugins sending results to one of Fluentd data output plugin.

Admittedly, I have not spent a lot of time with DSIEM and may have glaring gaps in my understanding of how DSIEM works.

How should I setup/configure DSIEM for this scenario?

ETA:

I have gathered from the logstash config events are sent as JSON via /events and I can now successfully receive events. I will reopen this issue in case I bump into another blocker.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.