otrf / security-datasets Goto Github PK
View Code? Open in Web Editor NEWRe-play Security Events
License: MIT License
Re-play Security Events
License: MIT License
The "Download & Decompress Dataset" example code doesn't work since the URL value needs to be quoted.
The "Read JSON File" example doesn't work since the file extracted from the zip file at https://raw.githubusercontent.com/OTRF/Security-Datasets/master/datasets/atomic/linux/discovery/host/sh_arp_cache.zip isn't a JSON file. It appears to be a raw linux audit log.
Hello,
mordor_file = "https://raw.githubusercontent.com/OTRF/mordor/master/datasets/small/windows/lateral_movement/wmi_event_subscription.pcapng"
registerMordorSQLTable(spark, mordor_file, "mordorTable")
registerMordorSQLTable call downloadMordorFile to download .tar.gz or .zip dataset fie.
But here dataset extension is pcapng.
https://github.com/hunters-forge/openhunt/blob/de241cef7cd1a385569590dfb94888e63caeef87/openhunt/mordorutils.py#L11-L19
As a result, the playbook gives error:
UnboundLocalError: local variable 'mordorJSONPath' referenced before assignment
After creating listener and executing initial access. Docker container empire-mordor
does not receive callback.
Have tried to reset docker container, that didn't work.
Hello Hunters,
I have an elastic cluster up and running and i was kinda interested in using Mordor's datasets in order to test rules created with signals and elastalert to display a Detection Capabilities Dashboard. My question is does the mordor project support ECS since I am mapping all of my event with it.
Thank you for your great work
Curious to know if there are plans to update this
I got the following error when building the environment in the cloud via terraform:
aws_instance.hr001 (remote-exec): C:\Users\User>powershell Restart-Computer -Force
aws_instance.hr001: Creation complete after 6m57s [id=i-099f5c4452ff6374b]
aws_instance.helk: Still creating... [7m0s elapsed]
aws_instance.helk: Still creating... [7m10s elapsed]
aws_instance.helk: Still creating... [7m20s elapsed]
aws_instance.helk: Still creating... [7m30s elapsed]
aws_instance.helk: Still creating... [7m40s elapsed]
aws_instance.helk: Still creating... [7m50s elapsed]
aws_instance.helk: Still creating... [8m0s elapsed]
aws_instance.helk (remote-exec): [HELK-INSTALLATION-INFO] Waiting for some services to be up .....
aws_instance.helk: Still creating... [8m10s elapsed]
aws_instance.helk: Still creating... [8m20s elapsed]
aws_instance.helk: Still creating... [8m30s elapsed]
aws_instance.helk: Still creating... [8m40s elapsed]
aws_instance.helk: Still creating... [8m50s elapsed]
aws_instance.helk: Still creating... [9m0s elapsed]
aws_instance.helk: Still creating... [9m10s elapsed]
aws_instance.helk (remote-exec): ***********************************************************************************
aws_instance.helk (remote-exec): ** [HELK-INSTALLATION-INFO] HELK WAS INSTALLED SUCCESSFULLY **
aws_instance.helk (remote-exec): ** [HELK-INSTALLATION-INFO] USE THE FOLLOWING SETTINGS TO INTERACT WITH THE HELK **
aws_instance.helk (remote-exec): ***********************************************************************************
aws_instance.helk (remote-exec): HELK KIBANA URL: https://172.18.39.6
aws_instance.helk (remote-exec): HELK KIBANA USER: helk
aws_instance.helk (remote-exec): HELK KIBANA PASSWORD: hunting
aws_instance.helk (remote-exec): HELK SPARK MASTER UI: http://172.18.39.6:8080
aws_instance.helk (remote-exec): HELK JUPYTER SERVER URL: http://172.18.39.6/jupyter
aws_instance.helk (remote-exec): HELK JUPYTER CURRENT TOKEN: 11439eeca7d1d331f72748f62b5ad7ddc1aaf5afde66c6de
aws_instance.helk (remote-exec): HELK ZOOKEEPER: 172.18.39.6:2181
aws_instance.helk (remote-exec): HELK KSQL SERVER: 172.18.39.6:8088
aws_instance.helk (remote-exec): IT IS HUNTING SEASON!!!!!
aws_instance.helk: Creation complete after 9m16s [id=i-09fc7e775f4e13969]
Error: error executing "/tmp/terraform_1314486142.sh": Process exited with status 5
I was able to RDP to the windows boxes and use them properly, but something must have not completed successfully. Would you mind providing a few steps to troubleshoot the build when it does not install properly. Like where to look and what logs to check. thank you!
Since the C2 box will only receive call backs from other boxes in the private subnet and they can connect to the C2 via its private ip address, I don't think there is a need to have the port open in the public subnet (even thought there is a whitelist)
https://github.com/Cyb3rWard0g/mordor/blob/master/environment/shire/aws/terraform/main.tf#L85
Estoy siguiendo el tutorial (https://mordordatasets.com/consume/kafka.html) pero a la hora de meter los .json en Elastic a través de kafkacat no funciona.
I'm following the tutorial (https://mordordatasets.com/consume/kafka.html) but when it comes to put the .json in Elastic through kafkaca it doesn't work.
root@ubuntu:/tmp/kafkacat# ./kafkacat -b ip:9092 -t winlogbeat -P -l empire_dcsync_dcerpc_drsuapi_DsGetNCChanges_2020-09-21185829.json
% ERROR: Failed to produce message (11500 bytes): Local: Unknown topic
root@ubuntu:/tmp/kafkacat# ./kafkacat -L -b ip:9092
% ERROR: Failed to acquire metadata: Local: Timed out
Hi,
The title page mentions that the project contributes malicious and benign datasets. However, all datasets in the datasets/ folder appear to be malicious. Are any benign datasets included and am I just missing them? Otherwise the title page should perhaps be updated.
The "Download & Decompress Dataset" example code doesn't work since the URL value should be specified as string using quotation marks.
The "Read JSON File" example doesn't work since the file extracted from the zip file at https://raw.githubusercontent.com/OTRF/Security-Datasets/master/datasets/atomic/linux/defense_evasion/host/sh_binary_padding_dd.zip
isn't a JSON file. It appears to be a raw linux audit log.
I uploaded empire_apt3_2019-05-14223117 with kafkacat,but I could not find the index in kibana。And I didn't get any error.
The following exceptions while passing a JSON dataset, submitting a PR shortly :
Problem with the inputs argument:
ayman@iMac mordor % scripts/data-shippers/Mordor-Elastic.py --url http://192.168.20.50:9200 inputs datasets/large/apt29/day1/apt29_evals_day1_manual_2020-05-01225525.json
Initializing Elasticsearch connection and index...
Calulating total file size...
N/A% (0 of 2) | | Elapsed Time: 0:00:00 ETA: --:--:--Traceback (most recent call last):
File "/Volumes/Data/Coding/mordor/scripts/data-shippers/Mordor-Elastic.py", line 69, in
total_size = sum([
File "/Volumes/Data/Coding/mordor/scripts/data-shippers/Mordor-Elastic.py", line 72, in
for member in tarfile.open(path).getmembers() if member.isfile()
File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/tarfile.py", line 1611, in open
return func(name, "r", fileobj, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/tarfile.py", line 1675, in gzopen
fileobj = GzipFile(name, mode + "b", compresslevel, fileobj)
File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/gzip.py", line 173, in init
fileobj = self.myfileobj = builtins.open(filename, mode or 'rb')
FileNotFoundError: [Errno 2] No such file or directory: 'inputs'
I submitted PR #40 to fix this issue
Shipping a JSON file:
ayman@iMac mordor % scripts/data-shippers/Mordor-Elastic.py --url http://192.168.20.50:9200 inputs datasets/large/apt29/day1/apt29_evals_day1_manual_2020-05-01225525.json
Initializing Elasticsearch connection and index...
Calulating total file size...
N/A% (0 of 1) | | Elapsed Time: 0:00:00 ETA: --:--:--Traceback (most recent call last):
File "/Volumes/Data/Coding/mordor/scripts/data-shippers/Mordor-Elastic.py", line 69, in
total_size = sum([
File "/Volumes/Data/Coding/mordor/scripts/data-shippers/Mordor-Elastic.py", line 72, in
for member in tarfile.open(path).getmembers() if member.isfile()
File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/tarfile.py", line 1616, in open
raise ReadError("file could not be opened successfully")
tarfile.ReadError: file could not be opened successfully
_tar -zcvf the json file fixed my issue, may submit another PR to handle different input file formats when I have a chance
I went through the walkthrough for installing HELK and when I try to ingest the JSON files using the data-shipper script, I get an error saying that it is unable to open the JSON file. I was able to get it to work by instead passing the script with a tar.gz data set and it will show as complete, but when I go to Kibana to look at the discover tab it shows no logs. Also, when I look at the elasticsearch indices management tab, it shows the winlogbeat-mordor and the number of events parsed, but its health status is yellow.
FIN6 Template: https://github.com/center-for-threat-informed-defense/adversary_emulation_library/tree/master/fin6
Atomic Test #1 - Registry dump of SAM, creds, and secrets
Local SAM (SAM & System), cached credentials (System & Security) and LSA secrets (System & Security) can be enumerated via three registry keys. Then processed locally using https://github.com/Neohapsis/creddump7
Upon successful execution of this test, you will find three files named, sam, system and security in the %temp% directory.
Supported Platforms: Windows
Attack Commands: Run with command_prompt! Elevation Required (e.g. root or admin)
reg save HKLM\sam %temp%\sam
reg save HKLM\system %temp%\system
reg save HKLM\security %temp%\security
Cleanup Commands:
del %temp%\sam >nul 2> nul
del %temp%\system >nul 2> nul
del %temp%\security >nul 2> nul
Tasks:
Any tips on how to start caldera and Empire in the RTO box?
Pls what the docker command to do this? I'm struggling to get it to work...
addition following closed issue #26
Currently the recorded data is winlogbeat data of version 6.7 which does not follow the ECS field mappings of current version 7+ (7.8 to be specific..) is there a way to get the recorded datasets for winlogbeat 7+ mapping? or just raw windows logs events?
thanks 👍
Hey - thanks for this great project! I have to say, I don't fully understand it...I see it provides JSON log files that were created after running attacks, but how should I use this information as an defender? I read the README and the introductory blog post but I still don't understand.
Deployment type:
This will allow end users to utilize least amount of resources.
Deliverable:
Im in Europe and I tried to run latest AWS setup, and I got the following error:
aws_instance.hr001 (remote-exec): C:\Users\User>powershell Restart-Computer -Force
aws_instance.hr001: Creation complete after 6m50s [id=i-0f6b7de902d27a413]
Error: error executing "/tmp/terraform_425406088.sh": Process exited with status 5
Error: Error launching source instance: PendingVerification: Your request for accessing resources in this region is being validated, and you will not be able to launch additional resources in this region until the validation is complete. We will notify you by email once your request has been validated. While normally resolved within minutes, please allow up to 4 hours for this process to complete. If the issue still persists, please let us know by writing to [email protected] for further assistance.
status code: 400, request id: 2ca526e8-3587-4200-9670-c709ffa11bc5
on main.tf line 246, in resource "aws_instance" "helk":
246: resource "aws_instance" "helk" {
is there a way to dynamically chose regions rather than only use US-WEST?
root@ttp:/home/pfctpot/mordor/scripts/data-shippers# python3 Mordor-Elastic.py --url http://localhost:9200 inputs apt29_evals_day1_manual_2020-05-01225525.json
Initializing Elasticsearch connection and index...
Traceback (most recent call last):
File "Mordor-Elastic.py", line 41, in <module>
"index.mapping.total_fields.limit": 2000
File "/usr/local/lib/python3.7/dist-packages/elasticsearch/client/utils.py", line 168, in _wrapped
return func(*args, params=params, headers=headers, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/elasticsearch/client/indices.py", line 124, in create
"PUT", _make_path(index), params=params, headers=headers, body=body
File "/usr/local/lib/python3.7/dist-packages/elasticsearch/transport.py", line 415, in perform_request
raise e
File "/usr/local/lib/python3.7/dist-packages/elasticsearch/transport.py", line 388, in perform_request
timeout=timeout,
File "/usr/local/lib/python3.7/dist-packages/elasticsearch/connection/http_urllib3.py", line 275, in perform_request
self._raise_error(response.status, raw_data)
File "/usr/local/lib/python3.7/dist-packages/elasticsearch/connection/base.py", line 331, in _raise_error
status_code, error_message, additional_info
elasticsearch.exceptions.RequestError: RequestError(400, 'resource_already_exists_exception', 'index [winlogbeat-mordor/QsS91DnYSw6SeqSQUNG8JQ] already exists')
I found some mismatches between the reported file type and the actual file type.
An inconsistency in the file type and the link for two dataset metadata files
You can detect the mismatches with this grep command grep -A1 -r "^- type: Host" | grep -B1 network
Here's the output of this command:
datasets/atomic/_metadata/SDWIN-190319021158.yaml:- type: Host
datasets/atomic/_metadata/SDWIN-190319021158.yaml- link: https://raw.githubusercontent.com/OTRF/Security-Datasets/master/datasets/atomic/windows/discovery/network/empire_shell_samr_EnumDomainUsers.zip
--
datasets/atomic/_metadata/SDWIN-200806031938.yaml:- type: Host
datasets/atomic/_metadata/SDWIN-200806031938.yaml- link: https://raw.githubusercontent.com/OTRF/Security-Datasets/master/datasets/atomic/windows/lateral_movement/network/covenant_sharpsc_stop_dcerpc_smb_svcctl.zip
Howdy,
HELK is healthy and workin but never used mordor until today, this is the error while trying to:
kafkacat -b $HELK_IP:9092 -t winlogbeat -P -l empire_dcsync_2019-03-01174830.json
as from https://mordor.readthedocs.io/en/latest/import_mordor.html
Tasks:
What is the process to perform this dynamically?
For example:
I would like to enable firewalls on every endpoint to replicate scenarios like this where the FW needs to be enabled:
https://twitter.com/HunterPlaybook/status/1166090088461361154
At the moment, this GPO is enabled: https://github.com/Cyb3rWard0g/mordor/tree/ca93d617f5b5a791cb7a67666a272dbf98602ea5/environment/shire/aws/scripts/DC/GPOBackup/disable_windows_defender_firewall
Can this be done dynamically? Thank you in advance.
Seeing the error "% ERROR: Failed to produce message (62261687 bytes): Broker: Message size too large" for various data-sets while trying to send the data to HELK.
Hello guys.
We with @AverageS used another way to upload index to our Demo Dashboard.
Would be great if you will add it to your docs.
Here is it:
small_datasets
directory:cd ./small_datasets
find . -name '*.tar.gz' -exec tar -xzf {} \;
import elasticsearch
import json
import os
es_url = "http://<es_ip/domain>:<es_port>"
es_user = ""
es_pass = ""
index_name = ""
_doc_type = ""
es = elasticsearch.Elasticsearch([es_url],http_auth=(es_user, es_pass))
for i in os.listdir():
if not i.endswith(".json"):
continue
with open(i) as f:
test = []
for line in f.readlines():
test.append(json.loads(line))
for x in test:
res = es.index(index=index_name, doc_type=_doc_type,body=x)
print(res['result'])
We weren't able to install kafkacat in ubuntu 19.10. We were able to get around this by following these instructions posted in dbcli/mssql-cli#252.
I think this could be helpful on https://mordordatasets.com/import_mordor.html
wget http://archive.ubuntu.com/ubuntu/pool/main/o/openssl1.0/libssl1.0.0_1.0.2n-1ubuntu6_amd64.deb
sudo dpkg -i libssl1.0.0_1.0.2n-1ubuntu6_amd64.deb
Got some error whet tried to download APT29 dataset.
apt29_evals_day1_manual.zip->Size mismatch
apt29_evals_day2_manual.zip->404 File not Found, 404 response
Hello!
Great project! I was wondering if you could dump somewhere in the documentation how the audit policy for the DC and also on clients are configured.
It would help a bunch to have this information.
Best,
\f
The data set referenced in this yaml isn't in the repo.
small/linux.discovery/host/sh_binary_padding_dd.zip
Hello Roberto,
First thank you for building Mordor and providing scripts which we can use to build our own datasets! I was able to use the Mordor-WinEvents.ps1 successfully with the native Windows logs but I also wonder if this script could be also used to convert some pre-recorded .evtx files into .json. For example this one:
https://github.com/sbousseaden/EVTX-ATTACK-SAMPLES/raw/master/Persistence/persistence_security_dcshadow_4742.evtx
Is this possible at the moment? I was not able to figure out how.
Thanks! Ludek
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.