endgameinc / eqllib Goto Github PK
View Code? Open in Web Editor NEWLicense: MIT License
License: MIT License
This is not really a bug, just outdated info in the docs.
eqllib needs the more_itertools package, which requires python3:
ERROR: Package 'more-itertools' requires a different Python: 2.7.17 not in '>=3.5'
This dependency makes eqllib fails at launch if using python2.7:
Traceback (most recent call last):
File "/usr/local/bin/eqllib", line 11, in
load_entry_point('eqllib==0.2.0', 'console_scripts', 'eqllib')()
File "/usr/local/lib/python2.7/dist-packages/pkg_resources/init.py", line 489, in load_entry_point
return get_distribution(dist).load_entry_point(group, name)
File "/usr/local/lib/python2.7/dist-packages/pkg_resources/init.py", line 2852, in load_entry_point
return ep.load()
File "/usr/local/lib/python2.7/dist-packages/pkg_resources/init.py", line 2443, in load
return self.resolve()
File "/usr/local/lib/python2.7/dist-packages/pkg_resources/init.py", line 2449, in resolve
module = import(self.module_name, fromlist=['name'], level=0)
File "/home/seppala/devel/src/coursera/exercises/holiday_hack/eqllib/eqllib/init.py", line 4, in
from .loader import Configuration
File "/home/seppala/devel/src/coursera/exercises/holiday_hack/eqllib/eqllib/loader.py", line 12, in
from .schemas import Analytic, BaseNormalization, make_normalization_schema
File "/home/seppala/devel/src/coursera/exercises/holiday_hack/eqllib/eqllib/schemas.py", line 4, in
import jsonschema
File "/usr/local/lib/python2.7/dist-packages/jsonschema-3.2.0-py2.7.egg/jsonschema/init.py", line 33, in
import importlib_metadata as metadata
File "/usr/local/lib/python2.7/dist-packages/importlib_metadata-1.3.0-py2.7.egg/importlib_metadata/init.py", line 9, in
import zipp
File "/usr/local/lib/python2.7/dist-packages/zipp-0.6.0-py2.7.egg/zipp.py", line 12, in
import more_itertools
File "build/bdist.linux-x86_64/egg/more_itertools/init.py", line 1, in
File "/usr/local/lib/python2.7/dist-packages/more_itertools-8.0.2-py2.7.egg/more_itertools/more.py", line 460
yield from iterable
^
SyntaxError: invalid syntax
Steps to reproduce the behavior:
Just pointing this out in case anyone (like me) is still living in the past and using python2.7.
Best,
Anna
Sysmon mapping: original_file_name = "OriginalFileName"
Hi,
Can it be possible to use multiple Json files instead of adding data to single Json file to querying data from Json file.
Regards,
Aniruddha D.
All analytics examples use single quote, e.g.
process where subtype.create and
process_name == "net.exe" and command_line == "* view*" and command_line != "\\"
whereas the query should use "process ... process_name == 'net.exe' ..." to work.
Just a cosmetic issue, but makes copy&paste easier :-)
The option --source is no longer there, and -f is gone as well.
Example in docs:
eqllib query -f my-sysmon-data.json --source "Microsoft Sysmon" "process where process_name in ('ipconfig.exe', 'netstat.exe', 'systeminfo.exe', 'route.exe')"
Actual working:
eqllib query -s 'Microsoft Sysmon' "process where process_name in ('ipconfig.exe', 'netstat.exe', 'systeminfo.exe', 'route.exe')"
Add DNS events to the Security Events schema and the mapping for Sysmon conversion.
Field mappings
query_name = "QueryName"
query_results = "QueryResults"
query_status = "QueryStatus"
I think the rest should be generic fields already mapped.
No sources available when running eqllib convert-data -h
:
C:\Users\IEUser\eqllib>eqllib convert-data -h
usage: eqllib convert-data [-h] [--encoding ENCODING]
[--format {json,jsonl,json.gz,jsonl.gz}]
[--source {}]
input-file output-file
positional arguments:
input-file Input JSON file
output-file Output JSON file
optional arguments:
-h, --help show this help message and exit
--encoding ENCODING, -e ENCODING
Encoding of input file
--format {json,jsonl,json.gz,jsonl.gz}
--source {}, -s {} Data source
Steps to reproduce the behavior:
After running the third command, there were a couple of warnings:
warning: manifest_maker: MANIFEST.in, line 3: path 'eqllib/domains/' cannot end with '/'
warning: manifest_maker: MANIFEST.in, line 4: path 'eqllib/sources/' cannot end with '/'
warning: manifest_maker: MANIFEST.in, line 5: path 'eqllib/analytics/' cannot end with '/'
C:\Users\IEUser\eqllib>eqllib convert-data -h
usage: eqllib convert-data [-h] [--encoding ENCODING]
[--format {json,jsonl,json.gz,jsonl.gz}]
[--source {security,MITRE Cyber Analytics Repository,Microsoft Sysmon}]
input-file output-file
positional arguments:
input-file Input JSON file
output-file Output JSON file
optional arguments:
-h, --help show this help message and exit
--encoding ENCODING, -e ENCODING
Encoding of input file
--format {json,jsonl,json.gz,jsonl.gz}
--source {security,MITRE Cyber Analytics Repository,Microsoft Sysmon}, -s {security,MITRE Cyber Analytics Repository,Microsoft Sysmon}
Data source
I fixed it by removing the '/' at the end (as the warning said basically). Not sure if relevant, but I'm using Win10 & Python 3.7.4.
When running convert-query
on a field that uses a normalization function (ex. baseName(path)
), the normalization function is not stripped from the converted query in a handful of queries. This can lead to issues down the road if the function is not found as expected.
Steps to reproduce the behavior:
$ eqllib convert-query -s "Endgame Platform" "registry where registry_value == '*foo*'"
registry where baseName(key_path) == "*foo*"
This fails in particular because there is no translation rule for normalizing wildcard comparisons. If you were checking for foo
with no wildcards, you'd get something like this, which is one case where normalization works as expected.
$ eqllib convert-query -s "Endgame Platform" "registry where registry_value == 'foo'"
registry where key_path == "*\\foo"
Instead of linking to undefined functions, an exception should be raised, or the underlying field should be returned as is.
$ eqllib convert-query -s "Endgame Platform" "registry where registry_value == '*blah'"
registry where key_path == "*\\*blah"
I created a custom source file to parse BRO logs. By default BRO has key names containing dots like id.orig_h
or id.resp_h
. When I do the following destination_address = 'id.orig_h'
and run eqllib it ignores this mapping. However, if I manually change id.orig_h
to dest_addr
in the JSON log file and change my source file statement to destination_address = 'dest_addr'
it works.
bro-source.toml
name = "Bro events"
strict = true
domain = "bro-domain"
filter_query = true
[timestamp]
field = "ts"
format = "%Y-%m-%d %H:%M:%S.%f"
[fields.mapping]
ts = "ts"
uid = "uid"
destination_address = 'id.orig_h'
[events.bro_conn]
filter = "conn_state"
[events.bro_conn.mapping]
proto = 'proto'
conn_state = 'conn_state'
local_orig = 'local_orig'
local_resp = 'local_resp'
bro-domain.toml
name = "bro-domain"
fields = [
# Common Fields
"ts",
"uid",
"destination_address"
]
[events.bro_conn]
fields = [
"proto",
"conn_state",
"local_orig",
"local_resp",
"missed_bytes"
]
I automatically call optimize on all BaseNodes. But only expressions need to be optimized.
There was an issue where | unique_count
wouldn't get optimized correctly. The bug could probably be fixed in eql
directly or here. I found a workaround here, so this works too.
Try to normalize a query with | unique_count
in it.
Not an exception
Got this traceback
Traceback (most recent call last):
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/unittest/case.py", line 59, in testPartExecutor
yield
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/unittest/case.py", line 615, in run
testMethod()
File "/Users/rwolf/Development/eqllib/tests/test_normalization.py", line 119, in test_unmodified_unique_count
normalizer.normalize_ast(original)
File "/Users/rwolf/Development/eqllib/eqllib/normalization.py", line 244, in normalize_ast
return QueryNormalizer(self).walk(node)
File "/Users/rwolf/Development/eqllib/eqllib/normalization.py", line 33, in walk
node = super(QueryNormalizer, self).walk(node, *args, **kwargs)
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/walkers.py", line 225, in walk
slots = [self.walk(v, *args, **kwargs) for name, v in node.iter_slots()]
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/walkers.py", line 225, in <listcomp>
slots = [self.walk(v, *args, **kwargs) for name, v in node.iter_slots()]
File "/Users/rwolf/Development/eqllib/eqllib/normalization.py", line 33, in walk
node = super(QueryNormalizer, self).walk(node, *args, **kwargs)
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/walkers.py", line 216, in walk
rv = self.autowalk(node, *args, **kwargs)
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/walkers.py", line 178, in autowalk
return [self.walk(n, *args, **kwargs) for n in node]
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/walkers.py", line 178, in <listcomp>
return [self.walk(n, *args, **kwargs) for n in node]
File "/Users/rwolf/Development/eqllib/eqllib/normalization.py", line 35, in walk
node = node.optimize()
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/ast.py", line 100, in optimize
return Optimizer(recursive=recursive).walk(self)
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/optimizer.py", line 24, in walk
return Walker.walk(self, node, *args, **kwargs)
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/walkers.py", line 188, in walk
rv = self.autowalk(node, *args, **kwargs)
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/walkers.py", line 184, in autowalk
return dict({self.walk(k, *args, **kwargs): self.walk(v, *args, **kwargs) for k, v in node.items()})
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/contextlib.py", line 119, in __exit__
next(self.gen)
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/walkers.py", line 172, in set_context
exit_method(node)
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/walkers.py", line 116, in _exit_pipe_command
output_schemas = node.output_schemas([NodeInfo(a, TypeHint.Unknown) for a in node.arguments], incoming_schema)
File "/Users/rwolf/.conda/envs/eqllib/lib/python3.7/site-packages/eql/pipes.py", line 129, in output_schemas
first_event_type, = event_schemas[0].schema.keys()
IndexError: list index out of range
The EQL query for "T1174 Password Filter DLL" shows
registry where hive.hklm and
registry_path == "SYSTEM\ControlSet\Control\Lsa\Notification Packages*"
| unique registry_path, process_path
the registry path should be "SYSTEM\*ControlSet*\Control\Lsa\Notification Packages" as the above condition does not allow to search for LSA "notification packages" from "currentcontrolset"
(thats * around ControlSet)
https://eqllib.readthedocs.io/en/latest/analytics/ae6ae50f-69f3-4e85-bfe2-2db9d1422517.html
I tried different queries but everytime i get this error: "run_query() got an unexpected keyword argument 'columns'"
$ git clone https://github.com/endgameinc/eqllib
$ cd eqllib
$ python setup.py install
$ sudo eqllib query -f sysmon-data.json "process where command_line = '*l*'"
Traceback (most recent call last):
File "/usr/local/bin/eqllib", line 11, in <module>
load_entry_point('eqllib==0.2.0', 'console_scripts', 'eqllib')()
File "/usr/local/lib/python3.6/dist-packages/eqllib-0.2.0-py3.6.egg/eqllib/main.py", line 231, in normalize_main
return func(config=config, **kv)
TypeError: run_query() got an unexpected keyword argument 'columns'
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.