ebiquity / casie Goto Github PK
View Code? Open in Web Editor NEWCyberAttack Sensing and Information Extraction
CyberAttack Sensing and Information Extraction
How to run it?Can you give detailed steps?
Does anyone know about how to get content.nostop.label and .content.json or .ner.json which are for train and test input? Thanks in advance. Cheers.
Any recommendation on how to address the absence of .content.nostop.label
and .content.json
file extensions in the provided corpus?
Can you please provide some help on this issue in order to reproduce the work?
Thank you.
Dear author,
I reproduced the result preliminarily after supplementing some missing codes according to all the materials you provided.
However, there is only one Domain-Word2Vec model under the embedding folder you provide.
As for Transfer-Word2Vec, Cyber-Word2Vec and Pre-built BERT Word2Vec models are missing.
Is it convenient for you to supplement the missing three types of word2vec model files?
Your reply will be of great help to my research.
I am looking forward to your reply.
Thank you again!
The line "from tensorflow import set_random_seed" appearing in the following files:
- nug_arg_detection.py
- parseJsontoFeatures.py
- prepare.py
- x2index.py
- nug_arg_detection_bert.py
causes the following error:
ImportError: cannot import name 'set_random_seed'
In TensorFlow2, tf.random.set_seed() should be used instead.
Are you using TensorFlow1, in contrary to what requirements.txt says?
Hi,
can you please help me with data generation for content.nonstop.label and content.json
In the parseJsontoFeatures.py(line 25 and 212), what’s the ”wd_search” ? I google nothing about it. I guess that’s your code. Would you like sharing your code to me ? Thank you very much.
The line "import wd_search" in parseJsontoFeatures.py fails with the following error:
ModuleNotFoundError: No module named 'wd_search'
In which package the module is present? Please suggest.
nug_arg_detection.py: error: the following arguments are required: -trainfile, -testfile, -directory
Requirements installation (pip install -r requirements.txt) fails due to packages versions that can not be found and packages versions conflicts.
For example (copied from pip output):
ERROR: Cannot install -r requirements.txt (line 1) and numpy==1.17.2 because these package versions have conflicting dependencies.
The conflict is caused by:
The user requested numpy==1.17.2
bert-embedding 1.0.1 depends on numpy==1.14.6
The version 1.0.1 of bert-embedding appears to be the only present in pypi.org. Therefore I removed the version of numpy to let pip choose the best one. But it caused another problem:
tensorflow-gpu 2.0.0 requires numpy<2.0,>=1.16.0, but you have numpy 1.14.6 which is incompatible.
Also, I could not install tensorflow-gpu 2.0.0 because pip said there is no compatible wheel. Fortunately, it succeeded when I downloaded the wheel to my local file system and installed it from there. But, in addition to the mentioned above, pip complains on the following:
tensorflow-gpu 2.0.0 requires keras-applications>=1.0.8, but you have keras-applications 1.0.7 which is incompatible.
tensorflow-gpu 2.0.0 requires tensorboard<2.1.0,>=2.0.0, but you have tensorboard 1.13.1 which is incompatible.
tensorflow-gpu 2.0.0 requires tensorflow-estimator<2.1.0,>=2.0.0, but you have tensorflow-estimator 1.13.0 which is incompatible.
keras-contrib is not present in pypi.org. It can be installed from Github, but the specified version is not found. (Actually, no specific version can be found in that repository.)
If the current version is OK, could you please replace the respective line in requirements.txt with the following:
git+https://[email protected]/keras-team/keras-contrib.git
BTW, the repository README now says they are migrating to tensorflow/addons. If it's OK to install it instead, please suggest which version to take.
Also, pywikibot==3.0.dev0 is not found in pypi.org anymore. I guess the most close would be pywikibot==3.0.20170403, but its installation fails, because the respective wheel meta-data says it is 3.0.dev0 that is different from the wheel name.
Also, requests==2.21.0 requires version of urllib3 different from the specified (1.25.7)
I am using Python 3.6.13 on Ubuntu 20.04. Do you recommend another version of Python? I might be wrong, but if I remember correctly, some of the listed packages versions need v3.6 specifically.
I want to use my own articles as the test data and generate all json, content.json, content.nostop.json.
Anyone know how to create the json files in the annotation folder provided by the author?
Hi,
I was able to create .content.json (using stanford corenlp) and content.nostop.label files (with this repo). Now the program is looking for .ner.json files. Could you please provide a sample of these files so I know how to generate them? Many thanks in advance.
There is an ModuleNotFoundError: No module named 'wd_search', and I didn't find related package in PyPi. Are the related functions renamed?
Hi,
I got an error:
realis_identify.py: error: the following arguments are required: -trainfile, -testfile, -directory, -label
I am wondering how to solve this.
Any help is appreciated.
Thx
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.