๐ญ Iโm currently working on practical applications for LLMs in cyber security.
๐ฌ Ask me about security architecture, generative AI security
- Security Architecture
- Python
- LLMs
AttackGen is a cybersecurity incident response testing tool that leverages the power of large language models and the comprehensive MITRE ATT&CK framework. The tool generates tailored incident response scenarios based on user-selected threat actor groups and your organisation's details.
License: GNU General Public License v3.0
Script to run app is called "๐_Welcome.py" and does not get renamed during setup process. Changing the name to "app.py" and following the instructions to run the Streamlit program allows it to execute as normal.
Hello- What a neat project! I'm attempting to exercise the AttackGen instance hosted at https://attackgen.streamlit.app/
Screenshot 01: On the Welcome page, I paste a freshly created OpenAI API key from my ChatGPT Pro account, press enter for acceptance, and make additional sections.
Screenshot 02: On the Generate Threat Group Scenario page, I click "Generate Scenario.
Screenshot 03: The Generate Threat Group Scenario page then presents the following error:
"An error occurred while generating the scenario: Error code: 404 - {'error': {'message': 'The model gpt-4 does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}"
Screenshot 04: Returning to the Welcome page, the API field is cleared.
Entering the same API key or a newly created one leads to the same results and error message.
Could you please review and advise if something is not working or how I should change my approach to exercising this AttackGen instance?
Respectfully,
Orlando Stevenson
Hello,
Thank you for the excellent project.
I'm encountering an issue while attempting to use Ollama from a Docker attackgen. I've made modifications to the welcome.py file, substituting:
response = requests.get("http://localhost:11434/api/tags")
with:
response = requests.get("http://host.docker.internal:11434/api/tags")
I'm able to retrieve the list of available Ollama models. However, when attempting to use the threat group scenario or a custom scenario, I encounter an error. It seems that the application is attempting to establish an HTTP connection to localhost:11434 instead of host.docker.internal:11434.
An error occurred while generating the scenario: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffff53750c20>: Failed to establish a new connection: [Errno 111] Connection refused'))
Could you please assist with what needs to be adapted? I've grepped for occurrences of 11434 but haven't found any usefull things.
Thank you!
Hello, firstly i want to thank u for this tool. It's amazing. I have one issue about "Generate Scenario". When i try the this function get a fail. Fail output like that:
UI Output
An error occurred while generating the scenario: 'ascii' codec can't encode character '\xa3' in position 16: ordinal not in range(128)
Console output
/app/pages/1_๐ก๏ธ_Threat_Group_Scenarios.py:198: DeprecationWarning: DataFrameGroupBy.apply operated on the grouping columns. This behavior is deprecated, and in a future version of pandas the grouping columns will be excluded from the operation. Either pass include_groups=False
to exclude the groupings or explicitly select the grouping columns after groupby to silence this warning.
.apply(lambda x: x.sample(n=1) if len(x) > 0 else None)
/app/pages/1_๐ก๏ธ_Threat_Group_Scenarios.py:198: DeprecationWarning: DataFrameGroupBy.apply operated on the grouping columns. This behavior is deprecated, and in a future version of pandas the grouping columns will be excluded from the operation. Either pass include_groups=False
to exclude the groupings or explicitly select the grouping columns after groupby to silence this warning.
.apply(lambda x: x.sample(n=1) if len(x) > 0 else None)
/usr/local/lib/python3.12/site-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: The class langchain_community.chat_models.openai.ChatOpenAI
was deprecated in langchain-community 0.0.10 and will be removed in 0.2.0. An updated version of the class exists in the langchain-openai package and should be used instead. To use it run pip install -U langchain-openai
and import as from langchain_openai import ChatOpenAI warn_deprecated(
Thanks in advance for your help
A secrets file was placed in the gitignore file but used as a location for a environ variable causing an error during runtime. Removed it since it wasn't necessary as the openai key was utilized in the app.
The little hand-wavy-thing at the beginning of the file name _Welcome.py is a PITA when it comes to the command line. Would you consider renaming it just "Welcome.py"?
how to support local ollama?
Hi @mrwadams, This is the error that is displaying while trying to access the sections "Threat_Group_Scenarios, Custom_Scenarios, AttackGen_Assistant"
ModuleNotFoundError: This app has encountered an error. The original error message is redacted to prevent data leaks. Full error details have been recorded in the logs (if you're on Streamlit Cloud, click on 'Manage app' in the lower right of your app).
Traceback:
File "/home/adminuser/venv/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 600, in _run_script
exec(code, module.__dict__)
File "/mount/src/attackgen/pages/1_๐ก๏ธ_Threat_Group_Scenarios.py", line 6, in
from langchain_community.llms import Ollama
After starting the welcome script using Python 3.11.9 and Streamlit 1.33, I entered my OpenAI API key and clicked on "Threat Group Scenarios." At this point I received the error below about a missing Streamlit secrets file:
FileNotFoundError: No secrets files found. Valid paths for a secrets.toml file are: /home/username/.streamlit/secrets.toml, /home/username/sh/learning/chatgpt-for-cybersecurity/attackgen/.streamlit/secrets.toml
Traceback:
File "/home/username/mambaforge/envs/openai/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 584, in _run_script
exec(code, module.__dict__)
File "/home/username/sh/learning/chatgpt-for-cybersecurity/attackgen/pages/1_๐ก๏ธ_Threat_Group_Scenarios.py", line 30, in <module>
if "LANGCHAIN_API_KEY" in st.secrets:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/username/mambaforge/envs/openai/lib/python3.11/site-packages/streamlit/runtime/secrets.py", line 345, in __contains__
return key in self._parse(True)
^^^^^^^^^^^^^^^^^
File "/home/username/mambaforge/envs/openai/lib/python3.11/site-packages/streamlit/runtime/secrets.py", line 214, in _parse
raise FileNotFoundError(err_msg)
I want to use attackgen but do not have a langchain API key. This projects documentation indicates
"If you do not wish to use LangSmith, you can delete the LangSmith related environment variables from the top of the following files"
I commented out the environment variables portion in those files. I tried all 4 environment variables, then just the last one (API_KEY) but receive the same failure:
File "/home/bbb/venv/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 542, in _run_script
exec(code, module.__dict__)
File "/.../attackgen/attackgen/pages/1_๐ก๏ธ_Threat_Group_Scenarios.py", line 28, in <module>
client = Client()
^^^^^^^^
File "/home/bbb/venv/lib/python3.11/site-packages/langsmith/client.py", line 480, in __init__
_validate_api_key_if_hosted(self.api_url, self.api_key)
File "/home/bbb/venv/lib/python3.11/site-packages/langsmith/client.py", line 269, in _validate_api_key_if_hosted
raise ls_utils.LangSmithUserError(
Hello. I'm sure it's something simple and that I'm just being clumsy, but I've tried on the website and by cloning the repository locally, and when I generate a scenario, I get the following error message:
An error occurred while generating the scenario: Error code: 404 - {'error': {'message': 'The model gpt-4-turbo-preview does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
Thank you in advance.
Hi,
It would be really useful to be able to ask it more questions and to elaborate on what has been generated.
Something like the following.
Is this repo a one-off poc or do you plan to keep investing in improving it?
--- "a/pages/2_\360\237\233\240\357\270\217_Custom_Scenarios.py"
+++ "b/pages/2_\360\237\233\240\357\270\217_Custom_Scenarios.py"
@@ -441,7 +441,35 @@ try:
st.markdown("---")
st.markdown(st.session_state['custom_scenario_text'])
st.download_button(label="Download Scenario", data=st.session_state['custom_scenario_text'], file_name="custom_scenario.md", mime="text/markdown")
-
+ st.session_state["question"] = st.text_input("Ask question:")
+ if st.button("Send Question", key="ask_question"):
+ if st.session_state["question"]:
+ if 'custom_scenario_text' in st.session_state:
+ st.markdown("---")
+ original_text = st.session_state['custom_scenario_text']
+ question = st.session_state['question']
+ messages.append(st.session_state['custom_scenario_text'])
+ messages.append(HumanMessage(
+ content=question
+ ))
+ model = os.getenv('OLLAMA_MODEL')
+ endpoint = os.getenv('OLLAMA_ENDPOINT')
+ st.markdown("Querying LLM")
+ llm = Ollama(model=model, base_url=f"http://{endpoint}")
+ response = llm.invoke(messages, model=model)
+ st.markdown("---")
+ all_content = original_text + "\n\n---\n" + question + "\n\n---\n" + response + "\n\n"
+ st.markdown(original_text)
+ st.markdown("---")
+ st.markdown(question)
+ st.markdown("---")
+ st.markdown(response)
+ st.markdown("---")
+ st.session_state['custom_scenario_text'] = all_content
+ st.session_state.pop('question')
+ else:
+ st.markdown("You must generate a scenario first")
+ st.session_state.pop("question")
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.