-
-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Llama3 / Ollama supported LLMs #114
Comments
Hi @t3chn0m4g3, Nice to meet you! Thanks for your contribution, I will take charge of this new feature in the following days :) Cheers Mario |
@mariocandela Likewise, this is great news! Looking forward to it! |
Hi @t3chn0m4g3, Download the latest version and take a look here: https://github.com/mariocandela/beelzebub?tab=readme-ov-file#honeypot-llm-honeypots Happy hacking ❤️ Cheers Mario |
@mariocandela Closing this is fine, thank you very much for taking this on with warp speed 🤩🚀 I can now start testing, if everything works out I will be happy to integrate this into T-Pot. Thanks again ❤️ |
Getting started 👋 |
@t3chn0m4g3 if I can help you in any way, write to me I am happy to help you 😊 |
@mariocandela Thank you, highly appreciated 😍 |
Lab is set up, now getting to work :) |
@mariocandela - In beelzebub/plugins/llm-integration.go Lines 13 to 18 in 628e20e
... is there any option in the config or the cli to overwrite the endpoint with a different host (assuming that ollama will probably not reside on the honeypot)? |
@mariocandela Reading helps... sorry...
|
@t3chn0m4g3 Don't worry mate, you can find me here for anything 😄 ps: These days I'm working on the documentation ❤️ Thanks for your time! |
Thank you @mariocandela! ❤️ |
@mariocandela It is working 🚀😁❤️ I have some issues (T-Pot => Elastic, Kibana objects) with the logging format. Currently beelzebub/protocols/strategies/ssh.go Lines 32 to 42 in 628e20e
... and does not split RemoteAddr into ip and port .
I am unfamiliar with go, so I managed to get this to work as an example:
There is probably an easier / better way 😅, maybe you have an idea? Don't know if changes lead to any problems on your end. If not, let me know, happy to contribute. |
Hi @t3chn0m4g3, Sorry for the delay, I added the two useful information for kibana :) https://github.com/mariocandela/beelzebub/releases/tag/v3.2.4 To speed up the process I applied the change, your solution is valid 😄 Thanks for your time ❤️ |
Awesome! Thank you! 🤩 |
Is your feature request related to a problem? Please describe.
Using OpenAI API can involve unforeseen costs.
Describe the solution you'd like
Adding support for Llama3 / Ollama supported LLMs could allow for streamlining invests and for larger installations.
The text was updated successfully, but these errors were encountered: