Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Llama3 / Ollama supported LLMs #114

Closed
t3chn0m4g3 opened this issue Jul 18, 2024 · 15 comments · Fixed by #115
Closed

Support for Llama3 / Ollama supported LLMs #114

t3chn0m4g3 opened this issue Jul 18, 2024 · 15 comments · Fixed by #115
Assignees
Labels
enhancement New feature or request

Comments

@t3chn0m4g3
Copy link
Contributor

Is your feature request related to a problem? Please describe.
Using OpenAI API can involve unforeseen costs.

Describe the solution you'd like
Adding support for Llama3 / Ollama supported LLMs could allow for streamlining invests and for larger installations.

@mariocandela
Copy link
Owner

Hi @t3chn0m4g3,

Nice to meet you!

Thanks for your contribution, I will take charge of this new feature in the following days :)

Cheers

Mario

@t3chn0m4g3
Copy link
Contributor Author

t3chn0m4g3 commented Jul 18, 2024

@mariocandela Likewise, this is great news! Looking forward to it!

@mariocandela
Copy link
Owner

mariocandela commented Jul 21, 2024

Hi @t3chn0m4g3,

Download the latest version and take a look here: https://github.com/mariocandela/beelzebub?tab=readme-ov-file#honeypot-llm-honeypots

Happy hacking ❤️

Cheers

Mario

@mariocandela mariocandela pinned this issue Jul 21, 2024
@mariocandela mariocandela reopened this Jul 21, 2024
@t3chn0m4g3
Copy link
Contributor Author

t3chn0m4g3 commented Jul 22, 2024

@mariocandela Closing this is fine, thank you very much for taking this on with warp speed 🤩🚀

I can now start testing, if everything works out I will be happy to integrate this into T-Pot.

Thanks again ❤️

@t3chn0m4g3
Copy link
Contributor Author

Getting started 👋

@mariocandela
Copy link
Owner

@t3chn0m4g3 if I can help you in any way, write to me I am happy to help you 😊

@t3chn0m4g3
Copy link
Contributor Author

@mariocandela Thank you, highly appreciated 😍

@t3chn0m4g3
Copy link
Contributor Author

Lab is set up, now getting to work :)

@t3chn0m4g3
Copy link
Contributor Author

t3chn0m4g3 commented Aug 26, 2024

@mariocandela - In plugins/llm-integration.go I can see the ollamaEndpoint as a const ...

const (
systemPromptVirtualizeLinuxTerminal = "You will act as an Ubuntu Linux terminal. The user will type commands, and you are to reply with what the terminal should show. Your responses must be contained within a single code block. Do not provide explanations or type commands unless explicitly instructed by the user. Your entire response/output is going to consist of a simple text with \n for new line, and you will NOT wrap it within string md markers"
systemPromptVirtualizeHTTPServer = "You will act as an unsecure HTTP Server with multiple vulnerability like aws and git credentials stored into root http directory. The user will send HTTP requests, and you are to reply with what the server should show. Do not provide explanations or type commands unless explicitly instructed by the user."
LLMPluginName = "LLMHoneypot"
openAIGPTEndpoint = "https://api.openai.com/v1/chat/completions"
ollamaEndpoint = "http://localhost:11434/api/chat"

... is there any option in the config or the cli to overwrite the endpoint with a different host (assuming that ollama will probably not reside on the honeypot)?

@t3chn0m4g3
Copy link
Contributor Author

@mariocandela Reading helps... sorry...

apiVersion: "v1"
protocol: "ssh"
address: ":2222"
description: "SSH Ollama Llama3"
commands:
  - regex: "^(.+)$"
    plugin: "LLMHoneypot"
serverVersion: "OpenSSH"
serverName: "ubuntu"
passwordRegex: "^(root|qwerty|Smoker666|123456|jenkins|minecraft|sinus|alex|postgres|Ly123456)$"
deadlineTimeoutSeconds: 60
plugin:
   llmModel: "llama3"
   host: "http://example.com/api/chat" #default http://localhost:11434/api/chat

@mariocandela
Copy link
Owner

mariocandela commented Aug 26, 2024

@t3chn0m4g3 Don't worry mate, you can find me here for anything 😄

ps: These days I'm working on the documentation ❤️

Thanks for your time!

@t3chn0m4g3
Copy link
Contributor Author

Thank you @mariocandela! ❤️

@t3chn0m4g3
Copy link
Contributor Author

t3chn0m4g3 commented Aug 28, 2024

@mariocandela It is working 🚀😁❤️

I have some issues (T-Pot => Elastic, Kibana objects) with the logging format. Currently tr.TraceEvent stores the events as nested JSON objects...

tr.TraceEvent(tracer.Event{
Msg: "New SSH Session",
Protocol: tracer.SSH.String(),
RemoteAddr: sess.RemoteAddr().String(),
Status: tracer.Start.String(),
ID: uuidSession.String(),
Environ: strings.Join(sess.Environ(), ","),
User: sess.User(),
Description: beelzebubServiceConfiguration.Description,
Command: sess.RawCommand(),
})

... and does not split RemoteAddr into ip and port.

I am unfamiliar with go, so I managed to get this to work as an example:

remoteAddr, remotePort, err := net.SplitHostPort(sess.RemoteAddr().String())
if err != nil {
        remoteAddr = "unknown"
        remotePort = "unknown"
}

log.WithFields(log.Fields{
	"info":         "New SSH Session",
	"protocol":    tracer.SSH.String(),
	"src_ip":  remoteAddr,
	"src_port":  remotePort,
	"status":      tracer.Start.String(),
	"id":          uuidSession.String(),
	"environ":     strings.Join(sess.Environ(), ","),
	"user":        sess.User(),
	"plugin": beelzebubServiceConfiguration.Description,
	"command":     sess.RawCommand(),
})

There is probably an easier / better way 😅, maybe you have an idea? Don't know if changes lead to any problems on your end. If not, let me know, happy to contribute.

@mariocandela
Copy link
Owner

mariocandela commented Aug 30, 2024

Hi @t3chn0m4g3,

Sorry for the delay, I added the two useful information for kibana :)

https://github.com/mariocandela/beelzebub/releases/tag/v3.2.4

To speed up the process I applied the change, your solution is valid 😄

Thanks for your time ❤️

@t3chn0m4g3
Copy link
Contributor Author

Awesome! Thank you! 🤩

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants