Bug: Random inputs generated automatically in llama-cli #9456
Labels
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
What happened?
I am running the cuurent model :
./llama.cpp/llama-cli -m /home/piuser/Desktop/Abhrant/Meta-Llama-3-8B.Q4_K_S.gguf -n 512 --repeat_penalty 1.0 --color -i -r "User:" -f llama.cpp/prompts/chat-with-bob.txt
When I do this, the cli starts and the conversation goes on normally. Sometimes, a random input is automatically taken even when I am not giving it.
For example:
I have added the question "what can you do? ". I have not added the input "I love you Bob." it automatically came up after the answer to "what can you do? " was generated. Any idea why?
Name and Version
version: 3733 (1b28061)
built with cc (Debian 12.2.0-14) 12.2.0 for aarch64-linux-gnu
What operating system are you seeing the problem on?
Linux
Relevant log output
The text was updated successfully, but these errors were encountered: