Replies: 3 comments
-
|
We need more info to help you debug this. What Ollama model are you using? Whats the prompt? Can you show the exact "fabric" command you are using? When you try with "--dry-run" what is being sent to the LLM? |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
@naven87 can you try "--raw" which will likely solve your issue? |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
It not a public facing interface so not something you can check, but "--raw" solved it. Thanks. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am connecting Fabric to an internal LLM which supports an Ollama interface. But I'm noticing that it seems like when called with a pattern the prompt is sent as a system role only. Our internal tool rejects any message without a User role message. Is there any option I'm missing which can redirect the prompt as a user message? I think I can hack by running a dry-run and then filtering off the dry-run lines before sending straight to fabric, but that seems really wacky.
Beta Was this translation helpful? Give feedback.
All reactions