How to use DeepSeek LLM via Azure? #17765
-
How to use DeepSeek LLM in via Azure? Is this already implemented? |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 10 replies
-
Yes, the integration of DeepSeek LLM with Azure is implemented. The codebase includes Azure-specific dependencies and configurations for using DeepSeek LLM with Azure services. You can set up DeepSeek LLM on Azure by configuring the Additionally, the codebase uses To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
@aiwalter were you successful using the Azure DeepSeek model deployment? |
Beta Was this translation helpful? Give feedback.
-
I believe you can use from llama_index.llms.azure_inference import AzureAICompletionsModel
llm = AzureAICompletionsModel(
model_name="DeepSeek-R1",
endpoint="https://<FOUNDRY ENDPOINT>.services.ai.azure.com/models",
credential=os.environ["AZURE_INFERENCE_CREDENTIAL"],
) Here's some doc: |
Beta Was this translation helpful? Give feedback.
-
@dosubot how can I deactivate the thinking output?
But in this case I would just want to have 'Hello! How can I assist you today? 😊' , what parameter to set? |
Beta Was this translation helpful? Give feedback.
I believe you can use
AzureAICompletionsModel
Here's some doc: