Skip to content

Support Phi-4-mini and Phi-4-multimodal-instruct in LLM text-generation comps on gaudi mode #4223

Support Phi-4-mini and Phi-4-multimodal-instruct in LLM text-generation comps on gaudi mode

Support Phi-4-mini and Phi-4-multimodal-instruct in LLM text-generation comps on gaudi mode #4223

Triggered via pull request March 4, 2025 06:17
@XinyaoWaXinyaoWa
synchronize #1335
Status Success
Total duration 27m 0s
Artifacts 9

pr-microservice-test.yml

on: pull_request_target
job1  /  Get-test-matrix
33s
job1 / Get-test-matrix
Matrix: Microservice-test
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
llms_text-generation_bedrock
482 Bytes
llms_text-generation_native_on_intel_hpu
459 Bytes
llms_text-generation_native_phi4_multimodal_on_intel_hpu
498 Bytes
llms_text-generation_native_phi4_on_intel_hpu
467 Bytes
llms_text-generation_predictionguard
178 Bytes
llms_text-generation_service_ollama
348 Bytes
llms_text-generation_service_tgi
853 Bytes
llms_text-generation_service_tgi_on_intel_hpu
864 Bytes
llms_text-generation_service_vllm_on_intel_hpu
1.12 KB