Skip to content

Conversation

LogicalGuy77
Copy link
Contributor

Fixes: #97

Summary:

The changes allow users to view ModelMesh-specific resources and status for each component of an InferenceService.

Testing

  • Deploy an InferenceService in ModelMesh mode.
  • Navigate to its details page.
  • Verify that ModelMesh resources are displayed correctly for each component.
1 2

Copy link
Contributor

@Griffin-Sullivan Griffin-Sullivan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just one question otherwise great work!

Comment on lines 324 to 330
# Try to find ModelMesh deployment (usually named modelmesh-serving)
try:
modelmesh_deployment = api.get_custom_rsrc(
**versions.K8S_DEPLOYMENT,
namespace=namespace,
name="modelmesh-serving"
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure it's safe to assume this deployment is always named modelmesh-serving?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're correct to point it out. I did some digging and modelmesh-serving is named as per the template at: https://github.com/kserve/modelmesh-serving/blob/05994465603e8baa3d23bae126a085f5f8bb591f/config/internal/base/deployment.yaml.tmpl#L17

  name: {{.ServiceName}}-{{.Name}}
  labels:
    modelmesh-service: {{.ServiceName}}

Copy link
Contributor

@Griffin-Sullivan Griffin-Sullivan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Much more readable now just a couple comments that I think you can adjust to make this even simpler and we should be good if it still works for you

Comment on lines 328 to 334
# Fallback: infer from model format (e.g., "sklearn" -> "mlserver-sklearn")
if (
model_format := component_spec.get("model", {})
.get("modelFormat", {})
.get("name")
):
return f"mlserver-{model_format.lower()}"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure you really need this. If an inference service doesn't specify the runtime it'll never work 🤷

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup removed this part and it works just fine.

"""
default_name = "modelmesh-serving"
# A configurable environment variable is more practical than discovery
service_name = os.environ.get("MODELMESH_SERVICE_NAME", default_name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so the user is supposed to set this environment variable? Is there no other way to get the modelmesh service? It's probably ok to assume the user didn't change the name of the modelmesh service IMO

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll remove the env part then.

@LogicalGuy77 LogicalGuy77 changed the title Implement ModelMesh support in ISVC handling Implement ModelMesh support in ISVC handling and shift to python Black formatting Sep 2, 2025
Copy link
Contributor

@Griffin-Sullivan Griffin-Sullivan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work!

@juliusvonkohout juliusvonkohout merged commit 6be7dc1 into kserve:master Sep 5, 2025
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Cannot view endpoint details on Kubeflow
3 participants