Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ezkl unable to use model from post-training quantization #942

Open
Mberic opened this issue Feb 20, 2025 · 0 comments
Open

Ezkl unable to use model from post-training quantization #942

Mberic opened this issue Feb 20, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@Mberic
Copy link

Mberic commented Feb 20, 2025

original model: https://huggingface.co/unity/sentis-face-landmarks

Script to quantize:

from onnxruntime.quantization import quantize_dynamic, QuantType

# Define input and output model paths
model_fp32 = "face_landmark.onnx"  # Original model
model_quant = "face_landmark_quantized.onnx"  # Quantized model

# Perform dynamic quantization
quantize_dynamic(
    model_input=model_fp32,
    model_output=model_quant,
    weight_type=QuantType.QInt8  # Quantize weights to int8
)

print(f"Quantized model saved as {model_quant}")

Ezkl cli version: 19.0.7

After running ezkl gen-settings -M face_landmark_quantized.onnx, there's this error:

[E] [2025-02-20 07:06:23:602, ezkl] - [graph] [tract] Failed analyse for node #197 "conv2d_1_quant" ConvHir

@Mberic Mberic added the bug Something isn't working label Feb 20, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant