integrations/tflite/ #8856
Replies: 12 comments 44 replies
-
Mobile SSD models are expected to have exactly 4 outputs, found 1, I get this error trying to integrate on Android, i added metadata for input. How do I suppose to handle this, and can you provide example? |
Beta Was this translation helpful? Give feedback.
-
"TensorFlow SavedModel: export failure ❌ 49.4s: generic_type: cannot initialize type "StatusCode": an object with that name is already defined" Im having this error occur when I try to export my yolov8 weights to tflite format. Any idea how to solve this? |
Beta Was this translation helpful? Give feedback.
-
I am trying to export the yolov8 default model (yolov8n.pt) into TFlite but after inputing the code Load a modelmodel = YOLO('yolov8n.pt') # build a new model from YAML Train the modelmodel.export(format='tflite') I get the error message ERROR: The trace log is below.
What you should do instead is wrap
ERROR: input_onnx_file_path: yolov8n.onnx |
Beta Was this translation helpful? Give feedback.
-
Hey,I met a problem,these my conda list absl-py 2.1.0 When I trans the .pt to .tflite,it happened this PyTorch: starting from 'E:\Lady_YOLO\YAML\yolov8n.pt' with input shape (1, 3, 160, 128) BCHW and output shape(s) (1, 84, 420) (6.2 MB) This is my code,and the model is official from ultralytics import YOLO |
Beta Was this translation helpful? Give feedback.
-
I am trying to run a yolov8 model on tflite in c++ on a linux arm v8l. May I know if tflite can support this and the version that supports it? Would appreciate any help |
Beta Was this translation helpful? Give feedback.
-
Hi, i trained a model yolov8 with custom dataset containing 26 classess, but when i convert the model to tflite i noticed that it gives as output [1,30,8400] and this is what caused me errors when using my model with flutter. the error |
Beta Was this translation helpful? Give feedback.
-
Hello |
Beta Was this translation helpful? Give feedback.
-
Hi, The following is the content of my requirements.txt: and am using the following code for the conversion: Load the YOLOv8 modelmodel = YOLO("current_best4.pt") Export the model to TFLite formatmodel.export(format="tflite", int8=True, data='./datasets/data.yaml') # creates 'yolov8n_float32.tflite' Load the exported TFLite modeltflite_model = YOLO("yolov8n_int8.tflite") Run inferenceresults = tflite_model("./frame_365.jpg") results.show() Any help will be greatly appreciated |
Beta Was this translation helpful? Give feedback.
-
import cv2
cap.release() For this code I'm getting this error TensorFlow SavedModel: export failure ❌ 47.0s: No module named 'tensorflow_lite_support' |
Beta Was this translation helpful? Give feedback.
-
Hi! I trying to convert my model .pt to .tflite but i get this error: Ultralytics YOLOv8.2.73 🚀 Python-3.8.19 torch-2.1.2+cpu CPU (11th Gen Intel Core(TM) i5-1145G7 2.60GHz) PyTorch: starting from 'runs\classify\train\weights\last.pt' with input shape (1, 3, 64, 64) BCHW and output shape(s) (1, 4) (2.8 MB) TensorFlow SavedModel: starting export with tensorflow 2.13.0... I have already tried to install the requirements indicated there but I get a compatibility error. Can you help me please? |
Beta Was this translation helpful? Give feedback.
-
i trained a custom model of yolov8segm converted into tflite but when i imported it on my android studio project, it says that it's an invalid model. what should i do? the model works properly in colab and segments the object in the image |
Beta Was this translation helpful? Give feedback.
-
When exporting the pre-trained model to TFLite:
My export is crashing due to the process trying to allocate 26GB of RAM memory ( exceeding available limit) As I have understood it, the data-parameter in the export is to verify the quantization (int8-param) When using the default coco8.yaml file, the process in complaining that the verification is not optimal (It wants atleast 1000 images, but coco8 provides just 4). So to enahnce this, I thought: why not use the entire coco-dataset? The question being: Am I correct to use the entire coco-dataset if i want the quantization to become better, or should i just stick to the default. And if the answer is: yes, use the entire coco-set, how do I overcome the huge allocation of RAM for that? is there a dataset with say 1000 images, instead of the 120K+ images of the entire coco-set? |
Beta Was this translation helpful? Give feedback.
-
integrations/tflite/
Explore how to improve your Ultralytics YOLOv8 model's performance and interoperability using the TFLite export format suitable for edge computing environments.
https://docs.ultralytics.com/integrations/tflite/
Beta Was this translation helpful? Give feedback.
All reactions