Click here to Skip to main content
15,886,873 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
I'm working on a real-time object detection mobile application using kotlin. I trained a yolo v5 model on a custom dataset of 4 classes, then converted the model to a .tflite file.
I used this line to convert the yolo model to tflite:
!python /content/yolov5/export.py --weights best.pt --include tflite --nms --img 640 --data data.yaml

Output Tensor Details:
======================
Name: StatefulPartitionedCall:0
Shape: [ 1 100 4]
Type: <class 'numpy.float32'="">
the problem is the mobile app keeps generating errors and crashes once the splash screen is viewed, I don't know if the problem is coming from the app or the model.

What I have tried:

I used this sample kotlin code provided with the model:
val model = ObjectDetection.newInstance(context)

// Creates inputs for reference.
val inputFeature0 = TensorBuffer.createFixedSize(intArrayOf(1, 640, 640, 3), DataType.FLOAT32)
inputFeature0.loadBuffer(byteBuffer)

// Runs model inference and gets result.
val outputs = model.process(inputFeature0)
val outputFeature0 = outputs.outputFeature0AsTensorBuffer
val outputFeature1 = outputs.outputFeature1AsTensorBuffer
val outputFeature2 = outputs.outputFeature2AsTensorBuffer
val outputFeature3 = outputs.outputFeature3AsTensorBuffer

// Releases model resources if no longer used.
model.close()

The error generated in the mobile app now is:
java.lang.AssertionError: TensorBuffer does not support data type: INT32

this error is generated from this line:
val outputs = model.process(inputFeature0)

however, I traced the model's output and found that it's of datatype: float32 so I don't get why do i get this error?
Posted
Updated 20-Mar-23 4:02am
v2
Comments
Member 15950766 20-Mar-23 9:45am    
Can your provide more information about the steps you followed for converting the YOLOv5 model to TFlite?
Fatema Shawki 20-Mar-23 19:43pm    
I used the export function to convert from yolo to tensor flow and then converted the tensor flow saved model to tensor flow lite using tf.convertor
1)
!python /content/yolov5/export.py --weights /content/yolov5/runs/train/exp/weights/best.pt --include tflite --img 640 --data data.yaml
2)
converter = tf.lite.TFLiteConverter.from_saved_model('/content/yolov5/runs/train/exp/weights/best_saved_model')
tflite_model = converter.convert()
with open('model.tflite', 'wb') as f:
f.write(tflite_model)

1 solution

I suspect the problem may be the use of intArrayOf in the followibng line:
Kotlin
val inputFeature0 = TensorBuffer.createFixedSize(intArrayOf(1, 640, 640, 3), DataType.FLOAT32)

which suggests that you are passing an array of INT32 values (1, 640, 640 and 3), even though you have declared the content to be FLOAT32.
See TensorBuffer  |  TensorFlow Lite[^] for the correct usage.
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900