I'm working on a real-time object detection mobile application using kotlin. I trained a yolo v5 model on a custom dataset of 4 classes, then converted the model to a .tflite file.
I used this line to convert the yolo model to tflite:
!python /content/yolov5/export.py --weights best.pt --include tflite --nms --img 640 --data data.yaml
Output Tensor Details:
======================
Name: StatefulPartitionedCall:0
Shape: [ 1 100 4]
Type: <class 'numpy.float32'="">
the problem is the mobile app keeps generating errors and crashes once the splash screen is viewed, I don't know if the problem is coming from the app or the model.
What I have tried:
I used this sample kotlin code provided with the model:
val model = ObjectDetection.newInstance(context)
val inputFeature0 = TensorBuffer.createFixedSize(intArrayOf(1, 640, 640, 3), DataType.FLOAT32)
inputFeature0.loadBuffer(byteBuffer)
val outputs = model.process(inputFeature0)
val outputFeature0 = outputs.outputFeature0AsTensorBuffer
val outputFeature1 = outputs.outputFeature1AsTensorBuffer
val outputFeature2 = outputs.outputFeature2AsTensorBuffer
val outputFeature3 = outputs.outputFeature3AsTensorBuffer
model.close()
The error generated in the mobile app now is:
java.lang.AssertionError: TensorBuffer does not support data type: INT32
this error is generated from this line:
val outputs = model.process(inputFeature0)
however, I traced the model's output and found that it's of datatype: float32 so I don't get why do i get this error?