Here we discuss the real-time testing of our app on two Android devices.
In the previous article, we went over the TFLite model setup in the Android environment, up to a working demo application.
In this one, we’ll do some real-time testing on Android devices.
I have tested the app with two different devices – Samsung SM-A710FD and Huawei MediaPad T3 10 – both with the Android version 7.0.
A piece of advice: don’t test the app in your room because anything and everything that emits or reflects light will be detected as a lightning. This is what we’ve trained our model to do, right? The almost 300 images in our dataset all show the lightning striking at night. This means black background and a light object - which is detected as lightning. The same thing happens when you point your phone or tablet camera to a computer screen or the light source on the ceiling. I suggest you test the app outside, at night, targeting something like street lights as a lightning substitute. I did a test with an iPhone – opened up a lightning image on it and targeted the screen with the Android device running the app. And yes: the app detected lightning!
Note that the targeting (camera pointing) should be abrupt so as to mimic the abrupt appearance of the real lightning.
For manual testing, I took my Android phone, Huawei tablet, and iPhone (loaded with the lightning images) to my balcony. It was a beautiful, clear night. Then I ran the app on my Android device. Have a look at the resulting video and screenshots.
Same for the Huawei tablet – see video and screenshots.
Magic, right? As I have mentioned earlier, any emitted or reflected light would be detected as lightning. To make the model work more accurately, you can train it more. I did the training with approximately 300 images. Give it a try with a larger dataset, say 1000 images.
In the next article, we’ll discuss the project outcome and "lessons learned" – how the approach we’ve followed can be utilized for similar detection tasks. Stay tuned!