A deep-learning android app to detect and classify fruits, vegetables and flowers, while providing corresponding knowledgeable information.
This is an example application for TensorFlow Lite
on Android. It uses
Image classification
to continuously classify whatever it sees from the device’s back camera.
Inference is performed using the TensorFlow Lite Java API. The demo app
classifies frames in real-time, displaying the top most probable
classifications. It allows the user to choose between a floating point or
quantized
model, select the thread count, and decide whether to run on CPU, GPU, or via
NNAPI.
These instructions walk you through building and
running the demo on an Android device. For an explanation of the source, see
TensorFlow Lite Android image classification example.
We provide 4 models bundled in this App: MobileNetV1 (float), MobileNetV1
(quantized), EfficientNetLite (float) and EfficientNetLite (quantized).
Particularly, we chose “mobilenet_v1_1.0_224” and “efficientnet-lite0”.
MobileNets are classical models, while EfficientNets are the latest work. The
chosen EfficientNet (lite0) has comparable speed with MobileNetV1, and on the
ImageNet dataset, EfficientNet-lite0 out performs MobileNetV1 by ~4% in terms of
top-1 accuracy.
For details of the model used, visit Image classification.
Downloading, extracting, and placing the model in the assets folder is managed
automatically by download.gradle.
Android Studio 3.2 (installed on a Linux, Mac or Windows machine)
Android device in
developer mode
with USB debugging enabled
USB cable (to connect Android device to your computer)
Clone the TensorFlow examples GitHub repository to your computer to get the demo
application.
git clone https://github.com/tensorflow/examples
Open the TensorFlow source code in Android Studio. To do this, open Android
Studio and select Open an existing project
, setting the folder toexamples/lite/examples/image_classification/android
Select Build -> Make Project
and check that the project builds successfully.
You will need Android SDK configured in the settings. You’ll need at least SDK
version 23. The build.gradle
file will prompt you to download any missing
libraries.
The file download.gradle
directs gradle to download the two models used in the
example, placing them into assets
.
Connect the Android device to the computer and be sure to approve any ADB
permission prompts that appear on your phone. Select Run -> Run app.
Select
the deployment target in the connected devices to the device on which the app
will be installed. This will install the app on the device.
To test the app, open the app called TFL Classify
on your device. When you run
the app the first time, the app will request permission to access the camera.
Re-installing the app may require you to uninstall the previous installations.
Do not delete the assets folder content. If you explicitly deleted the
files, choose Build -> Rebuild
to re-download the deleted model files into the
assets folder.