TensorFlow is a multipurpose machine learning framework. TensorFlow can be used anywhere from training huge models across clusters in the cloud, to running models locally on an embedded system like your phone.
This codelab uses TensorFlow Lite to run an image recognition model on an Android device.
If you don't have it installed already, go download and install AndroidStudio 4.1 Beta 1 or above while you are training your TensorFlow Lite model.
A simple camera app that runs a TensorFlow image recognition program to identify flowers.
License: Free to use
Before kicking off the model training, start downloading and installing the
Open the Colab which shows how to train a classifier with Keras to recognize flowers using TensorFlow Lite transfer learning.
The following command will clone the Git repository containing the files required for this codelab:
git clone https://github.com/hoitab/TFLClassify.git
Next, go to the directory you just cloned the repository. This is where you will be working on for the rest of this codelab:
cd TFLClassify
If you don't have it installed already, go install AndroidStudio 4.1 Beta 1 or above.
Open a project with Android Studio by taking the following steps:
TFLClassify/build.gradle
from your working directory. TFL_Classify.start
and press the run buttonstart
module in the project explorer on the left hand side:start
module or click on File
, then New
> Other
> TensorFlow Lite Model
FlowerModel.tflite
earlier.Finish
.TODO list makes it easy to navigate to the exact location where you need to update the codelab. You can also use it in your Android project to remind yourself of future work. You can add todo items using code comments and type the keyword TODO
. To access the list of TODOs, you can:
View
> Tool Windows
> TODO
Modules
private class ImageAnalyzer(ctx: Context, private val listener: RecognitionListener) :
ImageAnalysis.Analyzer {
...
// TODO 1: Add class variable TensorFlow Lite Model
private val flowerModel = FlowerModel.newInstance(ctx)
...
}
ImageProxy
into a Bitmap
and create a TensorImage
object for the inference process.override fun analyze(imageProxy: ImageProxy) {
...
// TODO 2: Convert Image to Bitmap then to TensorImage
val tfImage = TensorImage.fromBitmap(toBitmap(imageProxy))
...
}
score
with the highest probability first.MAX_RESULT_DISPLAY
. You can optionally vary the value of this variable to get more or less results.override fun analyze(imageProxy: ImageProxy) {
...
// TODO 3: Process the image using the trained model, sort and pick out the top results
val outputs = flowerModel.process(tfImage)
.probabilityAsCategoryList.apply {
sortByDescending { it.score } // Sort with highest confidence first
}.take(MAX_RESULT_DISPLAY) // take the top results
...
}
Recognition
ready to be consumed by RecyclerView
via Data Binding:override fun analyze(imageProxy: ImageProxy) {
...
// TODO 4: Converting the top probability items into a list of recognitions
for (output in outputs) {
items.add(Recognition(output.label, output.score))
}
...
}
// START - Placeholder code at the start of the codelab. Comment this block of code out.
for (i in 0..MAX_RESULT_DISPLAY-1){
items.add(Recognition("Fake label $i", Random.nextFloat()))
}
// END - Placeholder code at the start of the codelab. Comment this block of code out.
TFL_Classify.start
and press the run buttonTensorFlow Lite supports several hardware accelerators to speed up inference on your mobile device. GPU is one of the accelerators that TensorFlow Lite can leverage through a delegate mechanism and it is fairly easy to use.
start
module or you can click on TODO 5 under the TODO list and add the following dependency:// TODO 5: Optional GPU Delegates
implementation 'org.tensorflow:tensorflow-lite-gpu:2.2.0'
private class ImageAnalyzer(ctx: Context, private val listener: RecognitionListener) :
ImageAnalysis.Analyzer {
...
// TODO 6. Optional GPU acceleration
private val options = Model.Options.Builder().setDevice(Model.Device.GPU).build()
...
}
options
to the method input:private class ImageAnalyzer(ctx: Context, private val listener: RecognitionListener) :
ImageAnalysis.Analyzer {
...
// TODO 1: Add class variable TensorFlow Lite Model
private val flowerModel = FlowerModel.newInstance(ctx, options)
...
}
TFL_Classify.start
and press the run buttonHere are some links for more information: