Machine learning (ML) algorithms usually work on huge datasets and require an enormous amount of resources. On the other hand many mobile phone use face recognition to improve photographs or speech recognition for natural language translation. This can be achieved by training machine learning models, in particular neural networks, on the big computer, which may use a GPU (Graphical Procession Unit) available on many of them, while the trained model is scaled down and used on the mobile phone.
Not long ago Google has released TensorFlow, the Artificial Intelligence library it uses for its own algorithms, as OpenSource and therefore made it available to everybody. The full Tensorflow library is used to create and train models and the TFLite library allow the down-scaling and use on smaller machines. Only very recently TFLite-micro (tflm) has been released which strips down the system even further to make it available of embedded systems.
The TinyML library is described in a book published by O'Reilly. This book demonstrates how to use tflm models on three different micro-controllers:
It describes 4 different projects:
The following pages show how to set up the PC to run TensorFlow, how to create models and how to down-scale them for use on the ESP32. Then a custom version of MicroPython must be created including the esp32-camera driver and its MicroPython interface as well as ulab, a stripped down version of numpy and scipy as well as the tflm library and its MicroPython access routines.
Finally the installation of the TinyML examples on the ESP32 is explained.