TensorFlow Lite Support
Are bindings to TensorFlow Lite on the roadmap for this project?
I wrote some code to support minimal inferencing. But I have no time to push it forward recently. https://github.com/Nugine/tensorflow-lite
To use this crate, you may need to build tensorflow-lite manually, then specify how to link and where the libraries are located by environment variables. So you can build your projects quickly without internet or installation.
Since the safety requirements are unclear, I can not verify the soundness of this crate.
https://github.com/tensorflow/tensorflow/issues/49899 https://github.com/Nugine/tensorflow-lite/issues/1
To use this crate, you may need to build tensorflow-lite manually, then specify how to link and where the libraries are located by environment variables. So you can build your projects quickly without internet or installation.
This is more of a general comment and not directed specifically at you, but an important part of having support for TensorFlow Lite would be that downstream users don't need to worry about manually building tensorflow-lite or finding pre-compiled binaries. One of the big appeals of Rust is that cargo build or cargo build --target arm-linux-androideabi will Just Work without any extra hassle, and this works because all the build complexity is handled by the tensorflow-lite-sys crate (e.g. linking to the correct library if installed, otherwise compiling from source using the code included in the crate).
Most well designed C APIs are easy to write safe bindings for - the hard part is reliably (cross-)compiling and linking.
+1 This is quite helpful! (I need to run a deep learning model on android and ios with Rust, so tflite instead of tf itself is required)
+1 :) Are there any updates?