Close

Using tensorflow in a C program

A project log for Raspberry pi tracking cam

Tracking animals on lower speed boards to replace jetson & coral

lion-mclionheadlion mclionhead 02/16/2022 at 20:240 Comments

TFlite models aren't supported by opencv DNN.  Instead, you have to install the tensorflow library for C++.  This is another port which seems to have been dropped in favor of focusing on python.

The journey begins by downloading an ARM64 release of bazel.  It might work on ARM 32, but the only prebuilt binary is ARM64.

https://github.com/bazelbuild/bazel/releases

It has to be chmod executable & then renamed to /usr/bin/bazel.  

Then comes downloading the latest tensorflow release source code from 

https://github.com/tensorflow/tensorflow/releases

Then run python3 configure.py, at which point it says you have to downgrade bazel.  The lion kingdom tries bazel 3.7.2 instead.  Then tensorflow says bazel has to be above 4.2.1, so the lion kingdom tries 4.2.1.

Use the defaults for all the config options.

Then 

bazel build -c opt //tensorflow/lite:libtensorflowlite.so

There isn't an install script.  It dumps libtensorflowlite.so deep inside /root/.cache/bazel/_bazel_root 

It has to be copied somewhere easier to access for the dynamic linker.

Some header files are in tensorflow-2.8.0/tensorflow/lite

Other header files for 3rd party libraries are in ~/.cache/bazel

The example programs are in tensorflow-2.8.0/tensorflow/lite/examples/

It's a much bigger deal to make it work in C than python, partly because there isn't an include & library structure.  The images are actually stretched to the 320x320 input layer so unless the model is aspect ratio independent, the training set needs to be similarly stretched.  At such low resolution, small objects can be eliminated.

The test with the 16:9 cam was nowhere close.

Cropping it to 1:1 made it pop, so it is aspect ratio dependent.  It was really good at tracking a lion once the aspect ratio matched the input layer.  It might even be outdoing face tracking.  It even got all the orientations that it couldn't get in 4:3.  The task is either stretching the training data or somehow reorganizing the 16:9 video to fill a 1:1 frame.


In other news that surprised no-one, the jetson nano page that everyone has been reloading was changed from being restocked on Feb 19 to being discontinued.

Interestingly, archive.org showed it still in production as recently as May 2021.

Nowdays, it's incomprehensible that an embedded GPU ever existed for such a low price.  If embedded GPUs ever come close to that performance again, they're going to be thousands of doll hairs.

Discussions