TensorFlow-HRT icon indicating copy to clipboard operation
TensorFlow-HRT copied to clipboard

compile and build on firefly 3399

Open kaishijeng opened this issue 8 years ago • 4 comments

I can compile tensorflow and ComputeLibrary on firefly3399. Is there any good reason to use cross compiler instead of native compiler on firefly 3399? Do you provide an instruction how to compile/build on firefly?

Thanks,

kaishijeng avatar Dec 30 '17 18:12 kaishijeng

The main reason is build time. There are only 2GB memory on Firefly. Build on PC with large memory is much faster than build on Firefly. Native build is possible, but not recommended.

BR

xifengcun avatar Jan 01 '18 13:01 xifengcun

I am interested in the native build and please upload it

Thanks,

kaishijeng avatar Jan 01 '18 18:01 kaishijeng

The following procedure was tested 6 month ago. It is not required to patch Eigen now.

Tensorflow uses bazel building system to create the binaries, which will download the necessary packages from internet automatically. However, when building tensorflow on firelfy (RK3399) board, there are several issues to be resolved.

  1. Use the newest Bazel for ARM architecture detection

  2. Insert an external USB stick as memory swap partition

  3. Patch Eigen to fix compilation of Jacobi rotations with ARM NEON (Not required now)

  4. Set resource limits on bazel to build

Here is the detailed steps to build tensorflow from source on firefly

  1. Install Bazel from source

Please visit https://docs.bazel.build/versions/master/install-compile-source.html for official guide. Install openjdk: sudo apt-get install openjdk-8-jdk Download bazel release: https://github.com/bazelbuild/bazel/releases/download/0.5.2/bazel-0.5.2-dist.zip Unzip the archive and call bash ./compile.sh. Copy the generated output/bazel to /usr/bin 2. Add the memory swap partition

Insert one USB stick (>=2G) on the USB port and suppose it is /dev/sdx Format it as swap partition: mkswap /dev/sdx

This command should output the UID XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX. If that is not the case, please use blkid to get the UID. Add the swap entry in /etc/fstab: add one line in /etc/fstab UUID=XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX none swap sw,pri=5 0 0 Enable swap by: swapon -a 3. Configure tensorflow

./configure Just select the default options. 4. Patch Eigen (Not required now) The patch is included here and is from: https://bitbucket.org/eigen/eigen/commits/d781c1de9834/ Download all external packages needed first: bazel fetch //tensorflow/examples/label_image Change directory to where Eigen is unpacked: cd ~/.cache/bazel/bazel_xxx/xxxxxxx/external/eigenarchive patch -p1 < eigen.patch 5. Use below command to build tensorflow

bazel build -c opt --local_resources 1024,1.0,1.0 --verbose_failures //tensorflow/examples/label_image 6. Try label_image

If every thing goes well, after about one hour, label_image should be created at bazel-bin/tensorflow/examples/label_image. Download the model data and the sample photo by: curl -L "https://storage.googleapis.com/downoad.tensorflow.org/models/inception_v3_2016_08_28_frozen.pb.tar.gz" | tar -C tensorflow/examples/label_image/data -xz Now, please run: bazel-bin/tensorflow/examples/label_image/label_image

xifengcun avatar Jan 04 '18 05:01 xifengcun

How do you set ComputeLibrary in TensorflowOnACL configuration? I don't see this in your native compilation instruction

Thanks,

kaishijeng avatar Jan 04 '18 18:01 kaishijeng