Code Monkey home page Code Monkey logo

Comments (44)

sandeep1404 avatar sandeep1404 commented on July 22, 2024 4

Thanks @Tombana It is working with Manjaro OS, Finally BNN inference is done on RPI using LCE. Really appreciate your continuous help and patience.

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024 1

@sandeep1404
It seems like the strip utility does not accept aarch64 libraries. Can you comment-out the strip command from the build_pip_package.sh script, and then try to run it again?
Once that's successful, a .whl file should appear in the folder artifacts. You can copy that .whl file to your raspberry pi and then just use pip install ./filename.whl on it.

Hi there, I have the same problem with the Raspberry Pi OS Lite (64-bit), which was released on 2023-02-21. Is there any suggestion for which OS I should use?

Hi @bywmm, I recommend Manjaro Linux (easy setup) or Arch Linux (for more advanced users) for Raspberry Pi. These distributions usually come with the newest packages.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024 1

Hi @Tombana, sorry I removed "convert_saved_model" from this line but did not remove convert_keras_model , I commented on the above lines and rebuild it, Now I can able to install and import the lce. I will try to load the interpreter and run the inference if I will face any issues I will post them here, I really appreciate your patience in replying to all my queries back, Thank you for the help.

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

Hi @sandeep1404, thanks for trying out Larq Compute Engine!

The lce_benchmark_model binary requires a more recent C library than the one that comes with Raspbian. So you either need to install a more recent OS, or compile lce_benchmark_model on the Raspberry Pi itself. The instructions for that can be found here: https://docs.larq.dev/compute-engine/build/arm/ under the tab 'Building with CMake'

The documentation for doing inference is here: https://docs.larq.dev/compute-engine/inference/

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana Thanks for the reply, I have few questions, I built a larq model for image denoising, the model is working well for image denoising, now I converted the model to tflite and I want to perform inference on raspberry pi using larq compute engine, so as per my understanding I need to perform the following steps, please correct me If I am wrong :

  1. I converted my Image denoising model to tflite model using larq converter
  2. I build with cmake on raspberry pi 4 with 64bit os
  3. Now after building I need to perform inference using C++ using the documentation https://docs.larq.dev/compute-engine/inference/

Q1. Are these steps correct, and can we perform image denoising inference on raspberry pi since it's a kind of regression problem, I see all image classification problems performing inference on raspberry pi using larq compute engine.

Q2.Also, I am not able to get a clear picture on how to perform inference on our custom model using C++ since it is unclear from the documentation, I checked the example https://github.com/larq/compute-engine/tree/main/examples but still it's not so clear about the process of performing Image denoising inference on raspberry pi using LCE.

Q3. Is there any way to perform inference in Python using LCE on raspberry pi, if there is any possibility please let me know.
please help me in this regard. Thanks in advance

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

Q1: This is correct.

Q2: The example is not for image classification in particular, it is for general inference. You still have to add code that generates the input to the neural network and code that handles the output.

Q3: This is possible using our "testing" interpreter, see here: https://docs.larq.dev/compute-engine/api/python/#interpreter . To get that to work on aarch64, please see this issue: #653

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana Thanks for the reply. So using the testing interpreter based on python API as mentioned in https://docs.larq.dev/compute-engine/api/python/#interpreter, so this interpreter testing code is as follows which accepts the tflite file as input

larq_compute_engine.testing.Interpreter(
    flatbuffer_model,
    num_threads=1,
    use_reference_bconv=False,
    use_indirect_bgemm=False,
    use_xnnpack=False,
)

lce_model = convert_keras_model(model)
interpreter = Interpreter(lce_model)
interpreter.predict(input_data, verbose=1)

do I need to run all these codes on raspberry pi for inference, since if I run all these codes on my CPU it doesn't make any difference in inference time is that correct?

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

You can run the convert_keras_model part on your other computer. The interpreter.predict(..) part is what you can run on the Raspberry Pi.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana Thanks for the reply, I got it now. But what is the difference between doing inference with C++ as mentioned in https://docs.larq.dev/compute-engine/inference and inference using python API https://docs.larq.dev/compute-engine/api/python/#interpreter on raspberry pi, are the both same or different?

Also as mentioned in issue #707 , I cannot able to install LCE on raspberry pi 4, without installing LCE how can I load the interpreter and run inference on RPI4 using python API.

Thank you in adavance

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

The C++ and Python versions both do the same. However the Python version has more overhead, so if latency is crucial then you could use C++.
Indeed the larq-compute-engine package is not available on the RPi4 by default, but here it is explained how you can compile it yourself: #653

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana Thanks for answering all my queries patiently. I have gone through #653 and started building pip package for lce as mentioned in that by removing few of the code lines as mentioned in the issue #653, I performed the following steps:

when I perform the bazel build using the below command it got build completed sucessfully

bazel build :build_pip_pkg --config=aarch64

after that bazel-bin directory got created but the directory is empty and when I ran the below command it gives me an error showing that no such file or directory. I am attaching the picture for the reference.
IMG_20230420_111407

bazel-bin/build_pip_pkg artifacts --plat=linux_aarch64

What is the issue, can you please help in resolving the issue and how to verify whether the pip package for LCE is installed or not?

from compute-engine.

CNugteren avatar CNugteren commented on July 22, 2024

I believe the issue might be the first log line output after you started bazel build :build_pip_pkg --config=aarch64:

WARNING: Download from https://storage.googleapis.com/mirror.tensorflow.org/... GET returned 404 Not Found

Do you have a working internet connection on your device? If so, perhaps just re-try again later.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @CNugteren Thanks for the reply, My board is connected to the internet, I retired building again using the command
bazel build :build_pip_pkg --config=aarch64 but still it gives me the same warning as mentioned in the picture and then said build completed successfully :

WARNING: Download from https://storage.googleapis.com/mirror.tensorflow.org/... GET returned 404 Not Found

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

I think the 404 is not a problem, it just means it tried another mirror afterwards.

It's indeed weird how the bazel-bin directory is empty, maybe bazel behaves differently on aarch64.
Is there something in bazel-out?

What if you try to use run instead of build:

bazel run :build_pip_pkg --config=aarch64

This command will fail, that's expected, but then afterwards maybe the bazel-bin directory is no longer empty.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana Thanks for the reply, there are some files in bazel-out directory but I cannot able to see any files in bazel-bin even when I used run instead of build. when I use run it got run but there are no files in bazel-bin
I am attaching the picture for the reference
bazel
Please have a look into it and kindly help me in this regard, Thank you in advance.

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

Can you show us the contents of bazel-out/aarch64-opt (and maybe some of its subfolders) ?
I expect that the bazel-bin contents might be hiding under bazel-out/aarch64-opt/bin/...

It might also be possible to run bazel info bazel-bin. That should print a folder. Can you also show the contents of that?

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana Thanks for the response, I checked the contents of bazel-out/aarch64-opt and its subfolder, there is nothing in it, also I run the command bazel info bazel-bin and checked the path and there is nothing inside that path as well, I am attaching the picture for your kind reference. Thank you in advance.
bazel1

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

I have searched a bit but I'm afraid I don't have a solution.

Two things you can try maybe:

  • Since you're building on the raspberry pi itself, maybe it works if you remove the --config=aarch64 flag from bazel build.
  • Don't build build_pip_pkg but build only the python interpreter target: bazel build //larq_compute_engine/tflite/python:interpreter

If the bazel-bin folder stays empty I don't think there's more we can do.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi, @Tombana Thanks for the reply, I followed all your steps but still, I cannot able to find anything in the Bazel-bin directory, but I found the build_pip_package.sh file in Bazel-compute-engine and when I tried to execute the command It was giving me the error, I am attaching the pictures for reference, can you please have a look at it, and if you can able to debug the issue please help in resolving this, or is there is any other way where I can run the inference on raspberry pi using python API, any clear example would be beneficial to understand. Thanks for the help.
bazel2
bazel3

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

Those bazel errors that you get with bazel build //larq_compute_engine/tflite/python:interpreter are the problem. (I think maybe also you modified some BUILD files where you excluded //larq_compute_engine/tflite/python:interpreter from build_pip_pkg or something, otherwise this error would've shown up before).

I think the problem might be the version of bazel. Can you try to use bazelisk instead of bazel?
Download it from https://github.com/bazelbuild/bazelisk/releases and then replace the bazel command with ./bazelisk

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana, Thanks for your suggestions and I really appreciate your patience in helping me. I downloaded bazelisk as mentioned in the link and tried running with bazellisk instead of bazel but still there are no files in the bazel-bin directory. Also, for the bazel error during bazel build //larq_compute_engine/tflite/python:interpreter I am attaching the snapshots for the reference for the BUILD file and bazelisk below:
bazel4
build_file

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

I'm afraid I don't have any other ideas. I'm pretty sure that the errors at bazel build //larq_compute_engine/tflite/python:interpreter are the main problem, but I don't know how to fix that. I've never tried to use bazel directly on a Raspberry Pi.
I think it might be easier to just build the RPi package on your other computer instead of building it on the RPi itself. Then you need --config=aarch64 so that it will build for aarch64.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana Thanks for the reply, I will try the other way in building the package on another computer and try to config on RPi, I will post if I face any further issues, Thank you for the help and patience in looking at the issues.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana I am confused about how to build the package on another computer and try to config on RPi, can you give me an insight in understanding the process of how to do that? Thanks in advance.

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

It's the same steps as on the Pi itself, except that you have to use --config=aarch64 when you run bazel build. And when you run the python wheel packaging step, you have to use --plat=linux_aarch64, as described in #653.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana Thanks for the reply, is there any example or any repo for end to end guide for deploying a custom BNN model on raspberry pi with Larq compute engine doing inference with Python API, if so can you please share it here, Thank you in advance.

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

There's no such guide.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana I followed the instructions specified in #653 to build the pip package in Ubuntu 20.04, I followed the below steps:
step1:
build using the command bazel build :build_pip_pkg --config=aarch64 the following are the logs on my system with Ubuntu:

WARNING: Option 'java_toolchain' is deprecated
WARNING: Option 'host_java_toolchain' is deprecated
INFO: Analyzed target //:build_pip_pkg (9 packages loaded, 68 targets configured).
INFO: Found 1 target...
Target //:build_pip_pkg up-to-date:
  bazel-bin/build_pip_pkg
INFO: Elapsed time: 4.005s, Critical Path: 0.89s
INFO: 4 processes: 4 internal.
INFO: Build completed successfully, 4 total actions

After a successful build I found files in bazel-bin, where earlier when I build on RPi I cannot able to find the files in bazel-bin
step 2:
Next I ran this command on my computer bazel-bin/build_pip_pkg artifacts --plat=linux_aarch64 and below are the logs for the command

++ uname -s
++ tr A-Z a-z
+ PLATFORM=linux
+ is_windows
+ [[ linux =~ msys_nt*|mingw*|cygwin*|uwin* ]]
+ PIP_FILE_PREFIX=bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/
+ main artifacts --plat=linux_aarch64
+ DEST=artifacts
+ BUILD_FLAG=--plat=linux_aarch64
+ [[ -z artifacts ]]
+ mkdir -p artifacts
++ abspath artifacts
+++ dirname artifacts
++ cd .
+++ basename artifacts
++ echo /home/sandeep/compute-engine/artifacts
++ cd /home/sandeep/compute-engine
+ DEST=/home/sandeep/compute-engine/artifacts
+ echo '=== destination directory: /home/sandeep/compute-engine/artifacts'
=== destination directory: /home/sandeep/compute-engine/artifacts
++ mktemp -d -t tmp.XXXXXXXXXX
+ TMPDIR=/tmp/tmp.T5Njh7hkNJ
++ date
+ echo Tuesday 25 April 2023 10:45:43 PM IST : '=== Using tmpdir: /tmp/tmp.T5Njh7hkNJ'
Tuesday 25 April 2023 10:45:43 PM IST : === Using tmpdir: /tmp/tmp.T5Njh7hkNJ
+ echo '=== Copy Larq Compute Engine files'
=== Copy Larq Compute Engine files
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/setup.py /tmp/tmp.T5Njh7hkNJ
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/MANIFEST.in /tmp/tmp.T5Njh7hkNJ
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/README.md /tmp/tmp.T5Njh7hkNJ
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/LICENSE /tmp/tmp.T5Njh7hkNJ
+ is_linux
+ [[ linux == \l\i\n\u\x ]]
+ touch /tmp/tmp.T5Njh7hkNJ/stub.cc
+ is_windows
+ [[ linux =~ msys_nt*|mingw*|cygwin*|uwin* ]]
+ rsync -avm -L '--exclude=*_test.py' bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/larq_compute_engine /tmp/tmp.T5Njh7hkNJ
building file list ... done
larq_compute_engine/
larq_compute_engine/__init__.py
larq_compute_engine/mlir/
larq_compute_engine/mlir/__init__.py
larq_compute_engine/mlir/python/
larq_compute_engine/mlir/python/__init__.py
larq_compute_engine/tflite/
larq_compute_engine/tflite/__init__.py
larq_compute_engine/tflite/python/
larq_compute_engine/tflite/python/__init__.py
larq_compute_engine/tflite/python/interpreter.py
larq_compute_engine/tflite/python/interpreter_base.py
larq_compute_engine/tflite/python/interpreter_wrapper_lite.so

sent 3,704,866 bytes  received 183 bytes  7,410,098.00 bytes/sec
total size is 3,703,194  speedup is 1.00
+ pushd /tmp/tmp.T5Njh7hkNJ
/tmp/tmp.T5Njh7hkNJ ~/compute-engine
+ is_windows
+ [[ linux =~ msys_nt*|mingw*|cygwin*|uwin* ]]
+ echo '=== Stripping symbols'
=== Stripping symbols
+ chmod +w /tmp/tmp.T5Njh7hkNJ/larq_compute_engine/tflite/python/interpreter_wrapper_lite.so
+ strip -x /tmp/tmp.T5Njh7hkNJ/larq_compute_engine/tflite/python/interpreter_wrapper_lite.so
strip: Unable to recognize the format of the input file `/tmp/tmp.T5Njh7hkNJ/larq_compute_engine/tflite/python/interpreter_wrapper_lite.so'

These are the different folders in Bazel bin after running the above command

build_pip_pkg  build_pip_pkg.runfiles  build_pip_pkg.runfiles_manifest  examples  external  larq_compute_engine

I am unaware of what to do next, what files need to be copied and where they are located and how to install those files on RPi and do inference using python based interpreter.

from compute-engine.

bywmm avatar bywmm commented on July 22, 2024

The lce_benchmark_model binary requires a more recent C library than the one that comes with Raspbian. So you either need to install a more recent OS, or compile lce_benchmark_model on the Raspberry Pi itself. The instructions for that can be found here: https://docs.larq.dev/compute-engine/build/arm/ under the tab 'Building with CMake'

The documentation for doing inference is here: https://docs.larq.dev/compute-engine/inference/

Hi there, I have the same problem with the Raspberry Pi OS Lite (64-bit), which was released on 2023-02-21. Is there any suggestion for which OS I should use?

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana I rebuild the package by commenting the strip command, and here are my logs for bazel-bin/build_pip_pkg artifacts --plat=linux_aarch64

++ uname -s
++ tr A-Z a-z
+ PLATFORM=linux
+ is_windows
+ [[ linux =~ msys_nt*|mingw*|cygwin*|uwin* ]]
+ PIP_FILE_PREFIX=bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/
+ main artifacts --plat=linux_aarch64
+ DEST=artifacts
+ BUILD_FLAG=--plat=linux_aarch64
+ [[ -z artifacts ]]
+ mkdir -p artifacts
++ abspath artifacts
+++ dirname artifacts
++ cd .
+++ basename artifacts
++ echo /home/sandeep/compute-engine/artifacts
++ cd /home/sandeep/compute-engine
+ DEST=/home/sandeep/compute-engine/artifacts
+ echo '=== destination directory: /home/sandeep/compute-engine/artifacts'
=== destination directory: /home/sandeep/compute-engine/artifacts
++ mktemp -d -t tmp.XXXXXXXXXX
+ TMPDIR=/tmp/tmp.klVBTWED6Y
++ date
+ echo Wednesday 26 April 2023 12:30:22 PM IST : '=== Using tmpdir: /tmp/tmp.klVBTWED6Y'
Wednesday 26 April 2023 12:30:22 PM IST : === Using tmpdir: /tmp/tmp.klVBTWED6Y
+ echo '=== Copy Larq Compute Engine files'
=== Copy Larq Compute Engine files
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/setup.py /tmp/tmp.klVBTWED6Y
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/MANIFEST.in /tmp/tmp.klVBTWED6Y
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/README.md /tmp/tmp.klVBTWED6Y
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/LICENSE /tmp/tmp.klVBTWED6Y
+ is_linux
+ [[ linux == \l\i\n\u\x ]]
+ touch /tmp/tmp.klVBTWED6Y/stub.cc
+ is_windows
+ [[ linux =~ msys_nt*|mingw*|cygwin*|uwin* ]]
+ rsync -avm -L '--exclude=*_test.py' bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/larq_compute_engine /tmp/tmp.klVBTWED6Y
building file list ... done
larq_compute_engine/
larq_compute_engine/__init__.py
larq_compute_engine/mlir/
larq_compute_engine/mlir/__init__.py
larq_compute_engine/mlir/python/
larq_compute_engine/mlir/python/__init__.py
larq_compute_engine/tflite/
larq_compute_engine/tflite/__init__.py
larq_compute_engine/tflite/python/
larq_compute_engine/tflite/python/__init__.py
larq_compute_engine/tflite/python/interpreter.py
larq_compute_engine/tflite/python/interpreter_base.py
larq_compute_engine/tflite/python/interpreter_wrapper_lite.so

sent 3,704,866 bytes  received 183 bytes  7,410,098.00 bytes/sec
total size is 3,703,194  speedup is 1.00
+ pushd /tmp/tmp.klVBTWED6Y
/tmp/tmp.klVBTWED6Y ~/compute-engine
+ is_windows
+ [[ linux =~ msys_nt*|mingw*|cygwin*|uwin* ]]
+ echo '=== Stripping symbols'
=== Stripping symbols
+ chmod +w /tmp/tmp.klVBTWED6Y/larq_compute_engine/tflite/python/interpreter_wrapper_lite.so
++ date
+ echo Wednesday 26 April 2023 12:30:22 PM IST : '=== Building wheel'
Wednesday 26 April 2023 12:30:22 PM IST : === Building wheel
+ python setup.py bdist_wheel --plat=linux_aarch64
warning: no files found matching '*.pyd' under directory 'larq_compute_engine'
/home/sandeep/anaconda3/lib/python3.10/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
+ cp dist/larq_compute_engine-0.11.0-cp310-cp310-linux_aarch64.whl /home/sandeep/compute-engine/artifacts
+ popd
~/compute-engine
+ rm -rf /tmp/tmp.klVBTWED6Y
++ date
+ echo Wednesday 26 April 2023 12:30:23 PM IST : '=== Output wheel file is in: /home/sandeep/compute-engine/artifacts'
Wednesday 26 April 2023 12:30:23 PM IST : === Output wheel file is in: /home/sandeep/compute-engine/artifacts

I got .whl file in artifacts folder, I copied the files into my raspberry pi, but when I run pip install ./filename.whl I was getting the following error ERROR: larq_compute_engine-0.11.0-cp310-cp310-linux_x86_64.whl is not a supported wheel on this platform. as mentioned the same error in #653, even I used --plat=linux_aarch64 , I am still getting the error, can you please help me in resolving this issue. Thank you in advance.
bazel run

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

It seems like --plat=linux_aarch64 might be ignored by the python tools. Can you try to update the setuptools and wheel packages?

pip install setuptools wheel --upgrade

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana, I tried updating the setup tools and wheel with the command pip install setuptools wheel --upgrade , but still I am facing the same issue as mentioned in above: ERROR: larq_compute_engine-0.11.0-cp310-cp310-linux_x86_64.whl is not a supported wheel on this. What is the issue, can you please help me in resolving this issue? Thank you in advance.
bazelrun 1

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

I meant upgrading those packages on your Ubuntu machine, and then recreating the .whl file.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana , I did the same way as you said , I updated the packages on Ubuntu machine rebuild and the new ```.whl`` file I copied on my RPI, and run pip install ./filename.whl but still I am having the same error. Here are my logs after updating on my ubuntu machine and rebuilding:

++ uname -s
++ tr A-Z a-z
+ PLATFORM=linux
+ is_windows
+ [[ linux =~ msys_nt*|mingw*|cygwin*|uwin* ]]
+ PIP_FILE_PREFIX=bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/
+ main artifacts --plat=linux_aarch64
+ DEST=artifacts
+ BUILD_FLAG=--plat=linux_aarch64
+ [[ -z artifacts ]]
+ mkdir -p artifacts
++ abspath artifacts
+++ dirname artifacts
++ cd .
+++ basename artifacts
++ echo /home/sandeep/compute-engine/artifacts
++ cd /home/sandeep/compute-engine
+ DEST=/home/sandeep/compute-engine/artifacts
+ echo '=== destination directory: /home/sandeep/compute-engine/artifacts'
=== destination directory: /home/sandeep/compute-engine/artifacts
++ mktemp -d -t tmp.XXXXXXXXXX
+ TMPDIR=/tmp/tmp.8seaMz24he
++ date
+ echo Wednesday 26 April 2023 03:20:00 PM IST : '=== Using tmpdir: /tmp/tmp.8seaMz24he'
Wednesday 26 April 2023 03:20:00 PM IST : === Using tmpdir: /tmp/tmp.8seaMz24he
+ echo '=== Copy Larq Compute Engine files'
=== Copy Larq Compute Engine files
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/setup.py /tmp/tmp.8seaMz24he
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/MANIFEST.in /tmp/tmp.8seaMz24he
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/README.md /tmp/tmp.8seaMz24he
+ cp bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/LICENSE /tmp/tmp.8seaMz24he
+ is_linux
+ [[ linux == \l\i\n\u\x ]]
+ touch /tmp/tmp.8seaMz24he/stub.cc
+ is_windows
+ [[ linux =~ msys_nt*|mingw*|cygwin*|uwin* ]]
+ rsync -avm -L '--exclude=*_test.py' bazel-bin/build_pip_pkg.runfiles/larq_compute_engine/larq_compute_engine /tmp/tmp.8seaMz24he
building file list ... done
larq_compute_engine/
larq_compute_engine/__init__.py
larq_compute_engine/mlir/
larq_compute_engine/mlir/__init__.py
larq_compute_engine/mlir/python/
larq_compute_engine/mlir/python/__init__.py
larq_compute_engine/tflite/
larq_compute_engine/tflite/__init__.py
larq_compute_engine/tflite/python/
larq_compute_engine/tflite/python/__init__.py
larq_compute_engine/tflite/python/interpreter.py
larq_compute_engine/tflite/python/interpreter_base.py
larq_compute_engine/tflite/python/interpreter_wrapper_lite.so

sent 3,704,866 bytes  received 183 bytes  7,410,098.00 bytes/sec
total size is 3,703,194  speedup is 1.00
+ pushd /tmp/tmp.8seaMz24he
/tmp/tmp.8seaMz24he ~/compute-engine
+ is_windows
+ [[ linux =~ msys_nt*|mingw*|cygwin*|uwin* ]]
+ echo '=== Stripping symbols'
=== Stripping symbols
+ chmod +w /tmp/tmp.8seaMz24he/larq_compute_engine/tflite/python/interpreter_wrapper_lite.so
++ date
+ echo Wednesday 26 April 2023 03:20:00 PM IST : '=== Building wheel'
Wednesday 26 April 2023 03:20:00 PM IST : === Building wheel
+ python setup.py bdist_wheel --plat=linux_aarch64
warning: no files found matching '*.pyd' under directory 'larq_compute_engine'
/home/sandeep/anaconda3/lib/python3.10/site-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated.
!!

        ********************************************************************************
        Please avoid running ``setup.py`` directly.
        Instead, use pypa/build, pypa/installer, pypa/build or
        other standards-based tools.

        See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details.
        ********************************************************************************

!!
  self.initialize_options()
+ cp dist/larq_compute_engine-0.11.0-cp310-cp310-linux_aarch64.whl /home/sandeep/compute-engine/artifacts
+ popd
~/compute-engine
+ rm -rf /tmp/tmp.8seaMz24he
++ date
+ echo Wednesday 26 April 2023 03:20:01 PM IST : '=== Output wheel file is in: /home/sandeep/compute-engine/artifacts'
Wednesday 26 April 2023 03:20:01 PM IST : === Output wheel file is in: /home/sandeep/compute-engine/artifacts

Please have a look it, Am I missing something or going wrong somewhere, please help me in this regard. Thank you in advance.

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

Maybe it's a python version mismatch? I see that you built a python 3.10 wheel on your Ubuntu machine. Maybe the raspberry Pi has another python version? They have to be the same. If you want to build a wheel for another python version, you can do that by running all the bazel steps from a python virtualenv with python 3.9. If you do that you have to run bazel clean --expunge first so that it reconfigures the python version.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana, I created a new environment with python=V3.9 and build the packages, now the .whl file ran successfully and installed larq_compute_engine-0.11.0, but when I try to import lce import larq_compute_engine as lce it is throwing me a Module not found error as shown in the figure below. What are the next steps, is the LCE installed properly, can I load the interpreter and run the inference now? Sorry for troubling you by asking many questions this is a part of my thesis work, So I need to resolve and do inference. Thank you in advance.
lce

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

As described in #653, you have to remove from larq_compute_engine.mlir.python.converter import convert_keras_model from __init__.py and remove convert_keras_model from __all__, that should fix this issue.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana , I started Performing Inference on RPI, I converted the model to tflite on my computer and loaded the tflite file on RPI, but I am getting the following error:

Traceback (most recent call last):
  File "/home/pi/larq/larq_bnn_test/larq_bnn_test.py", line 10, in <module>
    interpreter=lce.testing.Interpreter(
  File "/home/pi/.local/lib/python3.9/site-packages/larq_compute_engine/tflite/python/interpreter.py", line 48, in __init__
    from larq_compute_engine.tflite.python import interpreter_wrapper_lite
ImportError: /lib/aarch64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /home/pi/.local/lib/python3.9/site-packages/larq_compute_engine/tflite/python/interpreter_wrapper_lite.so)

I have lib6 and lib6-dev already installed

libc6 is already the newest version (2.31-13+rpt2+rpi1+deb11u5).
libc6-dev is already the newest version (2.31-13+rpt2+rpi1+deb11u5).

I have no clue how to resolve this issue, I have GLIBC 2.31 installed in my system, it is the latest version, but it ask for GLIBC_2.32. Can you kindly help me in resolving the issue. I am attaching the picture for reference.Thank you in advance.

testing

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

You'll have to get an OS for your raspberry pi with a newer glibc, such as (probably) Manjaro or Arch.

The alternative was to build the code with an older compiler so that it does not depend on the new glibc anymore. Unfortunately using an older crosscompiler in bazel is not so easy. That leaves the option of building it on the raspberry pi itself, but that gave the "empty bazel-bin" issue.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana, So that means should I use Manjaro OS on my raspberry pi? If I use the OS should I build the files again and generate the .whl file or I can use the .whl file generated and can do pip install ./filename.whl, I hope since the packages are built upon linux-aarch64, the older .whl file should work is it right?

Now after installing the manjaro OS on my RPI, I will install the LCE with the earlier generated .whl file and perform inference is this correct please correct me if I am wrong, or can I use the current OS and resolve the issue of GLIBC, if I use the older compiler then you mentioned it will give "empty bazel-bin" issue. So the best option is to go with new OS is it so?

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024

That is all correct. You can use Manjaro OS on your Raspberry Pi. If you go to https://manjaro.org/download/ you can select Raspberry Pi as device and then choose a desktop environment. You can use the .whl file that you built before on your Ubuntu machine; there is no need to rebuild it.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana I have a question related to flags in LCE interpreter, when I call the interpreter as mentioned below

larq_compute_engine.testing.Interpreter(
    flatbuffer_model,
    num_threads=1,
    use_reference_bconv=False,
    use_indirect_bgemm=False,
    use_xnnpack=False,
)

What are the different flags here like use_reference_bconv , use_indirect_bgemm and use_xnnpack and one more question is related to num_threads, I got to know that raspberry pi has 4 cores and it supports multithreading, so for num_threads=1 I am getting inference of 0.12s for a batch of 8 images while when I set num_threads=2 or num_threads=3 or num_threads=4 the inference time for all 3 different cases is same almost it is 0.1s for a batch of 8 images, why is it so, as the number of threads increases the design becomes more parallel and inference should be faster, please correct me If I am wrong, what is the maximum value of num_threads we can assign for raspberry pi 4. Thank you in advance.

from compute-engine.

Tombana avatar Tombana commented on July 22, 2024
  • use_reference_bconv should be left to False. It is only used for unittests and uses a slow implementation
  • use_indirect_bgemm can often result in faster inference for binary layers, so you should try both options and see which one is faster
  • use_xnnpack specifies if XNNPACK should be used to run all the non-binary layers. Usually it's faster to enable it, but again it should be tested.

As for num_threads, you are right that it should become faster if you use more threads, and it should be at most 4.
However it does not use the threads across the batch dimension, it uses threads for each individual input in the batch.
It is common that you don't get a 4X speedup with 4 threads: if the input sizes are not large enough, there is a lot of overhead from parts of the computation that are not parallelized.

By the way, now that you have Manjaro, you can also use lce_benchmark_model again, the one that didn't work originally. Then there's less python overhead and it should give more accurate benchmarks.

from compute-engine.

sandeep1404 avatar sandeep1404 commented on July 22, 2024

Hi @Tombana Thank you for your suggestions, I Will try using lce_benchmark_model again and observe the results

from compute-engine.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.