Code Monkey home page Code Monkey logo

Comments (19)

Bryanx avatar Bryanx commented on July 20, 2024 7

I solved this by using tflite_flutter and tflite_flutter_helper instead of this library. Here is a gist in case anyone is running into this as well: https://gist.github.com/Bryanx/b839e3ceea0f9647ffbc5f90e3091742.

from flutter_tflite.

yumemi-RyoShimizu avatar yumemi-RyoShimizu commented on July 20, 2024 2

I created an image labeling model with AutoML, and since the model should have been quantized, I converted the image to uint8, but the following error was output.
Caused by: java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite tensor with type UINT8 and a Java object of type [(F (which is compatible with the TensorFlowLite type FLOAT32) .

from flutter_tflite.

securingsincity avatar securingsincity commented on July 20, 2024 1

Based on some searching of issues last night #53 and #59 both are related to this issue. The automl vision edge outputs a quantized tflite model.

Here are two images from netron describing the differences between the quantized model and the mobilenet v2 model that flutter_tflite currently supports

Screen Shot 2019-10-19 at 9 44 55 PM

Screen Shot 2019-10-19 at 9 44 37 PM

Note that they are the same except one accepts a uint8 list and the other takes a float32 list.

I'm not entirely sure what would need to change on the flutter tflite side to support this kind of model but hopefully this helps

from flutter_tflite.

waltermaldonado avatar waltermaldonado commented on July 20, 2024 1

For what I have seen, if you use the method to run the detections on binary you can use a quantized model. Actually, the conversion from image to ByteList suggested in the docs is made considering 8-bit integer as unit, as you can see below:

Uint8List imageToByteListUint8(img.Image image, int inputSize) {
  var convertedBytes = Uint8List(1 * inputSize * inputSize * 3);
  var buffer = Uint8List.view(convertedBytes.buffer);
  int pixelIndex = 0;
  for (var i = 0; i < inputSize; i++) {
    for (var j = 0; j < inputSize; j++) {
      var pixel = image.getPixel(j, i);
      buffer[pixelIndex++] = img.getRed(pixel);
      buffer[pixelIndex++] = img.getGreen(pixel);
      buffer[pixelIndex++] = img.getBlue(pixel);
    }
  }
  return convertedBytes.buffer.asUint8List();
}

This conversion should work for a quantized model, but is not working for a non-quantized one. The convertedBytes List should be 4 times the one is being suggested to work for non-quantized models.

When I use the detections on image path it works perfectly.

Edit:
For non-quantized models the docs suggest:

Uint8List imageToByteListFloat32(
    img.Image image, int inputSize, double mean, double std) {
  var convertedBytes = Float32List(1 * inputSize * inputSize * 3);
  var buffer = Float32List.view(convertedBytes.buffer);
  int pixelIndex = 0;
  for (var i = 0; i < inputSize; i++) {
    for (var j = 0; j < inputSize; j++) {
      var pixel = image.getPixel(j, i);
      buffer[pixelIndex++] = (img.getRed(pixel) - mean) / std;
      buffer[pixelIndex++] = (img.getGreen(pixel) - mean) / std;
      buffer[pixelIndex++] = (img.getBlue(pixel) - mean) / std;
    }
  }
  return convertedBytes.buffer.asUint8List();
}

from flutter_tflite.

zoraiz-WOL avatar zoraiz-WOL commented on July 20, 2024 1

use this code to train you custom model

import os

import numpy as np

import tensorflow as tf
assert tf.version.startswith('2')

from tflite_model_maker import model_spec
from tflite_model_maker import image_classifier
from tflite_model_maker.config import ExportFormat
from tflite_model_maker.config import QuantizationConfig
from tflite_model_maker.image_classifier import DataLoader

import matplotlib.pyplot as plt

#to unzip a rar
!unzip path-of-zip-file -d path-to-save-extract-file

data = DataLoader.from_folder('path-of-custom-folder')
train_data, rest_data = data.split(0.8)
validation_data, test_data = rest_data.split(0.5)
model = image_classifier.create(train_data, validation_data=validation_data)
loss, accuracy = model.evaluate(test_data)
config = QuantizationConfig.for_float16()
model.export(export_dir='path-to-save-model', quantization_config=config,export_format=ExportFormat.TFLITE)
model.export(export_dir='path-to-save-label', quantization_config=config,export_format=ExportFormat.LABEL)

from flutter_tflite.

Statyk7 avatar Statyk7 commented on July 20, 2024

I'm having the same problem and wondering if there is a way or workaround to handle those models?

from flutter_tflite.

Statyk7 avatar Statyk7 commented on July 20, 2024

Have you been able to run the MobileNet quantized version?
Can be found here: https://www.tensorflow.org/lite/guide/hosted_models
I have no success with Mobilenet_V1_1.0_224_quant :(
I have tried with runModelOnImage and with runModelOnBinary using the image to byte functions... no results... (and no errors)

But when using the TensorFlow iOS Sample App it works just fine!
https://github.com/tensorflow/examples/tree/master/lite/examples/image_classification/ios

from flutter_tflite.

waltermaldonado avatar waltermaldonado commented on July 20, 2024

No, I've never tried those models, but I think them should work aswell. Let us see your code, maybe we can find something...

from flutter_tflite.

Statyk7 avatar Statyk7 commented on July 20, 2024

I'm using the example provided with the tflite package:
https://github.com/shaqian/flutter_tflite/tree/master/example

With an additional asset for the model (the labels are the same than for the non-quantized model) in pubspec.yaml:
- assets/mobilenet_v1_1.0_224_quant.tflite

Then I load the quantized model instead of the non-quantized one in main.dart:loadModel:
default: res = await Tflite.loadModel( model: "assets/mobilenet_v1_1.0_224_quant.tflite", labels: "assets/mobilenet_v1_1.0_224.txt",

That's it!

from flutter_tflite.

waltermaldonado avatar waltermaldonado commented on July 20, 2024

Just to clarify, is your non-quantized model a detection model (localization + classification)? Because it seems to me that those quantized models are classification only models.

from flutter_tflite.

Statyk7 avatar Statyk7 commented on July 20, 2024

It's an image classification model I believe...

from flutter_tflite.

Ehtasha avatar Ehtasha commented on July 20, 2024

@Statyk7 @waltermaldonado

I'm integrating my own custom model in this example but the app crashes when we send an image to the model using method segmentMobileNet.

I have also tried with [runModelOnBinary] but issue still stand.

My custom model is trained on PyTorch and I have converted into Tensorflow using onnx and then in .tflite.
Model is not quantized.

from flutter_tflite.

andrsdev avatar andrsdev commented on July 20, 2024

I'm having the same problem!!! is there any updates of this?

from flutter_tflite.

andrsdev avatar andrsdev commented on July 20, 2024

Here are my Auto ML properties

Throws error
Caused by: java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite tensor with type UINT8 and a Java object of type [(F (which is compatible with the TensorFlowLite type FLOAT32) .

Screen Shot 2020-02-28 at 10 55 18 AM

from flutter_tflite.

oncul avatar oncul commented on July 20, 2024

Do u have problem with AutoMl generated tflite file on ios?

from flutter_tflite.

L-is-0 avatar L-is-0 commented on July 20, 2024

@andrsdev I have the same error here

from flutter_tflite.

joknjokn avatar joknjokn commented on July 20, 2024

Did anyone find a solution to this?

Also getting:
java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite tensor with type UINT8 and a Java object of type [(F (which is compatible with the TensorFlowLite type FLOAT32)

I'm trying with this model, and livestreamed camera image (YUV on android):
https://tfhub.dev/google/lite-model/aiy/vision/classifier/birds_V1/2

The page states:

Inputs are expected to be 3-channel RGB color images of size 224 x 224, scaled to [0, 1].
This model outputs to image_classifier.

I've tried a million things now and I can't get it to work. If I try to convert the streamed image to RGB, I get the UINT8/FLOAT32-error above.

from flutter_tflite.

tobiascornille avatar tobiascornille commented on July 20, 2024

@Bryanx Do you think tflite_flutter_helper alone would solve the issue? I.e. is it compatible with this library?

from flutter_tflite.

aboubacryba avatar aboubacryba commented on July 20, 2024

use this code to train you custom model

import os

import numpy as np

import tensorflow as tf assert tf.version.startswith('2')

from tflite_model_maker import model_spec from tflite_model_maker import image_classifier from tflite_model_maker.config import ExportFormat from tflite_model_maker.config import QuantizationConfig from tflite_model_maker.image_classifier import DataLoader

import matplotlib.pyplot as plt

#to unzip a rar !unzip path-of-zip-file -d path-to-save-extract-file

data = DataLoader.from_folder('path-of-custom-folder') train_data, rest_data = data.split(0.8) validation_data, test_data = rest_data.split(0.5) model = image_classifier.create(train_data, validation_data=validation_data) loss, accuracy = model.evaluate(test_data) config = QuantizationConfig.for_float16() model.export(export_dir='path-to-save-model', quantization_config=config,export_format=ExportFormat.TFLITE) model.export(export_dir='path-to-save-label', quantization_config=config,export_format=ExportFormat.LABEL)

You just saved my life. Thank You !!!!!

from flutter_tflite.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.