Comments (19)
I solved this by using tflite_flutter and tflite_flutter_helper instead of this library. Here is a gist in case anyone is running into this as well: https://gist.github.com/Bryanx/b839e3ceea0f9647ffbc5f90e3091742.
from flutter_tflite.
I created an image labeling model with AutoML, and since the model should have been quantized, I converted the image to uint8, but the following error was output.
Caused by: java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite tensor with type UINT8 and a Java object of type [(F (which is compatible with the TensorFlowLite type FLOAT32) .
from flutter_tflite.
Based on some searching of issues last night #53 and #59 both are related to this issue. The automl vision edge outputs a quantized tflite model.
Here are two images from netron describing the differences between the quantized model and the mobilenet v2 model that flutter_tflite currently supports
Note that they are the same except one accepts a uint8 list and the other takes a float32 list.
I'm not entirely sure what would need to change on the flutter tflite side to support this kind of model but hopefully this helps
from flutter_tflite.
For what I have seen, if you use the method to run the detections on binary you can use a quantized model. Actually, the conversion from image to ByteList suggested in the docs is made considering 8-bit integer as unit, as you can see below:
Uint8List imageToByteListUint8(img.Image image, int inputSize) {
var convertedBytes = Uint8List(1 * inputSize * inputSize * 3);
var buffer = Uint8List.view(convertedBytes.buffer);
int pixelIndex = 0;
for (var i = 0; i < inputSize; i++) {
for (var j = 0; j < inputSize; j++) {
var pixel = image.getPixel(j, i);
buffer[pixelIndex++] = img.getRed(pixel);
buffer[pixelIndex++] = img.getGreen(pixel);
buffer[pixelIndex++] = img.getBlue(pixel);
}
}
return convertedBytes.buffer.asUint8List();
}
This conversion should work for a quantized model, but is not working for a non-quantized one. The convertedBytes List should be 4 times the one is being suggested to work for non-quantized models.
When I use the detections on image path it works perfectly.
Edit:
For non-quantized models the docs suggest:
Uint8List imageToByteListFloat32(
img.Image image, int inputSize, double mean, double std) {
var convertedBytes = Float32List(1 * inputSize * inputSize * 3);
var buffer = Float32List.view(convertedBytes.buffer);
int pixelIndex = 0;
for (var i = 0; i < inputSize; i++) {
for (var j = 0; j < inputSize; j++) {
var pixel = image.getPixel(j, i);
buffer[pixelIndex++] = (img.getRed(pixel) - mean) / std;
buffer[pixelIndex++] = (img.getGreen(pixel) - mean) / std;
buffer[pixelIndex++] = (img.getBlue(pixel) - mean) / std;
}
}
return convertedBytes.buffer.asUint8List();
}
from flutter_tflite.
use this code to train you custom model
import os
import numpy as np
import tensorflow as tf
assert tf.version.startswith('2')
from tflite_model_maker import model_spec
from tflite_model_maker import image_classifier
from tflite_model_maker.config import ExportFormat
from tflite_model_maker.config import QuantizationConfig
from tflite_model_maker.image_classifier import DataLoader
import matplotlib.pyplot as plt
#to unzip a rar
!unzip path-of-zip-file -d path-to-save-extract-file
data = DataLoader.from_folder('path-of-custom-folder')
train_data, rest_data = data.split(0.8)
validation_data, test_data = rest_data.split(0.5)
model = image_classifier.create(train_data, validation_data=validation_data)
loss, accuracy = model.evaluate(test_data)
config = QuantizationConfig.for_float16()
model.export(export_dir='path-to-save-model', quantization_config=config,export_format=ExportFormat.TFLITE)
model.export(export_dir='path-to-save-label', quantization_config=config,export_format=ExportFormat.LABEL)
from flutter_tflite.
I'm having the same problem and wondering if there is a way or workaround to handle those models?
from flutter_tflite.
Have you been able to run the MobileNet quantized version?
Can be found here: https://www.tensorflow.org/lite/guide/hosted_models
I have no success with Mobilenet_V1_1.0_224_quant :(
I have tried with runModelOnImage and with runModelOnBinary using the image to byte functions... no results... (and no errors)
But when using the TensorFlow iOS Sample App it works just fine!
https://github.com/tensorflow/examples/tree/master/lite/examples/image_classification/ios
from flutter_tflite.
No, I've never tried those models, but I think them should work aswell. Let us see your code, maybe we can find something...
from flutter_tflite.
I'm using the example provided with the tflite package:
https://github.com/shaqian/flutter_tflite/tree/master/example
With an additional asset for the model (the labels are the same than for the non-quantized model) in pubspec.yaml:
- assets/mobilenet_v1_1.0_224_quant.tflite
Then I load the quantized model instead of the non-quantized one in main.dart:loadModel:
default: res = await Tflite.loadModel( model: "assets/mobilenet_v1_1.0_224_quant.tflite", labels: "assets/mobilenet_v1_1.0_224.txt",
That's it!
from flutter_tflite.
Just to clarify, is your non-quantized model a detection model (localization + classification)? Because it seems to me that those quantized models are classification only models.
from flutter_tflite.
It's an image classification model I believe...
from flutter_tflite.
I'm integrating my own custom model in this example but the app crashes when we send an image to the model using method segmentMobileNet.
I have also tried with [runModelOnBinary] but issue still stand.
My custom model is trained on PyTorch and I have converted into Tensorflow using onnx and then in .tflite.
Model is not quantized.
from flutter_tflite.
I'm having the same problem!!! is there any updates of this?
from flutter_tflite.
Here are my Auto ML properties
Throws error
Caused by: java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite tensor with type UINT8 and a Java object of type [(F (which is compatible with the TensorFlowLite type FLOAT32) .
from flutter_tflite.
Do u have problem with AutoMl generated tflite file on ios?
from flutter_tflite.
@andrsdev I have the same error here
from flutter_tflite.
Did anyone find a solution to this?
Also getting:
java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite tensor with type UINT8 and a Java object of type [(F (which is compatible with the TensorFlowLite type FLOAT32)
I'm trying with this model, and livestreamed camera image (YUV on android):
https://tfhub.dev/google/lite-model/aiy/vision/classifier/birds_V1/2
The page states:
Inputs are expected to be 3-channel RGB color images of size 224 x 224, scaled to [0, 1].
This model outputs to image_classifier.
I've tried a million things now and I can't get it to work. If I try to convert the streamed image to RGB, I get the UINT8/FLOAT32-error above.
from flutter_tflite.
@Bryanx Do you think tflite_flutter_helper
alone would solve the issue? I.e. is it compatible with this library?
from flutter_tflite.
use this code to train you custom model
import os
import numpy as np
import tensorflow as tf assert tf.version.startswith('2')
from tflite_model_maker import model_spec from tflite_model_maker import image_classifier from tflite_model_maker.config import ExportFormat from tflite_model_maker.config import QuantizationConfig from tflite_model_maker.image_classifier import DataLoader
import matplotlib.pyplot as plt
#to unzip a rar !unzip path-of-zip-file -d path-to-save-extract-file
data = DataLoader.from_folder('path-of-custom-folder') train_data, rest_data = data.split(0.8) validation_data, test_data = rest_data.split(0.5) model = image_classifier.create(train_data, validation_data=validation_data) loss, accuracy = model.evaluate(test_data) config = QuantizationConfig.for_float16() model.export(export_dir='path-to-save-model', quantization_config=config,export_format=ExportFormat.TFLITE) model.export(export_dir='path-to-save-label', quantization_config=config,export_format=ExportFormat.LABEL)
You just saved my life. Thank You !!!!!
from flutter_tflite.
Related Issues (20)
- anyone can guess error saying Cannot copy from a TensorFlowLite tensor (StatefulPartitionedCall:1) with shape [1, 10] to a Java object with shape [1, 10, 4]. HOT 2
- FAILURE: Build failed with an exception.What went wrong: Execution failed for task ':app:checkDebugAarMetadata'.
- Async Task is Deprecated
- Build file '/Users/macbookpro/Developer/flutter/.pub-cache/hosted/pub.dartlang.org/tflite-1.1.2/android/build.gradle' line: 24 HOT 7
- Last dart version is not supported
- The plugin `tflite` uses a deprecated version of the Android embedding. HOT 2
- Is this package abandoned? HOT 1
- PLEASE UPDATE THIS VERSION OF TFLITE huhuhu HOT 1
- [!] No podspec found for `flutter_tflite` in `.symlinks/plugins/flutter_tflite/ios` HOT 13
- Deprecated version of the Android embedding HOT 1
- app crashed while deploying tflite loaded model and Android V2 Embedding error HOT 1
- Do not support version greater than 3
- Can I build desktop app by this package?
- E/AndroidRuntime( 4563): Caused by: java.lang.IllegalArgumentException: Cannot copy from a TensorFlowLite tensor (StatefulPartitionedCall:0) with shape [1, 10647, 6] to a Java object with shape [1, 13, 13, 30]. HOT 1
- runModelOnImage returns null HOT 1
- I have a issue in runSegmentationOnImage
- The plugin `tflite` uses a deprecated version of the Android embedding. HOT 1
- The plugin `tflite` uses a deprecated version of the Android embedding. HOT 1
- got this error when running flutter project HOT 3
- Unhandled Exception: PlatformException(Failed to run model, Interpreter busy, java.lang.RuntimeException: Interpreter busy HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from flutter_tflite.