Code Monkey home page Code Monkey logo

Comments (18)

galeone avatar galeone commented on August 20, 2024

Right now is not possible, because the Go API is only able to define some simple graph, but there is not a Go equivalent of the tf.Variable Python class.

If you want to define a model and train it in Go, you have to use Python to define the computational graph of the whole training loop, then export this training loop graph, load it in Go and execute it (in the same way of executing the inference).

I've started a project, btw, that will expose the whole TensorFlow Python API to Go. I haven't had the time to go on with the development (I will soon) - you can read more here: #17 (comment)

If you want to contribute to this project, just le me know it

from tfgo.

Arnold1 avatar Arnold1 commented on August 20, 2024

@galeone hi, how can I load a model (saved_model.pb file) with your library?

2020-01-01 10:11:12     912917 saved_model.pb

from tfgo.

galeone avatar galeone commented on August 20, 2024

Hi @Arnold1 , you can use the model.LoadModel(exportDir string, tags []string, options *tf.SessionOptions) (model *Model)

Where:

  • export_dir is the path that contains your saved_model.pb
  • tags is the array of tags (usually something lie {"serve"} but the tag name depends on youw you tagged the model during the export (in Python)
  • options usually are just nil

You can see an example here: https://github.com/galeone/tfgo#go-code

from tfgo.

Arnold1 avatar Arnold1 commented on August 20, 2024

@galeone thanks so much. btw do you have the test_models/export somewhere - or could you upload it? what is[]string{"tag"} by the way - in tg.LoadModel?

@galeone also what is the difference between tfserving and your lib? also how can I concert a json into a tensor?

how can I specify for my model, see below:

        model := tg.LoadModel("model/123123122", []string{"serve"}, nil)

        fakeInput, _ := tf.NewTensor([1][28][28][1]float32{})
        results := model.Exec([]tf.Output{
                model.Op("LeNetDropout/softmax_linear/Identity", 0),
        }, map[tf.Output]*tf.Tensor{
                model.Op("input_", 0): fakeInput,
        })

        predictions := results[0].Value().([][]float32)
        fmt.Println(predictions)

from tfgo.

galeone avatar galeone commented on August 20, 2024

No wait wait. I linked you an example, you don't need my test_models/export folder, you can use your model.

The tag "tag" is the tag I used when exporting the savedmodel. You already showed that you used the tag "serve" (you're inspecting the saved model structure at the "serve" tag using saved_model_cli), so instead of {"tag"} you have to use {"serve"} (and you did this correctly).

Now, instead of `input_:0" (that's the name of the input placeholder I used in the example), you have to use one (or all) the input placeholder you have in your model, like Placeholder_1:0, Placeholder_2:0 and so on.

You have also to specify the output you want (instead of my LenetDropput/etc..., you have to use one of the output tensor defined in your saved model, like prob_out:0).

Also, this library is able to do pretty much the same things of TensorFlow serving (you can create an equivalent of TensorFlow serving in Go using the library), but it gives you more flexibility since you can use all the Go bindings

from tfgo.

Arnold1 avatar Arnold1 commented on August 20, 2024

@galeone

I tried the following but see:

2020-01-31 08:29:34.379920: I tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: model/123123122
2020-01-31 08:29:34.400792: I tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2020-01-31 08:29:34.421951: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2020-01-31 08:29:34.482102: I tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
2020-01-31 08:29:34.657523: I tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: model/123123122
2020-01-31 08:29:34.764345: I tensorflow/cc/saved_model/loader.cc:311] SavedModel load for tags { serve }; Status: success. Took 384430 microseconds.
panic: op prob:0 not found

code:

        fakeInput0, _ := tf.NewTensor([1]float32{})
        fakeInput1, _ := tf.NewTensor([1]float32{})

        results := model.Exec([]tf.Output{
                model.Op("prob:0", 0),
        }, map[tf.Output]*tf.Tensor{
                model.Op("Placeholder_1:0", 0): fakeInput0,
                model.Op("Placeholder_2:0", 1): fakeInput1,
        })

        predictions := results[0].Value().([][]float32)
        fmt.Println(predictions)

from tfgo.

galeone avatar galeone commented on August 20, 2024

You're almost there!
As you can see, the inputs and the outputs are in the format name:number (e.g. prob_out:0), where name is the name of the operation that generated the tensor, and number is the number of the output (0, is the first (and usually the only) output of the operation, 1 is the second (if any), and so on).

The tfgo model.Op method, wants you to define the input/output tensor in this way.
Thus, instead of model.Op("prob_out:0", 0) you have to write model.Op("prob_out", 0), and so on, for every input/output tensor.
After that, it will work :)

from tfgo.

Arnold1 avatar Arnold1 commented on August 20, 2024

@galeone ok :)

I tested the entire prediction, but from defining the fakeInput0, _ := tf.NewTensor([1]int64{}) till results[0].Value().([][]float32) it takes about 1.20 seconds. why is it so slow? how to make it faster?

from tfgo.

galeone avatar galeone commented on August 20, 2024

It is slow because on the first run you create the tf.Session, you put the tf.Graph inside it and you execute the prediction.
Any other call should be faster because the session object has already been created and the graph loaded in memory

from tfgo.

Arnold1 avatar Arnold1 commented on August 20, 2024

@galeone is there a way you can implement LoadModel to return an error - so I can do proper error handling and not throw a panic? also is there a way to suppress the logs when you call LoadModel? e.g. I tensorflow/cc/saved_model/loader.cc:311] SavedModel load for tags { serve }; Status: success. Took 445229 microseconds.

from tfgo.

galeone avatar galeone commented on August 20, 2024

I don't think returning an error is the correct behavior. If you fail to load the model, then your application (that is based on that model) won't run, so IMO it is better to panic.
For the log string, it is generated from the TensorFlow C API, so I don't have control over it. But I guess if you wrap the model loading in a separate goroutine and you capture stderr and stdout,then you'll be able to suppress it

from tfgo.

Arnold1 avatar Arnold1 commented on August 20, 2024

@galeone if the model cannot be loaded I want to give the user an error message so he understands why the model cannot be loaded and can take action... do you know what I mean?

from tfgo.

arnwas avatar arnwas commented on August 20, 2024

Hi thanks. In #2 there seems a link to lead to your work, but it gives 404. Where is that "tree/rtf" now? Cheers, Arno

from tfgo.

galeone avatar galeone commented on August 20, 2024

Currently, this has been moved to a separate repo, that is not working (it's a long and complex project and I don't have a lot of time to work on it): https://github.com/galeone/rtf

from tfgo.

Arnold1 avatar Arnold1 commented on August 20, 2024

@galeone what is the difference between your lib and the go bindings here? https://www.tensorflow.org/install/lang_go

Also what tensorflow versions does your lib support? Is it suggested to use tf 2.1.x?

from tfgo.

galeone avatar galeone commented on August 20, 2024

This library wraps the official Go bindings and adds some additional functionality. However, we are going out of topic, I'm closing this issue.

from tfgo.

Arnold1 avatar Arnold1 commented on August 20, 2024

@galeone ok. Also what tensorflow versions does your lib currently support? Is it suggested to use tf 2.1.x? im asking because I didn't see anything in the readme...

from tfgo.

galeone avatar galeone commented on August 20, 2024

The latest supported version is tensorflow 1.15, but only because I haven't tested it with other C libraries (there isn't still a C library of tensorflow 2.x).

However, if you use this library only to load SavedModels, you can export them using tensorflow 2.x (I've tested with 2.0) in Python and load them here (using the C library of Tensorflow 1.15), it works.

from tfgo.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.