Code Monkey home page Code Monkey logo

go-onnxruntime's Introduction

go-onnxruntime

Build Status

Go binding for Onnxruntime C++ API. This is used by the Onnxruntime agent in MLModelScope to perform model inference in Go.

Installation

Download and install go-onnxruntime:

go get -v github.com/c3sr/go-onnxruntime

The binding requires Onnxruntime C++ and other Go packages.

Onnxruntime C++ Library

The Go binding for Onnxruntime C++ API in this repository is built based on Onnxruntime v1.6.0.

To install Onnxruntime C++ on your system, you can follow the instruction on Onnxruntime Installation and refer to the dockerfiles.

The Onnxruntime C++ libraries are expected to be under /opt/onnxruntime/lib.

The Onnxruntime C++ header files are expected to be under /opt/onnxruntime/include.

If you get an error about not being able to write to /opt then perform the following:

sudo mkdir -p /opt/onnxruntime
sudo chown -R `whoami` /opt/onnxruntime

If you want to change the paths to the Onnxruntime C++ API, you need to also change the corresponding paths in lib.go .

Go Packages

You can install the dependency through using Dep.

dep ensure -v -vendor-only

This installs the dependencies in vendor/.

Configure Environmental Variables

Configure the linker environmental variables since the Onnxruntime C++ library is under a non-system directory. Place the following in either your ~/.bashrc or ~/.zshrc file:

Linux

export LIBRARY_PATH=$LIBRARY_PATH:/opt/onnxruntime/lib
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/onnxruntime/lib

macOS

export LIBRARY_PATH=$LIBRARY_PATH:/opt/onnxruntime/lib
export DYLD_LIBRARY_PATH=$DYLD_LIBRARY_PATH:/opt/onnxruntime/lib

Check the Build

Run go build to check the dependencies, installation and library paths set-up. On linux, the default is to use GPU, if you don't have a GPU, do go build -tags=nogpu instead of go build.

Note : The CGO interface passes go pointers to the C API. This is an error by the CGO runtime. Disable the error by placing:

export GODEBUG=cgocheck=0

in your ~/.bashrc or ~/.zshrc file and then run either source ~/.bashrc or source ~/.zshrc.

Examples

Examples of using the Go Onnxruntime binding to do model inference are under examples .

Credits

Some of the logic of conversion between Go types and Ort::Values is borrowed from go-pytorch.

go-onnxruntime's People

Contributors

jakepu avatar yen-hsiang-chang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

go-onnxruntime's Issues

CPU docker version has nvml.h dependencies

Hi, I am using the built CPU only docker image as development environment and it seems the module still depends on nvml-go module

sudo docker run --rm -it --name go-onnx-cpu -v $PWD:/go/src/gitlab.com/aiteam c3sr/go-onnxruntime:amd64-cpu-onnxruntime1.7.1-latest

under the environment, when I try to run the go module, this issue shows up. However this is a cpu only onnxruntime build, so we shouldn't be any GPU dependencies here?

/go/pkg/mod/github.com/c3sr/[email protected]/nvml.go:21:11: fatal error: nvml.h: No such file or directory
 // #include <nvml.h>
           ^~~~~~~~
compilation terminated.

My go test code is as follows:

package main

import (
	"context"
	"path/filepath"
	"github.com/c3sr/dlframework/framework/options"
)

var (
	graph_file = "model.onnx"
	batchSize  = 3
)

func main() {
	dir, _ := filepath.Abs("./_fixtures")
	graph := filepath.Join(dir, graph_file)

	device := options.CPU_DEVICE
	ctx := context.Background()

	opts := options.New(options.Context(ctx),
		options.Graph([]byte(graph)),
		options.Device(device, 0),
		options.BatchSize(batchSize))
}

Can't be built from prebuilt binaries

Can't get it to compile on Linux without GPU from official binaries

What I've done

  • Downloaded latest .tgz file from onnxruntime releases (onnxruntime-linux-x64-1.9.0.tgz)
  • Created empty folder /opt/onnxruntime
  • Extracted tgz file into that folder
  • I've set up bashrc as described in the instruction

Error

go get -v -tags='nogpu' github.com/c3sr/go-onnxruntime
github.com/c3sr/go-onnxruntime
# github.com/c3sr/go-onnxruntime
In file included from ../go/pkg/mod/github.com/c3sr/[email protected]/device.go:3:
./cbits/predictor.hpp:10:10: fatal error: onnxruntime_c_api.h: No such file or directory
   10 | #include <onnxruntime_c_api.h>
      |          ^~~~~~~~~~~~~~~~~~~~~
compilation terminated.

Bash rc setup

export LIBRARY_PATH=$LIBRARY_PATH:/opt/onnxruntime/lib
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/onnxruntime/lib

Values of variables

echo $LIBRARY_PATH $LD_LIBRARY_PATH
:/opt/onnxruntime/lib :/opt/onnxruntime/lib

How /opt/onnxruntime/ looks

tree /opt/onnxruntime/
/opt/onnxruntime/
├── GIT_COMMIT_ID
├── include
│   ├── cpu_provider_factory.h
│   ├── onnxruntime_c_api.h
│   ├── onnxruntime_cxx_api.h
│   ├── onnxruntime_cxx_inline.h
│   ├── onnxruntime_run_options_config_keys.h
│   ├── onnxruntime_session_options_config_keys.h
│   └── provider_options.h
├── lib
│   ├── libonnxruntime.so -> libonnxruntime.so.1.9.0
│   └── libonnxruntime.so.1.9.0
├── LICENSE
├── Privacy.md
├── README.md
├── ThirdPartyNotices.txt
└── VERSION_NUMBER

Failed to run the simple example due to missing variable declartion inside Metric.h

I have both the CUDA, cuda driver and cuDNN installed in my ubuntu system. Followed the instructions in c3sr/go-onnxruntime repo for installation

  • Successfully installed ONNXRUntime C++ API(built from src with cuda-- built and installed successfully)

  • nvidia-smi and nvcc --version works

  • can compile the acuda file and run from NVIDIA test example

  • g++ version and gcc version are same.

  • g++ (Ubuntu 11.3.0-1ubuntu1~22.04) 11.3.0

  • gcc (Ubuntu 11.3.0-1ubuntu1~22.04) 11.3.0
    Example code that i run is this
    I am continuously getting this error below.
    Undefined variable"NVPA_RawMetricsConfigOptions" in Metric.h
    Am i missing something?
    ` $ go run main.go

    (github.com/c3sr/go-cupti)

In file included from utils.cpp:17:
../../../go/pkg/mod/github.com/c3sr/[email protected]/csrc/Metric.h: In function ‘bool NV::Metric::Config::GetConfigImage(std::string, std::vector<std::__cxx11::basic_string >, std::vector&, const uint8_t*)’:
../../../go/pkg/mod/github.com/c3sr/[email protected]/csrc/Metric.h:68:17: error: ‘NVPA_RawMetricsConfigOptions’ was not declared in this scope; did you mean ‘NVPA_RawMetricsConfig’?
68 | NVPA_RawMetricsConfigOptions metricsConfigOptions = { NVPA_RAW_METRICS_CONFIG_OPTIONS_STRUCT_SIZE };
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
| NVPA_RawMetricsConfig
../../../go/pkg/mod/github.com/c3sr/[email protected]/csrc/Metric.h:69:17: error: ‘metricsConfigOptions’ was not declared in this scope
69 | metricsConfigOptions.activityKind = NVPA_ACTIVITY_KIND_PROFILER;
| ^~~~~~~~~~~~~~~~~~~~
In file included from ../../../go/pkg/mod/github.com/c3sr/[email protected]/csrc/Metric.h:4,
from utils.cpp:17:
../../../go/pkg/mod/github.com/c3sr/[email protected]/csrc/Metric.h:72:45: error: ‘NVPA_RawMetricsConfig_Create’ was not declared in this scope; did you mean ‘NVPW_CUDA_RawMetricsConfig_Create’?
72 | RETURN_IF_NVPW_ERROR(false, NVPA_RawMetricsConfig_Create(&metricsConfigOptions, &pRawMetricsConfig));
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
../../../go/pkg/mod/github.com/c3sr/[email protected]/csrc/Utils.h:7:26: note: in definition of macro ‘RETURN_IF_NVPW_ERROR’
7 | NVPA_Status status = actual; `

Failed to create simple model on cpu

I'm trying to do inference with model generated with Keras and saved to onnx format. But I'm getting an error while creating the model. Code was based on provided example

Error

go run -tags='nogpu' main.go 
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0xcd36ff]

goroutine 1 [running]:
main.main()
        /home/idk/golang_onnx/main.go:27 +0x1ff
exit status 2
(base) idk@idk:~

Code

package main

import (
	"context"
	"fmt"
	"path/filepath"

	"github.com/c3sr/dlframework/framework/options"
	"github.com/c3sr/go-onnxruntime"
	"github.com/c3sr/tracer"
)

func main() {
	device := options.CPU_DEVICE
	graph := filepath.Join("/home/idk/golang_onnx", "model.onnx")

	ctx := context.Background()
	opts := options.New(options.Context(ctx),
		options.Graph([]byte(graph)),
		options.Device(device, 0),
		options.BatchSize(9))

	opts.SetTraceLevel(tracer.FULL_TRACE)

	span, ctx := tracer.StartSpanFromContext(ctx, tracer.FULL_TRACE, "onnxruntime_batch")
	defer span.Finish()

	predictor, err := onnxruntime.New(
		ctx,
		options.WithOptions(opts),
	)
	defer predictor.Close()
	fmt.Println(err)
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.