Code Monkey home page Code Monkey logo

surrealml's Introduction

SurrealMl

This package is for storing machine learning models with meta data in Rust so they can be used on the SurrealDB server.

What is SurrealML?

SurrealML is a feature that allows you to store trained machine learning models in a special format called 'surml'. This enables you to run these models in either Python or Rust, and even upload them to a SurrealDB node to run the models on the server

Prerequisites

  1. A basic understanding of Machine Learning: You should be familiar with ML concepts, algorithms, and model training processes.
  2. Knowledge of Python: Proficiency in Python is necessary as SurrealML involves working with Python-based ML models.
  3. Familiarity with SurrealDB: Basic knowledge of how SurrealDB operates is required since SurrealML integrates directly with it.
  4. Python Environment Setup: A Python environment with necessary libraries installed, including SurrealML, PyTorch or SKLearn (depending on your model preference).
  5. SurrealDB Installation: Ensure you have SurrealDB installed and running on your machine or server

Installation

To install SurrealML, make sure you have Python installed. Then, install the SurrealML library and either PyTorch or SKLearn, based on your model choice. You can install the package with both PyTorch and SKLearn with the command below:

pip install "git+https://github.com/surrealdb/surrealml#egg=surrealml[sklearn,torch]"

If you want to use SurrealML with sklearn you will need the following installation:

pip install "git+https://github.com/surrealdb/surrealml#egg=surrealml[sklearn]"

For PyTorch:

pip install "git+https://github.com/surrealdb/surrealml#egg=surrealml[torch]"

For Tensorflow:

pip install "git+https://github.com/surrealdb/surrealml#egg=surrealml[tensorflow]"

After that, you can train your model and save it in the SurrealML format.

Compilation config

If nothing is configured the crate will compile the ONNX runtime into the binary. This is the default behaviour. However, you have 2 more options:

  • If you want to use an ONNX runtime that is installed on your system, you can set the environment variable ONNXRUNTIME_LIB_PATH before you compile the crate. This will make the crate use the ONNX runtime that is installed on your system.
  • If you want to statically compile the library, you can download it from https://github.com/surrealdb/onnxruntime-build/releases/tag/v1.16.3 and then build the crate this way:
$ tar xvf <onnx-archive-file> -C extract-dir
$ ORT_STRATEGY=system ORT_LIB_LOCATION=$(pwd)/extract-dir cargo build

Quick start with Sk-learn

Sk-learn models can also be converted and stored in the .surml format enabling developers to load them in any python version as we are not relying on pickle. Metadata in the file also enables other users of the model to use them out of the box without having to worry about the normalisation of the data or getting the right inputs in order. You will also be able to load your sk-learn models in Rust and run them meaning you can use them in your SurrealDB server. Saving a model is as simple as the following:

from sklearn.linear_model import LinearRegression
from surrealml import SurMlFile, Engine
from surrealml.model_templates.datasets.house_linear import HOUSE_LINEAR # click on this HOUSE_LINEAR to see the data

# train the model
model = LinearRegression()
model.fit(HOUSE_LINEAR["inputs"], HOUSE_LINEAR["outputs"])

# package and save the model
file = SurMlFile(model=model, name="linear", inputs=HOUSE_LINEAR["inputs"], engine=Engine.SKLEARN)

# add columns in the order of the inputs to map dictionaries passed in to the model
file.add_column("squarefoot")
file.add_column("num_floors")

# add normalisers for the columns
file.add_normaliser("squarefoot", "z_score", HOUSE_LINEAR["squarefoot"].mean(), HOUSE_LINEAR["squarefoot"].std())
file.add_normaliser("num_floors", "z_score", HOUSE_LINEAR["num_floors"].mean(), HOUSE_LINEAR["num_floors"].std())
file.add_output("house_price", "z_score", HOUSE_LINEAR["outputs"].mean(), HOUSE_LINEAR["outputs"].std())

# save the file
file.save(path="./linear.surml")

# load the file
new_file = SurMlFile.load(path="./linear.surml", engine=Engine.SKLEARN)

# Make a prediction (both should be the same due to the perfectly correlated example data)
print(new_file.buffered_compute(value_map={"squarefoot": 5, "num_floors": 6}))
print(new_file.raw_compute(input_vector=[5, 6]))

Raw ONNX models

You may not have a model that is supported by the surrealml library. However, if you can convert the model into ONNX format by yourself, you can merely use the ONNX engine when saving your model with the following code:

file = SurMlFile(model=raw_onnx_model, name="linear", inputs=HOUSE_LINEAR["inputs"], engine=Engine.ONNX)

Python tutorial using Pytorch

First we need to have one script where we create and store the model. In this example we will merely do a linear regression model to predict the house price using the number of floors and the square feet.

Defining the data

We can create some fake data with the following python code:

import torch
import torch.nn as nn
import torch.optim as optim
import numpy as np


squarefoot = np.array([1000, 1200, 1500, 1800, 2000, 2200, 2500, 2800, 3000, 3200], dtype=np.float32)
num_floors = np.array([1, 1, 1.5, 1.5, 2, 2, 2.5, 2.5, 3, 3], dtype=np.float32)
house_price = np.array([200000, 230000, 280000, 320000, 350000, 380000, 420000, 470000, 500000, 520000], dtype=np.float32)

We then get the parameters to perform normalisation to get better convergance with the following"

squarefoot_mean = squarefoot.mean()
squarefoot_std = squarefoot.std()
num_floors_mean = num_floors.mean()
num_floors_std = num_floors.std()
house_price_mean = house_price.mean()
house_price_std = house_price.std()

We then normalise our data with the code below:

squarefoot = (squarefoot - squarefoot.mean()) / squarefoot.std()
num_floors = (num_floors - num_floors.mean()) / num_floors.std()
house_price = (house_price - house_price.mean()) / house_price.std()

We then create our tensors so they can be loaded into our model and stack it together with the following:

squarefoot_tensor = torch.from_numpy(squarefoot)
num_floors_tensor = torch.from_numpy(num_floors)
house_price_tensor = torch.from_numpy(house_price)

X = torch.stack([squarefoot_tensor, num_floors_tensor], dim=1)

Defining our model

We can now define our linear regression model with loss function and an optimizer with the code below:

# Define the linear regression model
class LinearRegressionModel(nn.Module):
    def __init__(self):
        super(LinearRegressionModel, self).__init__()
        self.linear = nn.Linear(2, 1)  # 2 input features, 1 output

    def forward(self, x):
        return self.linear(x)


# Initialize the model
model = LinearRegressionModel()

# Define the loss function and optimizer
criterion = nn.MSELoss()
optimizer = optim.SGD(model.parameters(), lr=0.01)

Training our model

We are now ready to train our model on the data we have generated with 100 epochs with the following loop:

num_epochs = 1000
for epoch in range(num_epochs):
    # Forward pass
    y_pred = model(X)

    # Compute the loss
    loss = criterion(y_pred.squeeze(), house_price_tensor)

    # Backward pass and optimization
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()

    # Print the progress
    if (epoch + 1) % 100 == 0:
        print(f"Epoch [{epoch+1}/{num_epochs}], Loss: {loss.item():.4f}")

Saving our .surml file

Our model is now trained and we need some example data to trace the model with the code below:

test_squarefoot = torch.tensor([2800, 3200], dtype=torch.float32)
test_num_floors = torch.tensor([2.5, 3], dtype=torch.float32)
test_inputs = torch.stack([test_squarefoot, test_num_floors], dim=1)

We can now wrap our model in the SurMlFile object with the following code:

from surrealml import SurMlFile, Engine

file = SurMlFile(model=model, name="linear", inputs=inputs[:1], engine=Engine.PYTORCH)

The name is optional but the inputs and model are essential. We can now add some meta data to the file such as our inputs and outputs with the following code, however meta data is not essential, it just helps with some types of computation:

file.add_column("squarefoot")
file.add_column("num_floors")
file.add_output("house_price", "z_score", house_price_mean, house_price_std)

It must be stressed that the add_column order needs to be consistent with the input tensors that the model was trained on as these now act as key bindings to convert dictionary inputs into the model. We need to also add the normalisers for our column but these will be automatically mapped therefore we do not need to worry about the order they are inputed, again, normalisers are optional, you can normalise the data yourself:

file.add_normaliser("squarefoot", "z_score", squarefoot_mean, squarefoot_std)
file.add_normaliser("num_floors", "z_score", num_floors_mean, num_floors_std)

We then save the model with the following code:

file.save("./test.surml")

Loading our .surml file in Python

If you have followed the previous steps you should have a .surml file saved with all our meta data. We load it with the following code:

from surrealml import SurMlFile, Engine

new_file = SurMlFile.load("./test.surml", engine=Engine.PYTORCH)

Our model is now loaded. We can now perform computations.

Buffered computation in Python

This is where the computation utilises the data in the header. We can do this by merely passing in a dictionary as seen below:

print(new_file.buffered_compute({
    "squarefoot": 1.0,
    "num_floors": 2.0
}))

Uploading our model to SurrealDB

We can upload our trained model with the following code:

url = "http://0.0.0.0:8000/ml/import"
SurMlFile.upload(
    path="./linear_test.surml",
    url=url,
    chunk_size=36864,
    namespace="test",
    database="test",
    username="root",
    password="root"
)

Running SurrealQL operations against our trained model

With this, we can perform SQL statements in our database. To test this, we can create the following rows:

CREATE house_listing SET squarefoot_col = 500.0, num_floors_col = 1.0;
CREATE house_listing SET squarefoot_col = 1000.0, num_floors_col = 2.0;
CREATE house_listing SET squarefoot_col = 1500.0, num_floors_col = 3.0;

SELECT * FROM (
		SELECT
			*,
			ml::house-price-prediction<0.0.1>({
				squarefoot: squarefoot_col,
				num_floors: num_floors_col
			}) AS price_prediction
		FROM house_listing
	)
	WHERE price_prediction > 177206.21875;

What is happening here is that we are feeding the columns from the table house_listing into a model we uploaded called house-price-prediction with a version of 0.0.1. We then get the results of that trained ML model as the column price_prediction. We then use the calculated predictions to filter the rows giving us the following result:

[
  {
    "id": "house_listing:7bo0f35tl4hpx5bymq5d",
    "num_floors_col": 3,
    "price_prediction": 406534.75,
    "squarefoot_col": 1500
  },
  {
    "id": "house_listing:8k2ttvhp2vh8v7skwyie",
    "num_floors_col": 2,
    "price_prediction": 291870.5,
    "squarefoot_col": 1000
  }
]

Loading our .surml file in Rust

We can now load our .surml file with the following code:

use crate::storage::surml_file::SurMlFile;

let mut file = SurMlFile::from_file("./test.surml").unwrap();

Raw computation in Rust

You can have an empty header if you want. This makes sense if you're doing something novel, or complex such as convolutional neural networks for image processing. To perform a raw computation you can merely just do the following:

file.model.set_eval();
let x = Tensor::f_from_slice::<f32>(&[1.0, 2.0, 3.0, 4.0]).unwrap().reshape(&[2, 2]);
let outcome = file.model.forward_ts(&[x]);
println!("{:?}", outcome);

However if you want to use the header you need to perform a buffered computer

Buffered computation in Rust

This is where the computation utilises the data in the header. We can do this by wrapping our File struct in a ModelComputation struct with the code below:

use crate::execution::compute::ModelComputation;

let computert_unit = ModelComputation {
    surml_file: &mut file
};

Now that we have this wrapper we can create a hashmap with values and keys that correspond to the key bindings. We can then pass this into a buffered_compute that maps the inputs and applies normalisation to those inputs if normalisation is present for that column with the following:

let mut input_values = HashMap::new();
input_values.insert(String::from("squarefoot"), 1.0);
input_values.insert(String::from("num_floors"), 2.0);

let outcome = computert_unit.buffered_compute(&mut input_values);
println!("{:?}", outcome);

surrealml's People

Contributors

ce11an avatar maxwellflitton avatar sgirones avatar timpratim avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

surrealml's Issues

Installation fails

I tried installing SurrealML using
pip install git+https://github.com/surrealdb/surrealml in Python 3.11.6.

Unfortunately I am running into this error:

       --> modules/utils/src/execution/onnx_environment.rs:9:39
        |
      9 | pub static LIB_BYTES: &'static [u8] = include_bytes!("../../onnx_driver/target/debug/libonnxruntime.dylib");
        |                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        |
        = note: this error originates in the macro `include_bytes` (in Nightly builds, run with -Z macro-backtrace for more info)

      error: could not compile `surrealml-core` (lib) due to 1 previous error

My default toolchain is nightly-x86_64-apple-darwin (1.77).

I also installed onnxruntime via pip and homebrew just in case.

Please tell me what I'm missing. Are there any installation instructions?

Feature: shell.nix ?

Having a shell.nix would allow Nix flakes users to easily install this, could someone make PR if you're already using it for local build.

Model name and version missing when importing `.surml` file into SurrealB.

Example

Example code taken from the README.md:

from sklearn.linear_model import LinearRegression
from surrealml import SurMlFile, Engine
from surrealml.model_templates.datasets.house_linear import HOUSE_LINEAR # click on this HOUSE_LINEAR to see the data


if __name__ == '__main__':

    # train the model
    model = LinearRegression()
    model.fit(HOUSE_LINEAR["inputs"], HOUSE_LINEAR["outputs"])

    # package and save the model
    file = SurMlFile(model=model, name="linear", inputs=HOUSE_LINEAR["inputs"], engine=Engine.SKLEARN)

    # add columns in the order of the inputs to map dictionaries passed in to the model
    file.add_column("squarefoot")
    file.add_column("num_floors")

    # add normalisers for the columns
    file.add_normaliser("squarefoot", "z_score", HOUSE_LINEAR["squarefoot"].mean(), HOUSE_LINEAR["squarefoot"].std())
    file.add_normaliser("num_floors", "z_score", HOUSE_LINEAR["num_floors"].mean(), HOUSE_LINEAR["num_floors"].std())
    file.add_output("house_price", "z_score", HOUSE_LINEAR["outputs"].mean(), HOUSE_LINEAR["outputs"].std())

    # save the file
    file.save(path="./linear.surml")

    # load the file
    new_file = SurMlFile.load(path="./linear.surml", engine=Engine.SKLEARN)

    # Make a prediction (both should be the same due to the perfectly correlated example data)
    print(new_file.buffered_compute(value_map={"squarefoot": 5, "num_floors": 6}))
    print(new_file.raw_compute(input_vector=[5, 6]))

I get the outputs successfully:

[5.013289451599121]
[5.013289451599121]

I can also import the model into the database:

surreal start --log trace --user root --pass root --bind 0.0.0.0:8000 file:mydatabase.db    

and

surreal ml import linear.surml --namespace test --database test --conn http://localhost:8000

gives

2024-02-03T09:42:40.689501Z  INFO surreal::cli::ml::import: The SurrealML file was imported successfully

However, when looking at the trace we see that the model name and version are undefined:

2024-02-03T09:42:40.689088Z DEBUG request:process:executor: surrealdb::dbs::executor: Executing: DEFINE MODEL ml::``<> COMMENT '' PERMISSIONS FULL otel.kind="server" http.request.method="POST" url.path="/ml/import" network.protocol.name="http" network.protocol.version="1.1" otel.name="POST /ml/import" http.route="/ml/import" http.request.id="59568a8f-f6f0-41bf-be09-c29446f6ddbd" client.address="127.0.0.1"

This becomes more apparent when performing a query with the model:

surreal sql --username root --password root --namespace test --database test

running

test/test>  ml::linear<0.0.1>({squarefoot: 1000, num_floors: 1})

gives

["The model 'ml::linear<0.0.1>' does not exist"]

Versions

  • surreal: 1.1.1 for macos on aarch64
  • python: 3.10.11
  • surrealml: latest

Potential Fix

This behaviour arises from add_name and add_version being missing from surml_file.py.

Acceptance

  • Fix import error.
  • Update examples and tests.
  • Create an integration test with SurrealDB to check that a .surml file can be imported and predictions can be queried.
  • Add an error message/warning to the user that they are importing a .surml file into SurrealDB.

[ERROR]: `sklearn` example doesn't run

import numpy as np
from sklearn.ensemble import RandomForestClassifier

from surrealml import SurMlFile

num_classes = 2
X = np.random.rand(100, 28)
y = np.random.randint(num_classes, size=100)

skl_model = RandomForestClassifier(n_estimators=10, max_depth=10)
skl_model.fit(X, y)

test_file = SurMlFile(
    model=skl_model,
    name="random forrest classifier",
    inputs=X,
    sklearn=True,
)
test_file.save("./test_forrest.surml")

# load model and execute a calculation
random_floats = list(np.random.rand(28))
test_load = SurMlFile.load("./test_forrest.surml")
Model saved with digest: 747dc97cf71db25ea44fe440cdd726d46651fbd5
/Users/ayush/miniconda3/envs/surml/lib/python3.11/site-packages/torch/onnx/utils.py:825: UserWarning: no signature found for <torch.ScriptMethod object at 0x297134bf0>, skipping _decide_input_format
  warnings.warn(f"{e}, skipping _decide_input_format")
================ Diagnostic Run torch.onnx.export version 2.0.0 ================
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

Traceback (most recent call last):
  File "/Users/ayush/Projects/ML/surrealml_demo/sklean.py", line 13, in <module>
    test_file = SurMlFile(
                ^^^^^^^^^^
  File "/Users/ayush/miniconda3/envs/surml/lib/python3.11/site-packages/surrealml/surml_file.py", line 35, in __init__
    self.file_id = self._cache_model()
                   ^^^^^^^^^^^^^^^^^^^
  File "/Users/ayush/miniconda3/envs/surml/lib/python3.11/site-packages/surrealml/surml_file.py", line 59, in _cache_model
    torch.onnx.export(traced_script_module, self.inputs, file_path)
  File "/Users/ayush/miniconda3/envs/surml/lib/python3.11/site-packages/torch/onnx/utils.py", line 506, in export
    _export(
  File "/Users/ayush/miniconda3/envs/surml/lib/python3.11/site-packages/torch/onnx/utils.py", line 1548, in _export
    graph, params_dict, torch_out = _model_to_graph(
                                    ^^^^^^^^^^^^^^^^
  File "/Users/ayush/miniconda3/envs/surml/lib/python3.11/site-packages/torch/onnx/utils.py", line 1113, in _model_to_graph
    graph, params, torch_out, module = _create_jit_graph(model, args)
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ayush/miniconda3/envs/surml/lib/python3.11/site-packages/torch/onnx/utils.py", line 956, in _create_jit_graph
    flattened_args = tuple(torch.jit._flatten(tuple(args))[0])
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Only tuples, lists and Variables are supported as JIT inputs/outputs. Dictionaries and strings are also accepted, but their usage is not recommended. Here, received an input of unsupported type: numpy.ndarray

Bug: server shut down if use invalid file on /ml/import

SurrealDB ver 1.1.1, Macos Sonoma

Steps to reproduce

starts SurrealDB with --auth
use Postman or any http-client you like and send POST http://127.0.0.1:8000/ml/import with all needed fields(headers ns|db and basic auth) and send wrong file, for example file you export out of database. I just have no valid ml file which is expected here, so I use any file =)

Expected behaviour

I expect an error (400), as on other http endpoints here

Actual behaviour:

Server is immediatly shuts down on request to /ml/import with:
2024-01-27T14:38:33.286782Z INFO surreal::env: Running 1.1.1 for macos on aarch64 2024-01-27T14:38:33.287232Z INFO surreal::dbs: โœ…๐Ÿ”’ Authentication is enabled ๐Ÿ”’โœ… 2024-01-27T14:38:33.287677Z INFO surrealdb::kvs::ds: Starting kvs store at file://my_test_database.db 2024-01-27T14:38:33.372192Z INFO surrealdb::kvs::ds: Started kvs store at file://my_test_database.db 2024-01-27T14:38:33.375527Z INFO surrealdb::node: Started node agent 2024-01-27T14:38:33.377418Z INFO surrealdb::net: Started web server on 0.0.0.0:8000 thread 'surrealdb-worker' panicked at /Users/runner/.cargo/registry/src/index.crates.io-6f17d22bba15001f/surrealml-core-0.0.3/src/storage/surml_file.rs:71:46: range end index 757932081 out of range for slice of length 3717 note: run withRUST_BACKTRACE=1environment variable to display a backtrace [1] 8348 abort surreal start --auth file:my_test_database.db

Here is with RUST_BACKTRACE=FULL
2024-01-27T14:47:03.865968Z INFO surrealdb::net: Started web server on 0.0.0.0:8000 thread 'surrealdb-worker' panicked at /Users/runner/.cargo/registry/src/index.crates.io-6f17d22bba15001f/surrealml-core-0.0.3/src/storage/surml_file.rs:71:46: range end index 757932081 out of range for slice of length 3717 stack backtrace: 0: 0x1057e1114 - __mh_execute_header 1: 0x105459b74 - __mh_execute_header 2: 0x1057b97cc - __mh_execute_header 3: 0x1057e48b4 - __mh_execute_header 4: 0x1057e44fc - __mh_execute_header 5: 0x1057e5384 - __mh_execute_header 6: 0x1057e4f3c - __mh_execute_header 7: 0x1057e4ea8 - __mh_execute_header 8: 0x1057e4e9c - __mh_execute_header 9: 0x1064f5148 - __ZN7rocksdb6ribbon6detail34BandingConfigHelper1MaybeSupportedILNS0_25ConstructionFailureChanceE1ELy128ELb0ELb0ELb1EE11GetNumSlotsEj 10: 0x1064f5238 - __ZN7rocksdb6ribbon6detail34BandingConfigHelper1MaybeSupportedILNS0_25ConstructionFailureChanceE1ELy128ELb0ELb0ELb1EE11GetNumSlotsEj 11: 0x105dcc49c - __mh_execute_header 12: 0x1051987c8 - __mh_execute_header 13: 0x104f344b4 - __mh_execute_header 14: 0x1051df22c - __mh_execute_header 15: 0x104f37790 - __mh_execute_header 16: 0x1051df22c - __mh_execute_header 17: 0x104f3412c - __mh_execute_header 18: 0x1051deeac - __mh_execute_header 19: 0x104f35c68 - __mh_execute_header 20: 0x104fc3cbc - __mh_execute_header 21: 0x104f328d4 - __mh_execute_header 22: 0x1051deeac - __mh_execute_header 23: 0x104f57940 - __mh_execute_header 24: 0x104f608f0 - __mh_execute_header 25: 0x105101c00 - __mh_execute_header 26: 0x105edeb50 - __mh_execute_header 27: 0x105ee2320 - __mh_execute_header 28: 0x105ecf9b0 - __mh_execute_header 29: 0x105ecf6e8 - __mh_execute_header 30: 0x1057e7af8 - __mh_execute_header 31: 0x186b5e034 - __pthread_joiner_wake [1] 8442 abort surreal start --auth file:my_test_database.db

Uploaded SurrealML Model Did Not Give Expected Results

Describe the bug
I did not get the expected results with the uploaded surrealml model.

To Reproduce

  1. Created a multiouput linear regression with sklearn
import sklearn
import skl2onnx
import numpy as np
from sklearn.linear_model import LinearRegression
from surrealml import SurMlFile, Engine

n_samples = 100
n_features = 3
n_outputs = 3
X = np.array(np.random.rand(n_samples, n_features), dtype=np.float32)
Y = np.array(np.random.rand(n_samples, n_outputs), dtype=np.float32)

model = LinearRegression()
model.fit(X,Y)
  1. Stored the model as SurrealML File
file = SurMlFile(
	model=model, 
	name="testprediction", 
	inputs=X[:1], 
	engine=Engine.SKLEARN
)

file.add_version(version="0.0.1")
file.save(path="./lineartestprediction.surml")

new_file = SurMlFile.load(path="./lineartestprediction.surml", engine=Engine.SKLEARN)
  1. This is the expected output I get using raw compute in python
    image

  2. However when I uploaded the model to surrealdb and query it, this is what I get. It only returns the first output, but I expected three
    image

Expected behavior
It should return me all the three outputs. [0.5089617371559143, 0.5183728933334351, 0.5182746648788452]

Environment

Platform: Desktop
OS: Darwin
Architecture: aarch64
WebView: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko)
Version: 2.0.6
Flags: featureFlags: false, models_view: true, apidocs_view: true, themes: false, newsfeed: true

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.