Code Monkey home page Code Monkey logo

redisai-js's Introduction

GitHub issues CircleCI Dockerhub codecov Total alerts Forum Discord

Caution

RedisAI is no longer actively maintained or supported.

We are grateful to the RedisAI community for their interest and support.

RedisAI

RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. RedisAI both maximizes computation throughput and reduces latency by adhering to the principle of data locality, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure.

To read RedisAI docs, visit redisai.io. To see RedisAI in action, visit the demos page.

Quickstart

RedisAI is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.

The following sections describe how to get started with RedisAI.

Docker

The quickest way to try RedisAI is by launching its official Docker container images.

On a CPU only machine

docker run -p 6379:6379 redislabs/redisai:1.2.7-cpu-bionic

On a GPU machine

For GPU support you will need a machine you'll need a machine that has Nvidia driver (CUDA 11.3 and cuDNN 8.1), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout nvidia-docker documentation

docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:1.2.7-gpu-bionic

Building

You can compile and build the module from its source code. The Developer page has more information about the design and implementation of the RedisAI module and how to contribute.

Prerequisites

  • Packages: git, python3, make, wget, g++/clang, & unzip
  • CMake 3.0 or higher needs to be installed.
  • CUDA 11.3 and cuDNN 8.1 or higher needs to be installed if GPU support is required.
  • Redis v6.0.0 or greater.

Get the Source Code

You can obtain the module's source code by cloning the project's repository using git like so:

git clone --recursive https://github.com/RedisAI/RedisAI

Switch to the project's directory with:

cd RedisAI

Building the Dependencies

Use the following script to download and build the libraries of the various RedisAI backends (TensorFlow, PyTorch, ONNXRuntime) for CPU only:

bash get_deps.sh

Alternatively, you can run the following to fetch the backends with GPU support.

bash get_deps.sh gpu

Building the Module

Once the dependencies have been built, you can build the RedisAI module with:

make -C opt clean ALL=1
make -C opt

Alternatively, run the following to build RedisAI with GPU support:

make -C opt clean ALL=1
make -C opt GPU=1

Backend Dependancy

RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with.

RedisAI PyTorch TensorFlow TFLite ONNXRuntime
1.0.3 1.5.0 1.15.0 2.0.0 1.2.0
1.2.7 1.11.0 2.8.0 2.0.0 1.11.1
master 1.11.0 2.8.0 2.0.0 1.11.1

Note: Keras and TensorFlow 2.x are supported through graph freezing. See this script to see how to export a frozen graph from Keras and TensorFlow 2.x.

Loading the Module

To load the module upon starting the Redis server, simply use the --loadmodule command line switch, the loadmodule configuration directive or the Redis MODULE LOAD command with the path to module's library.

For example, to load the module from the project's path with a server command line switch use the following:

redis-server --loadmodule ./install-cpu/redisai.so

Give it a try

Once loaded, you can interact with RedisAI using redis-cli. Basic information and examples for using the module is described here.

Client libraries

Some languages already have client libraries that provide support for RedisAI's commands. The following table lists the known ones:

Project Language License Author URL
JRedisAI Java BSD-3 RedisLabs Github
redisai-py Python BSD-3 RedisLabs Github
redisai-go Go BSD-3 RedisLabs Github
redisai-js Typescript/Javascript BSD-3 RedisLabs Github
redis-modules-sdk TypeScript BSD-3-Clause Dani Tseitlin Github
redis-modules-java Java Apache-2.0 dengliming Github
smartredis C++ BSD-2-Clause Cray Labs Github
smartredis C BSD-2-Clause Cray Labs Github
smartredis Fortran BSD-2-Clause Cray Labs Github
smartredis Python BSD-2-Clause Cray Labs Github

The full documentation for RedisAI's API can be found at the Commands page.

Documentation

Read the docs at redisai.io.

Contact Us

If you have questions, want to provide feedback or perhaps report an issue or contribute some code, here's where we're listening to you:

License

RedisAI is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).

redisai-js's People

Contributors

chayim avatar dengliming avatar dependabot[bot] avatar dvirdukhan avatar filipecosta90 avatar gkorland avatar leibale avatar snyk-bot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

redisai-js's Issues

Use TF model object as parameter in new redisai.Model

As a Node.js developer i'm using the redisai-js module to connect with Redis-Ai with my node.js application. So i'm using / save / load my models in TensorflowJs format (model.json, weights.bin) so its useful create the Redis-Ai model getting start from these two files saved. Actually the code i'm used doesn't work because the model format:

    const m = await tf.loadLayersModel(`file://model/AX-model/model.json`)
    const myModel = new redisai.Model(redisai.Backend.TF, 'CPU',['a','b'], ['c'], m)
    const r = await aiclient.modelset('mlmodel', myModel)

So i'm interesting to thinking at a solution, in node.js application i'm struggle to use Redis to save the TensorflowJs models too directly in-memory db. I'd like to chose which operation get done in Node.js application and which in Redis-Ai, so for example train the model in Node.js application and predict in Redis-Ai:

     const model = tf.sequential()
     model.add(tf.layers.dense({inputShape: [1], units: 1}))
     model.add(tf.layers.dense({units: 1}))
    model.compile({
           optimizer: 'sgd',   // tf.train.adam(),
           // loss: tf.losses.meanSquaredError,
           loss: 'meanSquaredError',
           metrics: ['mse']
      })

       const myModel = new redisai.Model(redisai.Backend.TF, 'CPU',['a','b'], ['c'], model)
       const r = await aiclient.modelset('mlmodel', myModel)

Support for variadic arguments on SCRIPTRUN

Further reference:
https://oss.redislabs.com/redisai/master/commands/#aiscriptrun

Example on how provide an arbitrary number of inputs after the $ sign::

redis> AI.TENSORSET mytensor1 FLOAT 1 VALUES 40
OK
redis> AI.TENSORSET mytensor2 FLOAT 1 VALUES 1
OK
redis> AI.TENSORSET mytensor3 FLOAT 1 VALUES 1
OK
redis> AI.SCRIPTRUN myscript addn INPUTS mytensor1 $ mytensor2 mytensor3 OUTPUTS result
OK
redis> AI.TENSORGET result VALUES
1) FLOAT
2) 1) (integer) 1
3) 1) "42"

Gettin a basic example going in TypeScript

I have been using redisai-py for a while. Now I am starting to integrate use of redisai into my website with redisai-js.

I am using typescript and ts-node in development. Below is the code I added following the example and the error I received.

import redis from "redis";
import redisai from "redisai-js";

(async () => {
    const nativeClient = redis.createClient({ url: "redis://redis:6379" });
    const rai = new redisai.Client(nativeClient );
})();
Error: Cannot find module './backend'
Require stack:
- /app/node_modules/redisai-js/lib/index.js
- /app/src/index.ts
at Function.Module._resolveFilename (internal/modules/cjs/loader.js:880:15)
at Function.Module._load (internal/modules/cjs/loader.js:725:27)
at Module.require (internal/modules/cjs/loader.js:952:19)
at require (internal/modules/cjs/helpers.js:88:18)
at Object.<anonymous> (/app/node_modules/redisai-js/src/index.ts:2:1)
at Module._compile (internal/modules/cjs/loader.js:1063:30)
at Module._extensions..js (internal/modules/cjs/loader.js:1092:10)
at Object.require.extensions.<computed> [as .js] (/app/node_modules/ts-node/src/index.ts:851:44)
at Module.load (internal/modules/cjs/loader.js:928:32)
at Function.Module._load (internal/modules/cjs/loader.js:769:14)
[nodemon] app crashed - waiting for file changes before starting...

Syntax Error upon import

I'm getting a syntax error doing the simplest of things:

$ npm install --save redisai-js

I copy and paste the Vanilla JS tensor example from the README.md into a file named redisai-js.js. And then I run it:

$ node redisai-js.js

And it gives me this error:

/Users/guyroyse/code/redis-modules-assessment/node_modules/redisai-js/lib/tensor.js:49
    });
    ^

SyntaxError: Unexpected token '}'
    at wrapSafe (internal/modules/cjs/loader.js:931:16)
    at Module._compile (internal/modules/cjs/loader.js:979:27)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:1035:10)
    at Module.load (internal/modules/cjs/loader.js:879:32)
    at Function.Module._load (internal/modules/cjs/loader.js:724:14)
    at Module.require (internal/modules/cjs/loader.js:903:19)
    at require (internal/modules/cjs/helpers.js:74:18)
    at Object.<anonymous> (/Users/guyroyse/code/redis-modules-assessment/node_modules/redisai-js/lib/index.js:10:16)
    at Module._compile (internal/modules/cjs/loader.js:1015:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:1035:10)

I pulled the repo down myself, built everything, and npm installed the folder on my machine and it worked. Seems like something wrong with our npm depoly? Maybe?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.