Comments (2)
Note that TRTIS build itself is no longer based on bazel, but cmake. But in the same spirit we could make sure that TRTIS is easy to use as a cmake external project.
from server.
We have no plans to support bazel builds. We do support CMake and it should be easy to integrate TRTIS build into an existing cmake build.
from server.
Related Issues (20)
- Copy GPU Tensor to CPU on Python backend without using pytorch HOT 2
- conda-pack failing: Failed to initialize Python stub for auto-complete HOT 4
- Dynamic batching that supports static batch size with padding HOT 10
- UNAVAILABLE: Internal: FileNotFoundError: [Errno2] No usable temporary directory found in ['/tp', '/var/tmp','/usr/tmp', '/tmp/python_Lsk/3'] env_6Lp HOT 1
- How does share memory speed up inference? HOT 6
- Memorystore Redis IAM AUTH
- Response caching GPU tensors HOT 1
- Abnormal system memory usage while enabling GPU metrics HOT 1
- Request for Improved Metrics and Real-Time Concurrency Reporting in Triton Inference Server
- Python AsyncIO infer does not support shared memory HOT 1
- client silent failure - E0422 05:03:24.145960 1 pb_stub.cc:402] An error occurred while trying to load GPU buffers in the Python backend stub: failed to copy data: invalid argument HOT 3
- CUDA Graph not work HOT 4
- [RFE] HandleGenerate equivalent for sagemaker_server.cc HOT 1
- The time spent on the inference request process far exceeds the model inference time. How can I determine where this additional time is being consumed?
- Casting NumPy string array to np_utils.Tensor disproportionately increases latency HOT 2
- On server/deploy/oci -> running "helm install example ." to deploy the Inference Server and pod doesn't get to running due to Liveness probe failed & Readiness probe failed HOT 1
- trt_profile_max_shapes not supported for ONNX-TRT backend HOT 1
- Failed to initialize Python stub + ModuleNotFoundError: No module named 'nvtabular', 'merlin' HOT 1
- does triton support different model-repository assemble into a batch? HOT 1
- Question: Which backends automatically warm up models? HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from server.